Intel Moves to Free Gadgets of Their Recharging
Cords
New York Times (08/21/08) P. C4; Markoff, John
Intel has developed "wireless resonant energy link" technology that could
enable the wireless recharging of handheld devices and other gadgets and
appliances. The technology uses a magnetic field to broadcast up to 60
watts of power to a distance of two to three feet. Intel says it can
broadcast that power while losing only 25 percent of the power during
transmission. The technology builds on the work of Massachusetts Institute
of Technology physicist Marin Soljacic, who pioneered the idea of
wirelessly transmitting power using resonant magnetic fields. Both the
Intel and MIT researchers are exploring a phenomenon known as "resonant
induction," which makes it possible to transmit power several feet without
wires. Induction is already used to charge devices such as electric
toothbrushes, but those applications require the device to be placed in a
base station. The MIT group has demonstrated efficiencies of 50 percent
while transmitting power several meters. Intel also is testing whether the
technology could be used to power supercapacitors, which can be recharged
far more quickly than modern batteries. Intel's Justin Rattner says that
someday countertop appliances such as coffeemakers may only need to be
placed on the counter to be powered. Intel researchers are experimenting
with antennas less than two feet in diameter to remotely power a 60-watt
light bulb.
Click Here to View Full Article
to the top
Space Age Engineers to Verify Control Software for Future
Robotic Inter-Planetary Missions
University of Leicester (08/20/08)
The University of Leicester will help develop new verification and
validation techniques for next-generation satellite systems. Researchers
from Leicester's Control and Instrumentation Research Group are involved in
an international consortium that is working to improve mission-critical
control software for the rendezvous of groups of satellites. The European
Space Agency (ESA) is funding the two-year project. "Future ESA missions,
like the autonomous robotic satellites which will collect and return
samples from the surface of Mars, require control systems involving complex
requirements, system architectures, software algorithms, and hardware
implementations," says University of Leicester senior lecturer Declan
Bates. "It is essential to show that the control system is sufficiently
robust to ensure the desired safety levels under a large number of adverse
and unforeseen conditions." Engineers from the Spain's GMV, Canada's NGC
Aerospace, and the University of Oxford are also involved in the
project.
Click Here to View Full Article
to the top
Universities Detail Declines in Federal R&D Funding for
Science and Engineering Fields
National Science Foundation (08/21/08) Mixon, Bobbie
Federal funding of academic science and engineering research and
development (R&D) rose 1.1 percent in current dollars to $30.4 billion in
fiscal year (FY) 2007, according to the National Science Foundation (NSF).
However, the NSF's latest Survey of Research and Development Expenditures
at Universities and Colleges found that R&D funding fell for the second
straight year after adjusting for inflation. In inflation-adjusted
dollars, R&D funding decreased 1.6 percent from FY 2006, following a
decline of 0.2 percent in FY 2005. The federal government still accounted
for 62 percent of R&D funding in FY 2007. Overall, R&D expenditures
totaled $49.4 billion in FY 2007, as nonfederal funding grew 5 percent.
Industry funding increased 11.2 percent to $2.7 billion; state and local
government funding grew 6.1 percent to $3.1 billion; academic institutions'
contributions rose 6.6 percent to $9.7 billion; and funding from nonprofit
organizations, nongovernmental entities, and other sources increased 10
percent to $3.5 billion.
Click Here to View Full Article
to the top
Better Access to Scientific Articles on EU-Funded
Research: European Commission Launches Online Pilot Project
EUROPA (08/20/08)
The European Commission (EC) is working to ensure that the results of the
research it funds through the European Union's (EU's) 7th Research
Framework Programme (FP7) are disseminated as widely and as efficiently as
possible to guarantee maximum exploitation and impact in the world of
researchers and beyond. The EC recently launched a pilot project that will
give unrestricted online access to EU-funded research results, primarily
research articles published in peer-review journals, after a six-month to
12-month embargo period. The pilot project will provide access to about 20
percent of the FP7 program budget, and will include research projects in
health, energy, environment, social sciences, and information and
communication technologies. The open access pilot project will run until
the end of FP7, and aims to ensure that the results of EU-funded research
are made available to everyone. Grant recipients will be required to
deposit peer-reviewed research articles or final manuscripts from their
projects in an online repository. Researchers will have to make their best
effort to ensure that access to these articles is given either six or 12
months after publication, depending on the research area. The embargo
period is intended to allow scientific publishers to get a return on their
investment. "The rapid development of digital technologies offers
researchers unprecedented possibilities for the timely and efficient
sharing of information," says EU commissioner for information society and
media Viviane Reding. "Our new pilot will harness that potential, making
it easier for researchers, businesses, and policy makers to address global
challenges like climate change by providing them with access to the latest
research."
Click Here to View Full Article
to the top
UC San Diego Computer Scientists Propose New Data Center
Architecture Based on Commodity Network Elements
University of California, San Diego (08/20/08) Fox, Tiffany
University of California, San Diego (UCSD) computer scientists have
proposed a new way of building data centers that could lower costs and
provide more computing capability for end users. UCSD professor Amin
Vahdat's paper, "A Scalable Commodity Data Center Network Architecture,"
was presented at the annual ACM SIGCOMM meeting in Seattle. "Large
companies are putting together server farms of tens of thousands of
computers--even approaching 100-thousand--and the big challenge is to
interconnect all these computers so that they can talk to each other as
quickly as possible, without incurring significant costs," Vahdat says.
"We are proposing a new topology for Ethernet data center connectivity."
The research addresses problems inherent in modern data center networks
with large-scale computation or storage requirements. Vahdat say the work
addresses the problem of data center network connectivity in a world where
consolidation is increasingly common in data centers. The UCSD researchers
propose creating a data center that has scalable interconnection bandwidth,
making it possible for an arbitrary host in the data center to communicate
with any other host in the network at the full bandwidth of its local
network interface. The approach requires no modifications to end-host
network interfaces, operating systems, or applications, and is backward
compatible with Ethernet, IP, and TCP.
Click Here to View Full Article
to the top
Microsoft Launches Free Photosynth for Combining Shots
Into One Picture
Seattle Times (08/21/08) Romano, Benjamin J.
Microsoft Live Labs has released Photosynth, a free program that creates
three-dimensional virtual environments, or synths, from overlapping
photographs. Dozens of synths are available for viewing on the Photosynth
Web site, including some from National Geographic, which assigned
photographers to document wonders of the world such as Stonehenge using
Photosynth. Photosynth group manager David Gedye says if the project is
successful, then the researchers have invented a new form of media that is
halfway between photos, computer games, and video that offers rich detail,
user-controlled navigation, and cinematic qualities. Microsoft suggests
that people create synths using from 10 to 300 photos taken specifically
for Photosynth. The photos should cover the subject from all sides and
from a variety of perspectives. A moderately powerful computer can
calculate a synth from about a dozen photographs in about five minutes by
matching elements they have in common to reconstruct the subject. Larger
synths can take several hours to process, depending on the size of the
photos and the power of the machine, which is still a significant
improvement over the original application. Live Labs group project manager
Alex Daley says when Photosynth was demonstrated two years ago it took 36
hours on a large cluster of computers to calculate synths.
Click Here to View Full Article
to the top
Risk Assessment Planned for Voting Systems
Government Computer News (08/19/08) Jackson, William
The Election Assistance Commission (EAC) wants to conduct a formal risk
assessment of voting systems to identify an acceptable level of risk, as
well as appropriate security controls, for all types of voting systems used
in federal elections. The assessment will apply principles established in
the Federal Information Security Management Act (FISMA), as well as
procedures and guidelines for FISMA compliance created by the National
Institute of Standards and Technology. Although the EAC does not have
authority over state and local jurisdictions, the commission provides a set
of voluntary guidelines for certifying voting systems used in many states.
The EAC has released a request for proposals for a contractor to conduct a
"scientifically founded voting system risk assessment." The commission is
looking for a multidisciplinary team of academic researchers, security and
software engineers, security professionals, and election administration
professionals to conduct the work. The first phase will produce reference
models for election processes to define the operational context in which
voting systems are used, and models for each generic type of voting system,
such as paper ballot, optical scan, or Direct Recording Electronic
machines. The second phase will analyze risks associated with each
technology and perform assessments of potential harm from those risks. The
third phase will identify an acceptable level of impact for voting
systems.
Click Here to View Full Article
to the top
The 160-Mile Download Diet: Local File-Sharing
Drastically Cuts Network Load
University of Washington News and Information (08/19/08) Hickey, Hannah;
Muzzin, Suzanne
University of Washington (UW) and Yale University researchers have
proposed P4P, an approach to peer-to-peer (P2P) file-sharing in which users
share preferentially with nearby computers. Such a system would allow P2P
traffic to continue to grow without crippling the Internet, and could
provide a basis for future P2P systems. A paper on P4P will be presented
at ACM's Special Interest Group on Data Communications conference. Paper
co-author and UW professor Arvind Krinshamurthy says initial tests indicate
that the network load could be reduced by a factor of five or more without
compromising network performance, while simultaneously increasing speeds by
about 20 percent. In traditional P2P networks, on average, data packets
travel 1,000 miles and take 5.5 metro-hops, which are connections through
major hubs. In the P4P network, data packets traveled 160 miles on average
and made only 0.89 metro-hops, significantly reducing Web traffic on routes
between cities where bottlenecks are most likely to occur. Tests show that
only 6 percent of file-sharing is currently done locally, while in a P4P
network, local file-sharing increased to 58 percent. P4P require more
cooperation between the ISP and the file-sharing host, but it does not
force companies to disclose information on how they route Internet
traffic.
Click Here to View Full Article
to the top
Simple and Secure Networked Home
ICT Results (08/18/08)
The European Union-funded ESTIA project has demonstrated software that
enables a person to control audiovisual equipment and other products in the
home through a single, remote interface. Networked devices are
automatically recognized by the system, and the network can be administered
using a variety of home electronics, including TVs, cordless phones, PDAs,
or a PC. An increasing number of home-based electronics are being
manufactured with networking and remote-control capabilities, even washing
machines, dryers, and ovens, but few people are using these features.
ESTIA lead researcher Lars Dittmann says this is because people perceive
the control of networked devices as too complicated, particularly because
most networkable devices have their own proprietary control systems, and
due to trust and control issues. The ESTIA researchers aimed to address
these issues by creating an interface that gives users a personal identity
with different access rights to different networked devices. For example,
the interface enables people entering the house to type in a four-digit
code on a pad by the door, allowing the house to monitor who is there. If
an adult is in the house, the children would be allowed to use the oven or
microwave, but if the children are home alone their access may be limited
to the TV. The ESTIA home networking architecture selects and uses
whatever networking technologies are available, from IP-based networks to
KNX, a wire-based platform for building control systems.
Click Here to View Full Article
to the top
What Linux Will Look Like in 2012
InformationWeek (08/14/08) Yegulalp, Serdar
Linux is projected to evolve over the next four years into an operating
system that is easy for non-Linux-savvy users to employ, with Serdar
Yegulalp projecting a three-way split between three fundamental Linux usage
models: For-pay, free to use, and free/libre. The most common model, free
to use, is a free distribution with support optional, and additional
optional support for closed-source elements such as proprietary,
binary-only device drivers; free/libre distributions are wholly free.
"Over the next few years, the distinctions between these three licensing
models will become heavily accentuated by both the Linux community and by
the creators of these distributions themselves," writes Yegulalp. He
expects the Linux desktop of 2012 will develop into a bare-bones,
click-and-go interface to ease non-technical users' adoption of Linux,
while Linux hardware circa 2012 should include an array of mobile devices,
including phones, netbooks, and products that use open architectures.
Yegulalp anticipates a migration toward hardware with open standards and
accessibility, while application trends he foresees include the browser
assuming the role of application deployment framework. The running of
Linux in parallel with any other operating system will be greatly
simplified via virtualization in the Linux kernel, while Windows apps could
be run side-by-side with Linux apps through the use of a virtual machine
and cut-and-paste functionality. Yegulalp expects Linux to become even
more dominant among servers, also partly thanks to virtualization.
"Linux's mutability allows for its use not only as a server platform but as
hypervisor and container for other operating systems," he writes.
Click Here to View Full Article
to the top
'Virtual Archaeologist' Reconnects Fragments of Ancient
Civilization
Princeton University (08/13/08) Shekhar, Chandra
Archaeologists in Greece are working to reconstruct wall paintings that
contain clues about the ancient culture of Thera, an island civilization
buried under volcanic ash more than 3,500 years ago. At their current
rate, the task will take more than a century of work, but the process could
become much easier thanks to an automated system developed by a team of
Princeton University computer scientists. The new technology could change
how people do archaeology, says Princeton University dean of faculty and
computer science professor David Dobkin. Reconstructing an excavated
archaeological object is like solving a giant jigsaw puzzle, only much more
difficult because the object has been broken into thousands of tiny pieces,
many of which lack any distinctive color, pattern, or texture, and may
possess edges that have eroded over centuries. The task of reassembling
artifacts normally falls to humans, with archaeologists sifting through
fragments and using a trial and error approach to find matches. Previous
efforts to create computer systems that could be used to automate parts of
the process generally relied on expensive, unwieldy equipment that could
only be operated by a trained computer expert. Princeton's new system uses
inexpensive, off-the-shelf hardware and is designed to be used by
archaeologists and preservationists. The system uses a combination of
powerful computer algorithms and a processing system that mirrors the
procedures traditionally followed by archaeologists. In 2007, a team of
Princeton researchers went to Akrotiri, initially to observe and learn from
the conservators at the site, and eventually to test their system. They
successfully measured 150 fragments using their automated system.
Click Here to View Full Article
to the top
Virtual Hand Gets Under the Skin
New Scientist (08/14/08) Barras, Colin
Researchers at the University of British Columbia in Vancouver presented
realistic animations of the hand at the recent ACM SIGGRAPH computer
graphics conference in Los Angeles. The team modeled the bones, tendons,
and muscles of the hand and forearm, and created software to coordinate the
contraction and relaxation of muscles and forces transmitted by tendons to
produce hand gestures. They used a layer of skin to clothe the virtual
muscles and tendons. Motion capture technology is unable to provide such a
view of the movement of muscles and tendons underneath the skin. The new
hand animation models would make it possible for surgeons to predict the
results of hand surgery. "Using our technique, you can show what effect
rerouting a tendon would have on the hand before you actually do the
surgery," says Shinjiro Sueda, head of the research team. Sueda, Andrew
Kaufman and Dinesh Pai are also offering a graphics software plug-in so
animators can improve the realism of their hand animations.
Click Here to View Full Article
to the top
Comprehensive Study Shows IPv6 Shift Isn't
Happening
Extreme Tech (08/18/08) Hachman, Mark
Despite predictions that the Internet's IPv4 protocol will run out of
addresses sometime in 2011, little seems to have been done to transition to
IPv6, the new protocol that will provide enough Internet addresses for the
near future, according to an Arbor Networks study. The study found that
the there were only about 600 Mbps of inter-domain IPv6 traffic between
June 2007 and June of this year, or just 0.0026 percent of the amount of
overall IPv4 traffic. The study also found that IPv6 traffic peaked twice
between Nov. 4 and Dec. 25 of last year. Arbor's Scott Iekel-Johnson says
the largest of these spikes happened when 1,168 people in attendance at a
meeting of the Internet Engineering Task Force were asked to turn off IPv4
functionality on their network connections and routers and test to see
which sites could be accessed. Iekel-Johnson says the fact that just under
1,200 people caused the largest spike in IPv6 traffic over the last 12
months "speaks volumes" about the lack of adoption of the new protocol. He
called for action to spur the adoption of IPv6, such as a company issuing a
mandate to adopt the protocol. "If Comcast says to its customers, okay,
you need to go over to IPv6 because we're out of addresses and we want to
add customers, that will force the issue," Iekel-Johnson says.
Click Here to View Full Article
to the top
Planning to E-Vote? Read This First
Scientific American (08/18/08) Greenemeier, Larry
With less than three months until the U.S. presidential election, many
states continue to struggle with electronic-voting technology. In an
effort to avoid the problems that plagued the 2000 presidential election,
and to meet the requirements of the 2002 Help America Vote Act, many states
and counties rushed to obtain e-voting systems, but now those machines also
are problematic. Faulty e-voting systems could allow voters and pool
workers to place multiple votes, crash the system with a virus, create fake
vote tallies, and cause miscounts through other errors, according to
studies commissioned by California and Ohio within the past year. "Nothing
we do now will affect the November election," says Stanford University
professor and Verified Voting Foundation founder David Dill. "We don't
know how to make secure paperless voting." In Ohio, problems with e-voting
technology have cost the state $112 million, including discrepancies during
the primary election when the county board of elections determined that the
Premier DRE system malfunctioned and failed to count votes from memory
cards uploaded to the system's vote tabulation computer server. Ohio
secretary of State Jennifer Brunner commissioned Project EVEREST to study
e-voting technology throughout Ohio. The team of academics and private
researchers found exploitable weaknesses in all three e-voting vendors'
systems. EVEREST researcher and Pennsylvania State University professor
Patrick McDaniel says e-voting systems have to be completely redesigned
with security in mind, which, in the short term, means adding more thorough
vote-auditing capabilities so discrepancies can be investigated.
Click Here to View Full Article
to the top
The New Face of R&D
Computerworld (08/11/08) Vol. 42, No. 32, P. 32; Anthes, Gary
IBM, Hewlett-Packard (HP), and Microsoft's research and development
agendas all emphasize the growing importance of collaborative partnerships
with universities, customers, and other companies. IBM, HP, and Microsoft
are embracing open innovation, which subscribes to the belief that useful
ideas originate from both within and without and that companies market the
yields of those ideas through both internal and external channels. Henry
Chesbrough with the University of California, Berkeley's Center for Open
Innovation says IBM has refocused the bulk of its research efforts toward
services and support technologies, and the company's R&D approach uses
"collaboratories," which are regional joint ventures with universities,
foreign governments, or commercial partners that harness local skills,
funding, and sales pathways to quickly bring new technology to market. IBM
Research director John Kelly announced that IBM would make a three-year,
$100 million-plus commitment to each of four "high risk" basic research
areas, including a new integrated systems and chip architecture, business
integrity management via advanced math and computer science, and
nanotechnology, cloud computing, and Internet-scale data centers. Less
than a year after hiring former University of Illinois at Chicago
engineering dean Prith Banerjee as head of HP Labs, HP announced that it
would concentrate on smaller "big bet" research projects in five
areas--information explosion, dynamic cloud services, content
transformation, intelligent infrastructure, and sustainability--that
Banerjee believes are most critical to customers in the next 10 years.
Greater collaboration with other companies, universities, and venture
capitalists constitutes a sizable portion of HP's R&D strategy. Microsoft,
meanwhile, announced a major expansion of its Beijing research campus and
the opening of a new lab in Cambridge, Mass. "We are growing outward into
areas where computer science intersects with other disciplines, like AIDS
research, computational biology and the environment," says Microsoft
Research director Richard Rashid.
Click Here to View Full Article
to the top