Herculean Device for Molecular Mysteries
New York Times (07/08/08) P. D2; Markoff, John
A team of scientists and engineers has almost finished working on Anton, a
special-purpose supercomputer that promises to improve the performance of
complex molecular simulations a thousandfold. Anton, named in honor of
microbiology pioneer Anton van Leeuwenhoek, could be used to investigate
significant computational problems such as the folding of protein molecules
or the design of drugs based on the simulated activity of specific
molecules. Anton, expected to be operational by the end of this year, is
described in the current issue of Communications of the ACM. The new
supercomputer features 512 application-specific integrated circuits working
in parallel and designed to calculate the three-dimensional characteristics
of molecules. Molecular modeling using supercomputers has been going on
for more than a decade, but the field is still developing. Simulations of
processes such as the folding of a protein into a 3D structure, the
interactions between proteins, or the interaction between a protein and a
drug molecule could lead to advancements in science and drug development.
However, following each simulation, the results must be validated by
experimental scientists in a laboratory, so improving the speed of the
simulations, which currently takes thousands of hours on the fastest
supercomputers, will allow scientists to test the results sooner.
Click Here to View Full Article
to the top
Major DNS Flaw Could Disrupt the Internet
Network World (07/08/08) Messmer, Ellen
Security researcher Dan Kaminsky has discovered a fundamental flaw in the
Domain Name System (DNS) protocol that could allow an attacker to massively
disrupt the Internet, causing CERT to issue an alert and major DNS software
vendors to issue patches. Kaminsky says this is the first time such a
coordinated multi-vendor synchronized patch release has ever been executed.
Not applying the patch to the ISP infrastructure would allow a hacker to
attack an ISP and redirect traffic however they wanted, Kaminsky says.
Both current and older versions of DNS may be vulnerable, although patches
may not be available for older DNS software. Kaminsky says the problem
centers around a lack of sufficient port randomization related to the
transaction ID of a query, but he is waiting to further discuss the
vulnerability until most DNS patching has been completed. Kaminsky found
the problem by accident about six months ago, and organized an
industry-wide response that culminated in 16 researchers meeting at the
Microsoft campus in late March to fix the problem. CERT Coordination
Center's Art Manion says ISPs have been informed and several government
agencies are working closely with CERT to correct the DNS flaw. Kaminsky
says the DNS patch upgrade should go smoothly, but there is the potential
that if the DNS patch is not applied correctly people could experience a
"sudden outage."
Click Here to View Full Article
to the top
Computer Jobs Hit Record High
CIO Insight (07/07/08) Chabrow, Eric
U.S. information technology employment is approaching an all-time high as
nearly 4 million workers are now employed in IT-related jobs, according to
a CIO Insight analysis of U.S. Bureau of Labor Statistics data. The IT
unemployment rate increased one-tenth of a percentage point last quarter to
reach 2.3 percent, but it is still near historic lows. In fact, at 4.7
percent, overall unemployment in the United States is more than double IT's
jobless rate, writes Eric Chabrow. He says IT employment remains strong
because IT performs a critical role in business productivity and the
efficiencies IT creates are crucial for employers looking to cut costs.
Even those looking to reduce payroll are employing IT professionals because
better IT systems allow them to eliminate other positions. Furthermore,
companies cannot operate without functioning IT systems, so certain
business technology skills cannot be eliminated if a company wants to
remain competitive. Over the past four quarters, the IT workforce has
grown by 10.2 percent. Meanwhile, the number of workers employed by IT
services firms, defined by the U.S. government as computer systems design
and related services, rose by 56,100 over the past year, a 4.1 percent
increase.
Click Here to View Full Article
to the top
Nanotubes Hold Promise for Next-Generation
Computing
Wired News (07/09/08) Tweney, Dylan
Papers published by two separate research groups detail the possibility of
carbon nanotubes finding use in electronics thanks to breakthroughs in
their creation, sorting, and organization. Until recently nanotubes were
produced in a random fashion, and ordering the nanotubes in regular
patterns as well as growing exclusively semiconducting or exclusively
metallic nanotubes were extremely difficult. A paper presented at the VLSI
Symposium by Stanford University researchers offers a solution to the
disorder exhibited by nanotubes grown on silicon wafers by using
crystalline quartz, which facilitates more orderly growth, after which the
aligned nanotubes can be transplanted onto the silicon wafer. Stanford and
Samsung chemical engineers recently published a paper in Science explaining
how semiconducting and metallic carbon nanotubes can be selectively
cultivated by changing the substrate on which the nanotubes are grown.
Almost entirely semiconducting nanotubes can be produced using a substrate
of aminosilanes, while substrates of aromatic compounds can produce
metallic nanotubes. Prior to this discovery, the only option was to sort
semiconducting and metallic nanotubes after they were made using electrical
or magnetic fields, which was commercially impractical. The application of
nanotubes to electronics may come at an opportune time, since modern chip
fabrication technologies are approaching their physical limits.
Click Here to View Full Article
to the top
Soft, Squishy Robots Can Change Shape, Size
Computerworld (07/08/08) Gaudin, Sharon
Tufts University scientists are developing soft, squishy robots that are
able to squeeze through tight spaces and then return to their original size
and shape. Tufts professor Barry Trimmer says creating soft robots could
lead to a new way of approaching robotics. He says such robots could use
biological materials such as silk proteins to make muscles and sensory
organs. They could be used for detecting land mines or for
search-and-rescue efforts in hazardous areas. Trimmer says the robots
would be like an octopus, which can radically change its shape and compress
itself down to the size of its eyeball. "We have no idea how to do that
yet, but this project is trying to understand the technology that is needed
to do that," he says. Researchers have spent a significant amount of time
studying the caterpillar, which can control its body with a relatively
simple nervous system despite its lack of bones or joints. The caterpillar
is particularly interesting because it can move itself with only two
muscles controlling each leg because of the way its body responds to the
simple contraction and release of muscles. Soft robots would be controlled
by tiny, flexible computer chips. "If you look at a soft-bodied animal, in
a traditional engineering approach, you'd expect to use more computation to
control it," Trimmer says. "It should have a bigger brain, but you don't
see that."
Click Here to View Full Article
to the top
Google Introduces a Cartoonlike Method for Talking in
Chat Rooms
New York Times (07/09/08) P. C4; Stone, Brad
Google Labs has released Lively, an online application that enables users
to create a cartoonish avatar for text-based conversations in virtual chat
rooms. The chat rooms can be added to any blog or Web site. Lively could
revolutionize how people interact on the Internet. Traditional online chat
rooms are limited to text with the occasional video or voice chat. Lively
tries to make conversations three dimensional and more interactive, as if
users were playing a game. Users can select their own avatar and create
their own chat rooms, incorporating videos from YouTube and photos from
Google's photo service Picasa as pieces of art. Up to 20 people can occupy
a single chat room, with text appearing above the avatars as cartoon-style
bubbles. Similar products include Linden Labs' Second Life, which boasts a
much larger virtual world that hundreds of thousands of users can enter at
the same time, and Vivaty, a virtual world startup that creates similar 3D
chat rooms that run on Facebook and AOL Instant Messenger. Vivaty CEO
Keith McCurdy says Google's entry into the field is validation of the
concept.
Click Here to View Full Article
to the top
A Picowatt Processor
Technology Review (07/08/08) Greene, Kate
University of Michigan professor David Blaauw says his research team has
created a 1-millimeter-square processor that could be powered for a decade
or more by emerging thin-film batteries. Blaauw's Phoenix processor
consumes only about 30 picowatts of power in sleep mode, and just 2.8
picojoules of energy per computing cycle when active, which amounts to only
about 10 percent of the power consumed by the most energy-efficient chips
on the market, says University of California, Berkeley professor Jan
Rabaey. The processor requires 500 millivolts per operation, which is
one-75th the voltage needs of PC microprocessors. Blaauw says parts of the
chip do not function well at so low a voltage, so his team reworked the
processor's memory and its internal clock so that it could run with minimal
electrical input. When combined with a battery, the Phoenix chip would be
just a cubic millimeter in volume. Phoenix's transistors were fabricated
using a 180-nanometer process, which Blaauw says makes them big enough to
have minimal electrical leakage yet sufficiently small for the researchers
to plant a large number of transistors on the chip. When the processor is
in standby mode, special "power-gating" transistors completely shut off the
power supply to the processing transistors to reduce leakage even more.
Challenges ahead for the Michigan team include adding a battery to the
chip, developing a method for offloading data from the Phoenix for further
analysis, and fully integrating the device within a biological system.
Click Here to View Full Article
to the top
Making Sure the Internet Delivers
ICT Results (07/04/08)
The European Go4IT project has developed free software suites that
businesses can use to test whether their software will work with Internet
Protocol version 6 (IPv6). Internet applications are currently being
re-engineered in anticipation of the transition to IPv6, but before
companies can make IPv6 products available they must be tested for
performance, standards compatibility, and interoperability. In addition to
IP test suites, Go4IT researchers have proposed new Testing and Test
Control Notation-3 (TTCN-3) specifications for IPv6 compatible Dynamic Host
Configuration Protocol-type servers. TTCN-3 is a computer language
developed to test telecommunications software that has received the support
of major commercial companies. Project leaders say the Go4IT TTCN-3 test
suites will also be useful to a variety of other industry sectors. The
Go4IT team has established a global open source community devoted to the
development of TTCN-3 tests. The open source approach to TTCN-3
development allows small and medium businesses and academics to participate
in the development of the standard, and has accelerated the development and
acceptance of TTCN-3.
Click Here to View Full Article
to the top
Doctor Establishes Brain-Computer Interface Lab
Daily Gleaner (CAN) (07/02/08) Dobrovnik, Frank
A brain-computer interface (BCI) lab has been founded by Algoma University
professor George Townsend, who is pursuing a patent for technology that
might one day give paralysis victims the ability to communicate, send
email, and surf the Internet using brain signals. The technology is
designed to enable computers to recognize these signals and translate them
into action. One of the biggest hindrances to the development of BCI
technology is cost: An electroencephalograph amplifier that measures
electrical activity generated by the brain as recorded from electrodes
positioned on the scalp typically costs as much as $10,000. Townsend and
intern Michael Lajoie are working on technology for extremely small EEG
microchips that they hope will eventually lower the cost to "a few hundred
dollars at most," Townsend says. He adds that in addition to being
cheaper, the microchips will be more reliable. Townsend admits that BCI
technology is still in an early developmental phase, and suffers from
clumsiness. The current system requires subjects to focus on a screen of
72 letters, numbers, punctuation marks, and short commands, so that the
computer can "read" the one the subject is focusing on by registering a
change in his brainwaves.
Click Here to View Full Article
to the top
Google Releases 'Protocol Buffers' Data Language
InformationWeek (07/07/08) Claburn, Thomas
Google has released its Protocol Buffers data description language as open
source software. Initially developed for internal use, Protocol Buffers is
a simpler, smaller, and faster alternative to XML. "It's the way we encode
almost any sort of structured information which needs to be passed across
the network or stored on disk," says Google's Chris DiBona. Google's
Kenton Varda says the company developed Protocol Buffers because it uses
thousands of different data formats, most of which are structured, and XML
is unable to handle such large-scale encoding. "By sticking to a simple
lists-and-records model that solves the majority of problems and resisting
the desire to chase diminishing returns, we believe we have created
something that is powerful without being bloated," Varda says. Protocol
Buffers files are said to be three to 10 times smaller than comparable XML
files and can be parsed 20 to 100 times faster. However, Google says files
such as text documents will work better with XML.
Click Here to View Full Article
to the top
As Web Traffic Grows, Crashes Take Bigger Toll
New York Times (07/06/08) P. 1; Stone, Brad
The Internet has become an essential part of many businesses and personal
lives, which is why when a Web site goes down it can be devastating. About
a month ago, Amazon.com's site was shut down for several hours over two
business days, costing the company an estimated loss of a million dollars
an hour in sales. The Internet has always been susceptible to unforeseen
problems, but fewer people used the Internet in its early days and the
importance of a company's Web site was significantly less than it is today.
Companies that provide online services need to provide around-the-clock
availability, which even some of the Internet's biggest and most successful
companies occasionally have trouble doing. The problems that cause
blackouts can vary from unintended consequences resulting from system
upgrades, to human error, to old-fashioned electrical failure. In June, an
electrical explosion in a Houston data center of a Web hosting company
caused thousands of Web businesses to be inaccessible for up to five days.
Web addicts who find themselves unable to access their favorite sites often
write angry posts on Web sites and blogs about the down site's failure to
keep promises. Former Amazon executive Jesse Robbins, who was responsible
for keeping Amazon online from 2004 to 2006, says outcries over Web site
failures are understandable. "When these sites go away, it's a sudden
loss," he says. "It's like you are standing in the middle of Macy's and
the power goes out." Robbins says Web services should be held to the same
standard of reliability as the offline service they aim to replace.
Click Here to View Full Article
to the top
Will the QC Kill the PC?
Telegraph.co.uk (07/01/08) Highfield, Roger
Quantum computing experts project that working quantum computers could be
realized in a matter of years as experiments demonstrating their practical
applications stoke enthusiasm. "I believe that in 20 years at the most,
quantum computers will be used in everyday life on people's desktops," says
Vienna University quantum physicist Anton Zeilinger. Physicists are
convinced that quantum computers could perform many calculations
concurrently by tapping the unique properties of small particles, such as
their ability to be in two places--or in two different states--at once. In
quantum computers, the atoms and subatomic particles that act as switches
can be on and off at the same time, and the quantum bit, or qubit, can be
engaged in multiple calculations. In addition to computing speed advances,
quantum computers could be used to mine the unstructured data on the
Internet for video and images rather than for keywords, and facilitate more
realistic game play in video games by adding a truly random element, to
name a few examples. The possibility that quantum computing could be used
to defeat cryptographic security measures is worrisome, while another
security-related aspect of quantum computing is its ability to detect
people who try to intercept quantum-encrypted data by exploiting a
phenomenon in which the state of a quantum particle is changed by the act
of observation. Zeilinger says that communication between quantum
computers could be effected by entangled particles. MIT professor Seth
Lloyd has conducted research demonstrating that quantum computing could be
employed to maintain the privacy of personal information.
Click Here to View Full Article
to the top
Most Network Data Sits Untouched
Government Computer News (07/01/08) Jackson, Joab
Most data on enterprise networks rarely gets accessed after it is stored,
largely because users are too busy writing new data to access old data,
researchers say. University of California computer science researcher
Andrew Leung presented these findings at the recent USENIX conference in
Boston. Leung says because much of an organization's data rarely gets
accessed, organizations should consider moving their data to slower but
less expensive storage units. The researchers studied the traffic flow on
NetApp's enterprise file servers, which manage more than 22 terabytes of
material. Leung says the study is the first large-scale examination of
network traffic patterns. "How people have been deploying network file
systems has been changing over the past five to 10 years," Leung says.
During the three-month observation period, more than 90 percent of the
material on the servers was never accessed. Additionally, among the files
that were opened, 65 percent were only opened once, and most of the other
files accessed were opened five times or fewer. However, about a dozen
files were opened 100,000 times or more. The researchers also noticed that
the ratio of data being read from storage versus the amount of data written
to storage had changed from what had been seen in previous studies. Bytes
written compared to bytes read was a ratio of about 2-1, while previous
ratios saw read-to-write ratios of 4-1 or higher. "The workloads are
becoming more write-oriented, so the decrease in read-only traffic and the
increase in write traffic suggests that file systems want to be more
write-oriented," Leung says.
Click Here to View Full Article
to the top
Electronic Path to Bridge Safety
The Australian (Australia) (07/01/08) Foreshew, Jennifer
Australian researchers are developing bridge management software that uses
an artificial neural network to predict the safety of 10,000 bridges in
Queensland, Australia. Griffith University's Michael Blumenstein says the
artificial intelligence (AI) technologies used in the system will improve
maintenance strategies and minimize costs. "We are trying to incorporate
our technology into a bridge management system for the purpose of assisting
predictions of bridge deterioration, inspection, and maintenance,"
Blumenstein says. He says the technology could save between 10 to 20
percent on bridge maintenance costs. The software uses the artificial
neural network to learn from the historical performance of a bridge and
predict future problems. The system can reconstruct the past by generating
missing historical performance data on older bridges. "We are taking data
that is available from inspection records and we are using that to
back-predict and fill in the gaps to produce a larger historical record so
we can predict future deterioration issues for bridges," Blumenstein says.
The AI technologies are used in conjunction with routine inspections to
generate data that fills in the large missing gaps between inspections.
The goal is to be able to input variables such as bridge location,
construction and material type, weather conditions, and traffic volume and
predict the structural condition of the bridge. The researchers have
tested the system using data from Maryland's Department of Transportation
in the United States.
Click Here to View Full Article
to the top
The SocialDNS Project ... and Why DNS Is Not the Phone
Book of the Internet
CircleID (06/30/08) Lopez, Pedro Garcia
The SocialDNS Project is a unique naming and directory system for the Web
that is designed to become an open standard with scores of top-level
domains (TLDs) and to complement the existing Domain Name System to provide
advanced services that the current infrastructure for Web settings cannot
support, writes computer engineering professor Pedro Garcia Lopez at
Spain's Rovira i Virgili University. The system's major benefits include
limitless TLDs because of no restrictions for assigning names and TLDs.
SocialDNS also offers a directory service and cloud search engine, and the
support of additional metadata fields besides description, tags, and
geotags. SocialDNS' services and information are based on open systems,
and thus are public and free, says Garcia Lopez. "Third-party search
engines and tools can parse and index this information without cost or
restrictions," he notes. The WebTLD server software is available as open
source in Sourceforge, which means that everyone can study the system's
workings. Another advantage of SocialDNS is its ability to facilitate the
development of meta-services around domain names. Challenges to the proper
development of SocialDNS that Garcia Lopez identifies include the threat of
domain name speculation, which must be countered by a combination of simple
naming policies and conflict resolution schemes; the selection and
rewarding of moderators; system scalability for the root TLD; and clashes
with ICANN and DNS registrars, which will be avoided through SocialDNS'
maintenance of ownership in its namespace for TLDs assigned by ICANN.
Click Here to View Full Article
to the top
Identity Problems
National Journal (07/05/08) Vol. 40, No. 27, P. 22; Carney, Eliza Newlin
Outfitting virtually all U.S. citizens with fraud-resistant IDs has proven
to be a major challenge from a practical as well as emotional point of
view, with a multitude of technical, legislative, administrative, and
ethical obstacles impeding progress. Events and trends fueling the drive
for more reliable IDs include the 9/11 terrorist attacks, the push to deter
illegal immigration, credit-driven commerce, the threat of identity theft,
and technological innovations in identity verification methods. A sore
point among various parties is the Real ID Act, which has come under fire
for being passed without public debate or hearings, and for receiving
inadequate federal funding. Real ID sets up federal standards for issuing
driver's licenses, and dictates that states must link their databases in
order to enforce the law's prohibition on drivers holding licenses from
multiple states, which critics warn would create an irresistible target for
hackers and ID thieves. Some experts believe a national, biometric ID card
is the solution. "Right now, we are proceeding in hundreds of different
ways, for dozens of different IDs, at tremendous expense," says Robert
Pastor, co-director of American University's Center for Democracy and
Election Management. Privacy experts favor a scheme in which Americans
carry multiple smart cards with different applications, arguing that a
single ID would reduce Americans' security. "Uniformity in IDs across the
country would create economies of scale" for snoops and could help bring
about a surveillance society, warns the Cato Institute's Jim Harper.
Click Here to View Full Article
to the top
Incubating Next-Gen.Edu
Campus Technology (06/08) Vol. 21, No. 10, P. 26; Schaffhauser, Dian
Incubator classrooms can serve as testbeds for new technologies and
teaching and learning concepts to see which ones work best prior to a
broader campus implementation, and several universities are engaged in such
initiatives. The University of California-Riverside's project includes
"flex classrooms" equipped with wraparound whiteboards, movable tables and
chairs, and dual-projection systems with multiple control points, as well
as the Hyperconstruction Studio, a space for testing new technologies
before rolling them out in classrooms. Among the studio's features are
slate and maple tables to allow flexible groupings, carpeting arranged in
contrasting squares to aid in the creation of groupings, a sophisticated
video wall and instructor workstation, and multiple displays, all of which
are arranged to create an atmosphere "where everyone is engaged with
everyone else," says UC-Riverside's Leo Schouest. The creation and
presentation of new ideas--and the technology controls--are also physically
separated, and both students and instructors are empowered to present
information and make annotations on any or all of the displays in the room,
from any of the room's computer terminals. Meanwhile, Santa Clara
University has set up its new Learning Commons, Technology Center, and
Library, which features incubator rooms configured for power and
networking, and to support a wide array of potential applications. The
rooms include ceiling-mounted projectors, electric motorized screens,
audio-video hookups, and video recording facilities, while other spaces in
the building boast flexible configuration. SCU CIO Ron Danielson hopes
that these experimental spaces will encourage faculty to rethink the kinds
of classroom spaces they want and need.
Click Here to View Full Article
to the top