I.B.M. Researchers Advancing Computer Processing
Ability
New York Times (08/31/07) P. C5; Markoff, John
IBM researchers yesterday announced two nanotechnology breakthroughs that
could lead to the development of much smaller computers and other
electronics. Two IBM papers appearing in the journal of Science focus on a
new understanding of the behavior of magnetism at the atomic level. One
paper describes a technique for reading and writing digital 1s and 0s on a
small group of atoms, and even on single atoms. The other paper describes
the ability to use a single molecule as a switch to replicate the behavior
of transistors. The researchers used a scanning tunneling microscope to
observe the magnetic orientation of iron and manganese atoms at low
temperatures. The atomic scale magnetic structures could potentially be
used for data storage and could possibly be harnessed for quantum
computing. Another group of IBM scientists in Zurich were able to place
two hydrogen atoms in an ultra-thin insulating film and cause the atoms to
alternate between two states, the equivalent of 1s and 0s used on standard
chips. The same process also allowed the researchers to inject an electric
charge into one of the molecules and link the effect to a neighboring
molecule, suggesting that it could be possible to extend the effect into a
fabric of trillions of atom-sized switches. The advances are far from any
commercial applications but could be an important step toward quantum
computing.
Click Here to View Full Article
to the top
Student, Prof Build Budget Supercomputer
Calvin College (08/30/07) Graff, Allison
Former Calvin College student Tim Brom, who graduated this year, and
Calvin College computer science professor Joel Adams have built Microwulf,
a Beowulf-based computer cluster that may be the smallest and least
expensive supercomputer in the world. Microwulf was built for just $2,470,
giving it a price/performance ratio of less than $100 per Gflop. Microwulf
is small enough to fit next to a desk and is more than twice as fast as
Deep Blue, the IBM supercomputer that beat world chess champion Gary
Kasparov in 1997. Microwulf uses four dual-core motherboards connected by
an 8-port Gigabyte Ethernet switch and can process 26.25 gigaflops.
Although the National Weather Service and similar organizations use
supercomputers more than 100 times faster than Microwulf, the budget
supercomputer can be used to solve problems that are too complicated for
ordinary desktop computers. In addition to considering the
price/performance ratio, Adams designed Microwulf with power consumption in
mind as well. "This is becoming increasingly important, as excess power
consumption is inefficient and generates waste heat, which can in turn
decrease reliability," Adams says. Instead of using Microwulf in a
supercomputing lab, Adams is going to take Microwulf to middle school and
high school classrooms to try to get teenagers interested in computer
science. SC07, the international conference on high performance computing,
networking, storage, and analysis, sponsored by ACM and IEEE, will be held
November 10-16, 2007, in Reno, NV. More information is available at
http://sc07.supercomputing.org/
Click Here to View Full Article
to the top
Special Military Group Looks Ahead to Fight America's
Future Wars
San Francisco Chronicle (08/26/07) P. E1; Abate, Tom
The Defense Advanced Research Projects Agency (DARPA) is looking to
cutting-edge technology produced in Silicon Valley to fight future battles,
and futurist Paul Saffo noted at DARPA's recent 50th anniversary conference
that "almost every great digital oak has a DARPA acorn at the bottom."
Marine Lt. Gen. James Amos said the U.S. military will have to contend with
a new era of guerrilla warfare in which an "arc of instability" encircles
the globe equatorially. DARPA is hoping to develop weapons that would
enable high-altitude patrolling of such regions by the United States. For
example, DARPA leader Thomas Bussing envisions "an aircraft carrier in the
sky" that can neutralize threats through countermeasures launched from
anywhere in the continental United States that keep civilian casualties to
a minimum. Amos expects missions by ground forces to consist of squads
patrolling populous towns where distinction between friends and enemies is
close to nonexistent, aided by situational awareness delivered via aerial
platforms. Retired political scientist and author Chalmers Johnson is
critical of DARPA's ambitious high-tech warfare visions, arguing that it is
making the country less secure and driving it toward bankruptcy. "We spend
billions of dollars to develop and procure innovative solutions ... but at
the end of the day, it's still not possible for us to completely defeat
these very basic technologies and approaches our adversaries are choosing,"
noted Deputy Secretary of Defense Gordon England at the DARPA conference.
"And of course there's a huge cost disadvantage, probably a million to one
between our outlays and what an IED builder spends on readily available
parts." DARPA and Pentagon officials said at the conference that the
United States will need to spend $1 million for every dollar spent by enemy
guerrillas. "We are like a lion up against bees that are very effective
whenever they swarm," said DARPA's Daniel Newman.
Click Here to View Full Article
to the top
Creating a Computer Currency
Harvard University Gazette (08/29/07) Powell, Alvin
Scientists at Harvard's School of Engineering and Applied Sciences (SEAS)
helped develop the newest version of Tribler, a peer-to-peer video sharing
program that allows researchers to explore next-generation electronic
commerce and the possibility of using bandwidth as a global currency.
Originally developed by scientists at Delft University of Technology and
Vrije Universiteit in Amsterdam, Tribler allows users to create a
peer-to-peer video sharing network. The peer-to-peer software uses the
resources of members' machines to help the network run more smoothly. For
example, when a member places a call using a peer-to-peer telephone system,
that member's computer is used for more than just that call and may be used
to help the network function more effectively overall, or it may be used to
help route another call that may not be able to make a direct connection
because of firewalls. Essentially, members of peer-to-peer networks
participate in transactions, where members exchange each other's network
resources in an economic system similar to bartering, replacing money with
bandwidth, says SEAS professor David Parkes. Parkes such systems ensure
that every bit of a network's resources are used. Parks and Tribler
technical director Johan Pouwelse have been working to expand
peer-to-peer's developing economy. Establishing an accounting system that
tallies the amount of the network's resources a member used and contributed
would allow the resources to be saved and spent, creating a new form of
currency. Tribler uses a video sharing network because of the high demand
exchanging video files places on the network.
Click Here to View Full Article
to the top
Saving Power in Handhelds
Technology Review (08/30/07) Hardesty, Larry
Extending battery life in handheld devices is becoming increasingly
important, particularly as handheld devices are being used more frequently
to play videos, an activity that consumers power much faster than playing
MP3s. In the most recent issue of ACM's Transactions of Embedded Computing
Systems, University of Maryland researchers outlined a technique that in
simulations cut power consumption by about 66 percent. Gang Qu, one of the
developers, says the premise of the technique is that users can tolerate
some execution failure in multimedia applications without noticing the
difference. A fair amount of digital video plays at a rate of 30 frames
per second, whereas older movies in theaters played at 24 frames per
second. "That's about 80 percent," Qu says. "If you can get 80 percent of
the frames consistently correct, human beings will not be able to tell
you've made mistakes." Digital video decoding time can vary from frame to
frame, so digital media systems are designed to rapidly decode even the
most difficult frames so they can be displayed on time and without delay.
Qu and his colleagues wrote an algorithm that establishes a series of time
limits on the decoding process. If the time limits are exceeded, the
decoding of that frame is aborted and the system starts on the next frame.
Using statistics on the length of specific tasks, the researchers can
adjust the algorithm to guarantee a certain frame completion rate. Qu
notes the simulations used signals similar but not completely identical to
video signals, and real video decoding might not produce such dramatic
results.
Click Here to View Full Article
to the top
Controlling Bandwidth in the Clouds
Jacobs School of Engineering (UCSD) (08/28/07) Kane, Daniel B.; Raghavan,
Barath; Snoeren, Alex C.
University of California, San Diego, computer scientists have developed a
new bandwidth management system for cloud-based applications that will
allow mirrored sites with little activity to transfer bandwidth to sites
that are receiving heavy traffic. The algorithm allows distributed rate
limiters to collaborate and enforce global bandwidth rate limits, and to
dynamically shift bandwidth allocations across multiple sites or networks,
according to current network demand. The "flow proportional share"
algorithm developed enables coordinated policing of a cloud-based service's
network traffic. The Transmission Control Protocol (TCP) design is
scalable to hundreds of nodes, runs with very little overhead, and can
withstand both loss and communication delay. "With our system, an
organization with mirrored Web sites or other services across the globe,
could dynamically shift its bandwidth allocations between sites based on
demand," says Barath Raghavan, a Ph.D. candidate and lead author on a new
paper describing the work. "You can't do that now, and this lack of
control is a significant drawback to today's cloud-based computing
approaches." The paper, "Cloud Control with Distributed Rate Limiting,"
won the 2007 SIGCOMM best student paper award and was presented at ACM
SIGCOMM in Kyoto, Japan. "Our primary insight is that we can use TCP
itself to estimate bandwidth demand," says Alex Snoeren, senior author on
the paper. "Relying on TCP, we can provide the fairness that you would see
with one central rate limiter."
Click Here to View Full Article
to the top
Freedom Key to Web Evolution, Says Guru
Financial Times (08/31/07) P. 11; Edgecliffe-Johnson, Andrew
Vint Cerf, Google's vice president and chief internet evangelist, says any
threat to open access to the Internet would be "a hazard to innovation,"
and that the Internet's ability to handle a continually expanding number of
users and amount of content on the network is less important than security,
stability, reliability, and privacy. "The most important thing is to make
sure we have a secure and stable network," Cerf says. "There are ways to
attack the system which we need to defend against." As for "net
neutrality," Cerf hopes that the Internet will remain open and that
broadband providers will not discriminate between content providers or move to
block applications that use large amounts of bandwidth. "If we ever move
into a regime where the providers of basic Internet services have control
over what users or entrepreneurs can put on the network then I see a
potential hazard to innovation," Cerf says. Cerf urges regulators around
the world to recognize the importance of an open network with general
neutrality, and that if the Internet is ever controlled by "monopoly
broadband providers" the investments in data centers and other
infrastructure necessary to expand its reach could not be accomplished.
Cerf also believes that more consumers will be willing to pay for online
content as broadband expands. "I do think that as time goes on, the
consumer will understand the value of the content and be willing to pay,"
Cerf says. Vint Cerf is a co-winner, with Bob Kahn, of the 2004 ACM A.M.
Turing Award. For more information, click on
http://awards.acm.org/citation.cfm?id=8047952&srt=alpha&alpha=C&aw=140&ao=AM
TURING
Click Here to View Full Article
to the top
'Touching' Research at Queen's
Queen's University Belfast (08/30/07) Mitchell, Lisa
Queen's University Belfast researchers are studying haptic technology that
could add a sense of touch to virtual worlds, a project that may eventually
lead to technology that allows online shoppers to feel products, online
gamers to feel the force of an impact, or blind and visually impaired
people to access the Internet in ways that are currently impossible.
Queen's University professor Alan Marshall and his colleagues in the School
of Electronics, Electrical Engineering and Computer Science will spend the
next three years developing new network architectures that would allow
online networks to carry haptic information. Haptic technology allows
users to "touch" virtual objects by applying forces to the user, normally
vibrations or motions. Currently, almost all haptic devices are only
capable of being connected to a standalone system. Marshall wants to
develop networks that increase the user's immersion in a virtual world by
allowing them to see, hear, and touch the environment around them, with the
ability to share those sensations with users in other locations. "If we
are to enter the 'second age' of the Internet, then it must be able to
support multimodal communication, including additional senses," Marshall
says.
Click Here to View Full Article
to the top
Molecules Line Up to Make the Tiniest of Wires
ExpressNews (University of Alberta) (08/27/07) Poon, Ileiren
Researchers at Canada's National Institute for Nanotechnology (NINT) have
developed a new technique for producing miniscule components that would be
used to wire the tiniest computer chips. Jillian Buriak, a chemistry
professor at the University of Alberta who is a senior research officer at
NINT, says her team has relied on molecules to do most of the work involved
in manufacturing conductive nano-wires on silicon chips. "We've figured
out a way to use molecules that will self-assemble to form the lines that
can be used as wires," Buriak says. "Then we use those molecules as
templates and fill them up with metal, and then we have the wires that we
want." The team used the method to produce 25 parallel platinum
nano-wires, with each measuring 10 nanometers in width but 50,000 in
length--about as long as a human hair is wide. The researchers say the
self-assembly strategy could be used to produce wires that are 5,000 times
longer than their width and would connect to the smallest electronic
components. The approach could make electronics faster, cheaper, and
improve their storage capacity.
Click Here to View Full Article
to the top
Digital Detectives Discern Photoshop Fakery
Christian Science Monitor (08/29/07) P. 13; Gaylord, Chris
Image-manipulation software has become increasingly easy to use and
exponentially more difficult to detect, but Hany Farid, a computer science
professor at Dartmouth and head of the college's Image Science Group, has
developed computer algorithms that can test photos to see if they are fakes
by finding the tiny hidden flaws. "There's no way to push a button and
tell if it's real, but there are tests we can run that allow us to be
pretty sure if it's a fake," Farid says. Some of the techniques teach a
computer to identify subtle imperfections that untrained humans have
difficulty spotting, such as inconsistencies in the physics and geometry of
the image. For example, the vanishing points may not match, or the shadows
cast from two or more objects may contradict each other. While some of the
tests seem simple, others are quite complicated. One of the tests checks
the reflection of light in people's eyes to triangulate the location of the
flash camera that took the picture. If the analysis shows that the camera
was in multiple places, the photo is a fake. While a significant amount of
image manipulation is done by tabloid media, fake photos are problematic
for the legal system, and this is where Farid's software will be put to
good use. Farid has already testified in more than two dozen court cases
as to whether photographs were altered. He says that so far most
accusations of fraud turn out to be unfounded.
Click Here to View Full Article
to the top
UT Arlington Computer Science Researchers Awarded
$450,000 NSF Grant
University of Texas at Arlington (08/23/07)
Researchers at the University of Texas at Arlington will use a $450,000
grant from the National Science Foundation to study the use of asynchronous
communication architecture for wireless sensor networks. Assistant
professor Yonghe Liu and colleagues Sajal K. Das and Mohan Kumar believe
asynchronous communication architecture will make wireless sensor networks
much more energy efficient and lead to a longer lifetime for networks.
Wireless sensor networks have low data rates and their underlying
communications techniques, especially at the physical and link layers,
germinate along the Internet root and their wireless extensions, which
makes for operation that is not energy efficient and ad-hoc friendly. With
asynchronous communication architecture, a sensor node directly writes data
into a special, reactive module (RFID-tag based) that sits on the receiving
node while the main platform sleeps. Individual sensors will be able to
schedule their transmission, and no network-wide or local synchronization
will be needed.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
NSF Awards SDSU With Hefty Research Grant
Daily Aztec (San Diego State University) (08/27/07) Obert, Tamara
San Diego State University has received a $250,000 grant from the National
Science Foundation that will fund research into the development of new
wireless radio networks. As the number of wireless devices continues to
mount, space in the radio spectrum dwindles, according to a description of
the project in the proposal. The grant also will enable San Diego State to
purchase new equipment for its Wireless Multimedia Communications and
Networks Laboratory that will be used to test algorithms related to
wireless radio networks. Sunil Kumar, Santosh Nagaraj and Mahasweta
Sarkar, professors in the Electrical and Computer Engineering Department,
will head the research project. "We already have testbeds for wireless
sensor networks ZigBee and wireless LAN in this laboratory," says Kumar.
Computer science students also will participate in the project, which is
expected to last three to five years.
Click Here to View Full Article
to the top
Ad-Hoc Network Probes Links for Smoother Calls
New Scientist (08/29/07) Reilly, Michael
Researchers at the University of Alberta in Canada have simulated mobile
ad-hoc networks that are able to connect cell phone users more smoothly
between networks for calls. The computer model is designed to probe
potential connections for their viability, and seek out a stronger path if
the connection is too severely weakened. The relay of data over an ad-hoc
network can be disrupted as it reconfigures itself, but the transmission of
voice calls is often more of a problem than other data, such as email or
Web browsing. The Voice over Internet Protocol (VoIP) approach demands a
reliable network, but audio signals do not maintain their strength as the
user moves around and the distance between each node is not the same. Hai
Jiang and his team of researchers are optimistic that their approach can
help extend wireless coverage to the point in which an entire city is
blanketed with compatible wireless networks and communication devices share
signals. "Within a city, with enough nodes, this could provide a secondary
network for people to make calls," says Rajit Gadh of the University of
California, Los Angeles, who believes it also could pave the way for free
mobile telephony.
Click Here to View Full Article
to the top
Statistics Professor Says Databases Must Balance Privacy,
Utility
Carnegie Mellon News (08/30/07) Potts, Jonathan
Carnegie Mellon University statistics professor George Duncan says that
organizations with large databases such as the U.S. Census Bureau, which
collects tremendous amounts of personal information, need to find ways to
protect individuals' privacy while making the data available to
researchers. Duncan believes that traditional methods of "de-identifying"
records such as removing Social Security numbers and birth dates do not
adequately protect sensitive information because if someone knows enough
about the data they could use other characteristics to identify
individuals. Unfortunately, the information that can be used to
re-identify records is often the information that is most useful to the
researchers. "The question is, 'How can data be made useful for research
purposes without compromising the confidentiality of those who provided the
data?'" Duncan asks. Possible solutions include establishing
administrative procedures that restrict data access to approved personnel,
implementing restrictions on the use of information, and developing
statistical methods that de-identify records so that users cannot readily
reconstruct personal identities but researchers can still view the required
information. "Achieving 'adequate' privacy will require engineering
innovation, managerial commitment, information cooperation of data
subjects, and social controls," Duncan wrote in a commentary published in
the journal Science. ACM's Public Policy Committee (USACM) provided
testimony on protecting Social Security numbers at a recent Congressional
hearing. For more information, go to
http://www.acm.org/public-policy/public-policy-1?pageIndex=1
Click Here to View Full Article
to the top
Louisiana Tech Researchers Work on Cyber-Attack
Defense
Associated Press (08/26/07)
Louisiana Tech University's new Center for Secure Cyberspace (CSC) is
developing new technologies for use by the military and the private sector
to protect electronic networks and wireless communications. CSC director
Vir Phoha says the center has eight computer science researchers, four from
Louisiana Tech and four from Louisiana State University. Tech vice
president of research and development Les Guice says Air Force researchers
at Barksdale Air Force Base will also contribute to the research efforts.
Recently, the Air Force started setting up a cyberspace command at
Barksdale, which could lead to a variety of cyberspace-related research
projects. The CSC has been operating since June, and Phoha says that
previous research by CSC computer scientists has lead to published cyber
protection research on topics including advanced grid computing, how to
find malicious code online, and how to detect clogged computers before
access is denied. Phoha says a major area of research will involve sensor
networks that could aid the military on the battlefield, including more
advanced computer grids that could detect a terrorist suspect in Iraq, for
example. The research is made possible by the Louisiana Optical Network
Initiative, a fiber-optics network that connects supercomputers at the
state's major research universities.
Click Here to View Full Article
to the top
Green-Card Red Tape Sends Valuable Engineers
Packing
EE Times (08/27/07)No. 1490, P. 1; Riley, Sheila
Foreign engineers and other tech professionals are becoming fed up with
the bureaucratic hassles they must endure to secure employment-based green
cards that allow them to remain in the United States, and are returning to
their home countries. The result is "a massive reverse brain drain" with
dire economic repercussions, according to Harvard Law School's Vivek
Wadhwa, who led a study on the phenomenon. The report estimated that
500,000 foreign nationals living in the United States were awaiting green
cards by the end of fiscal 2006, while more than a quarter of international
patent applications filed from the United States last year listed foreign
nationals as inventors or co-inventors. Wadhwa explained that skilled
foreigners are brought into the country on temporary visas by U.S.
companies, which train them in American business strategy and then send
them back home. "How can this country be so dumb as to bring people in on
temporary visas, train them in our way of doing business and then send them
back to compete with us?" he asked. Indian programmer Praveen Arumbakkan,
who is going home after a prolonged period with no progress on his green
card application, said Indian nationals frequently place too much trust in
employers and are largely unaware of the resources that exist to help them
understand their immigration choices. Boston attorney Russell Swapp said
the debate on illegal immigration is being used by politicians to hold up
the passage of comprehensive legislation, thus drawing focus away from the
issue of legal immigration to the detriment of the economy. Many engineers
have complained that American tech employers are exploiting the work visa
system to import foreigners who are willing to work for less money than
their American counterparts.
Click Here to View Full Article
to the top
Horizon Awards 2007: Bitty Bytes
Computerworld (08/20/07) Vol. 41, No. 34, P. 44; Anthes, Gary
Hewlett-Packard has developed the Memory Spot, a tiny wireless device that
can be attached to almost anything and acts like a super strong and highly
effective radio frequency identification tag. The Memory Spot ranges in
size from 2mm square to 4mm square and can store half a megabyte of
information. It uses a built-in antenna to read and write at 10Mbit per
second, and includes a digital microprocessor and analog circuits for
receiving RF signals. Instead of using a battery, the Memory Spot receives
power through a process called inductive coupling, where electronic devices
transfer power through a shared electromagnetic field. HP Labs associate
director Howard Taub says applications for the Memory Spot could include
animated postcards, ultra-secure passports and identity cards, and medical
records that could stored in a patient's wristband. Hewlett-Packard is
working with the Near Field Communication Forum to see if Memory Spot
readers could be put in cell phones to receive information and interactive
media, such as a movie preview stored in a movie poster with a Memory Spot.
"It can bring intelligence to inanimate objects," says analyst Tim
Bajarin. "It could be used in all kinds of things, not just RFID-type
applications like inventory."
Click Here to View Full Article
to the top
Squashing Worms
Science News (08/25/07) Vol. 172, No. 8, Rehmeyer, Julie J.
The treatment prescribed for computer worms is for system administrators
to patch systems that will most likely limit an outbreak because they
usually cannot fix all system vulnerabilities at the same time, while
mutating worms are designed to exploit multiple vulnerabilities and
continuously change infection tactics. Determining which computers should
be initially patched in a mutating worm attack scenario is a problem that
has been studied mathematically by Microsoft Research theoretical computer
scientist Jennifer Tour Chayes. She suggests that the most highly
connected systems should be patched first, irrespective of their proximity
to other compromised systems. Chayes' research followed the assumption
that even patched systems remain vulnerable to new attacks by the same
worm. Through experimentation, she concluded that distributing patches to
the most highly connected nodes in her network model, regardless of whether
the nodes connecting them were also infected, brings the epidemic under
control with far fewer patches than were required in an earlier strategy
based on system administrators' typical response methodology. Chayes'
findings are sobering, not just with respect to network security, but also
to public health. For example, failure to adopt intelligent vaccine
distribution could lead to situations in which outbreaks of new human
viruses reach epidemic proportions.
Click Here to View Full Article
to the top