Internet Helps Americans Save More Energy Every
Year
Christian Science Monitor (02/13/08) P. 4; Clayton, Mark
The widespread adoption of the Internet and other developments of the
communications revolution have helped the United States become increasingly
energy-efficient, concludes a new study by the American Council for an
Energy-Efficient Economy (ACEEE). The study found that for every
kilowatt-hour of energy used by information and communications
technologies, the United States saves at least 10 times the consumed
amount. "Acceleration of information and computer technology across the
U.S. landscape post 1995 is driving much of the nation's
energy-productivity gain," says study co-author John Laitner, who adds that
if the nation had continued at the historic rate of prior years, it would
consume the energy equivalent of 1 billion barrels of oil more per year
than it currently does. The study says companies are using information
technology to make significant energy improvements. For example, UPS
recently introduced software to develop more efficient routes and help
drivers avoid left-hand turns, resulting in 28.5 million fewer miles driven
and about 3 million gallons of gas saved each year. Laitner says
individuals are also saving a lot of energy by using email, instant
messaging, and Internet news to organize and streamline their schedules,
and e-commerce to avoid extra trips to the mall. Telecommuting has also
helped reduce gas consumption and traffic congestion.
Click Here to View Full Article
to the top
Remembering the Search for Jim Gray, a Year Later
InformationWeek (02/12/08) Babcock, Charles
Over 12 months have passed since ACM A.M. Turing Award winner and
Microsoft researcher Jim Gray was lost at sea, and his disappearance
prompted a search for his boat by members of ACM, the IEEE Computing
Society, and the University of California at Berkeley. Fellow Microsoft
researcher Tom Barclay recalled the suggestion to sweep the ocean for
Gray's vessel using satellite data, which was a focus of Microsoft's
Virtual Earth project. Together with Gray, Barclay had constructed the
Terraserver-USA database, which was based on U.S. Geological Survey area
photo data. Barclay acquired satellite imagery for the search with the
help of several Virtual Earth team members, and imagery from GeoEye,
Digital Globe, and RadarSat was compiled through the Boulder, Colo.,
Virtual Earth facility. After the Coast Guard decided to call off its
search, Barclay and another former co-worker of Gray's helped arrange
additional flights through a network of friends and associates. Barclay
said he was receiving more than 500 emails a day from people who wanted to
aid in the search as well as offer advice and equipment. A tribute to Gray
on May 31 is being planned by ACM, UC Berkeley, and IEEE.
Click Here to View Full Article
to the top
Computing Education and the Infinite Onion
Computing Research Association (02/11/08) Reed, Dan
Computing Research Association chair Dan Reed writes that new approaches
to computing education are needed to reverse declining enrolment in
computer science. He says that little has changed in computer science
curricula in the past 30 years. Its core elements remain centered on
formal languages and theory, data structures, programming languages and
compilers, operating systems, and computer architecture. Successive layers
have been added to the computing curriculum onion, including graphics and
human-computer interaction, artificial intelligence, mobile and embedded
devices, computational geometry, networks and distributed systems,
numerical and scientific algorithms, parallel computing, databases and data
mining, among others. Reed says that as the computing curriculum onion
grows larger and more complex, the number of students will continue to
approach zero as the knowledge and degree expectations nears infinity. He
says that most graduates solve problems using computers rather than working
in core computing technologies and computing as a problem-solving process
needs to be accepted and introduced into education through technically
challenging and socially relevant problem domains. "This does not mean we
should eviscerate the intellectual core of computing," Reed writes, but
that education must emphasize relevance and introduce computing as a means
to solve problems.
Click Here to View Full Article
to the top
Officials Step Up Net-Neutrality Efforts
Wall Street Journal (02/13/08) P. A4; Schatz, Amy; Searcey, Dionne;
Kumar, Vishesh
Big broadband companies and federal lawmakers could soon clash over
whether consumers have the right to access as much as they want on the
Internet, as fast as they want it, without paying extra for the privilege.
Complaints that cable giant Comcast is deliberately slowing some Internet
traffic are spurring movements in Congress to block efforts by broadband
companies to favor some Internet traffic over others. Reps. Edward Markey
(D-Mass.) and Chip Pickering (R-Miss.) recently introduced a bill that
would change federal law to ensure Internet traffic has protections similar
to cell phone calls, which companies are required to connect without delay.
The Internet Freedom Preservation Act would establish a policy that would
"maintain the freedom to use for lawful purposes broadband
telecommunications networks, including the Internet, without unreasonable
interference from or discrimination by network operators." The bill would
give the Federal Communication Commission more authority to police Internet
providers and ensure Internet traffic is delivered fairly. The increased
efforts to enforce net neutrality come at a time when a massive increase in
video downloading and online viewing has forced cable and phone companies
to review how they price Internet service, with many considering deploying
fee plans that charge based on the extent of Internet usage.
Click Here to View Full Article
to the top
Where Are the Women in Tech?
MSNBC (02/10/08) Tahmincioglu, Eve
Women hold only 27 percent of computer-related jobs, reveals a study by
the National Center for Women & Information Technology (NCWIT). The study
also found that from 1983 to 2006, the percentage of computer science
bachelor's degrees awarded to women dropped from 36 percent to 21 percent.
Renee Davias, president of the Rochester Chapter of the Association of
Women in Computing, has been working to encourage women to get into the
technology field, including speaking to the Girl Scouts about the
profession. She says more women are needed in group development projects
because they help the team dynamic. "Women are much more matter-of-fact,
more collaborative," she says. NCWIT CEO Lucy Sanders says the declining
interest in tech careers among women is due to the way computer science is
taught in schools and how society depicts the profession as geeky and
nerdy. University of Arkansas professor Bill Hardgrove says it is
important to get more women into the field as they are intuitively better
at designing interfaces, and one of the biggest complaints about technology
is that user interfaces are poorly designed. SAS CIO Suzanne Gordon says
that although many women work in computing areas related to payroll,
accounts receivable, and the help desk, it is difficult to find women to
work in hardware, directly with machinery such as Unix and PC boxes and
networks. Gordon says this is a problem because groups work much better
when at least one woman is in the group as they bring a different
perspective and viewpoint.
Click Here to View Full Article
to the top
Joint Nokia Research Project Captures Traffic Data Using
GPS-Enabled Cell Phones
University of California, Berkeley (02/08/08) Yang, Sarah
Nokia and University of California, Berkeley researchers are developing
technology that could change how drivers navigate through congested
highways and gather information on road conditions. The Mobile Century
project recently conducted a field experiment in which researchers tested
the feasibility of using GPS-enabled mobile phones to monitor real-time
traffic flow. One hundred vehicles were deployed on a 10-mile stretch of
highway. Each car was equipped with a GPS-enabled mobile phone running
special software that periodically sent anonymous speed readings to servers
that computed traffic conditions. Traffic information was displayed on the
Internet, allowing viewers to see traffic in real time. An independent
tracking feature allowed a command center to track the position of the cars
to coordinate the experiment and ensure the safety of participants.
GPS-based systems are capable of pinpointing a car's location within a few
meters and calculating traveling speed within 3 miles per hour. The
researchers say that using GPS-equipped cell phones to monitor traffic
could help provide information on everything from multiple side-street
routes in urban areas to hazardous driving conditions or accidents on rural
roads. "There are cell phone-based systems out there that can collect data
in a variety of ways, such as measuring signal strength from towers and
triangulating positions, but this is the first demonstration of this scale
using GPS-enabled mobile phones to provide traffic related data such as
travel times, and with a deliberate focus on critical deployment factors
such as bandwidth costs and personal privacy issues," says UC Berkeley
California Center for Innovation Transportation director Thomas West.
Click Here to View Full Article
to the top
Software Gets Smart Cars Talking
ICT Results (02/11/08)
Technology developed by the European Com3React project could allow a group
of vehicles to exchange data automatically with each other and with traffic
control centers to make driving safer and more efficient. The system can
inform drivers of poor weather or road conditions immediately ahead, and
help them choose alternate routes, which could ease congestion and prevent
accidents, says Com2React (C2R) project coordinator Chana Gabay. The C2R
project's main objective was to develop software that creates a virtual
traffic control sub-center, which temporarily forms to manage a moving
group of vehicles in close proximity. "A lot of areas are not covered by
regional traffic control centers," Gabay says. "By creating virtual
sub-centers, the system extends the traffic networks to those areas." The
sub-center obtains and processes data collected by vehicles and quickly
provides drivers with instructions related to traffic and safety
conditions. The software also transmits selective data to a regional
control center and receives current traffic information to send to the
vehicles, which is processed by each vehicle's software to help drivers
make informed decisions. A prototype system was successfully tested in
Munich and Paris last summer. The researchers are now working to bring the
cost of the system down.
Click Here to View Full Article
to the top
Professors Awarded $1.1M By Keck Foundation to Pursue
Brain Research
The Tartan (02/11/08) Chandna, Marium
Carnegie Mellon neuroscience professor Marcel Just and computer science
professor Tom M. Mitchell have been awarded a $1.1 million grant to
continue their efforts in brain research and brain imaging. Their study,
"Using fMRI Brain Activations to Identify Cognitive States Associated with
Perception of Tools and Dwellings," is a collaborative effort between the
neuroscience and computer science departments. Just says this is the first
time that anyone has been able to track the specific object of a person's
thoughts. The researchers used a combination of brain scans and
machine-learning algorithms to determine what a person is thinking of at a
particular moment. Brain activity was measured using functional magnetic
resonance imaging scanning. Computer algorithms were applied to the
scanned images to decipher the signals transmitted during brain activity.
The first phase of the algorithms involved finding voxels--volumes of
elements normally used in the examination of medical data that can
represent 3D image data--that express a similar pattern over several
trials. The second phase focused on subsets of voxels, with each voxel
capable of being virtually anywhere in the numerous possible levels of
brain activity. The algorithms helped the scientists map individual items
and brain activity level. The team tested algorithms on a group of
subjects and discovered that the codes emitted in most human brains are
similar. Currently, the technology only applies to concrete objects such
as apples, but Just says that more abstract notions such as people or ideas
will be explored through this technique and one day may be identifiable.
Click Here to View Full Article
to the top
Copper Connections Created for High-Speed
Computing
Georgia Institute of Technology (02/11/08) Vogel, Abby
Georgia Institute of Technology professor Paul Kohl is working to improve
the connections between computer chips and external computer circuitry to
increase the amount and speed of information that can be sent throughout a
computer. The vertical connections between chips and boards are currently
made by melting tin solder between the two pieces and adding glue to hold
everything together. Kohl's research shows that replacing the solder
connections with copper pillars creates stronger connections, and also
allows for more connections to be made. "Circuitry and computer chips are
made with copper lines on them, so we thought we should make the connection
between the two with copper also," Kohl says. Both copper and solder can
tolerate misalignment between two connecting pieces, Kohl says, but copper
is more conductive and creates a stronger bond. Kohl and graduate student
Tyler Osborn, using funding from the Semiconductor Research Corporation,
have developed a new fabrication method for creating all-copper connections
between computer chips and external circuitry. Kohl is also developing an
improved signal transmission line to preserve signal strength over long
distances, such as in servers where inter-chip distances can be
significant.
Click Here to View Full Article
to the top
Can Artificial Intelligence Manage the Supply Chain
Better?
AI Magazine (02/11/08) Collins, John; Wellman, Michael
Trading agents have become a popular artificial intelligence application
because of their potential benefits in electronic commerce and because of
the challenges associated with models of rational decision making. A
workshop held in conjunction with the finals of the 2007 Trading Agent
Competition involved two game scenarios and two challenge events that
attracted 39 entries. A supply chain management scenario placed six agents
in the role of a PC manufacturer, with each agent having to procure raw
materials and sell finished goods in a competitive market while managing
inventor and production facilities. A procurement challenge was a side
competition that involved agents balancing risk and cost in the procurement
market by providing both long-term and short-term contracts. A prediction
challenge was another side competition that tested the price-prediction
capabilities of competing agents in both procurement and sales markets.
The CAT scenario placed agents in the role of competing exchanges, a
competition motivated by the rise of independent, for-profit stock and
commodity exchanges. CAT agents competed by defining rules for matching
buyers and sellers and by setting commission fees for their services.
Profitability was the measure of success for both the supply chain and CAT
scenarios. The challenges are important because the complexity and
uncertainty in the game scenario make it difficult to understand why one
agent outperforms another in general or in specific market conditions. The
resulting benchmark data lays the foundation for future empirical research
in this area.
Click Here to View Full Article
to the top
Supercomputer 'Virtual Human' to Help Fight
Disease
Telegraph.co.uk (02/08/08) Highfield, Roger
The Virtual Physiological Human project passed its first test when it
successfully simulated an AIDS infection in a "virtual human" to test the
effectiveness of an AIDS drug. The project, which combines the
supercomputing power of British and American computers, is testing whether
computers can be used to create tailored, personal drug treatments. The
study ran numerous simulations to predict how strongly the drug saquinavir
would interact with strains of an AIDS virus that had become resistant.
The simulated results matched reality, increasing confidence that computer
simulations could become a valuable tool for medicine. The study, by
professors Peter Coveney, Dr. Ileana Stocia, and Kashif Sadiq of University
College London, involved a sequence of simulation steps performed across
half a dozen supercomputers wired on grid in the U.K. and on the United
States' TeraGrid. The simulation took two weeks and used the computational
power approximately equal to that needed to perform long-range weather
forecasting. There are nine drugs that target the same enzyme as the
simulated drug, but doctors have no way of matching a drug to the unique
profile of the AIDS virus in a particular patient because the drug mutates
so quickly. The hope is that the drugs can be tested on a virtual human
first so that when the patient receives the drug it will work. Coveney
says the study represents the first step toward the ultimate goal of
"on-demand" medical computing.
Click Here to View Full Article
to the top
Reuters Wants the World to Be Tagged
Read/Write Web (02/06/08) Iskold, Alex
Open Calais is an API recently rolled out by Reuters that performs a
semantic markup on unstructured HTML documents for the purpose of
identifying interesting portions into metadata within documents. Calais
represents a next-generation Clear Forest offering that Reuters took
possession of in 2007, and much of its work is carried out by a natural
language processing engine integrated with a hard coded learning database
constructed by Clear Forest. The API is freely available for both
commercial and non-commercial employment, and Reuters says it is ready to
scale for an immense demand of both types of uses simultaneously. Entities
are identified, removed, and annotated for any document submitted into
Calais, and ideally an API such as Calais should be capable of accepting
URLs, because extracting structure from HTML would be no frivolous matter
for developers. Calais can be used to make search engines more intelligent
and more sensitive to related content. Automatically identifying entities
in a document also gives Calais the ability to identify what should be
linked, while structured alerts as well as the incorporation within
browsers of on the fly text analysis can be facilitated. Calais has the
advantage of an expanding semantic database of people, places, companies,
and events whose richness is increased with every new document submitted
into the system, as well as the benefits that come with training the
system.
Click Here to View Full Article
to the top
UAHuntsville Researchers Developing Computer Models to
Provide Military With Better Intelligence
University of Alabama in Huntsville (02/08/08) Garner, Ray
University of Alabama in Huntsville researchers are working to provide
better intelligence on asymmetric military threats by developing computer
models that identify trends in the behavior of Iraqi insurgents.
UAHuntsville researcher Wes Colley says some of the trends from the Iraqi
attacks show important day-to-day correlations that could be used to save
lives by heightening awareness of possible events or changing the
allocation of security assets to provide more protection. The researchers
developed a four-step process by reviewing the behavior signatures of
terrorists on 12,000 attacks between 2003 and mid 2007 and calculating the
relative probabilities of future attacks on a variety of target types. The
goal is not to predict exactly where, when, and what type of attack will
take place, but to identify which target types are more likely to be
attacked next, Colley says, noting that military commanders could make
choices from various options to reduce risk. "Despite many difficulties
with the dataset, we did find that our trend analysis very successfully
provided enhanced predictive capability when compared to the broader attack
rate," Colley says. "Our concept has proven successful in identifying
trends and correlations in the attacks."
Click Here to View Full Article
to the top
NICTA Researcher Co-Chairs W3C Emergency Services
Group
Computerworld Australia (02/04/08) Rossi, Sandra
Former World Wide Web Consortium (W3C) group advisory board member Dr.
Renato Iannella will co-chair the group's initial effort to develop an
interoperability framework for information sharing in the emergency
management community. Iannella, a researcher from NICTA, Australia's
Information and Communications Technology Research Center of Excellence,
will help lead the new Emergency Interoperability Framework (EIIF)
Incubator Group. EIIF will review and analyze the latest vocabularies used
by local, national, and international emergency groups, and develop
definitions and a framework for collaborative information sharing and
aggregation of information for emergency functions. The work will serve as
the foundation for developing a more comprehensive strategy for ontology
management and semantic information interoperability, and for creating a
proposal for W3C Working Group activity to pursue the idea further. EIIF
will run until December 2008. "It is essential that information gathered
by these organizations is stored and communicated in common formats to
ensure that information can be easily exchanged and aggregated to support
the decision-making process," Iannella says. "A key component of this
process is ensuring that consistent definitions [vocabulary] are used to
support meaningful sharing of information."
Click Here to View Full Article
to the top
'Recordable' Proteins as Next-Generation Memory Storage
Materials
ScienceDaily (02/12/08)
Researchers in Japan have developed a protein that is capable of recording
a specific information pattern on a glass slide. Tetsuro Majima and
colleagues made use of a special fluorescent protein to show that the
pattern could be read and erased at will. The novel combination of light
and chemicals allowed for the storage, playback, and erasure of
information. Protein-based memory devices have the potential to process
information faster and offer greater storage capacity than conventional
magnetic and optical storage systems, which are nearing their memory
storage capacities. The researchers also say proteins can lead to better
biosensors and diagnostic tests.
Click Here to View Full Article
to the top
Crashing Software Poses Flight Danger
New Scientist (02/11/08)No. 2642, P. 28; Marks, Paul
The software that controls aircraft is prone to mysterious glitches that
can become potentially serious, though to date no accidents caused solely
by such failures have been recorded. Still, experts warn that the odds of
such accidents happening are rising as aircraft makers prepare to make even
more aircraft operations software-dependent. Horrifying problems
attributed to software glitches that occurred while planes were in the air
are detailed in a report from the U.S. National Academy of Sciences. One
problem involved the concurrent loss of power for the pilots' flight and
navigation displays, radio, auto-throttle, and autopilot for two minutes in
an Airbus 319 in 2005. Ideally, software bugs are supposed to be spotted
and remedied in the testing phase, but the complexity of aircraft systems
and the sheer number of potential scenarios complicates this process. Most
aircraft makers follow the Radio Technical Commission for Aeronautics and
the European Organization for Civil Aviation Equipment's DO-178B guidelines
to test for bugs, but University of Bielefeld computer scientist Peter
Ladkin says that "the criteria currently used to evaluate the dependability
of electronic systems for many safety-related uses are way too weak, way
insufficient." Ladkin and systems engineering consultant Martyn Thomas
would like the writing of safety-critical software to be significantly
revised, with Thomas recommending the use of highly specialized computer
languages that do not permit the writing of ambiguous software
specifications. "Safe programming languages ... are likely to reduce the
cost and difficulty of producing dependable software," concludes the NAS
report.
Click Here to View Full Article
- Web Link May Require Paid Subscription
to the top
Biological Moon Shot
Science News (02/02/08) Vol. 173, No. 5, P. 72; Milius, Susan
The Encyclopedia of Life project seeks to furnish a portal that will
ultimately host easy access to Web pages for every species on the planet,
with the goal of revolutionizing scientific inquiry, according to project
godfather E.O. Wilson of Harvard University. The project's planners raised
$12.5 million in seed money through contributions from the John D. and
Catherine T. MacArthur Foundation and the Alfred P. Sloan Foundation, while
the implementation of the encyclopedia is being handled by a consortium of
museums and other science institutions, including the Smithsonian. The
project's first release will be a portal to basic information about fish,
and the initiative's work will be eased by supplying a pathway to trusted
databases already developed by specialists rather than creating information
resources from scratch. "The scientific community is going to make the
Encyclopedia of Life rich, and it's going to make it correct," says Mark
Westneat of the Field Museum of Natural History in Chicago. Entries will
be updated through Google-like aggregation technology. A scanning and
digitization group of encyclopedia workers at the Smithsonian's National
Museum of Natural History is collaborating with the Biodiversity Heritage
Library to place online information from volumes that describe species.
The Smithsonian's Thomas Garnett says nearly 4 million pages were scanned
in as of Jan. 25. Westneat says the Encyclopedia of Life seeks to attract
not just scientists, but middle schoolers in an effort to make science fun
and to take advantage of a generation that is skilled at Web surfing.
Click Here to View Full Article
to the top