ACM President Says White House Commitment to Increased
Education, R&D Investment Reflects Tech Community Priorities
PRNewswire (02/01/06)
ACM President David Patterson praised President Bush's American
Competitiveness Initiative outlined in his State of the Union address last
night, describing the program as very much in step with the priorities of
the research community. "Innovation deserves a prime place in the national
agenda because of its ability to create new industries and new jobs," said
Patterson. "But the only way to assure continuing innovation is to focus
on improving science, math, and technology education, and to increase
funding for fundamental research programs at the federal level." Patterson
and other leading scientists have been trumpeting the need for renewed
federal support for research amid signs that the United States'
technological advantage may be eroding at a time when scientific research
has never been a more essential piece of the national economy. Patterson
noted that federally funded research, which has led to the development of
the Internet, the personal computer, and search engines, is critical to the
nation's economic viability, as is an improved education system with
updated curricula and better-trained teachers. "In an increasingly
competitive world, innovation is required to create new industries, new
processes, and new jobs," Patterson said. "The benefits to society of
investing in research for IT and the rest of science and engineering are at
least as important for the 21st century as they have already proven for the
20th century."
Click Here to View Full Article
to the top
U.S. Defense Dollars for Computer Science Plunge
IEEE Spectrum (02/06) Vol. 43, No. 2,Kumagai, Jean
When ACM President David Patterson pitched his idea for applying
statistical machine learning to stabilize and optimize military and
commercial distributed computing systems, both DARPA and the NSF, to his
great surprise, turned him down. Concluding that high-risk research with
the potential for great impact was out of favor with the traditional
sources of federal funding, Patterson founded the Reliable, Adaptive, and
Distributed Systems Laboratory (RAD Lab), with joint funding from Google,
Microsoft, and Sun. "In this era of increasing competitive pressures,
people tend to get conservative, and descriptions like 'ambitions proposal'
tend to be a negative," said Patterson of his attempts to garner federal
funding for his artificial intelligence project. "We had to find another
model." Researchers have been sharply critical of DARPA, claiming that in
addition to the overall level funding having declined, DARPA grants heavily
favor short-term projects with clear military applications. DARPA said it
provided $207 million in funding for university computer science projects
in 2002; by 2004, that amount had dropped to $123 million, excluding
classified projects or those where the university functioned as a
subcontractor. By contrast, NSF funding for university projects has
doubled since 1999, though even that rate fails to keep up with the
advances in computer science: The agency once funded between 30 percent and
35 percent of the proposals that it received; last year it only funded 21
percent, and most grants are for $150,000 or less--well short of the scope
of projects such as Patterson's. DARPA's declining interest in university
research has attracted the attention of legislators, who have expressed
alarm that the torchbearer of technological innovation for half a century
would turn its back on innovation at a time when the United States faces an
unprecedented challenge to its global leadership.
Click Here to View Full Article
to the top
Enter the Semantic Grid
IST Results (01/31/06)
A team of European researchers at work on the OntoGrid project is
developing the technological framework for the Semantic Grid era of
computing, enabling users to easily and quickly share resources and
information. The system contains semantically-enabled middleware that
enables users to comb through a broad collection of resources to form an
organization that defies the boundaries of organization, industry, and
country, and then to disband that organization once the problem that it set
out to address is solved. "What we are doing is enriching the Grid with
semantics," said Asuncion Gomez-Perez, an OntoGrid coordinator. "This is a
visionary initiative. Few other researchers are working in this area at
present." The researchers developed two semantic applications to test over
the Grid: an insurance settlement system and an analysis application for
satellite images. The insurance application brings together in a unified
system the various participants in the claim process, including insurers,
assessors, lawyers, and repair facilities, streamlining the process and
reducing the risk of fraud with semantic-enriched information. Meanwhile,
the quality analysis component of the satellite project will provide
different aerospace organizations with consistent access to a body of
satellite images. While the cross-industry functionality of the Grid has
traditionally been hampered by its rigidity, OntoGrid promises reusable
middleware that is flexible and configurable based on the reference
architecture Semantic-OGSA. The middleware will offer users considerable
agility when addressing distributed environments, and could also link to
legacy systems.
Click Here to View Full Article
to the top
Electric Slide for Tech Industry?
CNet (02/01/06) Shankland, Stephen
Representatives from all facets of the technology industry met at Sun
Microsystems' headquarters on Tuesday for the Global Conference on Energy
Efficiency to address the consensus problem of soaring power consumption
that has emerged as IT departments are coming under increasing pressure to
pack more servers into finite spaces while energy prices continue to rise.
While a typical six-foot server rack would have consumed 2 kw to 3 kw four
or five years ago, the same rack can now consume up to 20 kw, said Sun CTO
George Papadopoulos. Data centers continue to increase in size and
density, with an estimated 12 million square feet of new data center space
expected to appear by 2009. A recent survey found that for every kilowatt
of power consumed by computers, 1.4 kw go to waste. The group agreed that
systems need a common measure of performance to determine the severity of
the efficiency problem, though defining that process is problematic, as
each company would likely choose tests that would favor its own equipment.
In the meantime, IBM, Hewlett-Packard, and others are reviving the
decades-old technique of liquid cooling, while chip makers are addressing
the problem at the design stage, as many of today's processors have fallen
prey to leakage. The next step concerns the subsystem of the computer's
memory, which is expected to consume more than half of a computer's energy
by 2008. Sun is developing a proximity input-output technique that could
replace conventional processing chips and communications wires by directly
connecting the top of one processor with the bottom of another, as well as
a technique to replace electrical communication links with optical ones.
To bring data centers closer to capacity, companies are also investigating
ways to optimize server utilization, such as virtualization, which places
several operating systems on a single server.
Click Here to View Full Article
to the top
Professor Earns Oscar for Technical Development
Daily Bruin (02/01/06) Erlandson, Julia
University of California, Los Angeles, computer science professor Demetri
Terzopoulos and Microsoft's John Platt have won a Technical Achievement
Academy Award for their 1987 paper, "Elastically Deformable Models," which
describes the computer simulation of deformable objects, such as cloth.
The paper was recognized for advancing the industry, as it applies to
technology used in both films and video games, making computer simulations
appear more realistic. "You have to simulate on a computer how clothes are
actually built in the real world," Terzopoulos said. "There are patterns
and seams. But then they can be worn by the virtual actor or human."
Animators had to create these simulations by hand before his technique,
which draws on math, computer science, and physics. Terzopoulos notes that
while the technology has only recently been adopted, it has already been
employed in the graphics of many major movies, including "Troy," "Star
Wars," and several Pixar films. Terzopoulos has begun to develop virtual
characters capable of acting and reacting independently, and notes that his
work in vision and graphics could also have a significant impact on medical
imaging. Scientific and Technical Academy Awards recognize contributions
to the progress of the field and, unlike motion pictures, are not limited
to the achievements in the past year.
Click Here to View Full Article
to the top
'Smart' Engine Shows Promise for Leaner, Greener
Vehicle
Newswise (01/30/06)
Researchers at the University of Missouri-Rolla are developing an advanced
controller that holds the potential to make engines run more cleanly and
efficiently, demonstrating particular promise in the technique of exhaust
gas recirculation (EGR), which reduces emissions of nitrogen oxide.
Electrical and computer engineering professor Jagannathan Sarangapani
explains that traditional engines require both air and fuel to function,
but a reduction in the amount of fuel required or diluting the combination
with inert gases will alter the engine's behavior. A software program
implements a neural network controller that the researchers created, which
is capable of learning from successful connections that it has made. "The
neural network observer part of the controller will assess the total air
and fuel in a given cylinder in a given time," said Sarangapani. "It then
sends that estimate to another neural network, which generates the fuel
commands and tells the engine how much fuel to change each cycle." Jim
Drallmeier, professor of mechanical and aerospace engineering, notes that
speed is a critical factor, as the controller operates within a period of
milliseconds. Before the design can transition to hardware, it must
surmount the theoretical obstacles that limit the controller's
understanding of the engine's operating conditions and develop an
appropriate cyclical signal for fuel command. Smart controllers could
improve on today's catalytic converters that fail in conditions such as a
cold start before the engine has warmed up. Sarangapani and Drallmeier's
research is jointly funded by the NSF and the Environmental Protection
Agency.
Click Here to View Full Article
to the top
Why America Needs to Open Its Doors Wide to Foreign
Talent
Financial Times (01/31/06) P. 15; Barrett, Craig
The immigration crisis that is unfolding in the United States has nothing
to do with the 11 million illegals currently in the country, but rather
with the closed-door policy that keeps out many of the world's best
scientific and technical minds, to the detriment of U.S. innovation and
technical leadership, writes Intel Chairman Craig Barrett. Dependence on
foreign talent is not new--the computing industry has drawn on German and
Asian immigrants to fill the gap in knowledge workers for decades. The
United States simply does not produce enough graduates to meet the growing
needs of the technical industry. The H1-B visa program allows 65,000
foreigners to come to the United States and work, though even that number
is inadequate in the face of the rising demands for knowledge workers, and
the program is oversubscribed. The H1-B program also does not provide
automatic entry to foreign students who graduate in the United States, but
rather sends them home after educating them partially on the taxpayer's
dime. Concern over unchecked illegal immigration has caused a legislative
backlash that is overly restrictive on legal, and potentially productive,
immigrants that are in critical demand. The home-grown workforce is
failing also, as just 5 percent of U.S. students get engineering degrees,
compared to a full half of Chinese graduates. The primary and secondary
education systems are partially at fault, as students at those levels test
significantly behind international students in math and science. U.S.
companies that pride themselves on hiring American workers are facing a
test, as the talent that they need is dwindling in the United States, while
hiring foreign workers is materializing as the formula for success. To
remain competitive, the United States must reform the primary and secondary
school systems, colleges and universities need to devote more funding to
science education, and the visa requirements for foreign workers must be
relaxed.
Click Here to View Full Article
to the top
Interplanetary Broadband
Technology Review (01/31/06) Bullis, Kevin
Researchers are exploring highly sensitive single-photon detectors as a
tool for picking up low-power lasers that could facilitate high-speed
communications for astronauts to relay information, and even stream video,
to Earth from space. Today's missions rely on agonizingly slow downloads,
while standard optical transmissions could convey video data, though the
lasers required to carry the signals from such great distances have
prohibitive power requirements. A new single-photon detector based on
nanotechnology offers both speed and efficiency, holding the potential to
make interplanetary communications a reality. MIT electrical engineering
professor Karl Berggren notes that nanofabrication helped overcome the
longstanding obstacle of using high bandwidth to detect extremely low-level
light. While existing single-photon detectors require more power to
increase their transmission distance, the researchers added a photon trap
to the detector. The photon trap increases the chances of the wire
absorbing a photon, roughly tripling the efficiency of the detection
device. Quantum cryptography also stands to benefit from single-photon
sensors, as eavesdroppers could be easily detected when information is sent
with a single photon. Single-photon sensors could double or triple the
transmission distance of quantum cryptography, though the cost would be
prohibitive for most commercial applications. Focusing the photons on the
miniature sensors remains a challenge, however, which Berggren expects to
be resolved within two years.
Click Here to View Full Article
to the top
ISU Supercomputer to Help With Corn Genome
Associated Press (01/31/06) Pitt, David
Researchers at Iowa State University will be able to process data on the
corn genome in days, rather than two to three months, by using the school's
new IBM BlueGene supercomputer. Introduced Monday, the $1.25 million
BlueGene/L computer is one of the top 10 computers in the United States,
and is the 73rd fastest in the world. The supercomputer can perform as
many as 5.7 trillion calculations per second, according to Srinivas Aluru,
professor of electrical and computer engineering, and has the processing
power of more than 2,000 home computers and more than 1,000 times their
storage capacity. Deciphering the more than 60 million bits of genetic
material of corn could one day lead to new uses for the plant, such as in
plastics, fuel, and fiber. The corn genome project is expected to take
about three years, and involves four other universities. Researchers also
want to use the supercomputer, which is funded by a $600,000 grant from the
National Science Foundation and $650,000 from the university, to study
protein networks in organisms. "It's the unavailability of computers of
this magnitude that limits many projects in engineering and computer
science," says Bob Jernigan, professor of biochemistry and biophysics.
"This can have an important influence on all kinds of research."
Click Here to View Full Article
to the top
'Free' Is the New 'Cheap' for Software Tools
CNet (01/31/06) LaMonica, Martin
IBM recently unveiled DB2 Express-C, a free database for software
developers that follows similar rollouts from Oracle and Microsoft,
symbolizing the effect that the emergence of free, open source software has
had on the business models of major proprietary vendors. "Commercial
vendors competing in areas where there are credible, free open-source
alternatives are increasingly being pressured to lower the barriers to
entry to their product," said RedMonk's Stephen O'Grady. This is
particularly true with programming tools such as database servers, though
offering a product for free can also be a savvy business move, as it can
broaden a company's customer base. IBM's Bernie Spang credits open-source
pioneers with demonstrating the viability of the model, and looks for the
free version of DB2 to nurture the growth of applications that spring from
that database. Free products can sometimes lure developers into embracing
a company's entire line of software. Whether the free databases that
Microsoft, IBM, and Oracle are offering will produce revenue is a secondary
concern to simply staying competitive in an increasingly open-source
programming environment, notes Forrester's Noel Yahanna. Forrester
predicts that the open-source database market will reach $1 billion by 2008
as corporate adoption continues to increase. Open-source use engenders the
appearance of add-on products, such as browser plug-ins, as well as the
combined use of multiple components. Prices have also been in steady
decline, particularly since the advent of Eclipse, which made it all but
impossible to charge for a rudimentary integrated development
environment.
Click Here to View Full Article
to the top
TSU Develops Software, Intelligence for the
Military
Tennessee State Meter (01/30/06) Terrell, Taylon
The computer science and engineering departments at Tennessee State
University (TSU) are participating in a project that could facilitate the
use of micro-sensors in robots in combat zones. The university has teamed
up with Penn State University for a research project called the Center of
Excellence for Battlefield Sensor Fusion, and the initiative is being
funded by the U.S. Army Research Office. The engineering department is
focusing on providing robots with mobility and sophisticated sensoring and
wireless networking for communication, the generation of an enormous amount
of data, and for coordinating positions and monitoring enemy movement. The
computer science team has developed software that allow robots featuring
sensors, cameras, grippers, and mechanical arms to respond to commands and
recognize objects, including a bomb. The robots with grippers and robots
with arms, which cost about $10,000, are designed to work together in
moving objects and place sensors around the battlefield in areas that
soldiers are unable to reach. "The purpose of utilizing this software is
for replacing soldiers in high risk areas so that it [the software] can
become the eyes and ears of soldiers," says Amir Shirkhodaie, professor of
mechanical and manufacturing engineering at TSU and head of the program.
Click Here to View Full Article
to the top
'Mocha' Energizes Online Scheduling
Brown Daily Herald (01/27/06) Carmody, Brenna
Brown University students have a new alternative to registering for
classes online in Mocha, a new program developed by five computer science
students. Programmers Dave Pacheco, Daniel Leventhal, Adam Cath, Dave
Hirshberg, and Bill Pijewski, all juniors, are touting Mocha as being
easier to use and more helpful than the Brown Online Course Announcement
(BOCA) system. With BOCA, students can use only one description at a time
when searching, and the system often slows down as the start of classes
approaches. However, Mocha is designed to enable "any kind of search you
can think of," says Pacheco, adding that the load on the servers is not
overbearing because the new interface condenses searches. Using Mocha,
students can add courses to a shopping cart, create a color-coordinated
schedule, bookmark classes, and select classes without entering a number of
zeroes. Mocha is available on a Web site created by the students, and the
Computer Science Department is hosting the Web site on its servers. The
programming students say future enhancements to the software will include
compatibility with Apple's iCal calendaring program, email exam reminders,
and a way to link users schedules.
Click Here to View Full Article
to the top
The Computer Virus Comes of Age
Financial Times (01/30/06) P. 6; Palmer, Maija
The appearance of the Brain virus 20 years ago touched off an age of
computer vulnerability that has advanced from a slow-moving, innocuous
virus transmitted via floppy disk to modern estimates of around 120,000
viruses, some of which are capable of bringing down corporate networks and
intercepting sensitive personal information. The roughly 1 billion
Internet users, many of whom use high-speed connections, enable viruses to
travel far more quickly today than they did in the days of Brain. MyDoom,
for instance, spread through email, infecting an estimated 250,000
computers a day in 2004. Sophos' Graham Cluley estimates that a computer
operating without anti-virus software has a 50 percent chance of being
infected by a virus if it is connected to the Internet for just 15 minutes,
even if it transmits no email and stays off of the Web. Antivirus software
is reasonably effective at keeping intruders out, but it comes at a
tremendous expense (spending on antivirus software is expected to reach
$5.9 billion by 2009) and drains a computer's processing power. Whereas
early viruses were relatively benign, often the product of a teenager
showing off for his friends, virus writers have grown more malicious,
deploying programs that erase hard drives, crash networks, and swipe
identities. Today's viruses do not make the same headlines as the infamous
Love Letter and Anna Kournikova viruses early in the decade, but they are
far more destructive, and often the product of organized criminal gangs.
"Now that the goal is for profit, we are seeing fewer big outbreaks of
viruses," said McAfee's Sal Viveros. "The virus writers don't want to make
headlines, they want to target a smaller number of people for specific
information." The U.S. Treasury advisor reports that revenue from
cybercrime now exceeds the illegal drug trade, and the trend is only likely
to accelerate should hackers turn their sites to mobile devices.
Click Here to View Full Article
to the top
Building Trust and Validation Into Distributed Computer
Networks
University of Southampton (ECS) (01/30/06)
The Provenance research project in Europe is focused on providing a way to
document results generated while using distributed computer networks, and
to manage the history of the information within the Grid infrastructure.
The EU project is designed to offer authorized users of Grids the necessary
background information for reviewing processes or experiments for errors.
"Ultimately we are building trust, proof and validation into Grids,
enabling users to have the highest levels of confidence in the information
available," explains Steve Munroe, EU Provenance Exploitation Manager in
the School of Electronics and Computer Science (ECS) at the University of
Southampton. The project has developed an initial public version of
software requirements for the provenance architecture, with logical and
process components for provenance systems. "Provenance can be used to
determine that a given process has adhered to the necessary regulations,"
says Munroe, "thus enabling the end user to place trust in the results
received." ECS professor Luc Moreau is the architect of the project, which
includes IBM UK and the Computer and Automation Institute of the Hungarian
Academy of Sciences among its partners.
Click Here to View Full Article
to the top
Browsers Face Triple Threat
Techworld (01/31/06) Broersma, Matthew
Researcher Michael Zalewski says there are three bugs, which he calls
"cross site cooking," in the handling of cookies that could possibly be
used to carry out attacks on several commercial Web sites. "Cooking"
attacks may be used against commercial sites to overwrite stored
preferences, session identifiers, authentication data, and shopping cart
contents to commit fraud, according to Zalewski. The bugs are used to
create and design cookies, but have not been fixed in the major browsers,
even though they were first discovered eight years ago. "These
shortcomings make it possible (and alarmingly easy) for malicious sites to
plant spoofed cookies that will be relayed by unsuspecting visitors to
legitimate, third-party servers," wrote Zalewski in a post to the BugTraq
security mailing list. Browsers normally reject cookies where the domain
specified is too broad; however, that does not work in Mozilla-based
browsers. The bug can attack some sites with international domain names
and possibly steal information from e-commerce Web sites around the world,
according to Zalewski. He suggests making changes in the HTTP cookie
format, and implementing a workaround where browsers could make a list of
potentially affected high-ranking domains. Zalewski says browser vendors
must take action and strip the "idle" periods out of cookie domain data as
a possible solution to the problem.
Click Here to View Full Article
to the top
Assurance Provider: Designing a Roadmap for Information
Security
Military Information Technology (01/28/06) Vol. 10, No. 1,Donnelly,
Harrison
Director of the National Security Agency's Information Assurance (IA)
Directorate Daniel Wolf is tasked with defining and deploying the IA
strategy to shield the Global Information Grid (GIG) and all related
programs, and he works with NSA customers to guarantee that the agency's IA
programs keep evolving to address their current and future needs for secure
networks and communications, supplying IA technical consulting services and
high-assurance products to the United States. Wolf says the IA roadmap for
the GIG is a collaborative effort with many organizations, including the
Defense Information Systems Agency and U.S. Strategic Command; he notes
that his group's work with the GIG's IA component can be used to fulfill
the IA requirements of the Defense Department's Information Sharing
Environment (ISE) program. He remarks confidently that "IA features are
becoming common and, while we have a long way to go, the robustness of IA
features is also increasing." Shaping a system to provide an environment
where any piece of information can be cheaply and securely delivered to
anyone at any time is the core goal of the GIG-IA architecture, according
to Wolf. The IA roadmap calls for the implementation within a system-high
environment of improved information sharing and auditing and misuse
capabilities. Wolf describes the Cryptographic Modernization Initiative as
a joint DoD/NSA venture to facilitate a 21st century upgrade for
cryptographic capabilities, and says the Committee for National Security
Systems has organized a working group at the national policy level to synch
up the program's planning and deployment. Key trends Wolf sees in the
field of encryption technology include vendors' adoption of common
interoperable algorithms, as encouraged by NSA's definition of its Suite B
project. Acknowledging that IA's software foundation is weak, Wolf lists
such challenges as improving software product evaluation capabilities,
adding scalability to evaluation, moving to the system level without
introducing vulnerabilities, and accommodating software's ever-increasing
size and complexity.
Click Here to View Full Article
to the top
Collaborative Advantage
Issues in Science and Technology (01/06) Vol. 22, No. 2, P. 74; Lynn,
Leonard; Salzman, Hal
The emerging global economy requires a new global strategy, as the United
States can no longer sustain technological domination, according to Case
Western Reserve University professor Leonard Lynn and the Urban Institute's
Hal Salzman. What is needed, the authors argue, are technology development
policies promoting "collaborative advantage" that "comes not from
self-sufficiency or maintaining a monopoly on advanced technology, but from
being a valued collaborator at various levels in the international system
of technology development." Lynn and Salzman say the generation and
proliferation of new communications, information, and work-sharing
technologies over the last 10 years has supported a new trade environment
that renders once-successful competitive strategies for U.S. firms
obsolete, and attempts to recapture technological leadership unworkable.
Several elements are cited as factors that could collectively unbalance
U.S. firms' capacity to innovate: U.S. multinationals outsourcing tech
development operations without conducting rigorous cost/benefit analysis
first; increased numbers of offshored activities as a result of more and
more multinationals offshoring; dwindling multinational investment in
U.S.-based universities; the rapid emergence of competing innovation
systems; and America's waning ability to attract top science and technology
talent. Lynn and Salzman recommend the development of national strategies
that emphasize collaborative advantage, a goal that requires the creation
of new outsourcing cost/benefit analysis tools and an aggressive search for
global partnership opportunities. Monopolization of the global S&T
workforce must give way to a model that supports an environment where S&T
brainpower flows freely. Finally, America must devise an S&T educational
system that teaches collaborative competencies, not just technical
know-how.
Click Here to View Full Article
to the top
XML: The Center of Attention Up and Down the Stack
IEEE Distributed Systems Online (01/06) Vol. 7, No. 1,Goth, Greg
Despite its historical failure to live up to its promise, XML appeared to
come into its own in 2005, with Microsoft and OASIS squaring off to secure
governance of the standards that will guide its development. ZapThink
projects that 48 percent of corporate networks will use XML by 2008, up
from just 15 percent in 2004. XML messages can be as many as 50 times
larger than binary messages, causing some to speculate that networks could
get bogged down with heavy XML traffic for Web services and
service-oriented architecture (SOA). IBM conducted XML efficiency tests in
2003 and found that only a small portion of a server's processing ability
handled the tasks of the application, with the rest consumed by overhead,
spawning the niche industry of XML acceleration appliances where a
plug-and-play piece of hardware takes over the bulk of the XML processing
from the server. The popularity of the accelerator calls into question the
future of middleware, with many companies beginning to realize that they
can do without the enterprise service bus. Iona Technologies CTO Eric
Newcomer contends that such speculation is premature, however.
"Everybody's always predicting the death of some old technology, and we
still have mainframes and Cobol years after it was predicted they would
die," he said. Newcomer believes that middleware vendors can still survive
if they develop ways for their product to fit into the changing landscape.
The question of how to deploy XML most effectively has led IBM and Sun to
challenge Microsoft for control over the development of the Open Document
Format, a competition that industry analysts around the world are watching
closely.
Click Here to View Full Article
to the top