2007 ACM Turing Award Winners to Speak at 45th Design
Automation Conference
Business Wire (05/20/08)
The winners of the 2007 ACM A.M. Turing Award will be guest speakers in a
session on the first day of the 45th Design Automation Conference (DAC), on
June 9, 2008. ACM honored Edmund M. Clarke, E. Allen Emerson, and Joseph
Sifakis with the award because of their contributions to the development of
model checking as a very effective verification technology, which is widely
used today. Clarke is the FORE Systems professor of computer science and
professor of electrical and computer engineering at Carnegie Mellon
University; Emerson is an endowed professor in computer sciences at the
University of Texas at Austin; and Sifakis is the founder of Verimag
Laboratory in Grenoble, France. "We are thrilled to have these three
distinguished individuals as speakers at DAC this year," says Limor Fix,
general chair, 45th DAC Executive Committee. "It's sure to be a memorable
highlight of the conference." The winners will be introduced by ACM
President Stuart Feldman, Jasper Design Automation CEO Kathryn Kranen, and
Intel director of research Andrew Chien. DAC is scheduled for June 8-13,
at the Anaheim Convention Center in Anaheim, Calif. ACM will present the
Turing Award to Clarke, Emerson, and Sifakis at the annual ACM Awards
Banquet on June 21, in San Francisco.
Click Here to View Full Article
to the top
Alarming Open-Source Security Holes
Technology Review (05/20/08) Garfinkel, Simson
An open-source programming error made in May 2006 that reduced the amount
of randomness used to create cryptographic keys in the widely used OpenSSL
library have created serious security vulnerabilities in at least four
open-source operating systems, 25 applications programs, and millions of
computer systems. Although the vulnerability was discovered on May 13 and
a patch has been distributed, installing the patch does not repair damage
to the compromised systems and some computers may be compromised even
though they are not running the code. Modern computer systems use large
numbers to generate keys that are used to encrypt and decrypt data sent
over a network. The error reduces the number of different keys that Linux
computers can generate to 32,767, making it significantly easier for
hackers to guess the key. Moreover, keys created by the computers with the
error are not fixed when the patch is installed. It's impossible to know
how many computers are affected because vulnerable keys could have been
transferred to non-open source systems if a file encrypted by the flawed
system was transferred to another system. The error was made when
programmers incorrectly used a tool that was intended to catch programming
bugs that lead to security vulnerabilities. Programs that use OpenSSL
include the Apache Web server, the SSH remote access program, the IPsec
Virtual Private Network, secure email programs, and many others.
Click Here to View Full Article
to the top
Proponents, Critics Give No Ground in Tussle Over
E-voting
Computerworld (05/19/08) Vol. 42, No. 21, P. 12; Weiss, Todd R.; Gross,
Grant; McMillan, Robert
Advocates and critics of touch-screen voting systems are refusing to budge
on their opposing views about the technology's reliability, with the former
continuing to testify to its accuracy and security while the latter
maintain their position that susceptibility to hacking and miscounts are
among the risks users of e-voting systems run. Two studies released in
March from The Brookings Institution and InfoSentry Services conclude that
most voters are comfortable with touch-screen systems, while Gary Bartlett
with the North Carolina State Board of Elections says "the routine voter
has not expressed any dissatisfaction with or distrust of any type of
e-voting equipment." Skeptics counter that such studies are irrelevant
because they are based on people's beliefs and feelings rather than on hard
facts. Experts such as Johns Hopkins University professor Avi Rubin say
the chief problem with e-voting systems is the absence of any way for
voters to assuredly know that their votes are being counted properly.
Election Technology Council executive director David Beirne contends that
many e-voting opponents "are only focusing on the perceptions" of problems.
In some cases election officials are the ones doubting the technology's
reliability, while e-voting vendors and other supporters are quick to say
that human rather than technical error is usually the cause of problems
with e-voting machines.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Design Revamp for '$100 Laptop'
BBC News (05/21/08)
The new version of the One Laptop Per Child (OLPC) XO laptop looks like an
e-book and costs just $75. Instead of having a rubbery keyboard like the
first model, the XO2 has two touch screens divided by a hinge that allows
the device to be oriented either as a laptop and keyboard or like the pages
of a book. The design change combines the functions of a laptop,
electronic book, and electronic board and allow several children to use the
device simultaneously. The XO2 is also more energy efficient, half the
size of the first device, and lighter and easier to carry. OLPC founder
Nicholas Negroponte says the XO2 is a totally new concept for learning
devices. Initially the new device will be promoted as an e-book reader
with the ability to store more than 500 e-books. OLPC says the price tag
for the new devices will be achievable because of falling flat-panel screen
prices, which is the most expensive of the new laptop components.
Negroponte also recently announced the resumption of the Get-One-Give-One
program that allows people in wealthy nations to buy two XO laptops and
donate one to a child in a developing nation.
Click Here to View Full Article
to the top
The H-1B Visa Dilemma, Part 2: What to Do?
TechNewsWorld (05/20/08) Burger, Andrew K.
Google's Keith Wolfe says Google and other American companies are in a
fierce competition with foreign companies for the world's top talent, and
if U.S. employers are unable to hire the best candidates due to H-1B visa
restrictions, many of whom already study at American universities, foreign
companies will. Meanwhile, H-1B opponents say the national economy and
U.S. programmers, particularly those over 30, suffer when American
companies hire foreign workers. Although the high-tech industry claims
that it needs to import workers because there is a shortage of qualified
people in the U.S., University of California, Davis professor Norman
Matloff says that argument "flies in the face of the economic data."
Matloff says companies are using visa programs to avoid hiring older
workers, giving the example of one of his students that held several
patents but was forced to leave the field after he turned 30 because he
found it difficult to get engineering work, even though his former
employers were hiring workers with H-1B visas. Part of the problem is that
the legal definition of prevailing wage, which companies must pay H-1B
employees, is filled with gaping loopholes, according to a 2003
Congressional report. Matloff says that Congress added even more loopholes
in 2004 legislation. "The advocates of globalization are right about one
thing: Globalization is here to stay," Matloff wrote in the November 2004
issue of Communications of the ACM. "But their claims of its benefits are
misleading, and their remedies will not work, leading only to frustration
and disappointment by U.S. IT workers and missed opportunities by U.S.
businesses. Genuinely thoughtful, realistic solutions to the problems are
imperative."
Click Here to View Full Article
to the top
Homeland Security Helps Reduce Open Source Flaws
InternetNews.com (05/20/08) Kerner, Sean Michael
A Department of Homeland Security multi-year initiative intended to
improve open source code quality, launched over two years ago, has reduced
the defect density in 250 open source projects by 16 percent, essentially
eliminating over 8,500 defects, Coverity reports. The report comes at a
time when open source software is becoming an increasingly integral part of
critical infrastructure in government and private enterprise. Coverity
runs scanning tools on the open source projects included in the initiative
to find coding errors. While many of the projects have benefited from
running Coverity's scans, not every project has managed to reduce errors,
primarily because they have not been actively using the results from the
scan. Projects working in Perl, PHP, Python, Postfix, Samba, and TCL have
been able to reduce their code defect densities by using data from the
Coverity scans. Coverity found a clear pattern indicating that certain
errors occur more often, specifically Null Point Dereferences, which
occurred 28 percent of the time. The Coverity report says this error often
occurs when one code path initializes a pointer before its use, but another
code path bypasses the initialization process. The second most common
defect is resource leaks, at 26 percent of all defects, which often involve
failure to release resources when the initial allocation succeeds.
Click Here to View Full Article
to the top
I, Computer: Engineered Bacteria Become the First Living
Computers
Science News (05/19/08) Barry, Patrick
Scientists have genetically engineered the bacterium E. coli so that its
DNA computes a classic mathematical puzzle known as the burned pancake
problem. The new research is the first to achieve DNA computation in
living cells. "Imagine having the parallel processing power of a million
computers all in the space of a drop of water," says Davidson College
biologist Karmella Haynes. "It's possible to do that because cells are so
tiny and DNA is so tiny." The potential computational power of programmed
bacteria is immense, but the DNA-computation system created by Haynes and
her colleagues can only handle a limited set of mathematical problems.
Meanwhile, researchers in Israel recently designed DNA molecules that can
compute games of tic tac toe. "I liken this to where video games were when
Pong first came out," says Missouri Western State University mathematician
Jeffrey Poet. The burned pancake problem that Haynes's E. coli DNA solves
is a metaphor for sorting large amounts of data into the right order by
repeatedly flipping chunks of data. Knowing the minimum number of flips to
solve the problem allows programmers to know when their software has been
fully optimized to sort data as quickly as possible. The flipping is done
by an enzyme taken from the salmonella bacterium. The enzyme flipped
segments of E. coli's DNA marked by genetic flags until the DNA was sorted
into the correct order and it spelled out a code for a gene that gives the
bacterium resistance to an antibiotic.
Click Here to View Full Article
to the top
Researchers Find New Ways to Steal Data
IDG News Service (05/19/08) McMillan, Robert
Researchers at the University of California, Santa Barbara (UCSB) and
Saarland University in Saarbrucken, Germany, have found unconventional ways
of stealing data. In Saarbrucken, the researchers have been able to read
computer screens using reflections on objects such as glasses and teapots.
Meanwhile, UCSB researchers have created Clear Shot, software that analyzes
a video of hands typing on a keyboard to determine what was being written.
Clear Shot was inspired by the movie "Sneakers," in which Robert Redford's
character obtains a video of his potential victim typing in his password
and says he is going to get a "clear shot." Clear Shot can analyze video
of hand movements on a computer keyboard and transcribe them into text.
UCSB graduate student Marco Cova says Clear Shot is accurate about 40
percent of the time. The software also suggests alternative words that may
have been typed. Saarland University professor Michael Backes says his
research began as a fun project to see if he could tell what other people
were working on by watching windows near computer monitors. The
researchers soon found that using a $500 telescope focused on a reflective
object in front of a monitor could create readable images of Word
documents. The researchers are now working on new image analysis
algorithms and using astronomical cameras in the hopes of getting better
images from more difficult surfaces such as the human eye.
Click Here to View Full Article
to the top
USF Profs' Robots Arrive as Rescuers
St. Petersburg Times (05/16/08) Sickler, Shannon Colavecchio-Van
University of South Florida (USF) professor Robin Murphy and Stanford
University professor Clifford Nass have received a $500,000 Microsoft grant
to develop the Survivor Buddy, an emergency robot companion for people
stuck in dangerous situations. Murphy envisions the robot playing soothing
music to trapped victims while displaying images of loved ones or rescuers
trying to reach them. The robot will also be able to give the victim water
and relay vital signs to doctors. Murphy, as director of the USF's
Institute for Safety Security Rescue Technology, works to create robots
designed for disaster rescue. The first test of Murphy's robots came after
the Sept. 11, 2001, attack on the World Trade Center. Murphy's robots
could go deeper into the wreckage than rescue workers and dogs, and while
the rubble was still burning. Murphy's robots were also used to respond to
hurricanes Charley, Wilma, and Katrina. Murphy spent her 50th birthday and
25th wedding anniversary at the Crandall Canyon mine in Utah last year,
trying to get a robot outfitted with cameras and other equipment to find
the six trapped workers. "My dream is that one day you'll see rescuers and
dogs at a disaster site, but if you don't see a robot you'll say, 'Where
are they?' because they'll have become so commonplace," Murphy says.
"They'll do things dogs and people can't."
Click Here to View Full Article
to the top
A Baseball Cap That Reads Your Mind
PhysOrg.com (05/16/08) Zyga, Lisa
Researchers at Taiwan's National Chiao-Tung University and National
Cheng-Kung University and the University of California, San Diego have
designed a new bio-signal monitoring system that fits inside a baseball cap
and detects and analyzes electroencephalogram (EEG) signals from the
wearer's brain. The cap is capable of determining if someone is getting
too tired to drive based on brain-wave patterns and could be configured to
control TVs, computers, and other electronic devices. The wireless system
can process and provide feedback in real time. The researchers say that
measuring EEG signals enables the brain-computer interface system to
monitor an individual's physiological and cognitive states. The system
capitalizes on recent advancements in sensor and information technology to
reduce power consumption and production costs. It can run on a lithium-ion
battery for about two days before needing charging, and the researchers
hope to increase the device's efficiency. The cap includes five embedded
dry electrodes for the wearer's forehead and one electrode for behind the
ear to read EEG signals. The system includes Bluetooth transmissions for
distances up to 10 meters and RF transmissions for distances up to 600
meters.
Click Here to View Full Article
to the top
Video Search Engine Watches and Learns
New Scientist (05/19/08) Reilly, Michael
Researchers led by Adrian Ulges at Germany's University of Kaiserslautern
have developed TubeTagger, a Web video categorization program that learns
how to add keyword tags to videos by watching YouTube. After being given a
keyword such as "soccer," TubeTagger automatically downloads 50 YouTube
videos that humans have labeled with that word and examines the color and
motion content of each video. The researchers repeated this learning
process 22 times using words associated with common sports and words such
as "riot" or "interview." After the training, TubeTagger was shown a
series of videos that it had never seen before, after which it came up with
three possible tags for each video as well as a confidence level for each
tag. TubeTagger chose the most appropriate tag 37 percent of the time, but
the success rate dropped when the system was tested using videos from other
sources. The program also scored better with some words than with others.
Ulges says these results prove the concept and suggest that TubeTagger's
ability to learn from "real world" video makes the program more scalable
than other systems.
Click Here to View Full Article
to the top
Inside Lockheed Martin's Wireless Security Lab
Network World (05/19/08) Reed, Brad
Lockheed Martin's Wireless Cyber Security Lab is engaged in a race with
hackers to catch flaws and vulnerabilities in wireless security in the
hopes of correcting them before they are exploited. "We're trying to
ensure that something similar [to 9/11] doesn't happen in the realm of
wireless communications," says lab director John Morrison. Lockheed
Martin's Perri Nijeb says the biggest nascent wireless security threat is
the blurring of the boundary between home and the office, as employees
increasingly access company data via corporate VPNs from their residences.
To address this problem, the company has been testing numerous types of
consumer technology, including cell phones, which have been moving to
enterprise networks, and the spread of Wi-Fi hot spots has been of
particular concern because of the technology's growing ubiquity in urban
areas. Nijeb cites "connection hijacking, deliberate or inadvertent denial
of service, the creation of security holes in corporate or government
networks, and difficulty in attributing network actions to specific IP
addresses, due to the ease of hijacking" as major issues with Wi-Fi, which
Morrison says can add up to immense burdens for corporate IT departments
that fail to educate their users about security matters. Lockheed Martin
R&D investigator Jason Crawford says the proliferation of Bluetooth
technology is also a worrying trend, as products capable of picking up
Bluetooth signals outside their transmission range could theoretically be
used to track people. The problems that Lockheed Martin's wireless
security lab is focusing on are also challenges for the U.S. military,
particularly as they relate to the security of its battlefield
communications networks. Morrison says soldiers' vulnerability is highest
when they use wireless communications in crowded urban settings, which
parallels the risk corporate users run when they link to enterprise
networks using home-based Wi-Fi connections.
Click Here to View Full Article
to the top
Wanted: More Hispanics in STEM Fields
eSchool News (05/06/08) Devaney, Laura
An increasing number of businesses and education groups are launching,
funding, and supporting efforts to increase the number of minorities,
particularly Hispanics, in science, technology, engineering, and
mathematics (STEM) fields. A Public Agenda study, titled "A Matter of
Trust," found that nearly half of Hispanic parents say it is a serious
problem that students are not taught enough math and science, and that they
are more likely to support making sure U.S. standards match those in Japan.
The study also found that less than half of young Hispanic adults believe
that qualified students can find a way to pay for college. Study authors
Paul Gasbarra and Jean Johnson of Public Agenda say education, and higher
education in particular, is highly prized and respected among Hispanic
parents, more so than parents in general, despite erroneous conventional
wisdom that would suggest otherwise. The authors also say that far too
many Hispanic families are underserved by public education, to a
significantly higher degree than the general population. A TRPI report,
titled "STEM Professions: Opportunities and Challenges for Latinos," found
that Hispanics also suffer from a larger gender gap in STEM careers, when
compared with Asians and African-Americans. However, the report also said
that as the fastest growing ethnic group in the United States, Latinos have
a unique opportunity to aim high and strive for STEM careers, given the
high demand for talent in those fields. A recently released NACME report,
"Confronting the New American Dilemma, Underrepresented Minorities in
Engineering: A Data-Based Look at Diversity," calls for K-12 educators to
infuse STEM education throughout the K-12 curriculum through active,
hands-on, project-based learning, and to introduce students to STEM
careers, starting with preschool awareness activities.
Click Here to View Full Article
to the top
Turning Conventional Video Coding Wisdom on Its
Head
ICT Results (05/19/08)
European researchers have inverted the traditional video coding model with
an alternative model proposed through the Discover distributed video coding
(DVC) research project. Project coordinator Luis Torres says the effort
yielded a software algorithm or codec that was already "very competitive"
with those created in the United States. By the end of last year, Discover
could exhibit the best rate distortion performance of any DVC codec in the
world, although Torres acknowledges that a lot of work must be carried out
before the Discover codec can generate picture quality comparable to
television. "I am quite sure, in the future, new projects will see DVC
quality catch up with current mainstream broadcast technology and become
indistinguishable from it," he says. Once this milestone is reached,
existing and planned DVC applications such as the provision of
high-quality, real-time video feed by wireless video transmission and
wireless surveillance networks can be optimized, Torres says. Other
concepts under development include a new multi-view image acquisition
standard entailing the generation of a 3D effect using several unconnected
cameras viewing the same scene from different angles and positions, and
transmission of camera imagery from inside the human body for medical
purposes. The Discover software is freely available to the recording
community and other interested parties on the project Web site.
Click Here to View Full Article
to the top
Commercialising the Semantic Web
ZDNet (05/15/08) Miller, Paul
Commercializing the Semantic Web was discussed in a final session of a
track at this year's World Wide Web Conference in Beijing. Giovanni
Tummarello addressed the commercial momentum involving DERI's Sindice
research project; and Nigel Shadbolt discussed how Garlik made use of
university research, built a consumer user-base, and moved toward a
monetary position with major banks. Shadbolt acknowledged how the Semantic
Web allowed Garlik to work faster on the DataPatrol product and made
programming easier. "Unpredictable data is hard to work with, without the
Semantic Web," Shadbolt said. Tim Berners-Lee stressed that Semantic Web
technologies can help facilitate the "unexpected re-use" of data, and the
panelists went on to discuss how their own projects led to new
opportunities to use data in other ways. When it comes to users and
investors, the industry will have to push the applications and solutions
built on the platform, rather than the Semantic Web itself, the panelists
agreed.
Click Here to View Full Article
to the top
Geography as IT Job Destiny
eWeek (05/16/08) Perelman, Deb
New data from IT job board Sapphire Technology reveals significant
patterns in the availability of technology jobs in different regions in the
United States. For example, more than 58 percent of all available tech
jobs in Austin, Texas, were in software development, primarily due to the
large number of startup companies in Austin. In Chicago, project
management positions accounted for more than 52 percent of job listings,
which Sapphire's Mike Giglio attributes to the large number of mergers that
have occurred in recent years. Software development skills were also in
high demand in Tampa and Fort Lauderdale, Fla., the D.C. metropolitan area,
and Sacramento, Calif. Meanwhile, one-third of available IT job listings
for the Los Angeles area were for desktop support. In fact, there were
more listings for help desk jobs than for any other type of technology work
in Los Angeles. Giglio says that companies in the Los Angeles area are so
fast-paced that if anything happens to the systems they are working on they
need it fixed right away and they staff their companies accordingly.
Click Here to View Full Article
to the top
Supercomputing's Pied Piper
Government Computer News (05/19/08) Vol. 27, No. 11, Jackson, Joab
The 2008 Government Computer News Technology Leadership Award recipient
Charles Holland, deputy director of the Information Processing Techniques
Office (IPTO) at the Defense Advanced Research Projects Agency (DARPA), has
had a huge impact on the direction of government high-performance
computing. Before others in the military and industry realized there was a
problem, Holland warned the United States was falling dangerously behind
other countries in supercomputing power. At IPTO, Holland oversees
development of cognitive systems, language processing, sensors, and other
advanced computing challenges. Holland is also the program manager of the
High Productivity Computing Systems (HPCS) project, which aims to advance
supercomputing to the petascale era. In 2000, Holland was lead writer on a
report that described how the United States was losing its superiority in
supercomputing and could consequently lose tactical advantages. After
presenting the report to Congress, Holland was asked to commission a
program to get the United States back to the cutting-edge of technology.
Five years later, Holland joined DARPA and became manager of the HPCS
program. Cray CEO Peter Ungaro says the program is the largest
supercomputing research-and-development program ever and is the driving
force in shaping the future of high-performance computing.
Click Here to View Full Article
to the top
Pflops Here; Now What?
EE Times (05/19/08)No. 1527, P. 1; Merritt, Rick
The IBM Roadrunner is expected to be the first supercomputer in the world
to benchmark at a sustained rate of 1 petaflops, but the larger issue
surrounding this breakthrough may be the insights into science and parallel
computing applications that users of petaflops-scale machines could gain.
"What it really accomplishes is giving scientists an ability to do more
science than they could before," says Oak Ridge National Laboratory
supercomputer center project director Buddy Bland, whose lab has a 263
teraflops system that has been used to run global climate simulations and
to design a fusion reactor. Oak Ridge researchers have been testing new
parallel programming languages in preparation for the deployment of a
100,000-core petaflops system. Meanwhile, Bill Thigpen with NASA's Ames
Research Center says he has seen a widening gap between the rate at which
benchmark performance is increasing and the gains in actual work delivered
by new systems, and the problem lies in giving software the scalability to
accommodate a growing number of cores. Among the areas driving NASA's own
plans to implement a petaflops-scale machine is the agency's computing
requirements for spacecraft design and research into ways to address global
warming and other pressing environmental problems, Thigpen says. IBM
Roadrunner's use of heterogeneous processor cores may ultimately be of
greater value to scientists than the reaching of the petaflops milestone.
"Once you get people to think about building algorithms for systems that
are memory-constrained, using heterogeneous cores is not a problem," says
Roadrunner project chief engineer Don Grice.
Click Here to View Full Article
to the top