Open-Source Bug Hunt Results Posted
Government Computer News (03/06/06) Jackson, Joab
Through a far-reaching analysis of open-source code sponsored by the
Department of Homeland Security, Coverity has found that there is less than
one-half of one bug embedded in every 1,000 lines of code, with even lower
rates in popular applications such as the Linux kernel and the Apache Web
server. The results are the first to appear from the three-year, $1.2
million grant awarded to Coverity, Stanford University, and Symantec. DHS
is hopeful that calling attention to the bugs will prompt developers will
fix them, shoring up the vulnerabilities that could be exploited by hackers
to disrupt or take over a system. Coverity CTO Ben Chelf noted that while
the automated scan cannot detect every bug, it discovered some that are
overlooked by in-house reviews. The scan found that XMMS is the cleanest
program, with only six bugs in 116,899 lines of code. The Advanced
Maryland Automatic Network Disk Archiver (AMANDA) proportionally had the
most bugs, with 108 discovered in its 88,950 lines of code. The bug
density for all the programs was 0.43 per thousand lines of code, with the
LAMP stack registering just 0.29 defects per thousand lines of code. Chelf
notes the difficulty of making comparisons between open-source code and its
commercial counterparts, given that Coverity has only tested a few
commercial applications. Coverity has concluded from its study that the
size of a program is a poor indicator of quality, as Linux has
comparatively few bugs, while a smaller program such as AMANDA may contain
many. The number of developers at work on a project, in proportion to its
size, is a better predictor of overall quality.
Click Here to View Full Article
to the top
Quality and Jobs Will Prevail in Offshoring Blitz
Computerworld Australia (03/07/06) Crawford, Michael
A report from the Association of Computing Machinery (ACM) contends that
more call center operations, fundamental research, and other IT functions
are being outsourced, and 30 percent of the 1,000 largest companies in the
world are now moving jobs to developed countries. For IT professionals to
remain employable, the ACM report says they should train in areas that are
less likely to become automated and develop a high skill level in these
areas. Meanwhile, research firm Gartner predicts that 10 percent of IT
departments will take a hit due to outsourcing over the next five years,
adding that units will also shrink in size because of offshoring and the
increasing commoditization of technology. However, IT managers in
Australia are not fearful of the reports on offshoring, arguing that their
expertise and quality of work will continue to be in demand. Moreover,
they believe companies will eventually come to understand the importance of
quality work, and will ultimately bring the IT work back in-house. Edward
Hore, IT manager for automotive specialists Bob Jane Group, expects to see
a reverse of the offshoring trend in three to five years. "Offshore
support will clearly start to come back in housein 1999 Gartner said all
support desks will be outsourced but in the last three years we have seen
the exact opposite as IT departments are growing rapidly," says Hore.
The report "Globalization and Offshoring of Software--A Report from the ACM
Job Migration Task Force" is available at
http://www.acm/org/globalization
Click Here to View Full Article
to the top
Good Idea: Reinventing Invention
Wired News (03/07/06) Glasner, Joanna
While the 1,000 companies that spend the most money on research and
development are increasing their investment, many of the innovations that
they produce are not intuitive to the average consumer. Analysts who study
innovation agree that the best inventors are those who recognize that most
inventions fail. Companies must also be able to look at the strategies
that drive success for their competitors and evaluate how they could be
implemented within their own organization. U.S. auto makers exemplify the
consequences of ignoring the success of your competition, while Microsoft,
in contrast, has responded with agility to competitive threats, frequently
by purchasing potential rivals. With the body of innovation steadily
growing, it becomes increasingly difficult to develop a breakthrough
invention, which explains the declining return on research and development
spending that many companies are experiencing. Ultimately, this trend will
lead to increased specialization and more team-oriented research, according
to Benjamin Jones, a professor at the Kellogg School of Management.
"Whenever researchers look at innovation, they see this upward trend in
collaboration," he said. "People are becoming more specialized over time
and they need to work in bigger teams." While specialization will be a key
driver of innovation, it will not necessarily require a higher level of
education, as researchers in the future will break down large systems into
manageable pieces, enabling developers to focus on improving one part of a
product without having to understand the whole.
Click Here to View Full Article
to the top
Microsoft Gets Behind the Wheel
Financial Times (03/08/06) P. 10; Mackintosh, James
Microsoft unveiled the first cars to carry the full version of Windows
Mobile for Automotive at last week's motor show in Geneva, and it is due to
present a Volkswagen concept car with complete Internet access at Hanover's
CeBIT technology exhibition tomorrow. After cornering the desktop market
and launching initiatives in robotics, PDAs, and mobile phones, Microsoft
is eyeing the automotive industry as the next major platform for growth.
Microsoft is angling for control over the "infotainment" systems in a car,
including the technology that powers satellite navigation, music, and
mobile phone connections. Fiat, Microsoft's first partner in the auto
industry, is offering a Blue & Me package, which includes a version of
Windows, a USB port installed at the factory, and a wireless Bluetooth link
for hands-free mobile-phone use. Fiat will offer navigation, including
mapping devices, in future models, and the controls for every application
will be voice-controlled. Microsoft has provided Windows to 19 car
companies for several years, though the company's name has not been
attached. Microsoft has a built-in advantage with its popular Media Player
and other widely used software, and it is currently developing
Internet-accessible telematics applications, as well as remote diagnosis
for technical problems. Many car manufacturers have been reluctant to
incorporate more technology into their systems after high-profile fiascos
such as BMW's iDrive. Fiat's USB port comes in stark contrast to the
tendency of many auto companies to make their software opaque out of a fear
that their brand would be damaged if software malfunctioned. Fiat is
debating whether to allow users to download and install upgrades on their
own with the USB storage device. Fiat's first models already need an
upgrade, as they can play music from an iPod but cannot play music from
Apple's iTunes store because of Apple's proprietary compression
technology.
Click Here to View Full Article
to the top
Software Shows Mona Lisa to Be Neither Man, nor da
Vinci
News-Gazette (03/06/06) Kline, Greg
A team of University of Illinois researchers using facial-recognition
software has concluded that Leonardo da Vinci was not the model for the
Mona Lisa, and, moreover, that there is a 60 percent probability that the
model was a woman. Last year the researchers used the software to assess
the mood of the painting's subject, which they reported as happy, with
tinges of anger, fear, and disgust. Illinois electrical and computer
engineering professor Thomas Huang is also exploring facial recognition for
multiple applications, including security systems and personalizing a
computer by training it to react to its user's facial expressions. The
technology could be used to create smart kiosks that recognize users or
adaptive billboards that could change their message if a man or a woman
walks by. Huang has applied the technology to Edvard Munch's "The Scream,"
identifying surprise as the predominant emotion, and a self-portrait of
Vincent van Gogh, which he found to convey profound sadness. In applying
the gender identification feature of the facial-recognition program to the
Mona Lisa, Huang and his students developed a scale of differences among
faces from a database of facial photos and plotted the features of males
and females. When applied to the Mona Lisa, the software reported a 60
percent probability that the subject was a female. When they plugged the
Mona Lisa and a self-portrait of da Vinci into the database, they found
that the images are more than likely different people.
Click Here to View Full Article
to the top
Flexible CRT Displays
Technology Review (03/07/06) Bullis, Kevin
Researchers at Rensselaer Polytechnic Institute, Northeastern University,
and New Mexico State University have developed a technique that could
produce flexible, flat-screen cathode ray tube (CRT) displays using a
pre-patterned surface to direct the growth off the nanotubes. The
researchers pour a liquid over the nanotubes, which is cooked until it
becomes a polymer. The polymer retains the nanotube pattern after it is
peeled off. The researchers claim that by isolating single nanotubes, they
have yielded the most efficient electron emissions yet reported.
Rensselaer postdoctoral materials science and engineering researcher
Swastik Kar, the lead author of the study, noted that a working prototype
display is still a few years away, as flexible nanotube displays still
require the electronics to govern their individual pixels.
Nanotube-plastic composites could lead to other nanotube-based devices, and
their pressure sensitivity could lead to the development of nano-skin that
would mimic the human sense of touch. Researchers are also developing
nanotubes that can be used as adhesives based on the structures of a gecko
that enable it to hold on to a wall. "Flexible nanotube-polymer films will
find a large range of applications, not only for electronics, but also for
sensing applications and even optical applications," said Liming Dai,
professor of materials engineering and chemistry at the University of
Dayton, Ohio. "It's an important area. Now is the time for people to push
these things toward real applications."
Click Here to View Full Article
to the top
Europe's Fastest Supercomputer Turned On
IDG News Service (03/08/06) Niccolai, James
A demonstration of the Julich Blue Gene/L (JUBL) supercomputer is expected
to take place this week at the CeBIT trade show in Hanover, Germany,
according to the Julich Research Center and IBM. The research center in
Germany went live about a month ago with JUBL for environmental and
particle physics research, and about 30 users are currently accessing the
supercomputer for 10 projects. JUBL has a peak performance of 45.8
teraflops, or 45.8 trillion operations per second, enabling the
supercomputer to claim the title of the fastest supercomputer in Europe.
The supercomputer is an IBM Blue Gene system that has more processors,
which also are packed more tightly together, than Julich's JUMP (Julich
Multi Processor) supercomputer, which has a peak performance of 8.9
teraflops per second. JUBL has 2,048 processors in each of its eights
racks for a total of 16,348, compared with 32 processors in each of the 41
racks for a total of 1,312 processors in JUMP. "The performance of each
processor is lower, but the architecture gives you more flexibility because
you have less heat to transfer out of the machine," explains Dr. Norbert
Attig of Julich's Central Institute for Applied Mathematics. Using the
Linpack performance benchmark, JUBL has a peak performance of 36.5
teraflops, compared to 280.6 teraflops for the fastest supercomputer in the
world, the Blue Gene/L System at the U.S. Department of Energy's Lawrence
Livermore National Laboratories in California.
Click Here to View Full Article
- Web Link May Require Free Registration
to the top
UM Engineers Pioneer Digital Fingerprinting to Catch
Cyber Thieves
Newswise (03/07/06)
University of Maryland researchers are developing new digital
fingerprinting applications that could protect entertainment content and
identify the sources of national security leaks without interfering with
appropriate uses. K.J. Ray Liu and Min Wu, both professors of electrical
and computer engineering at Maryland, are exploring new cyber forensics
techniques to protect content and track the pirates who attempt to steal it
through collusion attacks, where multiple attackers attempt to filch and
distribute proprietary or classified materials, deleting or altering the
original digital fingerprint in the process to avoid being traced. The new
technology incorporates anti-collusion codes to safeguard content while
still protecting legitimate uses. The technology could help Hollywood
protect its copyrights as content passes over the Internet, and the
researchers are also exploring techniques that could protect individual
content items from inappropriate use without installing unwanted and
potentially harmful products onto users' computers. The Maryland
researchers' system embeds each item with a unique ID that can tell which
users are involved in a piracy attack, and works with equal effectiveness
for video, audio, and live multicasts, such as pay-per-view events. "The
message our technology sends is: 'Don't bother to try anything, because we
can catch you,'" said Liu. Anti-collusion codes could also help identify a
person who leaks sensitive national security information embedded in a
multimedia format.
Click Here to View Full Article
to the top
Machine Control Opens Up
IST Results (03/08/06)
The IST-funded OCEAN project, completed last year, has developed a
standard for the creation of a broad range of new machine controls. Until
the OCEAN project, machine controls were largely closed and their software
stubbornly resisted customization. "Before, you got a machine that could
perform one process with a complete software system and if you wanted to
change the process you had to change all the software," said project
coordinator Fabrizio Meo, who added that customizing software was often
prohibitively expensive. The OCEAN developers used Linux, RTAI, and Real
Time Corba to create a more agile development model. The manufacturer's
unique software still resides in a black box, though it now communicates
through a common standard, enabling the integration of different products.
The standard also established a real-time communication protocol that
harnesses distributed components into a single machine. Meo describes the
continuous evolution of Linux and the other platforms that power machine
controls as the biggest challenge in the project.
Click Here to View Full Article
to the top
Computers Aren't Just for the Guys
Times of Trenton (03/04/06) Landau, Elizabeth
Princeton Engineering School Dean Maria Klawe, former president of ACM,
says the perception of computers as being toys for boys and the media image
of computer scientists as being geeks are not the only reasons why fewer
women are studying computer science today. Klawe also takes issue with the
way the subject matter is taught. For example, Klawe says a curriculum of
computing applications to video and sound files could focus less on the
technical nature of coding language. "The way that computer science
courses are set up focuses on the technical intricacies of computing rather
than what computing is good for," says Klawe, adding that women want to
know how they can apply such knowledge. Klawe is an advocate of the "pairs
programming" teaching approach, in which students work in pairs--one as the
program driver and the other as the navigator who reads along and notes
errors--and switch roles every half hour. She believes pairs programming
would appeal more to female students than working on assignments
individually. Klawe adds that having women combine computer science
studies with a major in art, music, or psychology is another way to boost
the female presence in the field. Klawe says the number of women in
computer science has plummeted from 30 percent to 15 percent over the past
25 years.
Click Here to View Full Article
to the top
NEC Develops Fastest Data Transmission Technology for
Supercomputers
Asahi News Service (03/08/06)
Japan wants to have the fastest supercomputer in the world in five years,
and a new development from NEC could help the government reach its goal.
NEC says it has new supercomputing technology that delivers a data
transmission speed of 25 Gbps, which is equivalent to amount of data on
100,000 newspaper pages. The previous record was 14 Gbps, which was
reached last year by researchers in the United States who developed a
supercomputing system for the U.S. Department of Defense. The optical
interconnection technology from NEC makes use of a vertical-cavity
surface-emitting laser that is able to transmit an enormous amount of data
between a CPU and a memory device. The use of optical signals promises to
improve the performance of supercomputers because it facilitates fast
speeds as well as lowers the noise associated with operating supercomputing
systems. Japan's Earth Simulator, which uses electrical signals and was
surpassed by a U.S. supercomputer in 2004, has a transmission speed of 0.5
Gbps. The Japanese government plans to start working with domestic
electronics makers on developing the next-generation supercomputer starting
in fiscal 2006.
Click Here to View Full Article
to the top
Aging Workforce a Concern for US Tech Firms
TechNewsWorld (03/04/06) Koprowski, Gene
The Computing Technology Industry Association (CompTIA) has teamed up with
AARP and several other organizations to help prepare employers for the
graying of their workforce. In particular, the aging workforce has become
an issue for technology companies in the United States, which will see a
decline in the number of young workers in the years to come. According to
CompTIA, in four years a third of employees in the country will be 50 years
of age or older. In response, CompTIA has helped form the Alliance for an
Experienced Workforce, which seeks to bring attention to the issue, assist
employers in retaining and attracting older workers, and create more
opportunities for older workers to utilize their experience, skills, and
knowledge. "Organizations risk losing core competencies, in-house
expertise and mentors for future talent," says CompTIA CEO John Venator.
"The long-term impact of such a trend is a slowdown in innovation."
Meanwhile, experienced technology executives stand to be compensated
handsomely for their skills. According to Silicon Valley recruiting firm
Trilogy Venture Search, salaries for senior tech execs will start at
$250,000 and could well exceed $300,000 for some positions.
Click Here to View Full Article
to the top
Foster to Lead Computation Institute
University of Chicago Chronicle (03/02/06) Vol. 25, No. 11,Koppes, Steve
Grid computing pioneer Ian Foster has replaced Rick Stevens as the
director of the Computational Institute, a joint project between the
University of Chicago and the Argonne National Laboratory that focuses on
computational and communication problems in the sciences, as well as in
medicine, law, the arts, and humanities. Foster, the Arthur Holly Compton
Distinguished Service Professor in computer science at the university and
the associate director of Argonne's Mathematics and Computer Science
Division, assumed his new position effective March 1, 2006. In 1995, he
co-founded the Globus Project, an initiative that developed an open-source
software package for building Grid systems and applications; and in 2004 he
helped found Univa, a company based in Elmhurst, Ill., to promote the
commercial use of the technology. Also, Jonathan Silverstein, assistant
professor in surgery at the university, was appointed associate director of
the Computational Institute. Silverstein has been an advocate of using
grid computing to improve patient safety and academic efficiency in
biomedicine. "Computation plays an increasingly central role in many
disciplines in the sciences, medicine and the humanities," says Foster.
"What we want to be about is not just doing bigger computations, but
working out how to apply these computation tools in new ways and to new
disciplines."
Click Here to View Full Article
to the top
Hey Neighbor, Stop Piggybacking on My Wireless
New York Times (03/05/06) P. 1; Marriott, Michel; Zarate, Andrea;
Ruethling, Gretchen
"Piggybacking," or the unauthorized use of someone else's wireless
Internet connection, is increasingly becoming an issue for people who live
in densely populated areas such as New York City or Chicago or in apartment
buildings, makers of wireless routers say. One of the reasons why
piggybacking is becoming increasingly common is because so many users do
not bother to secure their networks with passwords or encryption
programs--which allows anyone with a wireless-enabled computer within the
200-foot range of a wireless router to gain access to the network, says
analyst Mike Wolf. That assessment is backed up by Humphrey Cheung, the
editor of a technology Web site, tomshardware.com. In April 2004, Cheung
and his colleagues measured how plentiful open wireless networks have
become by flying two single-engine airplanes over metropolitan Los Angeles
with two wireless laptops. The project logged more than 4,500 wireless
networks, with only about 30 percent of them encrypted to lock out
outsiders, Cheung said. For wireless Internet users who fail to protect
their networks, the consequences can be much greater than slower Internet
access. Symantec Security Response's David Cole says savvy users could
piggyback into unprotected computers to gain access to files containing
sensitive financial and personal information, release malicious viruses and
worms that could do irreparable damage, or use the computer as a launching
pad for identity theft or the uploading and downloading of child
pornography.
Click Here to View Full Article
to the top
Supercomputer Architectures Battle for Hearts and Minds
of Users
Computerworld (03/06/06) P. 36; Ulfelder, Steve
While the supercomputing speed race has long been confined to the lab, the
data-flow problems that supercomputer researchers face will become
problematic for everyday IT managers within the next five years.
Processors have evolved faster than data-movement technologies, as
dual-core PCs have already infiltrated the marketplace, and experts predict
that a standard chip could contain 64 processors, each capable of running
four threads simultaneously. Clustered systems have propelled the
supercomputer's processing ability to the outer limits of latency and
bandwidth. IBM's Blue Gene, the world's fastest supercomputer, harnesses
131,072 clustered processors. "Clusters have completely changed the
scientific computing landscape," says Jack Dongarra, a computer science
professor at the University of Tennessee, who notes that more than 60
percent of the top 500 supercomputers are clustered systems. As their
popularity grows, supercomputers are finding their way into an increasing
number of practical applications. While the ascendancy of clustered
systems has tilted the balance away from the exotic architectures favored
by Cray, many scientists agree with Cray executives that the company's
high-bandwidth, low-latency supercomputers will form an integral part of
the future hybrid applications. Cray's Jason Silverman reports that the
company will bring compilers to market by 2009 that can determine which
code best suits its vector processors and which is more given to the less
proficient scalar processors, saving programmers from the time-consuming
allocation work. IBM is developing speculative multithreading, a technique
for the impromptu reassembling of processors that have been allowed to run
non-sequentially, which it hopes will be a breakthrough. DARPA is also
working to advance the development of scientific computers through its High
Productivity Computing Systems initiative that seeks to double performance
every 18 months at least through 2010.
Click Here to View Full Article
to the top
Future Disruptive Technologies: The Perspectives of MIT &
Stanford
Always On (02/01/06) Vol. 1, No. 4, P. 26; Byers, Tom
MIT vice president for research and associate provost Alice Gast and
Stanford University dean of engineering Jim Plummer participated in a panel
moderated by Tom Byers at the recent AO2005 Innovation Summit, where they
discussed what disruptive technologies their schools are focusing on. Gast
and Plummer agreed that universities, industry, and government have a
shared responsibility to inspire the next generation of scientists and
engineers, and key to this inspiration is the emphasis on disruptive
technologies. Plummer said the fields generating the most excitement among
engineers and scientists--biotechnology, energy, nanotechnology, and
environment--can not only lead to new knowledge, but can also establish a
platform for new entrepreneurism. He explained that his university is
attempting "to seed the future by trying out all the wild and crazy ideas
and seeing which ones actually have the possibility of being successful."
Gast said two keys of Internet-related research at MIT are "the human
computer interface and the distributed cell phone everywhere environment."
She noted that MIT and Stanford are also concentrating on the grand
challenge of energy and its connection to the Internet, and that the
various issues associated with energy--consumption, production, storage,
and portability--can lead to the development of many disruptive products.
At the same time, she emphasized that technical advances "must proceed hand
in hand" with tough policy decisions. Both Plummer and Gast said
interdisciplinary collaboration--not just between researchers, but between
researchers, social scientists, and economists--is critical. However,
Plummer said that "while business and policy and technology must work
together, their foundation is technology."
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Is Deviant Behavior the Norm on P2P File-Sharing
Networks?
IEEE Distributed Systems Online (02/06) Vol. 7, No. 2,Hughes, Daniel;
Walkerdine, James; Coulson, Geoff
Most unlawful pornography mediated by peer-to-peer (P2P) file-sharing
networks is generated by a small group within the P2P community, according
to an analysis of pornography-related resource-discovery traffic in the
Gnutella P2P network conducted by Lancaster University and York St. John
College researchers. There are two competing assumptions about the effects
of anonymity on computer-mediated communication: One posits that anonymity
promotes and increases the probability of deviant online behavior by
wearing down people's resistance to temptation, while the other contends
that anonymity only nurtures deviant behavior if the norms related to a
specific group identity permit it. The researchers' experiments, which
concentrated on capturing and studying Query and QueryHit messages on the
Gnutella network between Feb. 27 and March 27, 2005, showed a 1.6 percent
average of Query messages associated with illegal pornography, and a 2.4
percent average of QueryHit messages. Determining whether individuals who
share such material form a subgroup of the Gnutella community involved
generating a ranked list of the 20 leading pornography-related search
terms, and identifying peers who responded with QueryHits as likely
distributors of illicit content. There was a strong indication that 57
percent of peers who share such material share nothing else, compared to 17
percent who share less than 50 percent illegal material. These findings
suggest that the sharing of illegal pornography is not a common habit of
most Gnutella users, and that anonymity cannot be blamed as the cause of
deviant behavior. The researchers contend that, in view of these results,
there is no need to severely police or prosecute P2P file-sharing networks
if the deviant subgroup can be effectively targeted without infringing on
the wider file-sharing community.
Click Here to View Full Article
to the top