'We're All at Risk' of Attack, Cyber Chief Says
Technology Daily (12/11/07) Viana, Liza Porteus
Greg Garcia, the Homeland Security assistant secretary, spoke to the New
York City Metro InfraGard Alliance on Tuesday regarding the importance of
cybersecurity. InfraGard is an alliance between the private sector, the
FBI, and local law enforcement striving to safeguard key infrastructures,
including technology systems. Garcia pointed out that over 85 percent of
the nation's critical infrastructures are owned and operated by private
industry, which "means the federal government cannot address these cyber
threats alone." Though roughly $6 trillion passes through the U.S.
financial system on a daily basis, major companies continue to leave their
networks vulnerable to data theft and infiltration. The federal government
depends on organizations such as InfraGard and information-sharing centers
to drive industry to take cyber safety measures. The collaborations are
becoming increasingly valuable as hackers grow more sophisticated and as
the market for cybercrime surges. On the government end, the Homeland
Security Department's Einstein network scans systems for intrusions or
irregularities and distributes threat data within hours. Currently, 13
agencies use Einstein, but Garcia urges all agencies to participate.
Garcia also advises industry to take into consideration the physical
threats, such as a pandemic flu outbreak, that could impact networks, and
to incorporate such scenarios into their contingency network plans. In
March 2008, the department will administer Cyber Storm II, an exercise to
rehearse synchronized responses to simulated strings of cyberattacks
involving all levels of industry and government.
Click Here to View Full Article
to the top
'Combinatorial' Approach Squashes Software Bugs Faster,
Cheaper
NIST Tech Beat (12/12/07) Stein, Ben
Computer scientists and mathematicians from the National Institute of
Standards and Technology and the University of Texas, Arlington are
developing an open-source tool that catches programming errors using an
emerging approach called "combinatorial testing." The tool, scheduled for
release early next year, could save software developers significant time
and money. While it is known that the majority of software crashes are
caused by interactions between two variables, the researchers studied
applications in which failures could result from interactions of up to six
variables and designed a method for efficiently testing different
combinations of those variables. The technique is similar to combinatorial
chemistry in which scientists screen multiple chemical compounds
simultaneously. The new tool generates tests for examining interactions
between multiple variables in a given program. The researchers are
currently inviting developers to beta test the tool, which is expected to
be particularly useful for e-commerce Web sites and industrial process
controls, which often contain numerous interacting variables.
Click Here to View Full Article
to the top
Submissions Sought for RSSI'08
HPC Wire (12/12/07)
The fourth annual Reconfigurable Systems Summer Institute (RSSI'09), which
takes place July 7-10, 2008, at the National Center for Supercomputing
Applications, is expected to bring together domain scientists and
technology developers from industry and academia to discuss new
developments in field-programmable gate array technologies for
high-performance reconfigurable computing (HPRC). Submissions for the
event are being solicited in a variety of topics related to HPRC, including
the architecture of HPRC devices and systems, HPRC languages, compilation
techniques, and tools, libraries and run-time environments for HPRC. Two
types of submissions are being accepted--full-length papers up to 12 pages,
and short poster papers up to four pages. Submissions will be reviewed by
at least three reviewers and authors of accepted full-length papers will
present their work at one of the technical sessions. The best papers at
RSSI'08 will be considered for a special issue of the ACM Transactions on
Reconfigurable Technology and Systems journal.
Click Here to View Full Article
to the top
Communicating With Plastic
Technology Review (12/12/07) Bullis, Kevin
University of Tokyo researchers have developed a plastic pad that allows
electronic devices placed on it to communicate with each other, providing a
more secure and energy efficient alternative to short-range wireless
communications. Professor Takao Someya says the first application might be
an intelligent table that allows multiple devices to communicate without
being physically connected. The ultimate goal it to develop a system that
would allow thousands of devices to be connected, Someya says. The plastic
pad uses a combination of extremely short-range wireless communication and
wires. The sheet, only one millimeter think, is made by inkjet-printing
various insulating and semiconducting polymers and metal nanoparticles to
make transistors, plastic microelectromechanical (MEM) switches,
communications coils, and memory cells. The communication sheet is made of
an eight-by-eight-inch grid of cells. Each cell has a coil for
transmitting and receiving signals and plastic MEM switches for turning the
coils on and off and for connecting to neighboring cells. When two
electronic devices are on the sheet, sensors register their location and a
control chip at the edge of the sheet maps a route for the signals from
each device. The technology is part of an effort to develop inexpensive,
large-area electronics.
Click Here to View Full Article
to the top
Virtual Reality Solves Real Life Problems
Golden Gate X Press (San Francisco State University) (12/13/07) Tabatabai,
Ali
At San Francisco State University's Center for Computing in Life Sciences
(CCLS), students and professors work on solving some of California's most
troubling problems, including health and the environment. "We're dealing
with things like California health care, ecology, economy, and using a lot
of collaborative science to deal with what's happening around us," says
CCLS staff researcher Michael Wong. CCLS is an offshoot of the school's
computer science department, but it brings together biologists, chemists,
and other scholars from the College of Science and Engineering to perform
research and find practical solutions. "It's a great launching point for
students," Wong says. "They further their plans and really make a
contribution to science and society in California." Wong is using is
background in game theory to create a computer game to help train future
nurses and reduce the state's nursing shortage. The lab developed a
multiplayer role-playing game that puts student nurses in a virtual
hospital with virtual patients. The game was made with the help of actors
from the drama department who portrayed ailments and emergencies, while
Wong and his team wrote the code and made the game competitive. Another
research project developed software so the biology department could track a
species of invasive ants that threatened California's ecosystem. The
researchers captured the ants on video, allowing biologists and software
engineers to trace patterns in their behavior and discover how certain
movements were related to the ants' genetic makeup. A high-powered,
gene-tracking computer cluster was used to help the biologists analyze the
ants' behavior.
Click Here to View Full Article
to the top
NY School Opens Lab for Serious Games
Associated Press (12/13/07) Long, Colleen
The Parsons design school has established the first research lab devoted
to the development of so-called "serious games," video-based tools that
niche markets use to train public officials, students, and professionals in
various fields. For example, the U.S. military has modeled terrorist
attacks, school hostage crises, and natural disasters for serious games to
help prepare its personnel for such situations. The effort of Parsons The
New School of Design's PETLab could increase the popularity of serious
games, says director Colleen Macklin. "Our goal is really to create
intersections between game design, social issues, and learning," she says.
In addition to creating models for new games or interactive designs that
address social issues, PETLab will conduct interactive research to
determine the potential of serious games as a catalyst for positive social
change. The MacArthur Foundation has provided a $450,000 grant, and the
lab is also working with Microsoft to determine whether the Xbox can be
modified to create socially conscious games.
Click Here to View Full Article
to the top
Microsoft Leads Accessibility Effort
eWeek (12/10/07) Taft, Darryl K.
Assistive technology vendors, IT companies, and key nongovernmental
organizations have formed the Accessibility Interoperability Alliance (AIA)
in an effort to improve the interoperability of existing technology for
disabled users. AIA will also develop new software, hardware, and
Web-based products, work to improve developer guidelines, tools, and
technologies, and lower development costs. Consistent keyboard access,
interoperability of accessibility APIs, user interface automation
extensions, and accessible rich Internet application suite mapping through
user interface automation will be the initial focus of the collaboration.
"Today, developers must work across divergent platforms, application
environments, and hardware models to create accessible technology for
customers with disabilities," says Rob Sinclair, director of the
Accessibility Business Unit at Microsoft. "The AIA is an opportunity for
the entire industry to come together to reduce the cost and complexity of
accessibility, increase customer satisfaction, foster inclusive innovation,
and reinforce a sustainable ecosystem of accessible technology
products."
Click Here to View Full Article
to the top
HP Teams Up With UMass CASA
Daily Collegian (12/10/07) Sabin, Tim
Hewlett-Packard will join the industrial advisory board of the Center for
Collaborative and Adaptive Sensing of the Atmosphere (CASA) and assist the
National Science Foundation-sponsored initiative in its efforts to improve
weather analysis and prediction. The University of Massachusetts is the
lead institution involved in CASA, which conducts research, develops
technology, and installs prototype-engineering systems based on distributed
adaptive sensing networks. CASA is better able to understand and predict
atmospheric hazards because of the networks. "HP has a good handle on
traditional utility computing applications, and we believe that utility
computing will involve other classes of applications in the future," says
HP's Rich Friedrich. "The relationship enables HP to closely work on and
understand a new class of applications represented by CASA." Within a
decade, CASA expects to develop a network of low-cost and low-power radars
that can be placed on cell phone towers and rooftops.
Click Here to View Full Article
to the top
Q&A: PC Pioneer Chowaniec Looks Back at the Amiga
Computerworld (12/12/07) Gaudin, Sharon
Adam Chowaniec, chairman of Liquid Computing Corp., joined Commodore only
a year after the Commodore 64 was launched 15 years ago. Chowaniec's task
as vice president of technology at Commodore was to build the Amiga, the
C64's successor. The first Amiga was built with a custom chipset, had
highly advanced graphics for the time, and ran a 16-bit processor.
Chowaniec says joining Commodore right after the release of the C64 was a
significant challenge because the C64 was the end of an era. "With the
Commodore 64, you finally had a machine that was priced for the general
market," Chowaniec says, "Personal computing was really created in those
years." He says there does not seem to be the same passion and excitement
in the computer industry today as there was in the 80s. "There's been less
innovation than we saw back in the 80s. The industry consolidated, and
it's basically dominated by a small group of very large companies,"
Chowaniec says. "As companies get bigger, innovation gets slower. It's
just the way it is." Fortunately, Chowaniec says the industry is cyclical
and has stabilized since the tech bubble collapsed in 2000. He says the
industry is ready for another wave of innovation. "Technology never stands
still. In five or 10 years, I think there will be new approaches to
computing," says Chowaniec, who adds that his biggest problem with today's
technology is that it is too difficult to upgrade software and maintain
computer efficiency, and that the experience should be simpler and more
user friendly.
Click Here to View Full Article
to the top
Rise of the Machines
Cleveland Free Times (12/12/07) Vol. 15, No. 32, Gupta, Charu
Project EVEREST (Evaluation and Validation of Election-Related Equipment,
Standards, and Testing) is a sweeping review of the Diebold, Elections
Systems and Software, and Hart InterCivic electronic voting systems in use
in Ohio that was authorized by Secretary of State Jennifer Brunner. The
impetus for the project was the lack of security and reliability that has
dogged the systems, even though they were certified according to federal
guidelines. But the federal testing process did not catch exploits such as
the "Husti hack," which could compromise e-voting systems. Compuware ran a
test of e-voting systems for the state of Ohio prior to an expensive
deployment, and its findings were reviewed by a member of Brunner's
recently organized Voting Rights Institute. He wrote that while Compuware
"produced valid findings," the scope of its investigation was very limited,
and many of its fixes depended on improved policies and procedures from
elected officials that "are not magic solutions that can resolve all
problems." Project EVEREST will involve the assessment of e-voting systems
by two private labs--SysTest and Microsolved--and three teams of academic
computer security experts from Pennsylvania State University, the
University of Pennsylvania, and the University of California-Santa Cruz.
Hackability testing and source-code review are just a few of the areas to
be probed, but SysTest's involvement is controversial, given how deeply
rooted it is in the questionable federal testing and certification
structure that led to Ohio's current e-voting woes. Brunner says
hackability tests will not be assigned to SysTest, which rules out one
potential area of dispute.
Click Here to View Full Article
to the top
How the Next Billion Will Reshape the Internet
Toronto Star (12/10/07) Geist, Michael
While most of the media coverage of the annual Internet Governance Forum
in November focused on domain name issues, Michael Geist says a more
important topic is how the next billion Internet users will reshape the
Internet. With more than a billion people accessing the Internet
worldwide, doubling the number of Internet users, which should happen
within the next decade, this could have a dramatic effect on the network,
technology, computer software industry, and how Internet users access
information and interact with their online environment. Understanding how
the Internet will be affected requires understanding where the new Internet
users are coming from. While some new users will be from North America,
Europe, and other developed countries, most new users will reside in
developing countries. China is already the second-largest Internet-using
country in the world, behind the United States, and is likely to become the
largest Internet-using country within the next year or two, adding 250
million Internet users within 10 years. Countries such as India and Brazil
are expected to add another 200 million Internet users, while countries in
Africa will experience the fastest rate of growth. Most new Internet users
will not speak English as their first language, which will create
increasing pressure to accommodate different languages on the same domain
name system, and many new Internet users will have different cultural and
societal beliefs on issues such as free speech, privacy, and copyrights.
The new generation of Internet users may also change the hardware industry,
as flashy, big-screen laptops with fast DVD players give way to sturdy,
reliable, energy-efficient laptops. Methods for accessing the Internet may
also change, as widespread broadband may be too expensive for small
developing communities, which may rely more on wireless and satellite-based
connectivity.
Click Here to View Full Article
to the top
Supercomputing for the Masses
BusinessWeek (12/13/07) Ricadela, Aaron
The computer industry is positioned to experience a major transformation
due to the spread of supercomputing technology into corporate data centers
and even desktop PCs, providing an unprecedented amount of power to average
computer users. Technology companies are recruiting computer scientists
and looking to inject their most advanced technology into consumer
products. The major obstacle preventing supercomputers from becoming
household items is not hardware, but a lack of affordable, off-the-shelf
software. Such sophisticated machines require equally sophisticated, and
often customized, software. To solve this problem, Microsoft is building a
brain trust and giving schools research funding to study how
supercomputer-type programming can be used in personal machines. Creating
new markets for widespread supercomputing may be a trap, however, as
interest in such powerful machines may not be present in the average
consumer. "Is this whole infatuation with performance something that has
moved beyond what the vast majority of users really care about?," asks
Intel chief technology officer Justin Rattner. "Are there really a set of
applications that require 10, 100, 1,000 times the performance we have
today?"
Click Here to View Full Article
to the top
The Rise of Parallelism (and Other Computing
Challenges)
International Science Grid This Week (12/12/07) El Baz, Didier
Parallelism is no longer restricted to high-performance or high-speed
computing, as it is used in PCs, cellular phones, and numerous other
electronic devices, writes Didier El Baz, head of the Distributed Computing
and Asynchronism team, LAAS-CNRS. El Baz says the arrival of grid
computing and parallelism have raised numerous questions in computer
science and numerical computing. The combination of parallel and
distributed computing could potentially change the nature of computer
science and numerical computing. To ensure efficient use of new parallel
and distributed architectures, new concepts on communication,
synchronization, fault tolerance, and auto-organization are needed and must
be widely accepted. Manufacturers agree that future supercomputers will
have massively parallel architectures that will need to be fault tolerant
and well suited to dynamicity, which will require some type of
auto-organization as controlling these large systems efficiently will not
be possible entirely from the outside. Parallel and distributed algorithms
will also have to be more adapt at coping with the asynchronous nature of
communication networks and the faults in the system. These problems are
attracting more and more attention, particularly from scientists working on
communication libraries, and will need to be addressed to find solutions
and drive the evolution of computing.
Click Here to View Full Article
to the top
How Do We Preserve Scientific Data for the Future?
New Scientist (12/08/07)No. 2633, P. 28; Marks, Paul
Scientific research produces a vast amount of data, which raises questions
about how this data will be preserved for future generations. For example,
CERN's Large Hadron Collider is expected to yield 450 million gigabytes of
data over its 15-year duration, and CERN does not know if it will have the
money or technical assets to preserve this data following the project's
conclusion. "The data needs to be stored in a digestible form and be
available forever," says CERN's deputy director general Jos Engelen. "But
we just don't have a long-term archival strategy for accessing the LHC
data." Open-source software is considered to be problematic for archivists
because the software is so mutable, which means that there is no assurance
that data produced by an experiment that uses open-source programs will be
accessible on the Web later on. The National Science Foundation intends to
spend $100 million on the establishment and operation of up to five trial
repositories for publicly-funded research data, and to investigate new
digital preservation methods. The project's researchers will also examine
ways to transfer huge data sets between different storage media, while
another focus of the initiative is developing best practice guidelines so
that hot-button issues such as privacy are addressed. In Europe, a lobby
group called the Alliance for Permanent Access plans to pressure lawmakers
to apportion about 2 percent of each research grant for long-term
archiving.
Click Here to View Full Article
- Web Link May Require Paid Subscription
to the top
NIST Looks to Cook Up a New Hash
Government Computer News (12/10/07) Vol. 26, No. 30, Jackson, William
The National Institute of Standards and Technology has launched a
competition for a new crypto algorithm for digital signatures and message
authentication, and is accepting submissions for what will become the
Secure Hashing Algorithm-3. The algorithms currently in use, SHA-1 and
SHA-2, have not been broken, but weaknesses are starting to appear, says
NIST security technology group manager William Burr. The two current
standards likely still have several years of use left, and Burr says it is
prudent to find a new algorithm. Developing a new algorithm that meets the
requirements will be difficult, as it needs to be at least as secure as the
algorithms in use but more efficient regarding speed and computational
resources required to run it. The new algorithm must also be similar
enough to SHA-2 that it can directly substitute for it in any application,
but must be different enough that a successful attack against SHA-2 will
not affect the new algorithm. The selection process will be radically
different from previous secure hashing algorithm development and selection,
which tool place behind close doors. NIST will examine and test
algorithms, but submissions will also be made public so outside evaluators
can test the submissions for weaknesses.
Click Here to View Full Article
to the top
Development in 2027
eWeek (12/03/07) Vol. 24, No. 37, P. D3; Taft, Darryl K.
CodeGear's David Intersimone outlined his vision of what programming will
look like 20 years from now during a keynote presentation at the recent
EclipseWorld show. Intersimone's vision includes virtual software teams
and collaborative infrastructure. To reach such an advanced level of
programming, Intersimone said developers will need to overcome many of
today's development obstacles, including disparate, nonintegrated systems
and teams and the lack of cohesive software reuse strategies. Intersimone
suggested creating a new way of approaching software development by
capturing developer intent through application factories, which would
foster "application-driven development" in which the "structure, evolution,
and logic behind developing an application is part of the application."
The components and the application could be shared with other developers as
reusable software assets, which would be platform-neutral,
framework-agnostic, and relevant beyond Java and Eclipse. Intersimone is
promoting a way of annotating and building templates that would enable
better maintenance of applications and allow developers to build new
applications based on the templates and reusable software assets.
Intersimone also addressed concurrent and parallel programming, outlining
ways of dealing with both types of processing information such as wait-free
and lock-free synchronization, transactional servers, rethinking the
sequential programming model, and possibly using more functional
programming.
Click Here to View Full Article
to the top