Cyber Assaults on Estonia Typify a New Battle
Tactic
Washington Post (05/19/07) P. A1; Finn, Peter
Estonia, one of the most wired countries in Europe, was recently subjected
to massive and coordinated attacks against the country's Web sites,
including sites belonging to the government, banks, telecommunications
companies, Internet service providers, and news organizations, according to
Estonian and foreign officials. Computer security specialists called the
attacks against the country's public and private electronic infrastructure
unprecedented. The NATO alliance and the European Union have sent
technology specialists to Estonia to observe and help during the attacks,
which so far have disrupted government email and caused financial
institutions to shut down online banking. Security experts and officials
have warned that during times of war enemies may launch massive online
attacks against a target, and the Department of Homeland Security has
warned that U.S. networks need to be secured against al-Qaeda hackers. The
attacks against Estonia provide an opportunity to observe how such assaults
may be executed. Estonia's minister of defense Jaak Aaviksoo said the
attacks were massive, well targeted, and well organized. Aaviksoo said
about 1 million computers worldwide were used in Botnet attacks that began
April 27. By May 1, Estonian Internet service providers were forced to
disconnect all customers for 20 seconds to reboot their networks. By May
10, bots were probing Estonian banks, looking for weaknesses, and Estonia's
largest bank was forced to shut down all services for an hour and a half.
Estonian IT consultant Linnar Viik called the attacks an attempt to take a
country back to the Stone Age, and said in the 21st century a country is no
longer defined only by its territory and airspace, but by its electronic
infrastructure as well.
Click Here to View Full Article
to the top
CACM 60th Anniversary Issue Tracks Impact of Computing
Technology
AScribe Newswire (05/17/07)
Communications of the ACM commemorates the 60th anniversary of the ACM in
the May 2007 issue with a special section that features the memories,
findings, and accounts of historians, archivists, and early pioneers and
volunteers of the association. ACM was founded in August 1947 by
visionaries who desired to focus more intently on the emerging computing
research following World War II. The issue follows the growth of computing
over the years and its impact on society, as well as the role ACM has
played in scientific computing, business computing, and information
technology occupations. "It's important to look back to see where you've
been to know where we want to go," says guest editor David S. Wise,
computer science professor at Indiana University and a member-at-large of
ACM's governing body. "By becoming cognizant of the history and impact of
this critical technology, we are starting to understand how it has evolved
and how we can determine the next new thing." The ACM History Committee,
co-chaired by Wise and Richard Snodgrass, computer science professor at the
University of Arizona, is behind the 60th anniversary issue, which also
highlights the emergence of SIGGRAPH, the Special Interest Group that
focuses on computer-generated graphics, arts, design, and entertainment.
The changing trends in undergraduate education in computer science are also
discussed in an article. The special section is available for free online
at
http://portal.acm.org/cacm60.
Click Here to View Full Article
to the top
Recreating the Feel of Water
Technology Review (05/21/07) Ross, Rachel
Researchers at Hokkaido University, in Sapporo Japan, have created a new
way to simulate the feel of flowing water in two virtual-reality
simulations--one simulates fishing and the other kayaking. Most research
into haptics focuses on giving the user the feeling of touching a solid
object, but Hokkaido University associate professor Yoshinori Dobashi says
creating the sensation of liquids is a difficult task that uses complex
mathematical formulas known as Navier-Stokes equations. These equations
need to be running constantly to keep pace with the ever-changing movement
of water. "The computation of the force field has to be completed and
updated within 1/500th of a second," Dobashi says. "This is almost
impossible." Previous attempts to recreate the feel of liquids were
limited to two-dimensional models because 3D models were thought to be too
processor-intensive to perform in real time, Dobashi says. However, he
says this newest simulation is more realistic because it creates three
dimensions. To create a real-time, 3D simulation, Dobashi and his team
created a model that approximates real-world forces acting on a fishing rod
or paddle by performing part of the calculations in advance. The forces of
different water velocities and different paddle or fishing lure positions
are precalculated, meaning only the velocity of the water as the user moves
the paddle or rod needs to be calculated in real time. Once the velocity
has been calculated, the appropriate forces are applied to the user's hand.
Dobashi admits that forces have not been calculated for every possible rod
and paddle position, but he hopes to fill in the gaps and create two-payer
kayak races.
Click Here to View Full Article
to the top
I, Coach: What's in Store in Robotics
Computerworld (05/21/07) Anthes, Gary
Carnegie Mellon University computer science and robotics professor Takeo
Kanade has done revolutionary work in computer vision, smart sensors,
autonomous land and air vehicles, and medical robotics. Kanade believes
that the popular view of what the future of robotics will be will soon
change, as robots will be used to enhance human labor, acting as advisers
and coaches, rather than replacing it. Within 10 years, computers may have
the capacity to recognize emotions, gesture, and behaviors through vision
sensors alone, Kanade predicts. Kanade has been working on
"quality-of-life technology" that would particularly benefit the elderly
and people with disabilities. Some of Kanade's ideas include what he calls
inside-out vision, which instead of monitoring a subject from the external
environment, the subject wears a small camera that allows the computer to
observe a user's actions from their point of view, making it easier to
predict intent and respond accordingly, such as opening a door when the
computer detects a person moving directly at it. Computer vision could
also be used to monitor assembly lines to ensure all parts are being
properly installed, and alert the worker when they are doing something
incorrectly. Kanade sees robots becoming a type of coworker, performing
the tedious jobs such as moving parts to and from the product line, while
humans continue handle the process and quality control. Kanade says home
robots will be lightweight robots that mix entertainment, information, and
mobility assistance. The home itself can become robotic as well,
monitoring what a person is trying to do and helping, such as regulating
food temperature while cooking. Kanade says two major challenges hinder
the development of home and quality-of-life robots. The first is the
capacity to recognize human needs, and the second is how to make robots
capable of recognizing a mistake and responding faster.
Click Here to View Full Article
to the top
Rensselaer, IBM, and New York State Unveil New
Supercomputing Center
Rensselaer News (05/18/07)
Rensselaer Polytechnic Institute, working with IBM and the state of New
York, has created the Computational Center for Nanotechnology Innovations
(CCNI). The $100 million university-based supercomputing center is
designed to continue work on nanoscale semiconductor technology and develop
nanotechnology innovations in energy, biotechnology, arts, and medicine.
The heart of the facility, an IBM Gene supercomputer, is capable of more
than 80 teraflops, and when fully operational, the center will provide more
than 100 teraflops of computing power. Rensselaer's vice president of
information services and technology and CIO John E. Kolb said the ability
to design and manufacture smaller and faster semiconductors is vital to
maintaining Moore's Law, which states the number of transistors per a given
area will double every 18 to 24 months. Currently, circuit components are
about 90 nanometers wide, and according to the International Technology
Roadmap for Semiconductors, the components need to shrink to 45 nm by 2010,
32 nm by 2013, and 22 nm by 2016. Such extremely small sizes result in
different physical performances, requiring the use of supercomputers to
design chips and predict their performance. CCNI will provide researchers
with the tools necessary to perform a variety of computational simulations,
ranging from interactions between atoms and molecules to modeling the
behavior of complete devices. The 100-plus teraflop system is made up of
massively parallel Blue Gene supercomputers, POWER-based Linux clusters,
and AMD Opteron processor-based clusters, which makes CCNI one of the 10
most powerful supercomputer centers in the world, and the most powerful
university-based supercomputer.
Click Here to View Full Article
to the top
Computing Grid Helps Get to the Heart of Matter
eWeek (05/20/07) Musich, Paula
The success of experiments performed using the world's largest partial
accelerator, the Large Hadron Collider (LHC), will greatly depend on a
worldwide, high-speed network that will allow scientists to utilize 100,000
computers, primarily PCs, to process the mountains of data generated by the
experiment. The network uses a 10G bps backbone to link 11 scientific data
centers, creating the core of the world's largest international scientific
grid service. Francois Grey, director of IT communication at CERN in
Geneva, where the experiments will take place in November, said the LHN is
a 27-kilometer ring underground that accelerates protons to high energy
states and smashes them together, causing an explosion of particles. Huge
underground detectors pick off the signals from the collision every 25
nanoseconds. The data will be stored at a rate of hundreds of megabytes
per second. While the experiment is intended to answer questions
surrounding what unknown particles exist in the universe, the LHC Computing
Grid project will teach network engineers valuable lessons on running and
managing one of the largest 10G-bps networks. The full network will
include about 200 institutions in 80 countries, some with their own large
data centers. The network will collect and process a predicted 15
petabytes of data per year for the 15-year length of the project, although
data could be studied for many years after the experiments stop.
Click Here to View Full Article
to the top
New Software Can Identify You from Your Online
Habits
New Scientist (05/16/07) Marks, Paul
Microsoft is developing software that will be able to determine the
identity of a Web user by analyzing an individual's history of browsing the
Web. Speaking at the World Wide Web 2007 Conference in Banff, Canada, last
week, software engineer Jian Hu from Microsoft's research lab in Beijing
and colleagues said analytics software can make use of a wide range of
profiles, such as women's preference for searching the Web for health,
medical, and religious information, to perform a probabilistic analysis.
Raw information could be obtained from different sources, including
cookies, a PC's cache of Web pages, or proxy servers for keeping records of
Web surfing history. The software is already accurately guessing gender
and age. The researchers plan to continue to refine the software with an
eye toward accurately guessing occupations, qualifications, locations, and
names as well. "Because of the hierarchical structure--language, country,
region, city--we may need to design algorithms to better discriminate
between user locations," says Hu's colleague Hua-Jun Zeng. The research
has raised concern among some industry observers who believe the software
would violate privacy protections in many countries.
Click Here to View Full Article
to the top
A Thinking Person's Thinking Robot
Burlington Free Press (VT) (05/18/07) Johnson, Tim
University of Vermont assistant professor of computer science Josh Bongard
is currently best known for his work on a robot called "Starfish" that is
able to sense when it has been damaged and create a new way to walk,
compensating for the damage. Bongard's revolutionary work on robotic
self-awareness and artificial intelligence goes beyond computer science and
incorporates aspects of biology, psychology, neuroscience, mathematics,
economics, and even philosophy. Bongard sees self-awareness as having a
sense of one's own body, and Starfish is apparently capable of that.
Bongard and his Cornell University collaborators assigned Starfish the task
of moving across a surface without telling it how to move. Instead, the
robot was programmed a "series of playful actions" to test the locomotive
possibilities for itself, much like the movements of a human infant,
according to Bongard. After learning to walk successfully, the researchers
removed a part from one of its legs, rendering the leg useless, and the
robot went through another learning process to figure out how to move.
Instead of hobbling on three legs, as its creators expected, Starfish
basically dragged itself along. Currently, Bongard is working on
developing a program that can learn what an individual's preference is and
predict other preferences that person will have. Bongard is also using
computer simulations to create other robots capable of learning, a feature
that would be highly useful in dangerous places, disaster sites, and on
other planets. Bongard knows that a robot capable of learning and adapting
is controversial, but he believes there is a difference between being
self-aware and being conscious. Bongard says that it is theoretically
possible to create a conscious robot, but he is unsure if it would have any
practical value.
Click Here to View Full Article
to the top
In Future, Everything Will Be a Computer
National Post (CAN) (05/19/07) Medley, Mark
Hundreds of researchers, futurists, computer scientists, academics, and
innovators who convened in Toronto discussed the third wave of computing,
known in many circles as "pervasive computing," in which miniature
computers will be embedded in virtually everything, making an impact on
practically every aspect of our daily lives. "Everything could be a
computer, and could be used or adopted for a purpose by any individual,"
posited University of Toronto computer science professor Khai Truong.
Applications showcased at the conference included "dynamic book
recommendations" in which the act of picking up a book triggers the
transmission of reviews and related titles to the consumer's cell phone;
kitchens and utensils equipped with sensors so that a knife, for instance,
can determine what kind of food it is chopping; remote control of household
appliances by cell phone; and ad displays that customize themselves to the
time of day or the person passing by. But the advantages of pervasive
computing could be bought at a reduction of our personal privacy, which
raises the specter of Orwellian government surveillance and even more
insidious acts of kidnapping, theft, and stalking if every person's
whereabouts are public knowledge, to name just a few examples. Adam
Greenfield, author of "Everyware: The Dawning Age of Ubiquitous Computing,"
sounded a warning that pervasive computing technology could give rise to
"unpredictable and undesirable emergent behaviors," and outlined a series
of guidelines that must be followed to prevent this. The guidelines
recommend that the technology be self-disclosing and default to
harmlessness, deniable, and conservative of both face and time.
Click Here to View Full Article
to the top
Untangling the World Wide Web
Globe and Mail (CAN) (05/19/07) Dreher, Christopher
The remarkable growth of Internet users and applications has been
accompanied by a concurrent rise in the hazards and frustrations of going
online, leading mavens such as Internet pioneer and MIT researcher David
Clark to conclude that "the situation is not getting better, it's getting
worse." This state of affairs is giving rise to a philosophy that promotes
a complete rethink of the Internet's architecture, and a group of Stanford
University computer scientists formally launched the Clean Slate Design for
the Internet Project in April with this goal in mind. "Instead of trying
to fix problems for today, we're trying to figure out what the Internet
should look like in 15 years," explains Clean Slate research director Nick
McKeown, who points out that the ideas on which Internet technology was
originally founded have gone unchanged for four decades; this is in
contrast to the other high-tech fields, where innovation is continuous.
National Science Foundation programming director Guru Parulkar will take
over the Clean State Design project in August, and the new concepts
stemming from the effort will be tested on Parulkar's Global Environment
for Network Innovations beta network at a cost of between $300 million and
$400 million. Among the advantages that could come from Clean Slate is
faster and more secure data communication between handhelds thanks to
improved wireless spectrum allocation; the fulfillment of converged TVs/DVD
players/home computers' potential; elimination of the costs associated with
preventing spam, malicious hacking, and other threats; and broader
physical/virtual world interaction. Experts say the biggest challenges
facing the creation of a "clean state" Internet are non-technical. McKeown
says the network infrastructure is unprofitable and lacks economic
sustainability, while striking a balance between privacy and security
issues is another area of concern.
Click Here to View Full Article
to the top
Tough, Tech-Smart--And Female
Forbes (05/17/07) Rosemarin, Rachel
Women have not made the same progress in technology as they have in the
business sector, in fact it appears that women have regressed. According
to the U.S. Bureau of Labor Statistics, in 2000, 23.4 percent of network
and computer system administration jobs belonged to women, but in 2006 that
number dropped to 16.6 percent. The National Center for Women and
Information Technology also reports a decrease in the number of women
receiving undergraduate computer science degrees; in 1985, 37 percent went
to women, but only 21 percent did so in 2006. However, while the number of
women in technology may be declining on the whole, women are holding a
slightly larger percentage of leadership roles in technology than before.
Seven percent of CIOs were women in 2000, but 9 percent were women at the
beginning of this year, according to Sheila Greco Associates. Although a
handful of tech companies have women in major leadership roles, if women
continue to leave computer technology divisions before they have an
opportunity to be promoted to high-ranking technical positions, the number
of female CIOs and their counterparts will drop, along with other
positions. Cora Carmody, CIO at defense contractor SAIC, says some women
in IT are not exposed to the crucial management skills necessary to become
an executive in an IT position. Some tech-oriented women have left careers
in corporate IT and have explored startups and venture capital funding,
although in 2006 only 4.08 percent of venture funding went to tech startups
with female chief executives, down from 5.72 percent in 2001. Victoria
Usherenko, managing partner at head hunter agency Liaison Search
Associates, says that peer support needs to continue beyond education and
into careers, as women do not have the same network of peers that men do to
recommend each other, which artificially keeps the number of female IT
executive down.
Click Here to View Full Article
to the top
Cell Phones Now Helping to Guide the Blind
Computerworld (05/14/07) Kaza, Juris
Swedish researchers completed and tested a prototype of a voice-based
navigation system for the sight-impaired and the blind in late 2006, but
say improvements must be made in several areas, including positioning.
"Standard GPS is not good enough, so we are evaluating other positioning
technologies, including some rather accurate dead-reckoning software to
account for the user's movements, and, eventually, the use of RFID and
Bluetooth tags on certain objects and obstacles," says Tomas Uppgard, CEO
of local firm Mobile Sorcery. The researchers are developing a system that
would allow users to receive voice advisories from a mobile phone. The
prototype makes use of a Nokia 6300 Symbian phone with earphones and a
separate GPS unit linked to the phone through Bluetooth SIG technology.
The voice guide will call out, for example, an upcoming turn and alert the
user, for instance, if a vehicle is blocking the crosswalk, as well as
provide updates to other users and accept data entered by users. "The
metaphor is to give them a spoken map and enough detail to make their own
decisions," says Uppgard. Further testing is scheduled for the summer,
with full deployment projected for 2010.
Click Here to View Full Article
to the top
Reaping Results: Data-Mining Goes Mainstream
New York Times (05/20/07) P. BU3; Lohr, Steve
Data-mining programs are being used by an increasing variety of
professions as computing and mathematical analytics are being easily
adapted for individual situations. Shortly after becoming chief of police
in Richmond, Va., Rodney Monroe implemented a program that, in addition to
collecting traditional police information such as emergency calls and
police reports, uses neighborhood demographics, payday schedules, weather,
traffic patterns, and sports events to predict where crimes might occur.
The program found a high number of robberies in Hispanic neighborhoods on
paydays because a large percentage of the population used check-cashing
services instead of depositing the money in a bank, making them easy
targets for robberies. The crime rate in Richmond dropped about 20 percent
during the first year of using the program. Productivity research is also
using data mining to examine new areas that were once considered difficult
to measure. Data mining software, for example, allows employers to collect
information from office workers who handle ideas and information from
customers, suppliers, colleagues, and marketers. A company can establish a
system that tracks email traffic, instant messaging, and other digital
communications, removing personal information for security and privacy, to
study the flow of work and ideas through social networks. Meanwhile,
retail chains, including Wal-Mart and Kohl's, are using computing and math
analysis to accurately predict where to send products, such as what size
cloths should go to what stores. Appliance maker Whirlpool uses analytics
software to automatically scan warranty reports, manufacturing, supplier,
sales, and service data to cut warranty costs and improve quality.
Click Here to View Full Article
to the top
U. of I. Intends to Play Key Role in Nationwide Digital
Humanities Effort
University of Illinois at Urbana-Champaign (05/09/07) Lynn, Andrea
The University of Illinois plans to become a leading player in a national
initiative to "digitize the humanities" by designing and building
environments where scholars can carry out research across a wide spectrum
of literature through the use of high-performance computing tools in shared
digital networks. Dean of U of I's Graduate School of Library and
Information Science John Unsworth has netted a pair of major technology
grants from the Mellon Foundation to lead multi-institutional digital
humanities projects. One grant is a two-year, $1 million allocation for
the Metadata Offer New Knowledge (MONK) text-mining collaboration, which
integrates and extends the Nora and WordHoard projects to build "an
inclusive and comprehensive text-mining and text-analysis tool-kit of
software for scholars in the humanities," according to Unsworth. He also
serves as one of the co-principal investigators on the Software Environment
for the Advancement of Scholarly Research (SEASR), an infrastructure
project that received a $1.2 million Mellon grant in March. The goal of
SEASR is to increase the usability of content collections through the
merging of the National Center for Supercomputing Applications'
Data-to-Knowledge and Unstructured Information Management Architecture into
an analytical framework that can be easily learned and adapted by
researchers across all disciplines. Unsworth says the MONK and SEASR
projects are complementary. He is also involved with the $2.6 million ECHO
DEPository digital preservation research and development project. Unsworth
notes that one of the major barriers to the advancement of digital
humanities development is that the field "is often still regarded with
suspicion at the department level as somehow less than scholarly."
Click Here to View Full Article
to the top
DNSSec Too Costly, Difficult to Use, IGP Hears
Washington Internet Daily (05/18/07) Vol. 8, No. 96, Piper, Greg
The Internet Governance Project addressed the issue of the DNS Security
Extensions protocol (DNSSec) during a meeting that was attended by Matt
Larson of VeriSign's naming and directory services unit. Larson explained
that there has not yet been much demand for DNSSec, and it would require a
multimillion-dollar investment from VeriSign; thus the company is "looking
at this landscape very carefully." Speakers at the meeting said IT
departments would not be able to handle the litany of technical problems
posed by DNSSec, which could result in an entire root zone being disabled.
NIST's Scott Rose predicted that it would take a massive, headline-grabbing
DNS attack to produce substantial momentum for the adoption of DNSSec.
Rose noted the long odds that NIST faces in trying to bundle DNSSec with
the .gov domain. The managers of ccTLDs were put off by the Department of
Homeland Security's recent pronouncement about ensuring that the master
root key remains under American control.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Microsoft Man Seeks to Re-Engineer the Web
Inquirer (UK) (05/16/07) Grossman, Wendy M.
Microsoft's Kim Cameron wants to create a mechanism for knowing who you
are talking to on the Web and is working to re-engineer the Internet. His
first big break came last year with the publishing of the paper "The Laws
of Identity" and proposals for A Privacy-Compliant Identity Metasystem,
which serve as the foundation for the CardSpace identification technology
that is found in Windows Vista and is available for download for XP. The
technology is in beta at many sites and is "beginning to ramp up," says
Cameron. CardSpace makes use of the Information Card, which is generated
securely on the user's machine, for authentication, and it can be selected
from a graphical display. The completion of the authentication process
involves the production of a security token by the card, rather than
sending information in the card to the site. The graphical display
verifies information such as the owner of the site and the location of the
underlying business. There are some concerns about the idea of an
"identity layer," such as its threat model and use case. Cameron was asked
why Microsoft did not join the Liberty Alliance during the recent ACM
conference on Computers, Freedom, and Privacy, and he said the widescale
vendor initiative was different in that "it doesn't give the user their own
agent under their control."
Click Here to View Full Article
to the top
What Is the Analogue for the Semantic Web, and Why Is
Finding One Important?
University of Southampton (ECS) (05/15/07) schraefel, m.c.
The growth of the Semantic Web depends on the provision of an analog that
helps people comprehend it as well as identifies the potential
opportunities it can facilitate, and m.c. schraefel of the IAM Group
proposes a model based on the typical framework for Web interaction. She
maintains that the Semantic Web can be envisioned as a Notebook + Memex, in
comparison to the Page + Links representation of the current Web; this
model presents new challenges for basic human/computing interaction. The
notebook and the memex focuses on engagement with, development of, and
cooperation with information, as work in progress. "While the Semantic Web
has the strong Memex-y potential to support dynamic and automatic
associations across inter-related domains, the notebook emphasizes both the
more writerly and the more personal side of engaging with information,"
schraefel writes. "The notebook as model also says that it is critical for
new Semantic Web tools to support this creative, explorative process
directly and explicitly." She explains that social networks of data
sources that are of critical interest to many considering the Web's
structure are a missing element in projections of future human/computing
interaction, which is why a new paradigm of the next-generation Semantic
Web must consider not just computing's use/reuse paradigm and
machine-produced analysis/inference, but human voices as well. "If we
believe that this intermixing of voices and intermixing of idea generation
represents an important set of axes and continuums to support, then our
vision will need to be for tools to support these kinds of
interactions--interactions we carry out regularly in the physical world,
but are less well supported in the digital space," schraefel concludes.
Click Here to View Full Article
to the top
Perils and Pitfalls of HPC Spotlighted at LCI
Conference
HPC Wire (05/18/07) Vol. 16, No. 20, Montry, Gary
The risks of high performance computing were highlighted at this year's
Linux Cluster Institute (LCI) Conference, with Horst Simon kicking off the
first day with a discussion about HPC's current status, hardware
architecture, and the political aura surrounding the effort to construct
the first petaflop machines. Sandia National Laboratory's Robert Ballance
presented a keynote speech concerning the assembly and performance issues
of Red Storm, given the fact that the machine was bigger than any previous
machine the institution had created. There was an intense focus on
parallel I/O systems and the problems of inducing them to scale on large
cluster systems, illustrated by the fact that some of the tests can be so
lengthy that the production system would not be available for an
unacceptable amount of time, which forces I/O system administrators to
carry out simulations of the I/O systems on smaller clusters. Hardware and
software sessions comprised the second day of the LCI conference, with
emphasis on the Defense Advanced Research Projects Agency's HPCS program
and the presentation of various HPC machine descriptions, including
Sandia's upgrade to Red Storm. The third day of the conference began with
Peter Ungaro's keynote presentation in which he outlined Cray's plan to
compete in the HPC marketplace with future generations of clusters that
feature 10,000 to 1 million cores, and projected the emergence of a 1
million-core system within half a decade. He argued that commodity Linux
clusters are too generalized to be reliable, scalable, and available when
scaled past approximately 1,000 sockets, while Cray believes the successful
scaling of future megaclusters will depend on custom value-added
simplifications supplied by vendors.
Click Here to View Full Article
to the top