Panel Sees Progress Made in Cybersecurity
CNet (02/14/06) Evers, Joris
In the three years since President Bush approved the National Strategy to
Secure Cyberspace, the country's vulnerability to cyber attacks has been
reduced, a panel of experts at the RSA Conference agreed, though more work
needs to be done to keep pace with the increasing sophistication of
cyberattacks. "Are we making progress? Yes. Do we have to hit some
afterburners? I think that answer is yes also," said panelist Daniel
Mehan, the former CIO at the Federal Aviation Administration. Mehan gives
the state of government cybersecurity a rating between a D and a C+, noting
the 500 percent increase in the number of incidents that CERT tracked from
2000 to 2003. The government has significantly improved its coordination
with industry in responding to threats, as the recent Cyber Storm mock
attack showed a high level of information being shared between agencies and
companies. Andy Purdy, the acting director of the National Cyber Security
Division, noted that the government could still simplify security for
consumers, step up its efforts to protect children on the Internet, and
raise awareness about the hazards of filesharing. Independent security
consultant Howard Schmidt agreed with Purdy that software must become more
secure, noting also that small and midsize businesses must bolster their
protections against phishing attacks and other threats that compromises
users' personal information. Increased regulation patterned after Europe's
cybercrime draft treaty could also help, as well as an effort to strengthen
the telecommunications infrastructure.
Click Here to View Full Article
to the top
Flawed Election Machines Leave Maryland Voters
Guessing
Baltimore Sun (02/15/06) P. 13A; Rubin, Avi
Maryland's direct recording electronic (DRE) voting machines are the least
transparent voting system available, argues Avi Rubin, computer science
professor at Johns Hopkins University. Rubin claims that without an
auditing capability, voters must take on faith that the machines recorded
their ballots accurately, and that they have not been tampered with by
malicious programmers or election insiders. Installing malicious code in
DREs is far easier than detecting it, and absent an auditing mechanism,
verifying election accuracy is impossible. The most easily verified
auditing mechanism is a paper trail. A recent University of Maryland study
examined the Diebold machines currently in use in Baltimore County, finding
that none contained sufficient verification technologies. The study failed
to examine alternatives to the Diebold machines, however. Several states
have scrapped the Diebold machines in favor of more transparent
alternatives, and 26 states now require their systems to produce a
voter-verified paper trail. Legislation has recently been introduced in
Maryland to mandate paper records, as well as periodic spot checks of the
machines and complete public disclosure in the event that voting
discrepancies occur. Optical scan machines would satisfy these
requirements, and are among the least expensive systems available. While
optical scan machines might complicate the job of election coordinators and
poll workers, the resulting transparency and accuracy would far outweigh
such a minor inconvenience, Rubin says.
For information about ACM e-voting activities, visit
http://www.acm.org/usacm
Click Here to View Full Article
to the top
Cooler Supercomputers
Technology Review (02/14/06) Roush, Wade
Clustered systems have facilitated the supercomputer era and made
computationally-intensive applications such as climate modeling and protein
folding possible, though stringing together thousands of independent
machines amplifies the possibility of memory failure and chronically flirts
with overheating. Indeed, the cooling systems at many of the larger
supercomputing centers are severely strained. SGI's Columbia supercomputer
was built for the NASA Ames Research Center, and ranks as the
fourth-fastest system in the world with 20 superclusters, each containing
512 processors, though it is still cooled essentially by wind. In a recent
interview, SGI's Eng Lim Goh described Project Ultraviolet, an initiative
to develop more energy-efficient supercomputers. Goh leads the project,
which aims to develop a marketable system by the decade's end. To enhance
memory functions and boost longevity, Goh has been using ASICs that
interface with an Intel Itanium processor and provide both the user and the
operating system with a comprehensive view of the memory. Goh opted for an
off-the-shelf product to keep costs down, and has been enhancing its
reliability with intelligent agents in the chipset to expunge components of
the memory that may be prone to failure. Goh acknowledges that this
technology is comparable to the "self-healing" applications that IBM and
others have been promoting. To address the heat problem, Goh believes the
most effective technique will be to move heat more quickly. To reduce the
run-time heat of applications, Goh hopes to reduce communication delays,
provide consistently high levels of broadband, and reduce load imbalance.
A major source of application heat, memory latency also crops up when an
application must search for a piece of data outside of the processor's
cache and in the memory.
Click Here to View Full Article
to the top
New Microchips Shun Transistors
Wired News (02/14/06) Hudson, John
Notre Dame electrical engineering professor Wolfgang Porod and his team of
researchers have developed the first functional prototype of a chip built
around magnetism rather than transistors. As scientists continually strive
for new techniques to keep the pace of Moore's Law, magnetic islands that
move binary code could lead to wireless devices that offer much greater
density and processing power than their transistor-based counterparts. The
resulting chips would boot up almost immediately, and consume less power
while emitting less heat. The non-volatile memory would be immune to power
disruptions, and would continue to store data after the power was shut off.
The chip's reprogrammable architecture could be particularly useful for
specialty applications, such as video games and medical diagnostic devices.
"The value of magnetic patterning in storage devices such as hard drives
has been known for a long time," said Porod. "What is unique here is that
we've applied the patterning concept to the actual processing." The 110 nm
magnets can be constructed to mirror the logic gates of conventional
transistors, enabling their binary capabilities. Combining the NAND and
NOR gates, the researchers built a universal logic gate into their chip,
enabling it to perform every basic arithmetic function required for
computer processing. Nanoscale magnets have proven to be a more effective
technique than quantum dots for transistorless processing, or magnetic
quantum cellular automata. A pulsed magnetic field commences the logic
operations inside the processor, creating a magnetostatic attraction and
repulsion that flips the adjacent magnetic fields. The Notre Dame research
is the first application of magnetic fields to a chip that both processes
and stores digital information.
Click Here to View Full Article
to the top
Tech Executive to Run MIT Media Lab
Boston Globe (02/15/06) Weisman, Robert
Veteran technology executive and entrepreneur Frank Moss has taken the
helm of MIT's Media Lab, succeeding co-founder Nicholas Negroponte, who
left in September 2000 to focus on his nonprofit, One Laptop per Child, and
interim director Walter Bender, who is taking a two-year leave of absence
from MIT to head software development at One Laptop. MIT hopes to
re-center the lab's research under Moss' direction on areas such as
education, health care, and aging. Moss has expressed his desire to apply
the lab's high-tech research to fields that will have a broad social
impact, and to boost industry sponsorship to expand the lab while it
integrates more closely with other MIT facilities. Comparing the lab to
running a business, Moss said, "You have to strike a balance between having
academic freedom and doing different types of research, and having the work
sponsored by companies that want to see research commercialized." The
media lab has a rich tradition of taking an interdisciplinary approach to
developing cutting edge technology. Rival schools began developing similar
labs that chipped away at the media lab's funding, and the technology
economy's collapse at the beginning of the decade drove companies away from
investing in academic research. Moss' business experience could make him a
good match for the lab as it attempts to overhaul its image and recapture
corporate funding. Moss will continue the lab's focus on the convergence
of the human and computer interface, but says that he will not revive the
lab's international efforts, which have stalled in both Dublin and
Bangalore. Moss favors the lab's biomechatronics research that pairs human
tissue with robotics to create prosthetic limbs, the hyperscore graphical
composing program, and its studies in sociable robotics that aim to advance
human-computer interaction. Moss also hopes to collaborate with the
neighboring Computer Science and Artificial Intelligence Laboratory
(CSAIL).
Click Here to View Full Article
to the top
Q&A: A Lost Interview With ENIAC Co-Inventor J. Presper
Eckert
Computerworld (02/14/06) Randall, Alexander
The watershed computing event of the 20th century--the unveiling of the
ENIAC computer--celebrated its 60th anniversary yesterday. Co-creator J.
Presper Eckert died in 1995, but a previously unpublished interview sheds
new light into the origins of ENIAC. Before ENIAC, computing relied on
electromechanical devices to perform basic calculations and solve linear
differential equations. Studying the operations of a machine developed by
MIT's Vannevar Bush, Eckert got the idea to replace mechanical integrators
with electrical ones, and eventually came to believe that the entire system
could function electronically. The resulting ENIAC system could perform
second-order differential equations and add 10-digit numbers together
50,000 times faster than a human could. With an unprecedented 18,000
vacuum tubes, ENIAC was built in a 30 foot by 50 foot room at the
University of Pennsylvania's Moore School in Philadelphia. Many of the
tubes and circuits were off the shelf, though Eckert also invented numerous
circuits, and pioneered integrator circuits and registers. The machine's
functions were split among the accumulator, initiator, master programmer,
multiplier, divider/square root, gate, buffer, and function tables. While
none of the hardware used in ENIAC appears in modern computers, Eckert
notes that the concepts of subroutine and internal memory both originated
with ENIAC. Contrary to ENIAC lore, Eckert claims that a vacuum tube
failed only once every two days, and that it could be identified and
replaced within minutes. Finalized too late to contribute to the war
effort, ENIAC's first significant use was in the development of the
hydrogen bomb.
Click Here to View Full Article
to the top
Internet Firms to Defend Policies
Washington Post (02/15/06) P. D1; Noguchi, Yuki
Yahoo!, Microsoft, and Google will claim to have struck a balance between
business interests and human-rights concerns when they go before Congress
today to defend their corporate policies toward China. Yahoo! will testify
before the House subcommittee on Africa, Global Human Rights, and
International Operations and the subcommittee on Asia and the Pacific that
the Internet's presence has a beneficial effect on closed societies even
when it is subject to censorship. Yet some human rights proponents plan to
argue that American corporations are waiving their ethical duties by
complying with Chinese law, which often comes down hard on free speech.
Google announced last month that it would expurgate certain results on the
Chinese version of its search engine; in December, Microsoft's MSN
shuttered a dissident reporter's blog; and in 2005, Yahoo! supplied the
Chinese government with email data resulting in the imprisonment of another
dissident journalist. "If you're on the ground in China, you have to
comply with the [local] law," reported Yahoo! general counsel Michael
Callahan, who is slated to testify today. "Fundamentally, being there
transforms lives, society and economies." Callahan's argument--one echoed
by Google and Microsoft--is that when any government requests information,
the company is frequently in the dark about how that information will be
employed. Reporters Without Borders' Lucille Morillon said the prevention
of future crackdowns on dissident reporters should be facilitated by a
combination of corporate self-regulation and government oversight. The
State Department announced on Tuesday the establishment of a Global
Internet Freedom Task Force that will monitor the censorship policies and
information access restrictions of other governments, and make policy
recommendations to keep Internet access to a maximum while keeping
government attempts to suppress information to a minimum.
Click Here to View Full Article
to the top
Alumni Plan New Google Alternative
Stanford Daily (02/15/06) Fagliano, Stephanie
Stanford University alumni Anand Rajaraman and Venky Harinarayan are
developing a search technology called Kosmix that they hope will improve on
Google's search engine by allowing users to refine their searches by
defining categories. Categorizing data on the Web could significantly
improve search accuracy, though it would require a tremendous amount of
processing, given the Web's ad hoc organization. Further complicating the
effort is the wide variance that would inevitably emerge among different
users' interpretations of a given category. Still, category-based searches
could dispense with many of the random and irrelevant results that are
returned when searching through Google. Stanford computer science
professor Bill Dally notes that once development is completed, a new search
engine can be operational within weeks, though popularity among consumers
is often fickle and hard to predict. Students can learn the basic
algorithms to build a search engine in college, though applying that
background to the frenetic reality of the Internet is an entirely different
proposition. Stanford computer science professor and ACM Fellow Gio
Wiederhold believes that for a startup such as Kosmix to succeed, it will
have to find a market niche that has broad appeal for both consumers and
advertisers. Carving out a foothold in the search engine space also
requires consumers to see room for improvement on the enormously popular
Google technology, which is perhaps the most significant uncertainty that
Kosmix faces.
Click Here to View Full Article
to the top
Internet Television, E-Science, Smart Optics for
Detecting Structural Failures
EurekAlert (02/09/06)
At this March's Optical Fiber Communication Conference and
Exposition/National Fiber Optic Engineers Conference in Los Angeles,
researchers will unveil their latest optics-based innovations and
discoveries. Among the technical presentations will be a session on making
IPTV practical and affordable, featuring the research of Samrat Kulkarni
and his colleagues at Lucent Technologies Bell Labs on transport networks
that enable providers to reapply their existing infrastructure while
delivering high-speed signals to subscribers' homes. Physicists from the
University of Ottawa will present an optical system that monitors for
indications of structural defects in natural-gas pipes, concrete columns,
and other important pieces of infrastructure. The Distributed Brillouin
Sensor detects cracks, deformation, and other structural flaws through
fiber optics. The conference will also see Georgia Tech's Gee-Kung Chang
present his experiment network enabling the simultaneous transmission of
high-speed wireline and wireless broadband signals. The system could
enable transmission speeds up to 2.5 Gbps, and would integrate with
existing networks. MCNC's Gigi Karmous-Edwards will present her research
on optical-fiber networks that could integrate scientists around the globe
through massive data sharing enabled by enhanced network transport
protocols and configurations. Infinera's Fred Kish will outline his
company's method for the mass production of integrated photonic chips with
data rates of 100 Gbps. Time Warner Cable's Bob Harris will detail the
architecture of 10G networks capable of delivering voice, video, and
Internet services to keep the pace of Moore's Law. Other presentations
will feature a new laser-based manufacturing technique for flat-panel
displays and the world's first bismuth-doped silica fiber laser.
Click Here to View Full Article
to the top
Women Engineers Share Advice
Daily Princetonian (02/13/06) Mu, Euphemia
Princeton University recently hosted the Women in Science and Engineering
Conference, a day-long, inter-university forum featuring leadership
workshops, panels on career choices, and advice on how to balance a career
and family. "We all thought that a lot of the issues discussed in the
leadership workshops are relevant to women," said Melissa Carroll, a
graduate student in computer science and neuroscience at the university.
"These women often don't have people to talk to." The conference was
created to promote discussion and provide networking opportunities for
female undergraduate and graduate students. Several leadership workshops
were held on time management skills, negotiation skills, methods of dealing
with difficult people, and the characteristics of a successful leader. The
workshop discussions were moderated by professors Jennifer Rexford, Kyle
Vanderlick, Catherine Peters, and former ACM president Maria Klawe. Klawe
led the workshop on characteristics of a successful leader, which include
being an attentive listener, never losing sight of the big picture,
encouraging others to follow, and communicating effectively. "The first
rule of success is to fail openly and often," said Klawe. "If you don't
fail often, you are not setting your standards high enough." The
conference was attended mostly by women, and sponsored by the Graduate
Women in Science and Engineering group.
For information on ACM's Committee on Women and Computing, visit
http://www.acm.org/women
Click Here to View Full Article
to the top
Virtual Reality Prepares Soldiers for Real War
Washington Post (02/14/06) P. A1; Vargas, Jose Antonio
For a generation of soldiers raised on video games, real-life combat can
seem more like an outsized simulation of "Halo" or "Full Spectrum Warrior"
than a physical reality. The Army acknowledges that Ctl+Alt+Del is as
ingrained as the alphabet in today's soldier and it has capitalized on that
fact in its training. While they cannot replace field experience,
simulations have become an integral part of today's military training, and
have indeed changed warfare itself. "The technology of games has
facilitated a revolution in the art of warfare," said David Bartlett, who
heads the Pentagon's computer-related training, pointing to an increase in
battle preparedness that comes from playing first-person shooter games.
Objective comparisons between soldiers of one generation and another are
typically anecdotal and inherently problematic, though military experts
agree that modern soldiers do have a more thorough knowledge of weapons
than previous generations. Video games are a favorite pastime for many
soldiers in Iraq and Afghanistan on their off hours. "Over there in Iraq,
I think playing those games helped," said Sgt. Sean Crippen. "It kept me
on my toes. It taught me what to do and what not to do." Other observers
point to the reality gap between video games and real combat, claiming that
many soldiers weaned on first-person shooting scenarios find the genuine
article to be much more wrenching. Some battle-hardened soldiers abandon
shooting games when they return home, claiming that the games only bring
them back to a level of violence they had hoped to forget. By contrast,
others play them as avidly as they did before the war, admiring features
such as the realistic simulation of a soldier's heartbeat as the enemy
approaches.
Click Here to View Full Article
to the top
From Cattle to Chemicals: Colorado School Seeks to Expand
Grid Computing Efforts
Network World (02/13/06) Brown, Bob
Colorado State University's grid computing program was founded in 2004 to
advance the technology used to track animals, drawing on more than $2
million in grants from the Colorado Institute of Technology, as well as
contributions from Sun, the Department of Homeland Security, and others.
The Colorado Grid Computing Initiative's (COGrid) first project was to
produce animal tracking data to help identify cattle in the event of an
outbreak of mad cow disease. Patrick Burns, CSU's associate vice president
for information and instructional technology, has taught classes in grid
computing using the system, and has overseen its application to more than a
dozen departments throughout the school. In the project's second phase,
Burns plans to help educate other Colorado institutions on how to use the
system. Then, he hopes that other schools will receive donations for
similar systems to expand the grid. Burns' experience with the grid has
taught him that "it still takes a lot of work to map an algorithm onto an
advanced computing system and after that get good performance out of it."
Burns adds that "the human element is still very much required to design
algorithms, implement them, and test ways of doing things."
Click Here to View Full Article
to the top
In the Key of 'T' (Technology)
Knox College News (02/10/06) McGaughey, Alison
Aaron Lepkin, a senior at Knox College, recently designed software for one
of his professors to use in the classroom. Lepkin, a computer science
major with a minor in vocal performance, developed the software as part of
an independent study project. The software will be used to help Jeremy
Day-O'Connell's Introductory Music Theory students enhance their basic
music-reading skills. "I really like the interdisciplinary nature of
this," says Lepkin. "My interests are really varied, from computer science
to music to linguistics to cognitive science. So this project is combining
all of those things." Day-O'Connell says a lot of students lack the
foundational skills needed to read music fluently, which is a necessary
skill in analyzing musical compositions. He says, "The software will offer
supplemental drills for the less experienced students and will help me make
more room in the classroom for really talking seriously about music, which,
after all, is the point of the class." Lepkin's project consists of
designing a framework for posing problems for students to solve. After a
student has selected a particular category of question, the software
continues to generate random musical questions included in that category.
The user solves the problem, and then the software lets the user know
whether the solution is right or wrong.
Click Here to View Full Article
to the top
The Laptop Tug of War: Speed Versus Battery Life
Electronic Design (02/02/06) Vol. 54, No. 3, P. 37; Tuite, Don
The dramatic performance gains of recently unveiled laptop technology from
the likes of Intel, Hewlett-Packard, Dell, and Apple raise novel power
management issues. For example, Apple CEO Steve Jobs claimed Intel's new
Centrino Core Duo mobile technology platform quadruples the performance of
previous Apple laptops with Power-PC processors; but in comparison to the
Pentium M, the highest-performance mobile processor prior to the Dual Core,
the dual-core chip takes in around 20 percent maximum power, and needs
voltage regulation to respond about 16 percent faster. Still, the
combination of the Intel Core Duo processor, the mobile 945 Express
chip-set line, and the PRO/Wireless 3945ABG network connection
significantly reduces power consumption in the Core Duo platform. The
processor can function at very low voltages, and power dissipation is
decreased in the active state through advanced methods that keep clock and
signal switching to a minimum, enabling the chip to rapidly enter and exit
these states in order to save power while maintaining rapid responsiveness.
The chip set can power down with the processor in its low-frequency
power-conserving states via "dynamic bus parking." The Core Duo processor
is also equipped with the Advanced Thermal Manager that outfits each core
of the processor with a digital temperature sensor and a thermal monitor to
facilitate more refined fan control. Although such technologies will give
the new laptops as much battery life as their earlier iterations, they do
not eliminate the need for their power supplies to support instantaneous
demand.
Click Here to View Full Article
to the top
A Pill, a Scalpel, a Database
InformationWeek (02/13/06)No. 1076, P. 38; McGee, Marianne Kolbasuk
Information technology is making strides in three critical areas of
medicine: The filtering and delivery of information to the patient's
bedside, allowing for personalized care; formatting existing data to obtain
a richer, more helpful picture of the patient's condition; and the use of
analytics to integrate data that yields new insights. IBM Healthcare and
Life Sciences' Brett Davis says the interim between the discovery of new
medical breakthroughs and their standard application--which can take as
long as 17 years--is decreasing thanks to the use of IT and other new tools
for research and collaboration. In addition to helping enable more
customized patient treatments, health-care IT can cut the time and cost of
testing new drugs and improve the development of safer, more targeted drugs
via data mining and analysis. Analytic, pattern-recognition, and
decision-support software can examine data from countless sources, and they
could emerge as some of the most critical health-care tools. But
delivering more timely and customized bedside care requires a national
infrastructure for electronic health data that facilitates the exchange of
standardized medical records, which President Bush flagged as a national
goal to be realized by 2014. "The key tipping point will be in getting the
national health IT infrastructure in place," notes Davis. Other challenges
include the increasingly pressing issues of security, privacy, and ethical
data usage as more and more health-care information becomes electronically
accessible. Progress can also be hindered by hesitancy among some
researchers to share information.
Click Here to View Full Article
to the top
Miniaturized Power
Scientific American (02/06) Vol. 294, No. 2, P. 72; Choi, Charles Q.
Bell Labs aims to develop a mass-produced "nanobattery" that can be built
in with other circuitry on a chip; outfitted with nanometer-scale
electrodes, the nanobattery could remain inert for long periods, providing
energy only when needed. The nanobattery concept stems from Bell Labs
researcher Tom Krupenkin's work with "nanograss," in which miniature
superhydrophobic pillars exhibit hydrophilic properties when a voltage is
applied to a liquid electrolyte, causing the pillars to draw droplets down
between them, where the electrolyte reacts with any compound at the bottom.
Krupenkin reasoned that a nanobattery could be powered by the liquid, and
he notes that nanograss would not only ease the miniaturization of reserve
batteries, but also permit the design of batteries that activate only
sections of the nanograss field at one time instead of the entire field.
Bell Labs corporate parent Lucent Technologies is jointly developing the
nanobattery with mPhase, and the fruits of their labors include a prototype
that generates current. The model uses zinc anodes and manganese dioxide
cathodes, while a silicon dioxide/fluorocarbon nanomembrane rests on a
zinc-coated silicon floor. A zinc chloride electrolyte solution lies above
the porous, electrowetting nanomembrane. The anode and cathode patches lie
separate from one another in the unactivated state, but once immersed in
the electrolyte in the activated state, the patches physically connect and
react to produce electricity. Nanobattery technology's potential
breakthrough applications include more environmentally friendly power
sources, infrequently used sensors, environmental monitoring devices that
can send data over longer distances, and enhancement of products such as
cell phones, radio-transmitting pet collars, and medical implants.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Denial-of-Service Attack-Detection Techniques
Internet Computing (02/06) Vol. 10, No. 1, P. 82; Carl, Glenn; Kesidis,
George; Brooks, Richard R.
A survey of methods for detecting denial-of-service (DoS) attacks points
to the need to distinguish between network-based flooding attacks and
abrupt increases in flash events or legitimate activity. In a flooding
attack scenario, the attacker sends the victim a large amount of network
traffic workload, causing bottlenecks that can severely hamper legitimate
workloads, and no software vulnerability or specific conditions are needed
to execute such an attack. Locally installed DoS attack-detection
strategies can shield potential victims, while remotely installed
approaches can be used to spot propagating attacks; most IT departments opt
for local detection in which detectors are located at the potential victim
resource or at a router or firewall inside the victim's sub-network. A
variety of detectors distributed across three attack-detection method
categories--activity profiling, change-point detection, and wavelet
analysis--were analyzed, and several core problems were outlined. Rigorous
testing of the surveyed detectors was impossible partly because
comprehensive test data, testing environments, and standards are
unavailable, although the authors hope efforts such as the Cyber Defense
Technology Experimental Research Project will solve this problem. Also,
none of the detector schemes have nominal-traffic measures covering the
whole of potential network conditions, while researchers for the most part
offer no guidance on how much each detector's multiple operating parameters
can vary and thus impact performance. Real-world implementation issues
were also omitted in the studies. The authors conclude that though all the
surveyed detectors yield promising results in limited testing, none can
completely address the detection challenge; they reason that the optimum
solution is to combine various strategies and supplement them with the
participation of seasoned network operators.
Click Here to View Full Article
to the top