USACM Urges Congress to Build in Safeguards for Automated
Employment Checks
AScribe Newswire (05/06/08)
ACM U.S. Public Policy Committee Chairman Eugene H. Spafford's testimony
at a Congressional hearing on employment verification systems and their
impact on the Social Security Administration on Tuesday highlighted several
potential problems in a pilot system operated by the U.S. Department of
Homeland Security intended to allow employers to electronically check
employee work eligibility. Spafford urged Congress to include sufficient
safeguards to ensure that employers and employees are adequately protected
from technical failure and system abuse. Congress is considering several
proposals to expand the DHS E-Verify automated employment verification
system, including requiring employers to verify all new hires and existing
employees using an expanded version. Verification is now optional for
employers. "As technologists, we are acutely aware of the limitations and
failure modes of current information technology," Spafford said. "Any
system must take the extreme failure modes into account and provide
appropriate safeguards to avoid injury to the blameless seeking gainful
employment to better themselves." Spafford said the three biggest concerns
are the accuracy and timeliness of system results, the security and privacy
protection afforded to information kept in the system, and the technical
feasibility of multiple approaches to creating such a system. He said
those same concerns apply to the REAL ID Act, US-VISIT, and a U.S.
immigration and border management system.
Click Here to View Full Article
to the top
Delaying Data Could Cut Net's Carbon Footprint
New Scientist (05/05/08) Inman, Mason
Unlike most PCs, which adjust their energy consumption based on their
workload and shut down when idle, network hardware consumes about the same
amount of power whether active or inactive, and U.S. academics and
Microsoft and Intel researchers are working on techniques to reduce this
consumption by subtly modifying the flow of network traffic to ease the
labor of routers and servers. Energy consumption of Internet servers is
the focus of research being conducted by Microsoft Research scientists, who
have learned that activities such as instant messaging or online gaming
require long periods in which connections must remain open, which keeps the
servers active and power-consumptive. Study leader Jie Liu says the
approach of first sending new connections to servers that are already busy
automatically generates servers that bear a light workload, and which "are
the candidates for shutting down when the total load is low." Carnegie
Mellon University's Diana Marculescu says Liu and colleagues successfully
demonstrated that power can be reduced substantially using this technique
without affecting the user experience in any perceptible way. Meanwhile,
delaying data flowing into a network by only a few milliseconds can yield
energy savings of about 50 percent, according to the University of
California at Berkeley's Sergiu Nedevschi and colleagues at Intel Research
labs. Information can also be clustered into fewer, larger bursts so that
the hardware can rest between chunks, and the researchers' simulations
indicate that between 40 percent and 80 percent of the energy consumed by
network hardware can be saved with either strategy using current
hardware.
Click Here to View Full Article
to the top
Pentagon Wants Cyberwar Range to 'Replicate Human
Behavior and Frailties'
Wired News (05/05/08) Shachtman, Noah
Congress has told the Defense Advanced Research Projects Agency (DARPA) to
create a National Cyber Range as part of a $30 billion governmentwide
effort to prepare for digital warfare. To make the facility as realistic
as possible, DARPA has released a request for proposals that requires
contractors to provide robust technologies that emulate human behavior on
all nodes for testing all aspects of behavior. The range should produce
realistic chains of events between multiple users without scripting
behavior, implement multiple user roles similar to roles found on
operational networks, and change replicant behavior as the network
environment changes. Replicants also must simulate physical interactions
with peripherals such as keyboards and mice, drive all common applications
on a desktop environment, and interact with authenticate systems, including
Defense Department authentication systems. The digital people have to
demonstrate human behavior 80 percent of the time. The facility should
also include realistic offensive and defensive opposition forces capable of
fighting military cyberwarriors in simulated combat. Contractors must
create 10,000-node tests using government-provided configuration files and
network diagrams in under two hours, and the nodes must be more than
computers connected to a faux Internet.
Click Here to View Full Article
to the top
IBM Launches Program to Attract More Hispanics Into
Technology Jobs
InformationWeek (05/06/08) McGee, Marianne Kolbasuk
Currently, about 5 percent of the U.S. population works in science,
technology, engineering, and math (STEM)-related jobs, but only 2 percent
of Hispanics work in those areas, says IBM's Adalio Sanchez. IBM has
launched an initiative to attract more Hispanics to STEM jobs. Schools,
government, corporations, and nonprofits can play a role in preparing more
Hispanics for STEM careers, but Sanchez says that many corporations are
trying to reinvent the wheel individually when a greater collaborative
effort is needed. IBM's efforts include programs to spark and nurture an
interest in STEM subjects among school-age Hispanics, including helping
Hispanics overcome the language barrier and providing two-way
English-Spanish email translation software to all U.S. schools. IBM is
also targeting problems faced by older students, such as the fact that
calculus is not taught at many community colleges Hispanics attend and that
70 percent of science teachers are not certified in math and science in the
United States, by expanding its MentorPlace program so more IBM employees
provide online mentoring to students in U.S. school districts with large
Hispanic populations.
Click Here to View Full Article
to the top
Discovery May Lead to Faster, More Powerful
Processors
Computerworld (05/06/08) Gaudin, Sharon
Princeton University researchers developed a way to automatically erase
tiny defects in computer chips, which would allow manufacturers to create
even smaller and more powerful processors. Princeton professor Stephen
Chow says the process, called Self-Purification by Liquefaction, melts the
structures on the chip in a fraction of a millionth of a second, only long
enough for the resulting flow of liquid to be guided so it reforms into the
proper shapes. A pulse from a laser, similar to the one used in laser eye
surgery, is used because it heats only a thin top layer of the flawed
structures without causing any damage to interior structures. The pulse is
designed so it only melts semiconductor and metal materials and leaves
other parts of the chip untouched. In one experiment, the technique made
the edges of 70 nanometer-wide chromium lines more than five times
smoother, Chou says. The next step for Chou and his researchers is to try
the technique on 8-inch wafers.
Click Here to View Full Article
to the top
Berkeley Lab Researchers Propose a New Breed of
Supercomputers for Improving Global Climate Predictions
Lawrence Berkeley National Laboratory (05/05/08) Wang, Ucilia
Researchers at the U.S. Department of Energy's Lawrence Berkeley National
Laboratory have proposed a new way to improve global climate change
predictions through the use of a supercomputer with low-power embedded
microprocessors. Michael Wehner and Lenny Oliker of the Berkeley Lab's
Computational Research Division, and John Shalf of the National Energy
Research Scientific Computing Center, say a new class of supercomputers
that uses embedded processor technology could create a cost-effective
machine for modeling climate conditions to better understand climate
change. In their paper, "Towards Ultra-High Resolution Models of Climate
and Weather," the researchers conclude that a supercomputer using about 20
million embedded microprocessors would cost $75 million to construct, but
would consume less than 4 megawatts of power and have a peak performance of
200 petaflops. Wehner, Oliker, and Shalf, along with researchers from
University of California, Berkeley and Colorado State University, are now
working to build a prototype system capable of running a new global
atmospheric model developed at Colorado State. "What we have demonstrated
is that in the exascale computing regime, it makes more sense to target
machine design for specific applications," Wehner says. "It will be
impractical from a cost and power perspective to build general-purpose
machines like today's supercomputers."
Click Here to View Full Article
to the top
Better Reading on the Small Screen
Technology Review (05/06/08) Greene, Kate
Researchers at the Fuji Xerox Palo Alto Laboratory (FXPAL) recently
demonstrated the Seamless Documents project, mobile phone technology that
can store a scanned document in a database and analyze its structure and
content. The analysis is used to identify sections and paragraphs to
automatically extract key phrases that summarize sections, enabling users
to jump to a section labeled with a key word, or skip to the last paragraph
on a page, when reading the document on a mobile phone. The software also
automatically resizes images, section headers, and plain text when a user
is scrolling through the document. The first part of the Seamless
Documents project focuses on converting analog documents into digital
information that can be stored in a database and accessed using the
Internet and cell-phone networks. FXPAL software analyzes the document's
structure to find paragraph breaks, pictures, and section titles. The
software then automatically summarizes text and chooses key words and
concepts from each section to highlight for the user. The second part of
the project involves software that runs on mobile phones. The software
opens the document and displays extracted information. The user can see a
view of the document with key phrases in a large font, overlaid on top of
paragraphs and segments.
Click Here to View Full Article
to the top
There's a Hole in My Bucket--and in the Data as
Well!
UCSD News (05/05/08) Zverina, Jan
University of California, San Diego researchers at the San Diego
Supercomputer Center (SDSC) are working with four other universities on the
Hydrologic Information System (HIS), an initiative to create a universally
accepted cyberinfrastructure for studying the nation's water resources.
HIS, backed by a five-year National Science Foundation (NSF) grant, is
being developed with the Consortium of Universities for the Advancement of
Hydrologic Science, a joint effort involving more than 100 universities
funded by NSF to advance research in hydrology. SDSC Spatial Information
Systems Laboratory director Ilya Zaslavsky, a key architect of HIS, says
there is a flood of data on water quality and quantity that is collected
every day from thousands of sensor stations through a variety of government
agencies. However, he says that despite the wealth of data, most of the
databases are incompatible with each other. HIS is currently in the first
phases of creating a Web-based cyberinfrastructure that will provide broad
and uniform access to comprehensive distributed collections of water data
from federal, state, and local repositories, and enable users to publish
new observation datasets. HIS will also enable better cross-scale analysis
of hydrologic cycles and processes on either a regional or continental
scale by combining a variety of climate models and integrating data from
neighboring disciplines.
Click Here to View Full Article
to the top
Extracting the Structure of Networks
ZDNet (05/03/08) Piquepaille, Roland
Santa Fe Institute researchers Aaron Clauset, Cris Moore, and Mark Newman
have developed an algorithmic method that enables the automatic extraction
of the hierarchical structure of networks, and they say the results
"suggest that hierarchy is a central organizing principle of complex
networks, capable of offering insight into many network phenomena." The
researchers suggest a direct yet flexible hierarchical structure paradigm
that is applied to networks via machine learning and statistical physics
tools. Analysis of networks from three distinct disciplines shows that
hierarchical structures can predict missing network links with up to 80
percent precision, even in scenarios where only 50 percent of connections
are exposed to the algorithm. The May 1 issue of Nature details Clauset,
Moore, and Newman's work, and notes in the editor's summary that the data
describing complex networks is frequently biased or incomplete. An
accompanying article by Boston University's Sid Redner says that "focusing
on the hierarchical structure inherent in social and biological networks
might provide a smart way to find missing connections that are not revealed
in the raw data--which could be useful in a range of contexts." The SFI
researchers think that their algorithms are applicable to nearly all
network categories, ranging from biochemical networks to social network
communities.
Click Here to View Full Article
to the top
Are We Closer to a 'Matrix'-Style World?
MSNBC (05/05/08) Nelson, Bryn
Virtual reality technology is progressing rapidly thanks to advances in
computing power and graphics, and some researchers believe a "Matrix"-style
world where it is difficult to distinguish the real from the virtual is
right around the corner. "We've reached a level now where we can make very
realistic images: Five to 10 hours to make images more or less perfect,
where people say, 'Wow, that's a photograph!'" boasts University of
California at San Diego professor Henrik Wann Jensen. He says achieving
the same level of photorealism in real-time animation is upcoming thanks to
new graphics processors. Jensen is tackling the challenge of power
efficiency by slashing the computational costs of photon mapping and
ray-tracing algorithms. Whereas previous techniques sampled photons
randomly across a light source, Jensen's method maps the relevant photons
along the light's entire route, allowing a graphics interface to follow the
light around a scene and measure the degree of light absorbed, reflected,
or scattered by other objects. A notable achievement in touch-based
interface technology has been facilitated by a professor at Carnegie Mellon
University's Robotics Institute using magnetic levitation, in which a
device hovers about its base using magnetic fields, while the position and
orientation of a virtual object on a computer display can be manipulated by
a handle. The object's interaction with obstacles is simulated tactilely
by haptic feedback generated by electrical coils. To advance technology
that could lead to empathetic virtual characters, researchers have
developed the Sensitive Artificial Listener, which can maintain a
human-computer dialogue for prolonged periods by employing its sensitivity
to non-linguistic signals as well as a repertoire of verbal and non-verbal
cues and statements.
Click Here to View Full Article
to the top
RFID Testbed Rapidly Assesses New Antenna Designs
Georgia Institute of Technology (05/05/08)
Georgia Institute of Technology (GIT) researchers have designed a RFID
data collection system that can read hundreds of RFID tags simultaneously.
GIT professor Gregory Durgin says the researchers have designed a simple
anti-collision system that transmits multiple, unique signals
simultaneously, eliminating the back-and-forth process of existing RFID
systems. The system includes a transmitter, receiver, and emulator. The
emulator stimulates the activity of an integrated circuit, and the
transmitter sends a signal to the antenna. Attaching the emulator to the
antenna allows the system to send a unique spread spectrum signal to the
receivers. Each antenna signal can be separated from the others, allowing
multiple tags to be read at once. Experiments found that the system can
measure the power strength and phase of up to 256 tags in the field of
view, which is an area in front of the reader of approximately 20 feet by
20 feet. The system's design enables the researchers to test new signaling
schemes and frequencies without having to design new chips. Durgin says
the system also can evaluate multiple custom antennas in a variety of
configurations in realistic tag environments more quickly and at less cost
than previous methods.
Click Here to View Full Article
to the top
Bringing Down the Language Barrier...
Automatically
ICT Results (05/02/08)
The European Union funded TC-STAR project is developing automatic
speech-to-speech translation technology. "For humans, translation is
difficult. We have to master both the source language and the target
language, and machine translation is significantly more difficult than
that," says FBK-irst researcher Marcello Federico. "To our knowledge,
TC-STAR has been the first project in the world addressing unrestricted
speech-to-speech translation." The TC-STAR project included the
development of three technologies. Automatic Speech Recognition (ASR) is
used to transcribe the spoken word, Spoken Language Translation (SLT)
translates the source language to the target language, and Text to Speech
(TTS) generates the spoken output. All of these technologies still need to
be perfected. A key innovation of the project was to combine the output of
several ASR and SLT systems to make the transcription and translations
phases more accurate. Based on the Bilingual Evaluation Understudy method
used to compare machine and human translations, the quality of the
translations improved by between 40 percent and 60 percent over the course
of the project, and 70 percent of the words were translated correctly,
though they were sometimes misplaced in the sentence. Components developed
by the TC-STAR project have been made available under an open source
license.
Click Here to View Full Article
to the top
Internet in Danger of Losing Innovation
Network World (04/29/08)
Technologist and Oxford Internet Institute professor Jonathan Zittrain is
worried that the open technologies of the Internet and PC will encourage
malicious exploitation, and unless action is taken the abuse will provoke a
market response "that puts us that much more into the iPhone zone," by
which he means a migration toward isolationist closed-systems devices that
will choke off the Internet's innovation. "If [the iPhone and things like
it, including Web 2.0 app platforms] start substituting for the PC instead
of complementing it, we're in trouble," he warns. Zittrain says the
general challenge is finding a way to function successfully in an open
environment. He notes that standards are necessary, but the Net's
experimentalist architecture must be preserved for everyone. "I'd like to
make sure we maintain a hardware infrastructure that allows nerds to come
up with new stuff on their own and deploy it to the rest of us--a safety
valve against the more formalized/proprietary systems that will naturally
be competing too," Zittrain says. He distinguishes between the use of
appliances and a migration to appliances, pointing out that he is for the
former and against the latter. Though Zittrain acknowledges that
innovators will keep innovating even if everyone else opts for appliances,
he says the key issue is whether the innovators can make their innovations
readily available to the masses. Zittrain says he would like "a critical
mass of generative PCs" to still be present.
Click Here to View Full Article
to the top
Quickies: Intelligent Sticky Notes
The Future of Things (04/30/08) Gingichashvili, Sarah
Massachusetts Institute of Technology Ambient Intelligence Group
scientists have developed Quickies, intelligent Post-it notes that combine
artificial intelligence, RFID, and ink-recognition technologies. Quickies
can communicate with PCs to relay any information written on them to a
computer for display on a variety of electronic devices. The Quickie
writer uses digital-pen hardware that translates the movement of the pen on
the surface of the paper note into digital information. The information
can be viewed at any time using Quickie software, which stores the notes as
images and converts the handwritten notes into computer text using
handwriting recognition algorithms. The Quickie software allows users to
browse through their notes and search for specific information or keywords.
Using a commonsense knowledge engine and computational AI techniques, the
software analyzes the notes and categorizes them to provide users with
reminders, alerts, messages, and relevant information. Each Quickie note
has a unique RFID tag so it can be placed around a house or office,
preventing users from losing a book or other object marked with a Quickie.
Users can tell the software to remind them of important notes at specific
times, and the software can synchronize Quickie to-do lists with task lists
on mobile phones and laptops.
Click Here to View Full Article
to the top
Eye-Tracking Interface Means Gamers' Looks Can
Kill
New Scientist (05/05/08) Perkins, Ceri
The European Union-funded Communication by Gaze Interaction (COGAIN)
project is developing eye-gaze software that will enable people with severe
motor disabilities to play 3D computer games at the same level as regular
gamers. The eye-gaze software helps disabled users protect their privacy
online by enabling them to function normally in virtual worlds, says lead
researcher Stephen Vickers of De Montfort University in Leicester, United
Kingdom. Eye-gaze systems bounce infrared light from LEDs at the bottom of
a computer monitor to track a person's eye movements using infrared
cameras. The systems are able to determine where a person is looking with
an accuracy of about 5 mm. COGAIN software includes a traditional
point-and-click interface as well as extra functions to speed up certain
commands. For example, glancing off screen in a particular direction
switches functions, such as a mode that rotates the avatar or viewpoint, or
selecting transparent icons that can be dragged onto game objects to
perform actions. A "gaze gesture" has also been added to temporarily turn
off the eye-gaze functions to prevent unintentionally selecting an item
while looking around the screen. "The eyes are perceptual organs, not
designed for pointing and selecting," Vickers says. "You can't turn them
off, like you can lift your hand off the mouse."
Click Here to View Full Article
to the top
Purdue Supercomputer Unboxed and Built by
Lunchtime
Purdue University News (05/05/08) Tally, Steve
Purdue University employees came together on May 5 to help build the
largest supercomputer on a Big Ten campus with the goal of building the
computer in just one day, but it only took them until lunch. "The assembly
was finished much faster than we expected, and by noon we were doing
science," says Purdue chief information officer Gerry McCartney. By 1
p.m., more than 500 of the supercomputer's 812 quad-core nodes were running
1,400 research jobs from around the campus. The supercomputer consists of
812 servers and is capable of performing 60 trillion operations per second,
which ranks it in the top 40 of the current ranking of the world's most
powerful supercomputers. McCartney says the computer leverages the
commodity nature of cluster computing by using standard computing parts.
"By using commodity computer servers to build our supercomputer, we didn't
have to fly in engineers or hire specialized technicians," he says. "We
were able to do it with our own IT staff in about four hours." The
supercomputer was funded by Purdue faculty members who contributed research
funds to the effort instead of purchasing equipment for their own labs.
Click Here to View Full Article
to the top
Huge Databases Offer a Research Gold Mine--and Privacy
Worries
Chronicle of Higher Education (05/09/08) Vol. 54, No. 35, P. A10; Glenn,
David
Congress's rejection of the notion of a national "unit-record tracking"
system for student data has provoked speculation that states will bolster
their own education-data centers, which many researchers say would be
valuable resources for evaluating schools and colleges and helping them to
improve. However, there is a darker aspect to this possibility in the form
of potential privacy violations, and this is one reason why many states'
efforts to build such data clearinghouses have been sluggish. The
development of additional state data centers was advocated by a group of
scholars attending a recent conference organized by the National Academies
and the American Educational Research Association, who nevertheless
acknowledged that the trustworthiness of the systems would be undone by a
single serious breach of anonymity. Pending changes in Family Educational
Rights and Privacy Act (FERPA) regulations incorporate several
clarifications about how states, school districts, and colleges should
safeguard student confidentiality when working with databases, such as
requiring educational agencies to sign written agreements when they provide
data to outside researchers and mandating that the researchers return or
destroy the data when they are finished using it. The commentary
accompanying the draft regulations notes that privacy issues can remain
even with the total removal of names, Social Security numbers, and birth
dates from the data, so the regulations instruct each state to identify a
number below which data may not be disclosed for a specific "cell" of
students. "Even if FERPA did not exist, many of these challenges would
still be with us," says Thomas R. Bailey with Columbia University Teachers
College's Community College Research Center. "Colleges' IT systems aren't
set up to analyze this stuff. The data generally aren't stored in a way
that's ideal for research, because that's not the purpose for which the
system was designed."
Click Here to View Full Article
to the top