E-Voting Systems 'Hacked' for Flaws
	San Jose Mercury News (CA) (07/23/07) Harmon, Steven
	
	As part of a "top-to-bottom" review ordered by California's Secretary of 
State Debra Bowen, several computer scientists recently finished two months 
of testing to see if the state's touch-screen voting machines should be 
certified for use in the upcoming elections.  The testing included general 
hacking and attempts to manipulate the voting systems.  Bowen is expected 
to give a report on Aug. 3, six months before the Feb. 5 presidential 
primaries, but election officials are worried that there may not be enough 
time if the systems are determined to be vulnerable.  The level of testing 
Bowen's hackers put the machines through is unprecedented and went farther 
than any other state or federal testing of electronic voting machines, 
according to Kim Alexander, president of the California Voter Foundation.  
"Previous testing looked at whether the systems work the way vendors said 
they're supposed to work," Alexander says.  "It didn't include scenarios 
that would crop up in real elections, such as a software attack or the 
taking down of a polling place through technical manipulation."  County 
registrars are worried that the decertification of any of the machines 
could lead to a shortage of machines on election day and some criticized 
the testing process as unnecessary.  "Show me where the systems have 
actually been hacked and where votes have been changed," says Contra Costa 
Country registrar and California Association of Clerks and Election 
Officials president Stephen Weir.  "There's no evidence of it."  Weir also 
says the tests did not account for the defenses that clerks set up to 
prevent security breaches.  For information on ACM's e-voting activities, 
visit 
http://www.acm.org/usacm
Click Here to View Full Article
	  
	  
	
to the top
	
			
		
	
	Humans Narrowly Beat Computer in Poker Battle
	Middle East Times (07/25/07) 
	
	Computer scientists are lauding the performance of the artificial 
intelligence program Polaris in a poker competition against the best poker 
players in the world, even though it lost.  Machines are able to regularly 
defeat humans in chess, checkers, and backgammon, but poker is viewed as 
more of a challenge because of its psychological nature, which involves 
intentional deception, the influence of unpredictable emotions and chance, 
as well as mathematics.  Phil Laak and Ali Eslami narrowly defeated Polaris 
by 570 points in the fourth and final game, after one draw, and a victory 
each for them and the machine.  Darse Billings, lead architect of the 
Polaris team at the University of Alberta, says the program played 
exceptionally well.  "I wouldn't be surprised if we can beat them 
tomorrow," Billings says, whose team will continue to improve Polaris.  
Eslami, a former computer consultant, says he has never had a more 
exhausting match.  "I'm surprised we won ... it's already so good it will 
be tough to beat in the future."  The championship took place during an 
artificial intelligence conference in Vancouver that was attended by 
approximately 1,000 scientists.
Click Here to View Full Article
	  
	  
	
to the top
	
			
		
	
	From UF and IBM, a Blueprint for "Smart" Health 
Care
	University of Florida News (07/24/07) Hoover, Aaron
	
	New technology from the University of Florida and IBM creates what is 
being called the first roadmap for the widespread deployment of "smart" 
medical devices that, for example, monitor a person's blood pressure, 
temperature, respiration rate, and any other important medical information. 
 Electronically monitoring patients could eliminate the need for many 
visits to the doctor, which can be difficult for the elderly or sick, and 
could help doctors determine which patient should receive treatment first.  
"We call it quality-of-life engineering," says University of Florida 
professor of computer science and lead researcher on the project Sumi 
Helal.  The project provides the technological foundation for a company to 
manufacture and sell smart, networked, and user-friendly devices.  "UF and 
IBM both see the need and the opportunity to integrate the physical world 
of sensors and other devices directly into enterprise systems," says IBM's 
Richard Bakalar.  "Doing so in an open environment will remove market 
inhibitors that impede innovation in critical industries like health care 
and open a broader device market that's fueled by uninterrupted 
networking."  Helal previously created several devices that can provide 
care givers with information on a patient's activity and over health 
indicators, including a microwave that can monitor the salt content of food 
and a device that records how many steps a person takes, but these devices 
needed to be installed by a team of engineers.  To create a device that is 
ready to use out of the box, Helal created middleware based on open 
standards that "self integrates" to provide a standard connection for any 
health care device to use.  "When you bring it in to the house and plug it 
in, it automatically provides its service and finds a path to the outside 
world," Helal says.
Click Here to View Full Article
	  
	  
	
to the top
	
			
		
	
	Submissions Sought for Reconfigurable Computing 
Workshop
	HPC Wire (07/18/07) 
	
	A workshop on High-Performance Reconfigurable Computing Technology and 
Applications will be held in conjunction with SC07, and submissions of 
papers on high-performance reconfigurable computing (HPRC) related topics 
will be accepted through Sept. 15, 2007.  Topics of interest include 
architecture of HPRC devices and systems; languages, compilation 
techniques, and tools for HPRC; algorithms, methodology, and best practices 
in application development for HPRC; applications of HPRC in science and 
engineering; and trends and latest developments in HPRC.  The best papers 
could be included in a special issue of the ACM Transactions on 
Reconfigurable Technology and Systems (TRETS).  The workshop is scheduled 
for Nov. 11, in Reno, Nev., and will give academic researchers and industry 
representatives a chance to learn more about trends and developments, and 
establish a research agenda for the years to come.
Click Here to View Full Article
	  
	  
	
to the top
	
			
		
	
	Antique Engines Inspire Nano Chip
	BBC News (07/24/07) Fildes, Jonathan
	
	U.S. researchers have designed a nano computer that casts aside modern 
high-speed silicon chips in favor of a computing idea that was first 
proposed nearly 200 years ago.  In a paper published in the New Journal of 
Physics, the scientists said the mechanical computer would be built from 
nanometer-sized components and could be used in places that would damage 
silicon components.  "What we are proposing is a new type of computing 
architecture that is only based on nano mechanical elements," says 
University of Wisconsin-Madison professor Robert Blick, one of the authors 
of the paper.  "We are not going to compete with high-speed silicon, but 
where we are competitive is for all those mundane applications where you 
need microprocessors which can be slow and cheap as well."  The tiny, 
hypothetical computer could be built out of ultra-hard material such as 
diamond or piezoelectric material, which changes shape when exposed to an 
electrical current.  Unlike current computers, which use the movement of 
electrons on circuits to solve problems, the nano mechanical computer would 
use the push and pull of tiny parts to perform calculations.  The military 
is interested in a nano mechanical computer because, unlike electronic 
silicon computers, nano mechanical devices would not be vulnerable to 
electromagnetic pulses that would disable traditional computing systems.  
The researchers also believe that nano mechanical chips would be better at 
maintaining Moore's law than silicon chips because they run much cooler 
than silicon.  The University of Southampton's Michael Kraft says nano 
mechanical research may lead to hybrid chips because nano mechanics consume 
less power, which is becoming increasingly important for mobile devices.  
"The battery is the big bottleneck, so anything that reduces the power 
consumption is a real advantage," Kraft says.
Click Here to View Full Article
	  
	  
	
to the top
	
			
		
	
	A Baby Step for Computer Learning
	ScienceNOW (07/23/07) Cevallos, Marissa
	
	Stanford University researchers have developed a program that is able to 
determine vowel categories from human sounds.  James McClelland, a 
cognitive neuroscientist, and colleagues had in mind the ability of infants 
to sort out vowel sounds on their own as they pursued the project, which 
involved the use of so-called neural networks.  The researchers recorded 30 
mothers reading aloud to their infants, then fed the audio clips into a 
computer and limited the categories to "beet," "bait," "bit," and "bet" 
vowel sounds, but did not tell the neural network how many categories there 
would be.  The neural network analyzed thousands of sound clips to 
determine the number of categories, then quickly placed them into the vowel 
categories.  According to a report this week in the Proceedings of the 
National Academy of Sciences, the program placed the vowel sounds into the 
four categories more than 80 percent of the time.  With a more powerful 
neural network, McClelland wants to develop a program that can "lip-read" 
by analyzing sounds and a picture of a mouth.
Click Here to View Full Article
	  
	  
	
to the top
	
			
		
	
	UI Professor Seeks to Harness Strength of Amoeba Into 
Computer Cores
	News-Gazette (07/23/07) Kline, Greg
	
	University of Illinois professor of electrical and computer engineering 
Rakesh Kumar believes that the single-celled amoeba may be a good model to 
base how super-fast computers handle parallel processing.  Kumar's "amoebic 
computing" is a way to take better advantage of the growing number of cores 
in computer processors.  As technology advances, single-core processors are 
being replaced by multi-core processors.  The problem, however, is that 
because humans think sequentially, they also program that way, so only one 
task is being processed at a time.  Kumar's solution looks to the amoeba to 
improve processing speed.  An amoebic computing system could, like the 
amoeba, replicate tasks waiting to be processed and run them on other cores 
as needed.  The idea is to break sequential programs into component parts, 
or services, and send them to available cores instead of having them wait 
in line.  In addition to replication, amoebic computing mimics the amoeba's 
ability to adapt to its environment.  For example, the system would be able 
to change how it manages tasks as new ones surface and old ones are 
completed, in the same way an amoeba can change its shape in response to 
the conditions around it.  The system could also send work to the processor 
that would be most advantageous, be it the closest processing space, or one 
with more available resources.  It could even set aside or create new space 
to handle tasks as needed, much like how the human brain handles tasks.
Click Here to View Full Article
	  
	  
	
to the top
	
			
		
	
	Intel Scores Speed Breakthrough
	Wall Street Journal (07/25/07)  P. B4; Clark, Don
	
	Intel researchers say they have developed the first modulator made from 
silicon that can encode data onto a beam of light at a rate of 40 gigabits 
per second, which the company says is a major milestone on the way to 
creating inexpensive optical components that could provide drastically 
faster communication speeds.  Modulators are necessary when using lasers to 
send data down fiber-optic cable.  Obtaining speeds of 40 gigabits per 
second, which is currently about 40 times faster than the most 
sophisticated corporate data networks, requires expensive materials and 
available 40 gigabit modulators can cost thousands of dollars.  Intel wants 
to use the material to create less-expensive components as part of an 
effort the company calls "silicon photonics."  Intel has been making 
increasingly faster silicon-based laser components, including a 1-gigabit 
modulator in 2004 and a 10-gigabit modulator in 2006.  Intel's photonics 
technology lab director Mario Paniccia says, "It's been a phenomenal ride 
in terms of the rate of advancement in silicon photonics."  Paniccia 
described components that could send data between computers or circuit 
boards at a rate of one trillion bits per second; data transfer speeds that 
are well beyond current demands on computing systems but likely will be 
necessary eventually.  Paniccia says Intel is committed to commercializing 
silicon photonics products by the end of the decade.
Click Here to View Full Article
	   - Web Link May Require Paid Subscription 
	  
	
to the top
	
			
		
	
	Playing Piano With a Robotic Hand
	Technology Review (07/25/07) Singer, Emily
	
	Scientists at Johns Hopkins University have demonstrated that it is 
possible to control fingers on a robotic hand by directly tapping into the 
brain's electronic signals using a neural interface.  To create the neural 
interface, researchers recorded brain-cell activity from monkeys as they 
moved their fingers.  Previous research showed that a particular part of 
the motor cortex controls finger movement.  The recorded brain activity was 
used to create algorithms that decode the brain signals by identifying the 
specific activity patterns associated with specific movements.  When the 
algorithm was connected to the robotic hand and given a new set of neural 
patterns, the robotic hand performed the correct movement 95 percent of the 
time.  These initial experiments were performed "off-line," meaning the 
system was receiving pre-recorded neural activity, but the researchers are 
planning a demonstration with a live neural feed within the next six 
months.  Monkeys implanted with an array of recording electrodes will be 
connected to a virtual version of the prosthetic arm and monitored to see 
how well they can use brain activity to control the virtual hand.  The 
preliminary results are encouraging, but the scientists know it will be a 
long time before the system has the dexterity of a real hand and that a 
practical human version of the neural interface is still a long way off.  
"We would hope that eventually, we'll be able to implant similar arrays 
permanently in the motor cortex of human subjects," says University of 
Rochester neurologist and project researcher Mark Schieber.  Schieber says 
the long-term objective is to get the robotic hand to move however the user 
wants it to in real time, but getting the decoding algorithm to understand 
unscripted and general movements will be the challenge.
Click Here to View Full Article
	  
	  
	
to the top
	
			
		
	
	Graphene Nanoelectronics: Making Tomorrow's Computers 
From a Pencil Trace
	Rensselaer News (07/23/07) 
	
	Rensselaer Polytechnic Institute associate professor Saroj Nayak, working 
with graduate student Phillip Shemella and other students, made a key 
discovery that could advance the use of graphene as a possible replacement 
for copper and silicon in nanoelectronics.  After two years of research and 
dozens of computer simulations, the researchers were able to demonstrate 
that the length and width of the graphene directly impacts the material's 
conduction properties.  Graphene has unique electrical properties that 
include either metallic or semiconducting behavior.  Generally, the process 
of synthesizing graphene causes both metallic and semiconductor materials 
to be produced, but the researchers' findings create a blueprint that 
should allow entire batches of either to be produced as needed.  Computer 
chips have gotten increasingly smaller over the past decade, but as copper 
interconnects continue to shrink, the copper's resistance increases and its 
ability to conduct electricity degrades.  As a result fewer electrons can 
pass through and more electrons get caught in the copper, creating heat 
that can hinder a computer chip's speed and performance.  Graphene would be 
a good choice as a replacement for copper because graphene has excellent 
conductivity and has an extremely low resistance, meaning electrons could 
pass effortlessly and create almost no heat.  It will likely be several 
years before graphene interconnects become a reality, but Nayak says 
graphene shows serious potential for use in interconnects, transistors, and 
as a replacement for silicon as the primary semiconductor used in all 
computer chips.
Click Here to View Full Article
	  
	  
	
to the top
	
			
		
	
	Q&A: Google Finds R&D Opportunities, Pitfalls 
Abroad
	IDG News Service (07/23/07) Lemon, Sumner
	
	Kannan Pashupathy, Google's director of international engineering 
operations, oversees the company's rapidly expanding network of 
international R&D centers.  Over the past three years Pashupathy has 
increased the number of international research and development centers from 
three to more than 20.  Pashupathy says Google has made such a rapid push 
in establishing international R&D centers because the company is doing 
simultaneous releases in multiple languages.  The various R&D centers stay 
in touch with each other using fairly traditional telecommunications such 
as telephone and video, and every center is connected to a corporate 
videoconferencing system so teams can constantly engage each other.  To 
avoid time-zone problems, Google has a simple set of rules that enables it 
to avoid distributing work across too many locations that are not 
collocated, and to minimize day-to-day communications between teams in 
different areas. When hiring people, the company actually looks for people 
with strong cultural backgrounds and submits applicants to a test to see if 
they will fit culturally.  Part of the test is to see if applicants have an 
open mind and believe there is a richness in different cultures.  Google 
avoids the traditional hierarchy and someone who is fresh out of college is 
given the same say as a 20-year veteran, which can be difficult for senior 
management types who are used to one way of doing business and have a 
difficult time adjusting.  This free-range and autonomy for new hires has 
led to problems, such as when Google released a Chinese software tool that 
included a database developed by a competitor, but Pashupathy believes 
Google is very innovative in how it manages its employees and as such 
everything is a learning experience, for employees and managers.
Click Here to View Full Article
	  
	  
	
to the top
	
			
		
	
	Environmental Songlines for IT Systems
	IST Results (07/18/07) 
	
	Effective public-private collaboration to manage environmental dangers 
such as floods and forest fires is often hindered by incompatibility 
between IT systems, but the EU-funded ORCHESTRA project seeks to address 
this challenge through an IT architecture that defines the interaction of 
proprietary IT systems.  "You can't expect everyone to throw away their 
legacy systems and invest huge resources into a common IT infrastructure," 
explains ORCHESTRA project coordinator and Atos Origin operations manager 
Jose Esteban.  "ORCHESTRA allows all these different systems to 
interoperate with the minimum of investment."  The ORCHESTRA architecture's 
reference model particularizes a set of functional "modules" or services 
and the manner in which they must be "plugged together" to produce 
compatible risk management applications.  The architecture is oriented 
around ISO, W3C, and Open Geospatial Consortium standards, and Esteban 
hopes the OGC will adopt the ORCHESTRA reference model as an example of 
best practice for interoperability in the risk management domain.  The 
model has already been adopted by the SANY and DEWS projects in Europe.  
Esteban notes that remote interoperability between disparate systems is 
currently facilitated by Web services, but says the ORCHESTRA platform's 
universal application to any IT technology will keep the architecture 
relevant even if the means of enablement changes in the future.
Click Here to View Full Article
	  
	  
	
to the top
	
			
		
	
	Evaluations Aim to Advance Translation Technology
	NIST Tech Beat (07/20/07) Blair, John
	
	To help American military forces communicate with the local population, 
the National Institute of Standards and Technology is evaluating prototype 
translations systems for the Defense Advance Research Projects Agency 
(DARPA).  DARPA's TRANSTAC (Spoken Language Communication and Translation 
System for Tactical Use) project is intended to produce real-time, two-way 
translation systems, particularly for translating Iraqi Arabic.  NIST 
recently ran a series of laboratory and outdoor evaluation tests on 
prototype systems to determine their abilities in speech recognition, 
machine translation, noise robustness, user interface design, and efficient 
performance on limited hardware platforms.  "Effective two-way translation 
devices would represent a major advance in field translators," says NIST 
evaluation project leader Craig Schlenoff.  During the tests, English 
speaking Marines and Iraqi Arabic speakers acted out 10 different scenarios 
requiring verbal communication.  Participants looked directly at each other 
during the question and answer sessions.  The conversation was recorded on 
the laptop, and background noises were precisely controlled so the system 
could be evaluated in a predictable environment.  In the outdoor tests, 
background noises included other speakers, generators, garage doors, 
running vehicles, radio broadcasts, and other simulated conditions.  DARPA 
hopes that once the technology is fully developed, it will be able to 
deploy an automatic translator system in a new language within 90 days of 
receiving a request for that language.
Click Here to View Full Article
	  
	  
	
to the top
	
			
		
	
	Technology Summer Camp Welcomes Disabled High-School 
Students
	University of Washington News and Information (07/17/07) Hickey, Hannah; 
Bellman, Scott
	
	The University of Washington program DO-IT, which stands for Disabilities, 
Opportunities, Internetworking and Technology, will allow more than 50 
college bound high-school students with disabilities the opportunity to 
participate in an intensive program designed to promote college and career 
success.  DO-IT program participants will learn about careers in fields 
such as technology, science, engineering, and mathematics.  "DO-IT scholars 
earn about college life by living in a dorm, getting along with a roommate, 
participating in academic classes, preparing for challenging careers, and 
having fun," says DO-IT founder and director Sheryl Burgstahler.  "After 
the summer study ends, they communicate via the Internet with their new 
friends and are mentored by successful adults with disabilities. Year after 
year, they connect through DO-IT activities and are supported as the 
transition to college and careers."  The DO-IT program targets high-school 
sophomores and juniors with disabilities who are interested in attending 
college.  After attending the summer program, students are loaned 
computers, software, and adaptive technology to be used at home on 
additional DO-IT activities, including independent projects, and online 
interaction with mentors, teachers, and fellow students.  "Many successful 
DO-IT scholars continue in the program as mentors to younger participants," 
Burgstahler says.
Click Here to View Full Article
	  
	  
	
to the top
	
			
		
	
	On the Trail of Servers Gone Bad
	Government Computer News (07/16/07) Vol. 26, No. 17, Dizard, Wilson P. III
	
	Cybersecurity experts say that federal agencies are increasingly pursuing 
"honeyclient" technology to detect and analyze Web sites that store and 
distribute malware.  Honeyclients are virtual machines that travel over the 
Web searching for sites that show signs of being infected with malware, 
says Mitre computer scientist Kath Wang.  Wang says honeyclients "provide 
the capability to potentially detect client-side exploits" that can be used 
in malware attacks.  The exploits on malicious sites often allow the site's 
server to capture the visiting computer to be used as part of a bot herd of 
zombie computers.  Botnet herders then rent out hijacked computers to 
launch spam and other attacks, with prices ranging from a few cents a month 
for a home computer to several dollars a month for a computer inside a 
corporate network.  Wang says online criminals are already starting to 
install honeyclient avoidance technology on malicious servers, so Mitre, 
which operates six autonomous honeyclients, is building a honeyclient 
prototype that mimics human behavior by displaying the same delays and 
bandwidth footprint as a human visitor.  The Department of Homeland 
Security's assistant secretary for cybersecurity and communications Greg 
Garcia says his department has received more than 21,000 reports of 
cyberincidents through May of this fiscal year, as opposed to only 24,000 
for the entire 2006 fiscal year.  Garcia says DHS will be working more 
closely with Information Technology and Communications information sharing 
and analysis centers.  "Increasingly, we are finding that IT and 
communications are one and the same," he says.
Click Here to View Full Article
	  
	  
	
to the top
	
			
		
	
	True Random Number Generator Goes Online
	PressEsc.com (07/18/07) Panditaratne, Vidura
	
	Academics and members of the scientific community will not be able to 
accurately predict the next number that comes out of the Quantum Random Bit 
Generator Service (QRBGS).  The QRBGS is unlike the random number 
generators of most computers, which employ different algorithms to choose a 
number from large databases that use methods such as rolling the dice to 
compile their numbers.  Such random number generators deliver essentially 
pseudo-random numbers, but QRBGS uses photon emission, the unpredictable 
quantum process, to produce true random numbers.  QRBGS makes use of a fast 
non-deterministic random bit generator, and its random quality comes from 
the quantum physical process of photonic emission in semiconductors, 
followed by detection from the photoelectric effect.   Developed by 
computer scientists at the Ruder Boskovic Institute (RBI) in Zagreb, 
Croatia, QRBGS has been made available online, connected by computer 
clusters and GRID networks, free of charge.  Potential applications include 
advanced scientific simulations, cryptographic data protection, security 
applications, and virtual entertainment.
Click Here to View Full Article
	  
	  
	
to the top
	
			
		
	
	How to Forecast the Future
	Computerworld (07/16/07) Vol. 41, No. 29,  P. 32; Melymuka, Kathleen
	
	Seasoned forecaster Paul Saffo explains that forecasting "gives a context 
for decision-makers to act in the face of uncertainty."  He says 
forecasting should be a concern of CIOs because they play a central role in 
corporate strategy enablement, and the tools they employ are undergoing 
changes that are commensurate with the pace of Moore's Law.  Saffo says 
intuition can be informed through forecasting, and he cites the value of 
the "cone of uncertainty" visual aid as helpful in forecasting in that it 
forces one to consider all potential outcomes.  The forecaster defines wild 
cards as an occurrence or trend whose likelihood is either very low or 
unquantifiable, and he uses the timetable of quantum computing's arrival as 
an example of a wild card.  Saffo notes that change only has the illusion 
of rapidity because people tend to ignore precursors.  "Most ideas take 20 
years to become overnight successes," he exclaims.  Saffo recommends that 
CIOs keep an eye on sensor technology, which he predicts will become the 
source of most information over the next decade.  Information, in other 
words, will become ubiquitous.
Click Here to View Full Article
	  
	  
	
to the top
	
			
		
	
	Petascale Era Will Force Software Rethink
	HPC Wire (07/20/07) Vol. 16, No. 26, Sexton, Jim
	
	A key challenge of the petascale age is designing software that aligns 
well to petascale architectures so that previously unsolvable scientific 
and business problems can be tackled by the community, writes Jim Sexton, 
lead for IBM Blue Gene applications at the IBM T.J. Watson Research Center. 
 Although Sexton projects that Moore's Law will continue to progress 
through the petascale era, he notes that "performance increases will now 
come through parallelism and petascale systems will deliver performance by 
deploying hundreds of thousands of individual processor cores."  The 
inherent programming challenge involves the concurrent management of 
algorithmic and systems architectures, which Sexton likens to "a creative 
art form."  He points out that justifying the investment needed to sponsor 
the construction of a complete parallel programming infrastructure from the 
ground up requires more programs to be running on parallel systems and 
yielding significant results.  The community will need to see a definite 
cost/benefit to parallelism so that mainstream/commercial adoption can be 
encouraged.  Mindful of this goal, the Scientific Discovery Advanced 
Computing Discovery program is setting up nine Centers for Enabling 
Technologies to address major petascale computing problems.  Mainstream 
adoption of petascale computing could also be helped along by industrial 
applications, Sexton speculates.
Click Here to View Full Article
	  
	  
	
to the top