Read the TechNews Online at: http://technews.acm.org
ACM TechNews
June 28, 2006

MemberNet
The ACM Professional Development Centre
Unsubscribe

Welcome to the June 28, 2006 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Sponsored by Information, Inc.

http://www.infoinc.com/sponsorthenews/contactus.html


HEADLINES AT A GLANCE:

 

Analysis Finds e-Voting Machines Vulnerable
USA Today (06/27/06) P. 14A; Stone, Andrea

The majority of the electronic voting machines that states have been purchasing since the 2000 presidential election "pose a real danger to the integrity of national, state, and local elections," according to a report issued by the Brennan Center for Justice. The report cites more than 120 vulnerabilities in the three most popular systems: touch-screen machines and optical-scan systems with and without paper trails, which together account for 80 percent of the machines that will be used in the upcoming November elections. Though there has yet to be a reported case of voting machines being hacked in an actual election, the Brennan Center's Lawrence Norden notes incidents of similar software attacks on computerized slot machines. "It is unrealistic to think this isn't something to worry about," he said. The report comes amid primary season as concerns about the security of e-voting machines have been mounting. At least six states have seen lawsuits filed attempting to block the purchase or use of electronic systems. The report does not target specific machines, but rather argues more broadly that the e-voting systems in use today are inherently problematic. It finds that the easiest form of attack would be to switch votes from one candidate to another using corrupt software, and that machines that use wireless components are the most vulnerable. Without regular audits, machines with paper trails are just as vulnerable as those without, the report finds. It concludes that states should ban wireless components (a measure so far implemented only by California, New York, and Minnesota) and routinely conduct audits to compare voter-verified paper trails with electronic records. For more on the vulnerabilities of e-voting, vist http://www.acm.org/usacm
Click Here to View Full Article
to the top


U.S. Cybersecurity Chief Abruptly Resigns
Associated Press (06/28/06)

Giving just one day of notice, the U.S. government's cybersecurity chief resigned from the Department of Homeland Security after holding the post for one year. Amit Yoran's resignation comes amid strident calls from technology leaders and some lawmakers to broaden the range of his authority and give him more money for protection initiatives. Yoran had previously shared with colleagues his frustration over the perceived lack of attention that the department paid to cybersecurity. It is unclear who will step in for Yoran even on an interim basis. Yoran said that he "felt the timing was right to pursue other opportunities." A department spokeswoman praised Yoran for his contributions and said that cybersecurity remains a high priority, and that the department will work quickly to find a replacement. Industry leaders were unhappy that as a director, Yoran, who led a division with 60 employees and an $80 million budget, was at least three bureaucratic levels removed from the Homeland Security Secretary, and had unsuccessfully lobbied to elevate Yoran's position to the level of assistant secretary. A bill to that end has stalled in Congress. Department officials argue that cybersecurity issues should command as much attention as threats to physical structures, and that they should be addressed in a coordinated fashion because the vulnerabilities to each often have a common source. Under Yoran's tenure, the department implemented a cyber-alert system to notify subscribers about significant Internet attacks as they arise, along with directions to help users protect themselves. The department also created a blueprint of the government's nexus of interconnected electronic devices so that they could be routinely scanned for weaknesses that intruders could exploit.
Click Here to View Full Article
to the top


U.S. to Spend Millions on Massive, Ultrafast Supercomputers
Computerworld (06/27/06) Thibodeau, Patrick

The U.S. government intends to invest hundreds of millions of dollars developing supercomputers of unprecedented power over the next several years to tackle some of the most complex problems facing science and national security. The planned systems will be able to sustain petascale computing speeds, dwarfing the fastest machines in existence today. Cray has reportedly signed a $200 million contract with the Department of Energy to deliver a petascale system by 2008, while DARPA and the NSF are both shopping around for supercomputers that will carry price tags in the hundreds of millions of dollars. The magnitude of the computing power of the new systems will be such a departure that computer scientists will need to radically change their methods in order to take full advantage of their capacity, according to the Energy Department's Dimitri Kusnezov. "The question is what they would do with an infinite amount of computing speed," he says. "What would they calculate? And I'll wager that they don't have an answer for you. Because people think about their problems within the constraints of what they can calculate, and once you remove that constraint, people are lost." The Department of Energy maintains the world's faster supercomputer, the IBM BlueGene/L, which broke a record for sustained speed earlier this month by running a scientific code at 207 TFLOPS. DARPA has been working with Cray, Sun Microsystems, and IBM in a multiyear program to develop an economically feasible supercomputer, and is likely to select two of the three vendors for the next phase of the program. The NSF is interested in a petascale computer for climate modeling that it hopes to have operational by 2011. While much of the attention paid to supercomputers centers on processor count, other concerns arise from issues such as memory, storage, and energy consumption.
Click Here to View Full Article
to the top


Students Design Sensor Network to Protect Forest
TechWeb (06/26/06) Sullivan, Laurie

A team of Romanian college students took first place in the recent Windows Embedded Student ChallengE for designing a sensor network that provides protection for forests. The low-power application uses a network and routing protocol to connect sensors to a central server, and alerts can be sent to PDAs when a tree poacher, flooding, or a fire is detected. The network of sensors monitors humidity, sound, temperature, and carbon monoxide levels, and the data can be accessed from a Web site. "We used Microsoft's eBox as the central unit, the brain of the system, and the sensors are the ears and the nose," says Christian Pop, a 22-year-old computer science major who led the team from Politehnica University of Bucharest. "We can listen to sounds of chainsaws to stop loggers from cutting down trees, or try to prevent fires by analyzing the data from sensors that monitor carbon monoxide, temperature and humidity." CaLamp principal engineer James Y. Wilson, who served as a judge, says the project shows the potential of radio frequency (RF) technology and sensor networks has not been reached. Second place went to a team from the University of South Florida that developed an application to monitor fish farms, and protect fish from birds by spraying water as they approach in order to scare them away. The application makes use of cameras, motion detectors, and algorithms for studying changes in images.
Click Here to View Full Article
to the top


Sun Says Open-Source Java Possible in 'Months'
IDG News Service (06/27/06) Kirk, Jeremy

Sun Microsystems will release its proprietary Java language through an open-source license within a matter of months. The two remaining issues facing Sun are ensuring that Java remains universally compatible and preventing any one company from asserting its own implementation of the language as the dominant one on the market, says Simon Phipps, the company's chief open-source officer. "Maintaining both of those dimensions of compatibility is imperative because the Java market is a huge successful open market in which many companies are serving many other companies," Phipps says. The transition from outgoing CEO Scott McNealy to his replacement Jonathan Schwartz hastened Sun's decision to make Java open source. Complicating matters is the dedicated core of old-line employees who still vigorously oppose open sourcing Java, according to a former Sun employee. Though no decisions have yet been made, the keys to compatibility will be licensing and governance, Phipps says. He does not predict a major impact from the move in the short term, though over time, Sun will benefit from wider adoption of the platform and increased innovation. Some observers say that Sun waited too long to make the move, citing the growing popularity of Microsoft's C# and .NET languages. Community debugging and broad-based innovation would have helped Java's popularity earlier, particularly in the server space, says Brian Behlendorf, co-founder of the Apache Web Server Project. Another developer says the Java Community Process, which defines Java standards, will have to change.
Click Here to View Full Article
to the top


Securing America's Power Grid
Iowa State University News Service (06/26/06)

Researchers at Iowa State University believe a network of wireless sensors could be used to monitor America's power lines for suspicious activity. Arun Somani and Jerry R. Junkins are leading a team of computer and engineering experts who are developing a monitoring system that makes use of a network of wireless sensors mounted with a tiny camera to watch movements along power lines. The monitoring system could improve national security, and also be used to watch for conductor failures, tower collapses, hot spots, and extreme conditions. "Power companies would have additional abilities to view their systems and that would assist in disaster recovery," Somani says. The team continues to make progress on a prototype system, and recently showed off their work at Iowa State's Wireless and Sensor Networking Laboratory. They are now in talks with power companies to test the monitoring system on the electrical grid. The project includes designing a diagnosis algorithm to determine fault conditions and predict faults, and a decision algorithm to reconfigure power networks to prevent cascading blackouts. The researchers received a $400,000 grant from the National Science Foundation and a $150,000 grant from Iowa State's Information Infrastructure Institute for the project.
Click Here to View Full Article
to the top


High School Computer Science: Does Not Compute?
MC Press Online (06/26/06) Stockwell, Thomas M.

The historic gap between the version of computer science that is taught in high schools and the practical skills that employers expect from their IT workers could lead to a severe shortage of qualified workers, analysts report. Computer science education varies widely among high schools throughout the United States, as some classes actually teach programming languages while others focus more on word processing and spreadsheet skills. For many teenagers, the bulk of their knowledge about computers comes from casual Internet usage, and a declining number are enrolling in formal computer science programs once they get to college. A recent report by the Computer Science Teachers Association offers a detailed analysis of the current state of secondary school computer science education, emphasizing the need for students to develop a comprehensive understanding of computer science as a legitimate discipline unto itself. The study also debunks common myths about computer science, including the conflation of computer science with computer literacy, the assumption that computer science and programming are the same thing, and the perception that computer science is only a field for men. The study concludes that students should not only understand the theoretical foundation of computer science, but also the practical implications of that theory, and that instruction should focus on problem solving. Rather than concentrating on specific applications, the study calls for education to take a more conceptual approach, and that it should prepare students for what will be expected from them by their future employers. The report argues that by creating a clearer definition of the theoretical and practical elements of computer science, high schools can do a better job of preparing students to meet the growing demand for IT workers and stave off a potentially critical shortage. To learn more about CSTA, and to view The New Educational Imperative report, visit http://csta.acm.org
Click Here to View Full Article
to the top


Prospects Fading for European Patent Agreement
IDG News Service (06/26/06) Meller, Paul

Industry groups appear likely to scuttle efforts to create a single patent system for the entire European Union as they step up lobbying efforts urging the European Commission to give up its attempts to enact the Community patent. The Community patent would lower the cost of registering inventions throughout the Union and give inventors more legal certainty about the status of their patents. Proponents of the measure argue that it is an essential instrument for the Union to remain competitive with the United States and Japan, and the Commission had promised in January to make one last attempt to negotiate with industry and break the deadlock. But far from reaching a consensus, three powerful industry groups have called on the Commission to abandon the Community patent, arguing instead that it should concentrate on improving the existing patent regime. Some are concerned that further efforts to enact the Community patent could touch off a debate similar to the one that erupted last year over the issue of software patents. "To start a debate about the Community patent now would be like opening a Pandora's box," said the Business Software Alliance's Francisco Mingorance. "Looking at the debacle over the proposed law on computer-implemented inventions, a lot of companies fear this could happen all over again but on an even broader scale in a debate about the Community patent." The Commission first proposed draft legislation for the Community patent in 2001 that called for patents to be written only in three languages, thereby reducing the cost of translation, but it was significantly altered by national governments reluctant to give up the use of their own languages. Two alternative measures backed by industry would create a Union-wide legal system and require countries with an official language other than French, English, or German to provide a translated version of their patents.
Click Here to View Full Article
to the top


Coming Soon: Mind-Reading Computers
Reuters (06/26/06) Reaney, Patricia

British researchers plan to use a four-day science exhibition starting Monday in London to gather more data for an application that would allow computers to read people's minds. Peter Robinson, a professor at the University of Cambridge in England, and his colleagues will allow attendees at the Royal Society-organized event to participate in a study to analyze their facial movements for signs of boredom, interest, confusion, agreement, or disagreement. The system makes use of a video camera pointed at someone to determine their emotional state, says Robinson. "Imagine a computer that could pick the right emotional moment to try to sell you something, a future where mobile phones, cars, and Web sites could read our mind and react to our moods," he explains. "For example, a Web cam linked with our software could process your image, encode the correct emotional state, and transmit information to a Web site." He adds that the application could be used to improve online learning and road safety because it could determine when a person is confused or tired. The British scientists have teamed up with researchers at MIT for the project, and they are considering other input for the application, such as posture and gestures.
Click Here to View Full Article
to the top


When Robots Learn Social Skills
IST Results (06/22/06)

While most attempts at creating intelligent machines have centered on scientists attempting to program a group of predefined rules, a team of European researchers is attempting to develop machines capable of learning from their interactions with the environment, more closely resembling the evolutionary way in which humans learn. "The result is machines that evolve and develop by themselves without human intervention," said Stefano Nolfi, coordinator of the ECAgents program that combines the efforts of a diverse range of disciplines, including robotics, linguistics, and biology. The project's technology, termed Embedded and Communicating Agents, has enabled researchers at Sony's Computer Science Laboratory in France, for instance, to give the AIBO dog an added layer of intelligence that enables it to develop a language and interact with its fellow robots. Instead of being pre-programmed with human language, the robots agree on their own common terms to describe their environment. The Sony researchers instilled in their robots a sense of curiosity that drives them to learn. The so-called "metabrain" forces the robots to pursue more challenging activities and perform more complex tasks. The success of the social-learning technique has been demonstrated in other projects, such as the filesharing technology developed at the Viktoria Institute that automatically shares music files among users on the street or in a cafe. Technology that uses intelligent software agents to search and categorize information based on user's preferences could also be applied to the Internet, and robots capable of learning, communicating, and adapting to their environment should be a reality within a few years, according to Nolfi.
Click Here to View Full Article
to the top


Net Defenses May Be in Danger
Dallas Morning News (06/22/06) Harrison, Crayton

The simple test that asks a user to type in a series of squiggly letters or numbers when making a purchase or conducting some other sensitive application on the Internet, long a stalwart of Web-based security, is "getting to the point where it's almost defeated," according to Luis von Ahn, a post-doctoral fellow at Carnegie Mellon University's computer science department. "The ones not yet defeated by computers are really hard to read for humans. But they'll be defeated pretty soon." Computers now have the sophisticated programming required to read all but the messiest distorted-letter tests, sending computer scientists in search of a replacement. The tests are failing because computer scientists are working to improve the text-recognition ability of computers for benign applications, though once the technology exists, hackers will not be far behind. The current tests, known as Captchas, a term coined by Carnegie Mellon researchers for "Completely Automated Public Turing test to tell Computers and Humans Apart," date back to the late 1990s. Some Captchas display letters that are only slightly distorted, but are crisscrossed with lines overlain on a grainy background, while others make the letters float and swirl. To beat the text-based Captchas, some hackers have reportedly paid users to enter the correct information, while others have been able to detect patterns in the characters or the code used to create them. Still others have developed computers that are advanced enough to actually read the symbols in the same way that supercomputers have been able to defeat chess masters. Image-based Captchas, where a series of otherwise unrelated pictures might have one common element, can be harder to crack, and Google's Blogger service has begun offering an audio Captcha for sight-impaired users.
Click Here to View Full Article
to the top


U.S. Unprepared for Net Meltdown, Blue Chips Warn
CNet (06/23/06) Broache, Anne

The U.S. government is ill-prepared to coordinate an effective recovery response to a major Internet shutdown caused by natural or manmade disruptions, according to a new Business Roundtable report. "A massive cyberdisruption could have a cascading, long-term impact, without adequate coordination between government and the private sector," said Cyber Security Industry Alliance executive director Paul Kurtz. "The stakes are too high for continued government inaction." Given the devastating effects many sectors of the economy would feel in the event of a massive Internet outage, the government should be better prepared, the Business Roundtable concluded. "There is no national policy on why, when and how the government would intervene to reconstitute portions of the Internet or to respond to a threat or attack," the report noted. The U.S. Computer Emergency Readiness Team is primarily responsible for coordinating cyberattack responses. The report recommends that the government establish a means for providing global-scale advance warnings of Internet disruptions, and release a policy that specifically assigns roles to business and government representatives should such an emergency occur. Other recommendations include the setup of formal cyberdisaster response training programs, and the allocation of more funding for cybersecurity protection.
Click Here to View Full Article
to the top


Cyberprotection Takes Center Stage
Washington Technology (06/26/06) Vol. 21, No. 12, P. 48; Lipowicz, Alice

The National Infrastructure Protection Plan should address the vulnerabilities of the nation's virtual assets, such as the networks that relay data between major power plants, as well as physical assets, leaders from the computing industry argue. Meeting under the auspices of the IT Sector Coordinating Council, IT executives and cybersecurity experts are developing recommendations that they hope to discuss with the Homeland Security Department's National Cyber Security Division to develop a protection plan for critical IT infrastructure. IT is one of the 17 economic sectors expected to complete its preparedness plans, with the others including sectors such as energy, food, water, and telecommunications. Traditional infrastructure defense plans have focused on physical structures, but assets in cyberspace are different, and one of the critical challenges is evaluating how well the Internet could withstand a major attack on a national scale. There are also the questions of which assets overlap with telecommunications, and who should shoulder the cost of protecting resources that could have multiple classifications. "We're defining [IT critical assets] by critical functionality," said Paul Kurtz of the Cyber Security Industry Alliance. "We're asking: What is the top-level functionality that needs to be there? What needs to be there reliably 99.9 percent of the time?" Ultimately, the list of critical assets will likely include some physical resources on it, such as servers, routers, and the Internet exchange sites MAE-East and MAE-West, though those sites could also be included in the telecom list. Physical assets are less important today than they were five years ago, because operators have widely disseminated them throughout the country to minimize the disruptive impact of a localized attack. The ongoing convergence of IT and telecom may eventually eliminate the need to divide assets between the two industries.
Click Here to View Full Article
to the top


Heterogeneous Processing Needs Software Revolutions
HPC Wire (06/23/06) Vol. 15, No. 25,

Heterogeneous processing in high-end computing is doomed to failure without a system software revolution, writes the High-End Crusader. Obstacles in the way of heterogeneous processing include a complete lack of academic, governmental, and vendor leadership; an absence of compelling heterogeneous processing visions; and even the lack of simple heterogeneous processing guidelines. "To anticipate, the key task for heterogeneous-system system software lies in scheduling strategies and other system functions that maximize the performance extracted from scarce system resources, notably the heterogeneous system's limited global system bandwidth," the author states. The High-End Crusader calls for a blend of latency-avoidance and latency-tolerance methods, and the tasking of the system software for the mix and match of preemptive (agile) and nonpreemptive (cumbersome) threads. Limited global system bandwidth can be more productively used via the exploitation of these threads by heterogeneous systems. The author recommends that heterogeneous multicore processors should be designed with one big out-of-order core and 10 small in-order cores. "If we agree that thread migrations are bite-sized dependence requests, and that dependence satisfiers are bite-sized responses to thread migrations, then HPCS should demand 1) global system bandwidth sufficient to carry both dependence and operand traffic, and 2) sophisticated system software that extracts appropriate heterogeneous threads from difficult applications and dynamically schedules them onto heterogeneous execution resources in order to use limited global system bandwidth well," the High-End Crusader concludes.
Click Here to View Full Article
to the top


Got a Minute?
New Scientist (06/24/06) Vol. 190, No. 2557, P. 46; Motluk, Alison

Constant interruptions impair office productivity, with a Basex survey determining that more than two hours of the working day--and $588 billion a year in the United States--are consumed by such distractions, even when they are work-related. This situation is caused, and some researchers say can be cured, by technology. Computers that can rate the value of a communication and ascertain whether the recipient is in the right state of mind to receive it--essentially predicting his or her interruptability--could be key to the solution. This will require training a system to know its user's disposition, work habits, and signs of availability. The prototype Bestcom system is designed to assess the advantages of keeping its user informed and compare them to the cost of interruption. Reservations with such a setup include a lack of certainty that the system will not be used to monitor employees intrusively, the possibility that the system's checking might become an annoyance for workers and a distraction in and of itself, and the inevitability of error. A more considerate computer that apologizes for its mistakes and appears to empathize can help address the potential for irritation, according to the MIT Media Lab's Rosalind Picard. The difficulty of getting back into the flow of work after an interruption is another cause of lost productivity, and Gloria Mark of the University of California, Irvine, thinks technology can help in this regard by keeping people apprised of each other's projects or "working spheres" to lower the chance of interrupting workers for tasks that are outside those spheres. Ultimately, the responsibility for controlling workplace interruptions lies with people and how they use technology, not the technology itself.
Click Here to View Full Article - Web Link to Publication Homepage
to the top


How to Build a Babel Fish
Economist Technology Quarterly (06/06) Vol. 379, No. 8481, P. 20

Scientists are getting closer to developing practical machines that can understand any language thanks to emerging statistical machine translation software that, unlike earlier attempts at machine translation, does not require a linguist to identify rules. Carnegie Mellon researchers have launched a project that they hope will result in a machine that can learn language simply by listening to a foreign speaker or, ultimately, by watching television. The next few years will see an explosion in translation applications, including real-time automatic dubbing, search engines that comb through multilingual resources, and even in-ear devices that instantly translate speech from a foreign language. Though some of those technologies may seem remote, researchers have already developed a system that can translate speeches or lectures through a statistical analysis of a large built-in repository of speeches taken from the United Nations and the European Parliament. Though there are a variety of statistical-translation techniques, they all rely on statistical analysis rather than predefined rules. Most begin with a bilingual body of text and identify the relationships between words by evaluating the frequency that words are clustered near each other, offering much greater flexibility than systems based on rules, which are unable to deal with similes, poor grammar, or ambiguous meanings. The statistical method is closely linked with the way that humans learn language, where comprehension is gradually acquired through extensive exposure. In the case of machines, that exposure comes from a vast repository of sample texts. The Defense Advanced Research Projects Agency has been funding numerous mobile machine-translation projects, such as Babylon, a technology that provides two-way translation between English and Iraqi Arabic.
Click Here to View Full Article
to the top


A Conversation With Leo Chang of Clickshift
Queue (06/06) Vol. 4, No. 5, P. 12; Coatta, Terry

In an interview with GPS Industries software development director Terry Coatta, Clickshift founder and CTO Leo Chang explains that simplicity rather than standards might be central to the adoption of component software. "I have pretty much given up on the success of the standard idea or standard language that everybody is going to speak," he states. "I would just like for each one to be relatively simple to use." Chang associates a software component with "a library that's distributable, that's published, that's versionable," and he believes Web services align well with that classification. Chang attributes many standards' lack of penetration to a cultural resistance organizations have toward third-party components, and he also cites a deeply entrenched disparity between designing a completely flexible yet simple-to-use standard. He says simplicity and functionality are key factors in his decision to take up a new standard. Chang notes that the development of an open-source software component standard is a major influence on his selection of that standard, given that the collective effort of its creation is often reflected in its robustness. He thinks it is possible that a diverse collection of Web services can be integrated into a bigger application.
Click Here to View Full Article
to the top


To submit feedback about ACM TechNews, contact: technews@hq.acm.org

To unsubscribe from the ACM TechNews Early Alert Service: Please send a separate email to listserv@listserv.acm.org with the line

signoff technews

in the body of your message.

Please note that replying directly to this message does not automatically unsubscribe you from the TechNews list.

ACM may have a different email address on file for you, so if you're unable to "unsubscribe" yourself, please direct your request to: technews-request@ acm.org

We will remove your name from the TechNews list on your behalf.

For help with technical problems, including problems with leaving the list, please write to: technews-request@acm.org

to the top

News Abstracts © 2006 Information, Inc.


© 2006 ACM, Inc. All rights reserved. ACM Privacy Policy.

About ACM | Contact us | Boards & Committees | Press Room | Membership | Privacy Policy | Code of Ethics | System Availability | Copyright © 2014, ACM, Inc.