ACM TechNews (HTML) Read the TechNews Online at: http://technews.acm.org
ACM TechNews
January 8, 2007

MemberNet
The ACM Professional Development Centre
Unsubscribe

Welcome to the January 8, 2007 edition of ACM TechNews, providing timely information for IT professionals three times a week.


HEADLINES AT A GLANCE:

 

Changes in E-Voting Likely Coming, Experts Say
IDG News Service (01/05/07) Gross, Grant

Several advocacy groups, including Common Cause and the Electronic Frontier Foundation, hosted a panel of election experts, who agreed that something must be done about the state of the nation's e-voting systems, but admitted that no widely accepted solution currently exists. Kentucky Secretary of State Trey Grayson said, "We're at this point ... where I believe there's a consensus that we need to do something. However, the consensus is ahead of the solution." Grayson added that since Kentucky has used e-voting machines without incident since the 1980s, many districts would be reluctant to change. Transparency is a widely accepted goal for e-voting, but experts disagree whether audits of both machines and ballots, or attaching printers to voting machines, can achieve transparency. Twenty-seven states currently require paper-trail systems to accompany e-voting, but only 11 of them require officials to conduct audits to assure the electronic votes match the paper ones. U.S. House of Representatives Administration Committee staffer Thomas Hicks says it would be very difficult to complete wide-ranging changes to e-voting systems before the 2008 elections; 2010 or 2011 is a more realistic goal, he says. Hicks predicts that paper-trail audit legislation will be introduced in the next two years, and that another bill could allow independent researchers to see the source code used by e-voting machines.
Click Here to View Full Article
to the top


Attack of the Zombie Computers Is a Growing Threat, Experts Say
New York Times (01/07/07) P. 1; Markoff, John

In light of the current trend of increasing botnet-enabled Internet crime, computer security experts are admitting that the Internet is becoming more at the mercy of cybercriminals, and that a new approach must be adopted. Gadi Evron, a computer security researcher for Beyond Security, says, "It's the perfect crime, both low-risk and high-profit. The war to make the Internet safe was lost long ago, and we need to figure out what to do now." Georgia Institute of technology researcher David Dagon, who co-founded a startup that concentrates on botnets, estimates that botnet programs exist on 11 percent of the over 650 million computers connected to the Internet. Internet pioneer and Carnegie Mellon computer scientist David J. Farber laments, "It's an insidious threat, and what worries me is that the scope of the problem is still not clear to most people. The popular [Windows-based] machines are so easy to penetrate, and that's scary." A voluntary organization known as ShadowServer is observing botnet activity on about 400,00 infected machines. Computer security firm Message Labs estimates that over 80 percent of spam currently being sent comes from botnets. A program known as "rustock" recently gained attention for its ability to secretly add machines to a botnet, and use them for "pump and dump" schemes, yet it could also be used for a wide array of Internet crimes. Rustock is able to conceal infecting agents so that no digital fingerprints can be detected. Despite their best efforts, computer scientists cannot keep up with the improvements being made to botnet programs and are even beginning to fear for the commercial viability of the Internet; most ISPs are only making the situation worse, they say, by either ignoring or downplaying the problem. San Diego Supercomputer Center Internet researcher K.C. Claffy says, "It's a huge scientific, policy, and ultimately social crisis, and no one is taking any responsibility for addressing it."
Click Here to View Full Article
to the top


Researchers Use Wikipedia to Make Computers Smarter
American Technion Society (01/04/07) Hattori, Kevin

The accumulated knowledge of Wikipedia could soon allow search engines, spam filters, and other applications to better understand context when analyzing a document, thanks to research at the Technion-Israeli Institute of Technology. The program created by the research team uses a concept database, constructed from Wikipedia, to understand single words and phrases. The type of "background knowledge" that the researchers want computers to utilize is a vital part of human problem solving ability, "but we [previously] didn't know how to have computers access such knowledge," says Technion Faculty of Computer Science researcher Shaul Markovitch. Whereas current programs simply treat documents as a group of words, the new system aims to understand the meaning of the words it encounters. One example given by the researchers is a current spam filter set to block messages where the word "vitamin" appears. If "B12," a type of vitamin, appeared in an email instead of "vitamin," the program would not block it; the Technion system, however, would find upon consulting its database that B12 is a type of vitamin and the email would be blocked. The system also works at discerning the meaning of ambiguous terms, which would be very valuable when dealing with translated documents.
Click Here to View Full Article
to the top


PCs Get Set to Scream in 2007
Wired News (01/05/07) Gain, Bruce

PCs will become faster in 2007 as a result of flash-memory-assisted hard drives and software that utilizes the strength of multi-core CPUs. Analyst Rob Enderle, who is currently testing a hybrid flash/hard-drive memory system, says, "You take all of that stuff from the hard drive and put it in high-speed memory, and the applications just pop. Not only does the system come up faster, but the applications within the system come up a lot faster." Flash memory will boost the demand for laptops by increasing their battery life and decreasing the time machines spend coming out of "hibernate" mode. Also sure to bring about faster computing are the newly introduced multithreaded applications that make use of parallelism in quad-core CPUs. Gabe Newell, co-founder of software developer Valve, believes that these advancements mean quad-core processing "will have good benefits near-term and huge benefits over the next several generations." During 2007, traditional Ethernet 802.11 Wi-Fi antennas and access points, including cellular networks, should receive a boost from wireless capabilities. Along with 40-inch LCD monitors, manufacturers are working on small second screens for laptops, which could allow users to view email and other information via Microsoft's SlideShow platform without needing to start up their machine.
Click Here to View Full Article
to the top


A Personal Computer to Carry in a Pocket
New York Times (01/08/07) P. C1; Markoff, John

The realm of the mobile phone is being invaded by the personal computer, as several new devices are bringing the capabilities of desktop and laptop computers to America's hip pockets. Symbian's Jerry Panagrossi calls this development the "emergence of a fourth-screen," referring to the "movie, television, computer, and now the smart-phone screen." Steve Jobs will unveil Apple's new mobile communications device, which many anticipate to be one of the first of the new breed of devices that can handle music, entertainment, productivity tasks, and communications over cellular and other wireless networks. Many expect Apple's product to signal the beginning of intense competition between hardware vendors to create increasingly capable hybrid devices. One company, Oqo, has already shown its Model 2, and received a positive overall response. The Microsoft Vista-equipped Model 2 is a fully equipped computer with a slide out keyboard and is able to connect to Wi-Fi and cellular networks. Oqo co-founder and computer designer Jory Bell says the company's "main goal is to reinvent the PC in a pocketable form." Korean and Japanese companies have already converted half of their customers to the new generation of mobile computing devices that use data-oriented 3G wireless networks. Analysts say the U.S., once far behind, is now only a year behind these countries. Analyst Chetan Sharma says, "This is happening because of a number of factors. Some have to do with culture and the others are purely business-related. The carriers are now realizing that wireless data is a substantial part of the business."
Click Here to View Full Article
to the top


Emotion-Aware Teaching Software Tracks Student Attention
New Scientist (01/05/07) Simonite, Tom

Researchers in the United Kingdom and China believe software that is able to determine the emotional state of students will improve the effectiveness of e-learning systems. Universities around the world are turning to the Internet and other technologies to deliver lectures and presentations to students over large distances. "But these systems are unable to take into account the needs and responses of the student in the same way a teacher in a classroom can," says Vic Callaghan of Essex University, adding that "that's what we are trying to do, by making a system that can sense emotion." Callaghan has teamed up with Liping Shen, from Shanghai Jiao Tong University, to develop the software, which is designed to determine whether a student is showing interest in their studies and is comprehending the subject matter. Their system would have users wear a ring fitted with a sensor to monitor heart rate, blood pressure, and changes in electrical resistance resulting from perspiration, and would transmit the data to a computer via Bluetooth. The system would also be able to slow down a learning tutorial or switch topics, and even change the format such as from text to video to help the user. Callaghan and Shen hope to test the system on students in China.
Click Here to View Full Article
to the top


Augmented Cognition: Science Fact or Science Fiction?
Institute for Ethics and Emerging Technologies (01/03/07) Costandi, Moheb

The immense volume of information coming in from multiple sources has forced us to continuously switch back and forth between these sources, and some researchers are pursuing augmented cognition as an enhancement to our processes of attention and working memory. Augmented cognition uses computational technology to ascertain a person's cognitive state with the goal of improving it, and the field has drawn interest from the military and the private sector. Optimizing the allocation of attention and integrating multiple sources of information for more efficient data usage is what researchers hope to achieve with augmented cognition, and attention-augmenting devices could prove especially helpful in scenarios that require fast decision-making. Under development is a "smart" cockpit that reads the electrical activity of the pilot's brain and monitors mental stress levels via an electro-encephalogram; the system automatically filters out irrelevant data to reduce stress so the pilot can pay better attention, or takes over the aircraft completely when the stress is overwhelming. Dylan Schmorrow, director of the Defense Advanced Research Projects Agency's AugCog program, believes human memory capacity will be enhanced by augmented cognition, which will "circumvent fundamental human limitations by engineering work environments that will make it easier for people to encode, store and retrieve the information presented to them [and] develop interfaces that are context-sensitive by presenting material in relation to the context in which it is encountered." Meeting this challenge involves incorporating information in unique, image-able, and multisensory contexts. Schmorrow describes the AugCog's program's mission as "to extend, by an order of magnitude or more, the information management capacity of the human-computer warfighting integral by developing and demonstrating quantifiable enhancements to human performance in diverse, stressful, operational environments...[and to] empower one human's ability to successfully accomplish the functions currently carried out by three or more individuals."
Click Here to View Full Article
to the top


When Is a D Better Than C? When It's a Language
InternetNews.com (01/05/07) Patrizio, Andy

A newly developed free programming language aims to address the shortcomings of C/C++, though some question the likelihood of the language's success since it is not being released in conjunction with any other product or platform. The language, D, was developed by Walter Bright, who created the first C++ compiler, Zortech++. He has developed a compiler and standard libraries for both Windows and Linux for D. The Phobus standard library and the compiler front-end are open source, and a D compiler is included for the widely-used open source C compiler, GCC. D is able to produce compiled code without the need of a virtual machine. The language is also somewhat backwards compatible with C: It can be interfaced with any C API without the need for a call interface; but it adds Java and Microsoft C# functions such as garbage collection, an inline assembler, and Java-like single inheritance. In production since 2001, D has received considerable contribution from the Slashdot/open source developer community. Bright designed the language with the experience of C++ programmers in mind, not the sale of a product. He says, "The idea is to make programming in D the most productive possible. Quicker to learn, quicker to write code in, quicker to debug, and quicker to maintain." However, Forrester analyst Jeff Hammond questions the programming language's market potential. He says, "To make a technology viable, the technology has to be more than just good. You have to build a business model around it. What's the business model here?"
Click Here to View Full Article
to the top


System X Used to Model Behavior of Entire Structures
PhysOrg.com (01/08/07)

Elisa Sotelino is using the Virginia Tech supercomputer System X to model how structures will hold up under various conditions, such as fires or earthquakes. "What I do is take the initial information that the structural engineer obtains in the lab and create mathematical models to predict loads for similar structures," says Sotelino, a professor of civil and environmental engineering at Virginia Tech. "My models are very large and I need the advantages of the parallel computing capabilities of System X to run these models." System X, which demands new algorithms, can be used to solve structural problems mathematically, but in the real world issues such as cost and visual appeal must be taken into consideration. Sotelino is credited with developing a family of parallel algorithms called the Group Implicit Algorithms, which have contributed enormously to the area of nonlinear dynamic analysis of structures. The highly regarded specialist in the field was also the force behind the creation of the Structural Engineering Concurrent Software Development Environment (SECSDE). SECSDE is an application for the reusing, quick prototyping, and moving of parallel finite element analysis software for programming and executing complex structural engineering applications.
Click Here to View Full Article
to the top


Scientist Trying to Figure Out 'Space Weather'
News-Gazette (01/02/07) Kline, Greg

University of Illinois electrical and computer engineer professor Jonathan Makela studies the ionosphere, the region of the atmosphere in which molecules are ionized by solar radiation, in hopes of being able to predict the effects its activity will have on life on Earth. A 1989 blackout effecting 9 million people in Quebec was linked to geomagnetic phenomena occurring on the Sun, and is exactly the type of event Makela wishes to prevent or at least predict. Other effects the ionosphere can have are on satellite signals traveling through it, airplane communication systems, or electricity being sent through the power grid. Makela uses his "narrow-field ionospheric airglow imager," an especially sensitive digital camera equipped with special filters and an exposure time of a few minutes to observe the "airglow" over Earth's equatorial areas. The airglow is caused by electrons that have been stripped from the atoms and molecules of ionosphere gases recombining after sunset, and is difficult to see with the naked eye. The imagining equipment transmits data to Makela via the Internet. Along with colleagues, he has been able to create images like TV weather maps of conditions in the ionosphere. Based on how the glow of the airglow changes, the space weather causing the changes can be studied. Makela admits that his research is not yet at the point of being able to predict events. He says, "We're at the stage that meteorology was 50 (years ago)."
Click Here to View Full Article
to the top


DARPA Pushes to Bring Supercomputers to the Masses
Computerworld (01/08/07) Thibodeau, Patrick

The Defense Advanced Research Projects Agency has embarked on a $650 million program to develop easy-to-use, high-performance supercomputers for both commercial and national security applications; the focus for DARPA is not just on the design of hardware, but also the creation of programming languages, development tools, and techniques for scaling applications across immense numbers of processors. William Harrod, the manager of DARPA's High Productivity Computing Systems Program, says constructing such systems calls for a programming environment that "is easier to use and that has less of a learning curve than the environments on today's HPCs." In addition, the new systems will require an operating system and an architecture that yield efficient performance. There were five vendors vying for the development contract when the program started in 2002, and the list has been whittled down to two: Cray and IBM won the $500 contracts to devise "economically viable high-productivity" supercomputers by the close of the decade. The vendors are developing competing programming languages--Chapel from Cray and x10 from IBM--to address current limitations. "There will be one language at the end of the day, and the government, multiple companies and HPC user communities are going to have to put in some effort to adopt [it]," said Cray CTO Steve Scott.
Click Here to View Full Article
to the top


2007 U.S. Budget: NSF Braces for Opportunities Lost
Science (01/05/07) Vol. 315, No. 5808, P. 24; Mervis, Jeffrey

The congressional Democratic desire to freeze the 2007 federal budget at current spending levels until October could jeopardize potential projects at the National Science Foundation (NSF), such as the development of a petascale computer. The NSF is sponsoring a competition to build the $200 million next-generation supercomputer, but a flat budget could force the agency to delay a $50 million down payment until the start of the 2008 fiscal year. The outgoing Republican Congress never completed work on a federal budget in which the NSF had hoped to see an 8 percent increase from its $5.6 billion budget. The increase was part of the Bush administration's American Competitiveness Initiative (ACI), which called for doubling federal spending on basic research in the physical sciences over the next 10 years, with President Bush proposing that the first increase occur in the 2007 budget. Some industry lobbyists plan to start focusing on the 2008 budget. Sens. Pete Domenici (R-N.M.), Lamar Alexander (R-Tenn.), and Jeff Bingaman (D-N.M.) in late December sent a letter to President Bush asking him "to continue to make [ACI spending levels] a top priority in your budget and for your administration." They stressed that such a commitment to technology's role in innovation will be key to the nation's ability to remain the world's economic leader in the years to come.
Click Here to View Full Article - Web Link to Publication Homepage
to the top


Web Users Driving Change in 2007
BBC News (01/01/07) Ward, Mark

Three technology veterans interviewed by BBC News each identified a trend to watch for in 2007. Consultant Kathy Johnson says that as social networks grow on the Internet, companies will strive for "actualization of personalization," meaning they will seek a way to mine the information contained in these communities, to enhance the ability of retailers to make personalized recommendations to users. She points out that users have been more likely to act on suggestions made by online interest groups than retailers in the past. Entrepreneur Phillipe Courtot believes that the simplicity and speed at which Web programs can be constructed will change the way businesses purchase and create software. "You cannot keep on developing software the old ways. The costs of distribution and support are higher and higher and the customers are less and less satisfied." Courtot expects many mergers and acquisitions among traditional software firms as customers increasingly choose Web-based software and services offered over the Web. Accenture European research labs director Dr. Martin Illsley says customers will increasingly use mobile phones equipped with cameras to provide businesses with negative feedback. Illsley also points to service robots, and wireless sensor networks as trends that are ready to explode in 2007.
Click Here to View Full Article
to the top


Online Tests Click With Students
Canadian University Press (01/03/07) Heise, Ryan

University of Alberta researchers are developing new ways of administering tests by replacing pencil and paper with interactive multimedia components designed to better hold students' attention and more accurately evaluate their knowledge. The work is being led by computer science professor and chairman of the Alberta Informatics Circle of Excellence Anup Basu, who says, "The idea is to ... be very interactive. People can drag and drop things; they can play games and in the process get tested." The software is currently being run on computers, but other devices such as mobile phones are being considered. The testing would also be scalable on an individual basis. Basu says, "We want to tailor the test to a student's ability. If the questions are too hard we make it easier; if the questions are too easy we make it a little more difficult." Basu's thinking is not that this will raise scores, but it will ensure that the student remains interested in the test. The program's research director Irene Cheng says a new scoring method would be used in these tests, which would use item response theory and allow educators to tell whether students guessed or not. "If you just count the number of correct answers, it's actually quite misleading because now we actually monitor the response curve," Cheng says. "We can see whether students are guessing or if the student has improved--we can know their behavior." Cheng and Basu expect this type of testing to be implemented in K-12 for now.
Click Here to View Full Article
to the top


Sierra: A Brain That Thinks About Thinking
Computerworld Australia (01/05/07) Tay, Liz

Author Kathy Sierra, a former game developer, says she has a dual role in the IT industry: She helps people learn difficult technical subjects with a minimum of frustration, and helps developers generate passionate users irrespective of the product. "The way most learning happens today is so inefficient, that we can make a dramatic improvement--orders of magnitude improvement--with the technology we do have, simply by applying a set of principles that have come from cognitive science, neurobiology, game design, psychology, entertainment, and learning theory," Sierra notes. She draws a distinction between the brain and the mind, arguing that the dichotomy between "belief" (the brain) and conscious knowledge (the mind) lies at the root of our problems with memory, learning, focus, engagement, attention, and so on. "I think the biggest place to make a difference is learning to create more brain-friendly materials including products and user manuals," she explains, noting that the trick is to associate things the brain does not respond to (such as computer code) with things it does through the use of intriguing visuals, unique presentation, and interesting narratives. Sierra authored the Head First books as brain-friendly programming texts to give readers and learners a better learning experience; show teachers that there are more engaging and fun ways to teach technical subjects; increase awareness of brain-friendly principles; and help people better comprehend the processes of thinking and memory. Her topic of discussion at linux.conf.au 2007 will be applying lessons from game design and other domains to product design, user documentation, user community building, and other things with the goal of creating passionate users, and Sierra maintains that instilling passion in IT employees is crucial to this approach.
Click Here to View Full Article
to the top


Of Mice and Multimedia
Newsweek (01/04/07) Braiker, Brian

IDEO founder and industrial designer Bill Moggridge attributes bad user interface design to the failure to imbue a product with both intuitiveness and wide appeal. Detailing the principles of interactive design is the purpose of his new book, "Designing Interactions." The book attempts to explain the operational parameters and relevancy of design through interviews with industry pioneers. Moggridge notes that the computer mouse went through an exhaustive evolutionary phase in which a wide variety of iterations were designed and tried out before the current incarnation, widely accepted by consumers, emerged. He admits that he does not know why designing highly intuitive interfaces is so tough, but he uses the lack of competition for Apple's iPod to illustrate the value of a systemic approach. Apple first acquired the company with the iTunes music technology and then added attractiveness and ease of use; the company did not roll out a physical product until the ability to manipulate music on a computer was well established and the iTunes music store was set up. "I believe you have to have both excellent design that people like or fall in love with, and you also have to have the systemic approach that looks at the entire experience," Moggridge contends. He points out that the move toward Web 2.0 is progressing, and the time may soon be coming when a new kind of interaction that combines locomotion, manipulation, and community activities becomes available.
Click Here to View Full Article
to the top


The Logic of Privacy
Economist (01/04/07)

Stanford University computer scientists John Mitchell, Adam Barth, and Anupam Datta are using the theory of contextual integrity to address the tension between the wide availability of personal data and the demand for privacy. The theory notes that total privacy is not required by people; information sharing can proceed provided certain social norms are adhered to. Mitchell and colleagues believe contextual integrity can be used to represent the codes of privacy in the formal phraseology of a computer language. Contextual integrity is dependent on four classes of variable: An information flow's context, the acting capacities of the individuals sending and receiving information, the types of information involved, and the transmission principles that serve as the foundation of the information flow. Barth is using linear temporal logic, a mathematical logic system that expresses elaborate constraints on the past and future, to transform the descriptions of contextual integrity variables into formal expressions that can be employed in computer programs, in cooperation with New York University's Helen Nissenbaum, the developer of contextual integrity. Linear temporal logic-based computer programs, unlike those written in other programming languages, describe the desired vision of the world. Mitchell's team has crafted logical formulas to represent American privacy laws, including those that encompass children's activities online, financial institutions, and health care; transmission principles can be communicated in logical terms by employing concepts such as "previously" and "eventually" as mathematical operators such as "plus," "minus," "multiply," and "divide" signs. Questions of privacy can be handled better through the application of contextual integrity, while the reasons why new information gathering techniques arouse anger can be better determined, according to Nissenbaum.
Click Here to View Full Article
to the top


To submit feedback about ACM TechNews, contact: [email protected]

To unsubscribe from the ACM TechNews Early Alert Service: Please send a separate email to [email protected] with the line

signoff technews

in the body of your message.

Please note that replying directly to this message does not automatically unsubscribe you from the TechNews list.

ACM may have a different email address on file for you, so if you're unable to "unsubscribe" yourself, please direct your request to: technews-request@ acm.org

We will remove your name from the TechNews list on your behalf.

For help with technical problems, including problems with leaving the list, please write to: [email protected]

to the top

News Abstracts © 2007 Information, Inc.


© 2007 ACM, Inc. All rights reserved. ACM Privacy Policy.