ACM TechNews (HTML) Read the TechNews Online at: http://technews.acm.org
ACM TechNews
December 3, 2007

Learn about ACM's 2,200 online courses and 1,100 online books
MemberNet
CareerNews
Unsubscribe

Welcome to the December 3, 2007 edition of ACM TechNews, providing timely information for IT professionals three times a week.


HEADLINES AT A GLANCE:

 

Faster Computers Accelerate Pace of Discovery
Washington Post (12/03/07) P. A7; Lee, Christopher

The first petascale computer is expected to become operational sometime next year, continuing the trend of ever faster supercomputers. A petascale supercomputer will complete a calculation in two hours that would take five hours on what is currently the world's fastest computer. Scientists say that powerful supercomputers will help solve long-standing mysteries in climate change, geology, drug development, dark matter, and other fields where direct experimental observation is too time-consuming, costly, dangerous, or simply impossible. Supercomputers are expected to become so powerful that they have introduced a new step into the scientific method, with computation joining thesis, hypothesis, and experimentation as a standard procedure of scientific discovery. The massive increases in computing power rely on not only increases in processor speed and memory, but on the ability of scientists to "gang" hundreds of thousands of nodes into a single machine and to create better ways of having the nodes communicate with one another when solving a problem. "If you ran today's code on yesterday's computers, they would be much faster," says Raymond Bair, director of the Argonne Leadership Computing Facility at the Argonne National Laboratory. "People have figured out how to solve the problems faster." Although petascale computing is not yet available, scientists are already looking forward to the exascale machine, capable of 1 million trillion calculations per second and about a thousand times more powerful than petascale. Exascale machines could be available around 2018.
Click Here to View Full Article
to the top


Secretary of State Casts Doubt on Future of Electronic Voting
San Francisco Chronicle (12/02/07) P. C7; Wildermuth, John

California Secretary of State Debra Bowen says that electronic voting systems used in California are still too unreliable and untrustworthy to be used in the state's elections. Moreover, Bowen doubts whether the electronic voting systems will ever meet the standards she believes are needed in California. Although Bowen says computer scientists may one day develop reliable systems, she says today's machines are not as transparent or auditable as the paper ballot systems they replaced. A rigorous inspection of the state's voting systems found that most of the voting machines were vulnerable to hackers looking to change results or cause mischief, which resulted in Bowen decertifying almost all of the touch screen systems used in California. Bowen says she would like to see California use optical scan systems, which use a paper ballot and a tallying machine and are already used to count mail ballots in California. Optical scan systems are "old and boring, but cheap and reliable," Bowen says, because the paper ballots make it easy to have a recount. While Bowen's investigation and decisions only involve California, they have had a nationwide impact because many of the same systems are used in other states. "I want to make sure the votes are secure, auditable, and transparent and that every vote is counted as it was cast," Bowen says.
Click Here to View Full Article
to the top


Government-Sponsored Cyberattacks on the Rise, McAfee Says
Network World (11/29/07) Brodkin, Jon

Governments and groups across the world are harnessing the Internet to mount cyberattacks on their enemies by attacking key systems such as financial markets, electricity, and government computer networks, according to a new report by McAfee. The report, which was created with input from the FBI, NATO, and other intelligence groups, notes that China has been charged with launching attacks against four countries in 2007. The United States and 119 other nations are also believed to be conducting Web espionage operations, reports McAfee. Such assaults are well-organized, well-funded, and can operate on technical, economic, political, and military fronts. Moreover, the attacks have grown so sophisticated that they can evade the radar of government cyber defenses, according to McAfee. David Marcus of McAfee anticipates the eventual creation of a privatized model, under which governments will authorize cybercriminals to attack enemies, noting that state-sponsored malware has already emerged. Meanwhile, cyberattacks are also a growing threat to online services such as banking and new targets include VoIP and social-networking applications such as Facebook. Malware is also getting more flexible and robust, as demonstrated by the "Storm Worm," and McAfee researchers have seen "the emergence of a complex and sophisticated market for malware." Finally, the report notes that cybercrime tools such as custom-written Trojans and software flaws are available for sale, and that the underground economy that distributes the tools is so competitive that customer service has become a selling point.
Click Here to View Full Article
to the top


Forum to Upgrade MPI Standard
HPC Wire (11/29/07)

The MPI Forum has been relaunched and is calling on end users, hardware and software vendors, researchers, and MPI implementers to take part in its effort to bring the Message Passing Interface (MPI) standard up to date. A decade old, MPI is the ubiquitous application programming interface for parallel computing, and it has made it easier to exchange data between processes and adjust the number of processes in a single, parallel job. The MPI Forum plans to clarify the current MPI 2.0 standard and make corrections to the document by mid 2008 and address errors and omissions in the standard by early 2009. The final stage of reassessing the standard, with regard to its support for current and future applications, could be completed by 2010 and could involve changes for generalized requests, non-blocking collectives, new language support, and fault-tolerance. "MPI has been extremely successful in enabling advances in simulation over the past decade and will continue to play a key role in this arena," says Rich Graham of Oak Ridge National Laboratory's National Center for Computational Sciences, who is coordinating the project. "However, with a large body of hands-on experience and a rapidly changing computing ecosystem, it is time to take a look at adjusting the standard to meet this ever-changing environment."
Click Here to View Full Article
to the top


The Transistor at 60
Sydney Morning Herald (Australia) (11/27/07) Head, Beverley

Since its debut six decades ago, transistor technology has advanced to the point where 820 million transistors can be housed on Intel's new Penryn processor. However, the shrinkage of transistors to accelerate processing speed and manage power efficiency has Intel co-founder Gordon Moore convinced that a physical barrier will be reached within the next 10 or 15 years. Not everyone agrees with Moore's assessment. "What's happened again and again when you come upon the physical limits is we've been able to advance around them, and I think that will continue for at least the next several generations," says director of IBM's Australia Development Laboratory Glenn Wightwick. Intel CTO Justin Rattner forecasts that within 10 years electronics will shift from reliance on an electron's electrostatic charge to its "spin," and perhaps usher in molecular devices. Wightwick says many research labs are investigating potential replacements for transistors, such as molecular cascades or carbon nanotubes. The trade-off with a switch to new electronic components is the cost and effort of facilitating such a transition, but users would benefit enormously because their interaction with technology would be easier thanks to single-system chips, Rattner says. He says these advances could lead to innovations such as practical machine translation, continuous speech recognition, and personal robots.
Click Here to View Full Article
to the top


Scientist: 'Hybrid' Computers Will Meld Living Brains With Technology
Computerworld (12/03/07) Gaudin, Sharon

University of Arizona professor Charles Higgins believes that in 10 to 15 years "hybrid" computers that use a combination of technology and living organic tissue will be common consumer products. Higgins has successfully connected a moth's brain to a robot, using the moth's sight to tell the robot when something is approaching so it can move out of the way. Higgins says he started out trying to build a computer chip that could simulate how a brain processes visual images, but found that the chip would cost an estimated $60,000. "At that price I thought I was getting lower quality than if I was just accessing the brain of an insect which costs, well, considerably less," Higgins says. "If you have a living system, it has sensory systems that are far beyond what we can build." The 12-inch-tall robot that relies on a moth's sight may be considered cutting edge right now, but Higgins believes that it is only the beginning of organic enhanced computers. "In future decades, this will not be surprising," says Higgins. "Most computers will have some kind of living component to them."
Click Here to View Full Article
to the top


Carnegie Mellon's National Robotics Engineering Center Receives $14.4 Million to Develop and Demonstrate Next-Generation Autonomous Ground Vehicle
Carnegie Mellon News (11/29/07)

The U.S. Army Tank-Automotive Research, Development, and Engineering Center (TARDEC) has awarded Carnegie Mellon University's National Robotics Engineering Center (NREC) a $14.4 million contract to develop a more advanced version of Crusher, its autonomous, unmanned ground vehicle. The updated version of Crusher will have the latest suspension, vehicle frame, and hybrid-electric drive technologies, which should give its performance a boost. TARDEC also wants Carnegie Mellon to develop an end-to-end control architecture and demonstrate its viability for autonomous UGV operations in settings that have the most difficult terrain. "We're delighted that NREC will play a key role in showing how advanced autonomous vehicles work in FCS [future combat systems] settings," says NREC director John Bares. "Our goal will be to develop, integrate, and test a high-performance UGV with the most up-to-date mobility and autonomy technologies."
Click Here to View Full Article
to the top


Cryptic Messages Boost Data Security
ICT Results (11/28/07)

The first "real-life" application in quantum cryptography was the use of id Quantique's unbreakable data code in the Swiss national elections in October 2007. "Protection of the federal elections is of historical importance in the sense that, after several years of development and experimentation, this will be the first use of a 1 GHz quantum encrypter, which is transparent for the user, and an ordinary fiber-optic line to send data endowed with relevance and purpose," said id Quantique co-founder Nicolas Gisin. Through quantum cryptography, two communicating parties can generate a shared random bit string only they know, which can be used as a key to encode and decode messages. Furthermore, the parties can be almost immediately tipped off when an unauthorized third party is attempting to gain access to the key and take action to counter the intrusion. Accidental data corruption can also be detected, which is an important consideration in the Swiss elections. The elections are just the first step of a plan to set up a pilot quantum communications network in Geneva called SwissQuantum, whose next phase will be the provision of a platform for testing and validating the quantum technologies that will help safeguard future communications networks. Id Quantique is a partner in the SECOQC project, and id Quantique co-founder Gregoire Ribordy says the initiative "makes it possible for id Quantique's engineers to interact with some of the best groups worldwide in the field of quantum cryptography." SECOQC's partners plan to lay the groundwork for a high-security communication network that melds quantum key distribution with elements of classical computer science and cryptography.
Click Here to View Full Article
to the top


Intelligent Software Helps Build Perfect Robotic Hand
Innovations Report (11/29/07) Egan, Lisa

Researchers in Portsmouth and Shanghai plan to use artificial intelligence to teach a robotic glove to move in the same dexterous manner as a human hand. A cyberglove will learn human hand movements from software that is being developed by Dr. Honghai Liu, senior lecturer at the University of Portsmouth's Institute of Industrial Research, and professor Xiangyang Zhu from the Robotics Institute at Jiao Tong University in Shanghai. The device makes use of motion capture, sensor, and infrared illumination camera technology to capture data, and has a measurement accuracy of up to a few millimeters. The artificial intelligence and robotics experts believe their research could ultimately result in the development of the perfect artificial limb. "Humans move efficiently and effectively in a continuous flowing motion, something we have perfected over generations of evolution and which we all learn to do as babies," Zhu says. "Developments in science mean we will teach robots to move in the same way."
Click Here to View Full Article
to the top


Open Source's Future: More Microsoft, Bigger Talent Shortages
Network World (11/27/07) Brown, Bob

Raven Zachary, open source research director for The 451 Group, believes the open source industry in 2008 will see more news from Microsoft, IBM, Oracle and other big IT vendors, less startup funding, more M&A activity, and an increasingly damaging shortage of talent. During The 451 Group's second Annual Client Conference in Boston, Zachary said he is optimistic about the market, largely because the traditional bottom-up adoption of open source by developers and systems management professionals is being coupled by top-down adoption driven by CIOs and executive committees who believe in the cost reduction potential of open source software. Additionally, companies are switching to open source for applications as well as for browsers and operating systems. Zachary says big-name IT vendors known for their proprietary technologies are embracing open source and collaborative development systems, and that big vendors are re-evaluating their business models, licensing schemes, and product plans because of open source's "disruptive force." However, the current shortage of open source talent is only expect to get worse as demand skyrockets for internal open source support and developers. Zachary also expects there to be a wave of failed open source businesses in 2008 as companies learn how to monetize open source products.
Click Here to View Full Article
to the top


Computer Research Might Land Senior $100,000 Scholarship
Washington Post (12/03/07) P. B2; Chandler, Michael Alison

One of the top math and science contests for high school students in the country will announce its winners on Monday. Over the weekend, six individual and six team finalists presented their projects before a panel of judges in New York City. Computer networks were the focus of Jacob Steinhardt, an 18-year-old senior at Thomas Jefferson High School for Science and Technology in Fairfax, Va. The research has the potential to contribute to the development of faster computer networks. Steinhardt turned his attention to studying computer networks algebraically in order to improve their efficiency. "It was me just doing stuff for my own edification, and it evolved," he says. Steinhardt has a chance to win a $100,000 scholarship, and he intends to continue to pursue studies in math and computer science next year at MIT. The competition, sponsored by the Siemens Foundation, drew 1,600 entries.
Click Here to View Full Article - Web Link May Require Paid Subscription
to the top


Comparing Collaborative Interaction in Different Virtual Environments
SPIE (11/28/07) Shahab, Qonita; Kwon, Yong-Moo; Ko, Heedong

Significant strides have been made in the development of haptics hardware and software, particularly in haptics technology. Such advances have significantly altered collaborative virtual environments (CVEs), where multiple users can work together and interact with objects in a virtual world. In CVEs, all of the user's inputs need to be combined in real time to determine how an object will be affected or behave. Several research efforts have examined interaction techniques common to CVEs users, particularly when multiple users are handling the same object. One project, called the Virtual Dollhouse, examined two people working together to build a virtual dollhouse using virtual building blocks, a hammer, and nails. Network support allowed participants in different places to work in the same simulation and to see the results of each other's actions. Part of the study included the effect of haptics on a collaborative virtual interaction as well as changes in collaborative efforts when the users were in different types of virtual environments. Even during voiceless collaboration, users were able to understand the status of an object being operated by other users based on color feedback, and users could still work together effectively. Meanwhile, 3D devices such as the space interface device for artificial reality proved to be more intuitive for a task where users are asked to select and move objects than a traditional joystick. Immersive display environments such as the cave automatic virtual environment were also found to be more suitable than non-immersive displays such as normal PC monitors for simulating object manipulation that required force and the feeling of weight.
Click Here to View Full Article
to the top


Taking Legos a League Beyond U-32 Students Program Robots for National Competition
Rutland Herald (VT) (12/03/07) Larkin, Dapne

The For Inspiration and Recognition of Science and Technology (FIRST) Lego League 2007 Challenge brought together students aged 9-14 for a worldwide competition in robotics programming relating to global energy issues. The competition involved programming robots for certain tasks, such as replacing solar panels on a Lego house. The Lego Mindstorm robots used in the competition are the same toolkits used by Tufts University engineering programs, and can be as accessible or as challenging as needed for the students. Student teams are given eight weeks to design, build, and program a robot to complete specific tasks, which are announced each year on Sept. 1. The competition also features workshops for FIRST mentors to encourage the development of more FIRST teams. "The combination of the public speaking presentation, computer programming and engineering adds up to a brilliant piece," says Randy Brown, a FIRST team coach and physics and computer science teacher in Vermont. "If you're going to have a soccer team, you should have a FIRST team."
Click Here to View Full Article
to the top


Concurrency: The Compiler Writer's Perspective
Software Development Times (11/15/07)No. 186, P. 26; Morales, Alexandra Weber

Google compiler writer Brian Grant acknowledges in an interview that concurrency is a challenge for developers, but not an overwhelming challenge. He notes that developers can address nondeterminism through tools and methodologies, but there are complexities inherent in other language features. "The key is managing the complexity by imposing some kind of discipline," Grant explains. "As with all software, you need to break it into components, layers, and well-defined interfaces." Grant says we are a long way off from automatic concurrency or parallel compilers, and adds that OpenMP works well for an extremely small set of applications such as parallelizing hot kernels in numeric and scientific computing; partitioning larger codebases is a far more daunting challenge that OpenMP is not suited for. In contrast to existing languages such as C++, higher-level languages tend to favor ease of correctness over ease of performance, Grant says. He contends that GPUs or the Cell processor do not necessarily tackle the problem of increasing performance while decreasing concurrency. Grant points out that some applications run at high speeds and others do not, while such applications are much more difficult to program. He concludes that he has yet to see a compelling or complete concurrency solution, and posits that "concurrency needs to be well supported throughout the whole software ecosystem: languages, tools, libraries and legacy code."
Click Here to View Full Article
to the top


Toward a Social Semantic Web
Computer (11/07) Vol. 40, No. 11, P. 113; Mikroyannidis, Alexander

Although there may appear to be a natural conflict between the structural demands of the Semantic Web and the open availability of Web 2.0 and social networking, the two areas are essentially compatible and can coexist, writes University of Leeds research assistant Alexander J. Mikroyannidis. Internet monitoring firm Netcraft reports that the Web currently comprises more than 100 million Web sites and despite the rapid rate of growth in the amount of information available, serious efforts to manage the information have been lacking. Most published information is not structured to allow for logical reasoning, and finding information that requires more than a keyword search can be difficult. Meanwhile, the rise of Web 2.0 has seen the rapid development of some incredibly popular and innovative technologies, including social networking sites, communication tools, and wikis. The widespread appeal of blogging, syndication, and tagging that allows users to easily share and publish content has made Web 2.0 incredibly popular and has led to the Social Web, a medium for online communication and collaboration. A popular way of organizing content on the Social Web is with keywords, which creates sets of categories derived from tagging that are commonly referred to as folksonomies. Folksonomies can potentially be used to bridge the gap between the Social and Semantic Web due to their ability to form stable structures. For example, folksonomies' collective categorization scheme could be used as an initial knowledge base for constructing ontologies, which could then use the most common tags as concepts, relations, or instances. Essentially, creating a Social Semantic Web would require allowing Web 2.0 users to tag as usual and then exploit the folksonomies for the construction and evolution of ontologies and metadata to deliver Semantic Web products and services to users.
Click Here to View Full Article - Web Link to Publication Homepage
to the top


To submit feedback about ACM TechNews, contact: [email protected]

To be removed from future issues of TechNews, please submit your email address where you are receiving Technews alerts, at:
http://optout.acm.org/listserv_index.cfm?ln=technews

To re-subscribe in the future, enter your email address at:
http://signup.acm.org/listserv_index.cfm?ln=technews

As an alternative, log in at myacm.acm.org with your ACM Web Account username and password, and follow the "Listservs" link to unsubscribe or to change the email where we should send future issues.

to the top

News Abstracts © 2007 Information, Inc.


© 2007 ACM, Inc. All rights reserved. ACM Privacy Policy.