ACM TechNews (HTML) Read the TechNews Online at: http://technews.acm.org
ACM TechNews
December 19, 2007

Learn about ACM's 2,200 online courses and 1,100 online books
MemberNet
CareerNews
Unsubscribe

Welcome to the December 19, 2007 edition of ACM TechNews, providing timely information for IT professionals three times a week.


HEADLINES AT A GLANCE:

 

IBM Thinks Green With Supercomputer-On-A-Chip
IT World Canada (12/19/07)

A recent IBM research project that aims to replace electricity with pulses of light to make data transfer between processor cores up to 100 times faster could lead to laptop-sized supercomputers and drastically improved power consumption. The technology, called silicon nanophotonics, replaces electronic wires with pulses of light in optical fibers for faster and more efficient data transfers between cores on a chip, according to IBM research scientist Will Green. Green says silicon nanophotonics can transfer data up to a few centimeters, is about 100 times faster than wires, and consumes one-tenth as much power, which should reduce operational costs for supercomputers. "The silicon nanophotonic effort is a high-bandwidth, low-power technology for cores to communicate," says Green, who adds that the improved data bandwidth and power efficiency will allow for massive computing power in consumer products. "We'll be able to have hundreds or thousands of cores on a chip," says Green. Silicon nanophotonics is based on the same science that led to the development of optical fibers and Internet communications, and could be incorporated in chips within 10 to 12 years, according to Green. Copper wire is still essential for transistors in chips to communicate, but silicon nanophotonic technology could be used for cores to communicate. "We're complementing the capabilities of copper with our optical technology," says Green.
Click Here to View Full Article
to the top


E-Voting Machines Rejected in Colo.
Associated Press (12/19/07) Merritt, George

Colorado's Secretary of State Mike Coffman recently declared many of Colorado's electronic voting machines to be unreliable, decertifying three of the four voting-equipment manufactures certified in the state. Coffman said some of the machines could still be used in November if a software patch can be installed, and other machines could be replaced with equipment certified for use in other states, but both of these solutions would require the approval of state legislature. Six of Colorado's 10 most populous counties, including Denver, will be affected by Coffman's decision, which cited problems with accuracy and security. The machines previously certified in Colorado were built by Premier Election Solutions, formerly known as Diebold Election Systems, Hart InterCivic, Sequoia Voting Systems, and Election Systems and Software. Only Premier had all of its equipment pass the recertification process. ES&S' Ken Fields said the decertification was based on recently imposed additional requirements, and Hart InterCivic's Peter Lichtenheld said Hart InterCivic plans on appealing based on how Colorado conducted the tests and maintenance of its machines. In his announcement decertifying the machines, Coffman said Colorado's actions will have national repercussions, and that the federal certification process is inadequate.
Click Here to View Full Article
to the top


Intel's Ultrasmall Flash Hard Drive
Technology Review (12/18/07) Greene, Kate

Intel has announced one of the smallest flash-memory drives that could give handheld devices the power of desktop computers. The chip will compete with similar chips from Samsung, which are used in gadgets such as Apple's iPod and iPhone, but Intel's chip comes with a built-in standard electronics controller, which makes it easy and inexpensive to combine multiple chips into a single, high-capacity hard drive. Since being introduced in the late 1990s, flash memory has revolutionized consumer electronics due to flash-memory chips being smaller, more durable, and more energy efficient than magnetic hard disks, making them the ideal replacement for hard drives in handheld devices such as MP3 players, mobile phones, and even some high-end laptops. Intel's new chip, the Z-P140, is about the size of a thumbnail, weighs less than a drop of water, and is available in 2 GB and 4 GB versions. The built-in electronics controller allows the Z-P140 to be combined with up to three other Intel chips that do not have controllers, allowing for a maximum of 16 GB of storage, according to Intel's Troy Winslow. Though 16 GB is nowhere near the 160 GB hard drives in desktops, Don Larson, marketing manager of Nand products at Intel, says 2 GB is enough to run some operating systems, such as Linux, as well as some software applications. Larson expects to be able to fit 64 GB of storage into a piece of silicon about the size of the Z-P140 by 2010. Researchers at Intel and other companies are already looking for the next solid-state technology to replace flash, including phase-change memory, which changes the crystal structure of a material through heat, which is extremely rugged, compact, and can be written on several thousand times faster than flash.
Click Here to View Full Article
to the top


Energy Usage Benchmark May Help IT Buyers Find Greener Servers
Computerworld (12/17/07) Thibodeau, Patrick

Standard Performance Evaluation (SPEC), a nonprofit company that makes performance benchmarks, has released a test suite that will enable system buyers to compare products based on the basis of energy efficiency. The authors of the benchmark say the test reflects a change in thinking about hardware buying priorities by many corporate users. "The result is that in a growing number of cases, 'satisfactory performance at lower energy cost' may be a better business answer than 'always the best performance possible,'" the authors write in a report on the benchmark. By delivering the benchmark now, the IT industry is a step ahead of the EPA, which is developing an Energy Star rating for servers as part of an effort to reduce data center power consumption. The SPEC and EPA ratings are not necessarily competing efforts, as the federal agency and IT vendors have been cooperating on data center energy usage since last year. Klaus-Dieter Lange, a Hewlett-Packard senior performance engineer and head of the SPEC subcommittee that developed the benchmark, says he expects the EPA Energy Stare rating will be similar to SPEC's benchmark, and that HP is already using the benchmark on its servers. Other vendors involved in the benchmark development include Dell, Fujitsu Siemens Computers, IBM, and Sun Microsystems. The benchmark uses a Java-based application workload to "exercise" a server's CPU, caches, memory, and other system components to measure power consumption at capacity utilization rates ranging from idle to 100 percent of system resources. Developers of the benchmark acknowledge that it will have to be expanded to include other types of workloads, such as database activities, but emphasize that it is the first server energy rating of any type.
Click Here to View Full Article
to the top


Dec. 18, 1987: Perl Simplifies the Labyrinth that Is Programming Language
Wired (12/07) Long, Tony

In 1987, the first version of the Perl programming language was released. Larry Wall, the creator of Perl, borrowed from existing languages, particularly C, to create a general-purpose language indented to simplify text manipulation. Constant upgrades have made Perl a widespread language that is used for multiple purposes, including all aspects of Web development, system administration, and networking. Perl quickly went through numerous upgrades, as less than seven years passed between Perl 1.0 and Perl 5.0, and since then Perl 5 has been continuously modified, with additional features keeping Perl on of the most used programming languages. Wall designed Perl to reflect the realities of modern day computer programming. As the cost of hardware dropped and computers became a more common tool, the cost of programmers rose sharply. Perl's relative simplicity and flexibility helped organizations get the most out of their highly paid programmers.
Click Here to View Full Article
to the top


Computing in a Parallel Universe
American Scientist (12/07) Vol. 95, No. 6, P. 476; Hayes, Brian

Multicore chips that facilitate parallel processing will require a major rethinking in program design, writes Brian Hayes. Software for parallel processors is vulnerable to subtle bugs that cannot manifest themselves in strictly sequential programs. Running correct concurrent programs is possible, but a key challenge is that running the same set of programs on the same set of inputs can entail different results depending on the precise timing of events. One concept for addressing this problem is to have the operating system manage the allocation of tasks to processors and balance the workload, which is currently the chief strategy with time-sliced multiprocessing and dual-core chips. Another tactic is to assign this responsibility to the compiler, which, like the earlier strategy, would be the job of expert programmers. But making the most of parallel computing requires all programmers to deal with the problems of creating programs that run efficiently and properly on multicore systems. "We have a historic opportunity to clean out the closet of computer science, to throw away all those dusty old sorting algorithms and the design patterns that no longer fit," Hayes concludes. "We get to make a fresh start."
Click Here to View Full Article
to the top


Getting a Grip: Building the Ultimate Robotic Hand
Wired (12/07) Vol. 15, No. 12, Mone, Gregory

Enabling robots to handle physical objects means imbuing them with the "hand-eye" coordination needed to recognize targets, guide their appendages toward them, and finely manipulate the objects. Such robots must be designed to learn from the errors they make, and this is the goal of a number of roboticists building machines that are motivated to explore, fail, and learn through tactile manipulation, much like a human infant does. The latest robot developed by the Stanford AI Robot project, Stair 2.0, sports a more advanced hand than its Stair 1.0 predecessor along with algorithms that allow the machine to learn without human intervention, recording unsuccessful attempts to manipulate objects so it will not repeat those same actions. The University of Massachusetts at Amherst's UMan robot features an algorithm that helps the machine determine how to operate its hand to manipulate objects it does not recognize through experimentation, stippling the device's mental perception of the object with a series of points. The machine measures changes in the distances between those points as it gets a feel for the target and deduces how to manipulate it. Meanwhile, the University of Genoa's Laboratory for Integrated Advanced Robotics has created a humanoid, five-fingered robot that is programmed to learn to manipulate objects via study and mimicry of humans performing the same actions, using mirror neurons as a template for the device's cognitive architecture. Areas with a demonstrated need for such machines include elder care.
Click Here to View Full Article
to the top


WearIT@work: Toward Real-World Industrial Wearable Computing
IEEE Pervasive Computing (12/07) Vol. 6, No. 4, P. 8; Lukowicz, Paul; Timm-Giel, Andreas; Lawo, Michael

The deployment of real-life industrial wearable technology is the goal of the EU-funded wearIT@work project, and the consortium behind the initiative represents the biggest civilian wearable-computing effort worldwide. The project is organized around a quartet of pilot applications--car production, emergency response, health care, and aircraft maintenance--that drive the work in a bottom-up, user-centered approach. The wearIT@work project focuses on context and sensing as enablement technologies, and one objective is to track the activities of maintenance and production workers via sensors worn on the body so that customized information can be generated, the correctness of workers' performance can be confirmed, and trainees' progress can be evaluated. The project emphasizes the facilitation of professional, industrial development environments for wearable applications, which involves the development of the Open Wearable Computing framework designed to support simple, hardware-independent wearable application development. Information for anyone interested in wearable system development will be available through a single access point, the Open Wearable Computing Group. Thus far, the most substantial finding of the wearIT@work project is its exposure of the heterogeneity of wearable applications. Among the technologies researchers have experimented with for the aircraft maintenance pilot is a next-generation vest equipped with a computing system and a gesture interface comprised of a radio frequency identification reader, a battery, an acceleration sensor, a microcontroller board, and a Bluetooth interface. The emergency response pilot entails the establishment of three key functionalities--communication, sensing, and navigation--which could be supported by technologies that include miniature wireless sensor nodes, see-through head-mounted displays, pedestrian dead-reckoning, and mobile outside beacons.
Click Here to View Full Article
to the top


Malware Flood Driving New AV
InfoWorld (12/14/07) Hines, Matt

Symantec security experts watched as customers participating in a research project downloaded approximately 65,000 new applications during a week-long period in November 2007. The experts analyzed the software and identified as many as 60 percent of the applications as malicious. The statistics illustrate a worrying trend--that malicious applications are outstripping legitimate programs on the Web--which may compel Symantec to alter its strategy for fending off threats. Malware criminals find gaps in popular applications such as Web browsers using fuzzing tools, and then check their attacks against anti-virus products to ensure their efficacy. As a result, "most new malware is going undetected by commercial security products," explains Carey Nachenberg of Symantec. Moreover, malware authors are increasingly using server-side polymorphism, which hooks many victims by "producing a copy for as few as two or three people and then re-writing it; so, if we get one version we can remove it from a few computers, but not all the variants," says Nachenberg. Accordingly, standard countermeasures will have to be supplemented with new strategies. One new tactic involves using distributed data collection capabilities to study usage patterns of various applications. Such patterns could help security vendors distinguish malware from valid software, and would allow vendors to suggest that individuals avoid questionable programs. However, if the number of new malware programs continues to exceed the production of legal programs, anti-virus vendors may have to adopt a white-listing approach to spot good applications rather than attempting to pursue all the bad applications.
Click Here to View Full Article
to the top


The Incredible Shrinking Computer Chip
Scientific American (12/07) Greenemeier, Larry

ASM International says its atomic layer deposition technology will allow chipmakers to build microprocessors that measure 45 nanometers as the shortest distance between gates, and even at distances of 32 and 22 nanometers. The development would enable chipmakers meet the demand for more powerful processors as the technology continues to shrink. Last week, ASM America, a subsidiary of the Netherlands-based semiconductor equipment maker, said technology and processes for making 45-nanometer chips in large volumes are now in production. They include hafnium oxide, which would replace silicon dioxide as the insulating material to help prevent leakage and improve the control of the current through transistors; and lanthanum oxide, which would be placed between the hafnium oxide and the actual metal gate on each transistor to help control the interface between the metal gate and hafnium dielectric. Meanwhile, researchers at Clemson University are using a hafnium oxide gate dielectric to lower microprocessor heat generation and speed up the data transmission rate to more than 5 GHz. "We should have machines running at these speeds in two to three years," says Rajendra Singh, director of Clemson's Center for Silicon Nanoelectronics.
Click Here to View Full Article
to the top


UMaine Widens Research Access
Morning Sentinel (ME) (12/17/07) Crowell, Alan

The University of Maine will use a $200,000 grant from the National Science Foundation to purchase a new supercomputer and another $300,000 grant to improve the transfer of massive data files. Also, the university plans to make the world-class research tools available to the state's research firms and schools. For example, large research companies will be able to boost their computing power and quickly move enormous amounts of data between their facilities and the university, while small institutions will be able to use Maine's computers rather than purchase their own technology. Opening up the computing resources will give the state a competitive advantage. Maine officials are even more excited about making the technology widely available to schoolchildren. Next year, university officials plan to work with about 20 teachers to introduce programs that scientists use to predict climate changes. They say the modeling technology will give students a better understanding of science than their schoolbooks, and could ultimately encourage them to pursue careers in math or science.
Click Here to View Full Article
to the top


Chip-Shrinking May Be Nearing Its Limits
Associated Press (12/16/07) Robertson, Jordan

Sixty years after the first transistor, and nearly five decades since they were first integrated into silicon chips, transistors are approaching their physical limitations in size, and soon will not be able to be made any smaller. "Things are changing much faster now, in this current period, than they did for many decades," says Intel CTO Justin Rattner. "The pace of change is accelerating because we're approaching a number of different physical limits at the same time. We're really working overtime to make sure we can continue to follow Moore's Law." Earlier this year, IBM and Intel both announced a new way to boost transistor efficiency involving replacing the silicon dioxide used as an insulator with various metals in the gate and the gate dielectric, which could help improve transistor performance and conserve energy. Other possible solutions include some "highly speculative" alternative technologies, including quantum computing and optical switches. Intel predicts that such innovations will be necessary to continue Moore's Law beyond 2020. While modern transistors may be reaching their limits, no one is predicting that the advancement of technology will slow down or stop. "The only thing that's been predicted more frequently than Moore's Law has been its demise--everybody's been wrong," says Sun Microsystems CTO Greg Papadopoulos.
Click Here to View Full Article
to the top


Neuronal Circuits Able to Rewire on the Fly to Sharpen Senses
EurekAlert (12/16/07)

Researchers from the Center for the Neural Basis of Cognition (CNBC), a joint project of Carnegie Mellon University and the University of Pittsburgh, have created an algorithm for the mechanism called "dynamic connectivity," which helps explain how scents are picked up. In describing dynamic connectivity for the first time, the researchers also used a computer model to show that enhancing the sharpness of the stimuli, independent of the spatial patterns of the active neurons, makes it easier to differentiate between stimuli. Connections made by neurons in the olfactory bulb change dynamically as a result of the specific patterns of stimuli, according to the researchers. "If you think of the brain like a computer, then the connections between neurons are like the software that the brain is running," says Carnegie Mellon professor Nathan Urban. "Our work shows that this biological software is changed rapidly as a function of the kind of input that the system receives." The January 2008 issue of Nature Neuroscience includes a paper that addresses the process.
Click Here to View Full Article
to the top


Hackers Have Poor Nations' PCs in Their Sights
New Scientist (12/15/07)No. 2634, P. 22; Reilly, Michael

Cybersecurity remains an untamed frontier in developing countries, allowing hackers to operate and wreak havoc with near-total impunity. "All in all, you have a perfect recipe for botnet attacks in the developing world," notes Ethan Zuckerman of the Berkman Center for Internet and Society. He observes that hacker activity rises dramatically once a country achieves 10 percent to 15 percent Internet penetration. The International Telecommunications Union (ITU) is rolling out a global effort to implement cybersecurity measures that the developed world uses within the Third World, but it will be a formidable challenge. Poorer nations do not possess the funds for countermeasures nor the technical training to erect effective cyberdefenses, partly because the cost of Internet connectivity is much higher than it is in industrialized countries. Africa, which is already beset with economic turmoil and computer vulnerability, could become even more ripe for cyber-exploitation as cheap, streamlined computers become widely available through initiatives such as the One Laptop Per Child program. International cooperation is essential to the improvement of developing nations' cyberdefenses, says the University of Cologne's Marco Gercke. Seymour Goodman of the Georgia Institute of Technology cites the importance of organizing national computer emergency response teams (CERTs), which would analyze the type of attack and the required countermeasures while also informing ISPs, and the ITU wants to supply the expertise and training to set up CERTs in all developing countries.
Click Here to View Full Article - Web Link May Require Paid Subscription
to the top


Spreading the Load
Economist Technology Quarterly (12/07) Vol. 385, No. 8558, P. 19

Mining radio-telescope data for signs of extraterrestrial intelligence, designing new drugs, and modeling weather systems are just some of the projects that are enlisting the formidable computing muscle of volunteers' PCs through efforts such as the Berkeley Open Infrastructure for Network Computing, an open-source platform that coordinates over 40 initiatives. Another platform, the Berkeley Open System for Skill Aggregation, was designed with "distributed thinking" in mind. The growth of volunteer computing is also being helped along by the use of non-PC devices such as games consoles and the processors they are equipped with. In addition, friendly competition among volunteers and significant refinements to the volunteer networking software are aiding these efforts. The rapid expansion of volunteer computing's potential is supported by the doubling of processor power every 18 months or so and the concurrent growth of bandwidth accessible to ordinary Internet users. These projects recruit volunteers primarily through word of mouth, and a major challenge to such efforts is convincing researchers that volunteer computing is not just a publicity stunt, but a valid, massive, and mostly unharnessed resource. Working in the initiatives' favor is the fact that the immensity of the Internet user population virtually guarantees interest in any project.
Click Here to View Full Article
to the top


Sex, Math and Scientific Achievement
Scientific American (11/07) Halpern, Diane F.; Benbow, Camilla P.; Geary, David C.

The reason why the fields of science, mathematics, and engineering are male-dominated is multifaceted, and cannot be narrowed down to any one specific factor. The argument that gender is a determining factor falls apart when one considers that "scientific ability" cannot constitute a single intellectual capacity, while demonstrated differences between men and women in the various talents needed for scientific achievement would not indicate immutability. Research shows that males and females' cognitive skill performance progresses more or less equally until past grade school, when females perform better on most verbal skill assessments, as well as episodic memory evaluations; on the other hand, boys' "visuospatial" abilities--the skills for mentally navigating and modeling three-dimensional movement of objects--tend to be better than girls, which gives them an advantage in solving math problems that require the creation of a mental image. Boys and girls' mathematical prowess is equal, on average, yet there are greater numbers of mathematically gifted boys, due to the fact that for some reason males are much more heterogeneous in their mathematical skill. Still, the relative population of mathematically gifted females has been increasing significantly, concurrent with various societal changes that have unfolded over the past several decades, including the advent of special programs and mentoring for girls. A 10-year study of over 300 extremely gifted individuals found that those with stronger mathematical than verbal skills tended to prefer math and science courses and were interested in acquiring degrees in those areas, while kids with stronger verbal talents favored humanities courses and usually pursued educational credentials in the humanities and law. This supports the theory that gifted kids make their career choice based on what they are best at rather than what they are capable of learning. Another factor is gender-based discrimination in science/math/engineering fields, which has not disappeared but simply become an unconscious rather than conscious bias.
Click Here to View Full Article
to the top


To submit feedback about ACM TechNews, contact: [email protected]

To be removed from future issues of TechNews, please submit your email address where you are receiving Technews alerts, at:
http://optout.acm.org/listserv_index.cfm?ln=technews

To re-subscribe in the future, enter your email address at:
http://signup.acm.org/listserv_index.cfm?ln=technews

As an alternative, log in at myacm.acm.org with your ACM Web Account username and password, and follow the "Listservs" link to unsubscribe or to change the email where we should send future issues.

to the top

News Abstracts © 2007 Information, Inc.


© 2007 ACM, Inc. All rights reserved. ACM Privacy Policy.