Association for Computing Machinery
Welcome to the August 19, 2009 edition of ACM TechNews, providing timely information for IT professionals three times a week.

HEADLINES AT A GLANCE


Carnegie Mellon, Pitt Involved in $10 Million 'Model Checking' Research
Pittsburgh Tribune-Review (PA) (08/19/09) Cronin, Mike

The National Science Foundation has awarded a $10 million grant to Carnegie Mellon University (CMU), the University of Pittsburgh, and six other institutions to develop computational techniques that could be used to save lives. CMU professor Christopher Langmead says the researchers will study extremely complex systems and identify the events that may result in mission-critical failures. The researchers will integrate the techniques of model checking and abstract interpretation, search for bugs in automobile and aerospace computer software, and examine the causes of pancreatic cancer and atrial fibrillation. Model checking explores all possible states of a system and looks for irregular or unusual conditions. However, sometimes too many states exist to find flaws, so abstract interpretation is used to simplify a system by grouping similar states together, says CMU professor and grant team leader Edmund M. Clarke, winner of ACM's 2007 A. M. Turing Award. This type of analysis is particularly important in situations where software interacts with the real world, such as software that helps ensure the safety of vehicle brakes or the success of spacecraft launches, says project participant and University of Maryland professor Rance Cleaveland. The project also could help improve the diagnosis of some diseases because physicians could group cell proteins that perform similar functions to identify cancer cells that behave differently or stop growing. "Using a computational method, we hope to provide an answer to what's happening at the very early stages," says Langmead. Other schools involved in the effort include Cornell University, the State University of New York at Stony Brook, New York university, and the City University of New York.


Computer Scientists Scale 'Layer 2' Data Center Networks to 100,000 Ports and Beyond
UCSD News (08/18/09) Kane, Daniel

University of California, San Diego (UCSD) researchers have developed software designed to enable data centers to function as single, scalable, plug-and-play networks. The PortLand software system is a fault-tolerant, layer 2 data center network fabric capable of scaling to more than 100,000 nodes. The researchers say the software could help support large-scale data center networks by increasing inherent scalability, providing baseline support for virtual machines and migration, and significantly reducing administrative costs. PortLand also removes the reliance on a single spanning tree and can natively leverage multipath routing and improve fault tolerance. "With PortLand, we came up with a set of algorithms and protocols that combine the best of layer 2 and layer 3 network fabrics," says UCSD professor Amin Vahdat. "Ideally, we would like to have the flexibility to run any application on any server while minimizing the amount of required network configuration and state." One of PortLand's key innovations is its location discovery protocol, which allows for the possibility of a scalable layer 2 network. Switches automatically learn their location within the data center topology without human assistance, and can then assign Pseudo MAC addresses to each of the servers they connect to. "The goal is to create a network fabric that allows you to buy any server or switch, plug it in, and have it just work," says UCSD computer science graduate student Radhika Niranjan Mysore. The PortLand system was presented at ACM's SIGCOMM 2009 conference, which takes place August 17-21 in Barcelona, Spain.


Wi-Fi via White Spaces
Technology Review (08/18/09) Naone, Erica

The transition from analog to digital broadcasts has opened up radio spectrum that could be used to deliver long-range, low-cost wireless Internet service using white spaces, which are empty fragments of the spectrum scattered between used frequencies. White space frequencies could be used to provide broadband Internet access in rural areas and fill in gaps in city Wi-Fi networks. For example, Microsoft Research's Ranveer Chandra says white space frequencies could be used to allow people to connect to their home network from up to a mile away. Last November, the Federal Communications Commission (FCC) ruled that companies could build devices that transmit over white spaces, but also required that those devices should not interfere with existing broadcasts. Microsoft researchers have designed a series of protocols, called White Fi, to account for the restrictions involved in using white spaces. Chandra says wireless networking has traditionally used an open spectrum with all users being equal shareholders, but in white spaces some users are primary users. Chandra says his research team recently received an experimental license from the FCC allowing them to build a prototype White Fi system on the Microsoft Research campus. The researchers will send their findings to the FCC in the hope that the data will help establish future white-space regulations. The blueprints for a computer network that uses white spaces were presented at ACM's SIGCOMM 2009 conference, which takes place August 17-21 in Barcelona, Spain.


Home, James--Public Transport Gets Personal
ICT Results (08/19/09)

The CyberCars2 project has developed technology that could lead to highly efficient, unmanned public transportation systems. "At the moment, most public transport needs drivers to control the vehicles, and that makes it suitable only for mass transportation," says CyberCars2 coordinator Michel Parent, director of the R&D program for automated transportation at the French National Institute for Research in Computer Science and Control. "But the more you can automate vehicles and make them work on the existing infrastructure ... then personal, rapid transit becomes feasible." Parent says automation offers a major improvement in efficiency for public transportation and could be an ideal solution for congested city centers. CyberCars2 is developing various technologies that will make road-based automated transportation systems a reality. Parent says the project's main objective was to operate and coordinate several different vehicles at a high throughput. The project developed a routing layer so vehicles can communicate even when they cannot "see" each other, as well as routing protocols capable of performing multi-hop data exchanges between two vehicles. After creating communications channels, the project focused on control software to allow cars to cooperate. The objective was to have several different cybercars, using a variety of sense and control technologies, to be able to approach one another while staying safe and avoiding collisions. Using simulations of intersections and merges, the project developed rules for how vehicles must negotiate with one another in close prolixity. The project succeeded in having six cybercars and three unmanned buggies navigate a figure-eight track in a real-life test, with the vehicles successfully navigating the circuit, communicating with each other, slowing down at the four-way crossover, and safely navigating the intersection.


Momentum Building on STEM Education
eSchool News (08/14/09) Devaney, Laura

Policy makers are focusing on equipping more students with global skills through science, technology, engineering, and math (STEM) education, and some education experts say traction is building for more attention on the technology and engineering components. A recently completed study from the National Research Council's National Academy of Engineering (NAE) characterizes the extent and nature of initiatives to teach engineering to K-12 students in the United States. The report assigns a definition to engineering because many people have little understanding of the profession, and also discusses research and evidence on the effects of engineering education on areas such as improved science and math learning and improved technological literacy, says NAE program officer Greg Pearson. "One of the findings is that discussions of STEM tend to be focused on science, sometimes math, rarely both together--usually they're siloed, and the T and especially the E are really just left out of the discussion in policy, education, and classroom practice," he says. The University of Illinois at Chicago's Donald Wink says that schools must deploy rigorous and open learning programs to enhance the effectiveness of STEM teaching, as well as secure the technology appropriate for teaching what is current and relevant in these disciplines. The National Science Foundation (NSF) estimates that although women earned more than 50 percent of all science and engineering bachelor's degrees in 2006, they earned only about 20 percent of degrees in engineering, computer science, and physics. NSF's Innovation through Institutional Integration (I3) initiative strives to connect institutions' NSF-funded STEM education projects and to exploit their collective strengths. I3 promotes greater collaboration within and among institutions and deals with important initiatives, including broader participation of underrepresented minorities in STEM fields and the merger of research and education.


Desktop Multiprocessing: Not So Fast
Computerworld (08/18/09) Wood, Lamont

The continuance of Moore's Law--the axiom that the number of devices that can be economically installed on a processor chip doubles every other year--will mainly result in a growing population of cores, but the exploitation of those cores by the software requires extensive rewriting. "We have to reinvent computing, and get away from the fundamental premises we inherited from [John] von Neumann," says Microsoft technical fellow Burton Smith. "He assumed one instruction would be executed at a time, and we are no longer even maintaining the appearance of one instruction at a time." Although vendors offer the possibility of higher performance by adding more cores to the central processing unit, the achievement of this operates on the assumption that the software is aware of those cores, and will use them to run code segments in parallel. However, Amdahl's Law dictates that the anticipated improvement from parallelization is 1 divided by the percentage of the task that cannot be parallelized combined with the improved run time of the parallelized segment. "It says that the serial portion of a computation limits the total speedup you can get through parallelization," says Adobe Systems' Russell Williams. Consultant Jim Turley maintains that overall consumer operating systems "don't do anything very smart" with multiple cores, and he points out that the ideal tool--a compiler that takes older source code and distributes it across multiple cores--remains elusive. The public's adjustment to multicore exhibits faster progress than application vendors, with hardware vendors saying that today's buyers are counting cores rather than gigahertz.


International Win for Clever Dataminer
University of Waikato (08/17/09)

The first place finisher in the 2009 Student Data Mining Contest, run by the University of California, San Diego, used the Weka data-mining software to predict anomalies in e-commerce transaction data. Quan Sun, a University of Waikato computer science student, says it took about a month to find the answer. The contest drew more than 300 entries from students in North America, Europe, Asia, and Australasia. "I couldn't have done it without Weka," Sun says of the open source software that was developed at Waikato. "Weka is like the Microsoft Word of data-mining, and at least half of the competitors used it in their entries." ACM's Special Interest Group on Knowledge Discovery and Data Mining gave the Weka software its Data Mining and Knowledge Discovery Service Award in 2005. Weka has more than 1.5 million users worldwide.


Student's 'Green' Use for Online Social Networking
University of York (08/17/09) Garner, David

Derek Foster, a University of York computer science student, has created WattsUp, software that works with Facebook to track household energy consumption as part of online social profiles. WattsUp is capable of showing live and historical household energy consumption and carbon dioxide emissions of the users of the social networking Web site. The application uses the WATTSON energy monitor to record household energy consumption and carbon dioxide emissions. Foster says the WattsUp application helps improve awareness of energy consumption and has the potential to help reduce energy usage. "By using Facebook as the delivery platform, it introduces social psychology elements such as peer pressure and normative behavior between friends, which can introduce competitiveness to reduce energy usage," he says. WattsUp has been selected as a novel application finalist by the developers of the Microsoft Facebook Development Toolkit.


New Nanolaser Key to Future Optical Computers and Technologies
Purdue University News (08/17/09) Venere, Emil

Researchers at Purdue, Norfolk State, and Cornell universities have developed a new type of laser called a spaser, which they say could make possible a variety of innovations, including light-based, ultra-fast computers and advanced sensors and imaging technology. The spaser is the first of its kind to emit visible light, possibly making it a critical component for future technologies based on nanophotonic circuitry, says Purdue professor Vladimir Shalaev. Nanophotonic circuitry will require a laser-light source, but modern lasers cannot be made small enough to integrate them into electronic chips. However, the spaser uses clouds of electrons called surface plasmons instead of the photons that make up light. Nanophotonics could result in a variety of radical technologies, including hyperlenses that lead to new sensors and microscopes 10 times more powerful than those available today and are capable of seeing objects as small as DNA, or computers and consumer electronics that use light instead of electrical signals to process information. "Here, we have demonstrated the feasibility of the most critical component--the nanolaser--essential for nanophotonics to become a practical technology," Shalaev says. The researchers developed spaser-based nanolasers as spheres that measure 44 nm in diameter. The spasers contain a gold core surrounded by a glass-like shell filled with green dye. When a light is shined on the spheres, plasmons generated by the gold are amplified by the dye and are converted into photons of visible light, which is emitted as a laser. Future efforts may involve creating a spaser-based nanolaser that uses an electrical source instead of a light source, which would make spasers more practical for computer and electronics applications.


New Material for Nanoscale-Computer Chips
University of Copenhagen (08/17/09) Frandsen, Gitte

A collaborative effort involving Chinese and Dutch researchers has demonstrated that organic nanoscale wires could be used instead of silicon in computer chips. Nanochemists from the Chinese Academy of Sciences and the University of Copenhagen have developed nanoscale electric contacts from organic and inorganic nanowires, and have coupled several contacts together to create an electric circuit. The new research demonstrates how advanced devices made from organic materials could be built at the nano scale. "It is a first step toward [the] realization of future electronic circuitry based on organic materials--a possible substitute for today's silicon-based technologies," says the University of Copenhagen's Thomas Bjornholm. "This offers the possibility of making computers in different ways in the future." The researchers used organic nanowires and tin oxide nanowires to create a hybrid circuit. The device has a low operational current, high mobility, and good stability, all of which are necessary to be considered a replacement for silicon.


RFID Tags Get an Intelligence Upgrade
New Scientist (08/14/09) Kleiner, Kurt

Researchers are working on adding intelligence to radio frequency identification (RFID) tags by incorporating microcomputers to open RFID cards to a variety of much smarter applications. The limited power available to RFID tags makes computation a challenge, but also could have the advantage of making computational RFID tags (CRFID) inexpensive, robust, and long-lasting. "Ten years ago we would have thought this was science fiction--doing programming without a battery," says Kevin Fu, a CRFID researcher at the University of Massachusetts Amherst. Fu and colleagues are working on CRFIDs using hardware from Intel. Intel's smarter tags have a 16-bit microcontroller capable of storing programs up to 32 kilobytes in size. The tags can store small amounts of electricity taken from a reader in a capacitor for short periods. Because power storage is limited, CRFIDs need a way to back up computations often so they do not lose all their work when the power runs out. Fu and colleagues have developed a way to enable long-running computations to occur despite continual power interruptions. Fu says CRFIDs could eventually be used to make passports or credit cards more secure or as sensors. Fu's team has tried embedding CRFIDs in concrete to report moisture content, which could give engineers early warning of structural faults.


Engineering Professors Awarded $1 Million Grant to Design More Robust Computer Systems
Colorado State University (08/12/09) Wilmsen, Emily

Colorado State University engineering professors Tony Maciejewski, Arnold Rosenberg, and H.J. Siegel have received more than $1 million from the National Science Foundation to design more robust and dependable computing and communications systems. "Uncertainty is the enemy of a robust computer system, but this grant will help us minimize damaging failures and work to build computer systems that perform well through crises," Maciejewski says. "As computer systems become more integrated with everyday life, it's really important that they continue to perform critical functions even when there's an unpredicted circumstance." DigitalGlobe, which supplies images to Google Maps and Microsoft Virtual Earth, and the National Center for Atmospheric Research also will be participating in the project, along with the University of Colorado at Boulder. The researchers will design models and mathematical and algorithmic tools to derive robust resource management schemes and quantify the probability of system failures. "The robustness concepts being developed have broad applicability, and will significantly contribute to meeting national needs to build and maintain robust information technology infrastructures," says DigitalGlobe's Jay Smith.


Abstract News © Copyright 2009 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]




Unsubscribe
Change your Email Address for TechNews (log into myACM)