Association for Computing Machinery
Welcome to the February 7, 2014 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).


The Public Eye?
University of Cambridge (02/06/14)

In 2011, the Indian government launched the Unique Identification Authority of India (UIDAI), a program to collect the iris patterns and fingerprints of all of its 1.2 billion citizens within three years. To date, more than 540 million people have enrolled in the optional program, with one million more joining every day. Each new iris pattern is checked against every other pattern in the database to detect and prevent duplication, resulting in almost 500 trillion iris comparisons every day. The program relies on algorithms developed by University of Cambridge professor John Daugman. In India, anonymity is a huge problem, as many of India's citizens have never had any form of government identification, which makes them ineligible for government benefits. Each person who enrolls with UIDAI is issued a unique 12-digit number called Aadhaar. The major challenge of Aadhaar is that every new entrant to the database must be checked against each existing entrant to prevent acquisition of duplicate or multiple identities. The algorithms convert an iris pattern into a numeric code, which takes just a few milliseconds, and then measure the amount of dissimilarity between the new iris code and every other iris code. The researchers have found that an average central-processing unit core can complete about 1 million comparisons per second.

Carnegie Mellon Model Predicts Growth, Death of Facebook and Other Membership-Based Websites
Carnegie Mellon News (PA) (02/04/14) Byron Spice

Carnegie Mellon University (CMU) researchers have developed a computational model to assess the viability of websites and social networks and predict which sites are sustainable and which are not. The model aims to replicate the dynamics of membership sites such as Facebook, LinkedIn, and TeaPartyNation, including the role of active users as catalysts of site activity, turning dormant site members into active users and keeping them active. The researchers say the model was able to reliably predict which sites will be sustainable for the foreseeable future. The model could help investors understand which sites are self-sustaining and which are likely to fail, as well as helping website managers identify and correct problems in the dynamics of attention to their sites, according to CMU researcher Bruno Ribeiro. The model also accounts for several other factors, including the tendency of active members to become inactive, the influence that active members can have in encouraging friends to join or become active members, and the role of marketing and media campaigns in convincing people to join. "If this model is correct, social network sites will try to make your friends' lives seem more interesting and your feedback on their posts more urgent," Ribeiro says.

Cryptography Breakthrough Could Make Software Unhackable
Wired News (02/03/14) Erica Klarreich

University of California, Los Angeles professor Amit Sahai and his colleagues have created an "indistinguishability" program obfuscator that could address many challenges that have existed in cryptography for the past 40 years. Obfuscators conceal the inner functioning of a program while still allowing use, which could protect software patches, chips that read encrypted DVDs, and software controlling military drones. In the future, obfuscators could potentially enable people to use autonomous virtual agents to act on their behalf. Many computer scientists, including Sahai, at one time believed obfuscation to be impossible, but Sahai and his colleagues last year published two papers demonstrating a protocol and describing its applicability to cryptography. "This is the first serious positive result" in discovering a universal obfuscator, says Microsoft Research's Boaz Barak. "The cryptography community is very excited." The indistinguishability obfuscator can be used to establish public key encryption, digital signatures, and other fundamental cryptographic protocols, including functional encryption and deniable encryption. The method mixes carefully selected random elements into each piece of the program so that if the resulting program is run in the intended way, the randomness is negated and the output is correct. Although the team believes the obfuscation scheme is unbreakable, its mathematical approach must be tested over time and the concept must be refined to enable commercial applications.

Indiana University Researchers Help Develop State-of-the-Art Cybersecurity Resource
IUB Newsroom (02/04/14) James Boyd

Indiana University researchers have developed the Software Assurance Marketplace (SWAMP), a tool designed to help software developers close security holes in their products. The SWAMP, which is supported by a $23.4 million U.S. Department of Homeland Security (DHS) Science and Technology Directorate grant, provides a state-of-the-art facility that serves as an open resource for software developers, software assurance tool developers, and software researchers who want to collaborate and improve software assurance activities in a safe, secure environment. The SWAMP provides a suite of assurance tools and software packages that identity vulnerabilities and reduce false positives. "The magnitude of our national software assurance problem requires a comprehensive approach backed by a powerful facility that addresses all dimensions of the problem--integrated education, better tools, and wider adoption," says SWAMP director Miron Livny. The SWAMP can assess Java, C, and C++ software against five static analysis tools. "We see widespread adoption of the SWAMP as having a profound, positive impact on software systems and applications that powers our critical infrastructure," says DHS software assurance program manager Kevin Greene. "The SWAMP collaboration is a great example of the public and private sector coming together to advance improvements in software assurance activities to deal with emerging cyber threats."

Robots With Insect Brains
Free University of Berlin (02/03/14)

Researchers at the Free University of Berlin (FU Berlin) have developed a robot with an artificial mini-brain that can perceive environmental stimuli and learn to react to them. The researchers say the artificial brain has the ability to learn by simple principle. "The network-controlled robot is able to link certain external stimuli with behavioral rules," says FU Berlin professor Martin Paul Nawrot. "Much like honeybees learn to associate certain flower colors with tasty nectar, the robot learns to approach certain colored objects and to avoid others." He notes the relatively simple nervous system of honeybees is the model for the working principles. Components of the small robotic vehicle include a camera for receiving and projecting visual information, software that replicates in a simplified way the sensorimotor network of the insect brain, and a neural network that operates the motors of the robot wheels and can control its motion direction. The team plans to introduce more learning principles, which will make the mini-brain more powerful and the robot more autonomous.

Researchers Develop 'Envy-Free' Algorithm for Settling Disputes From Divorce to Inheritance
NYU News (02/03/14) James Devitt

Three New York University researchers have developed a pair of algorithms based on principles of fairness can help resolve conflicts over assigning belongings for a divorce or inheritance. For the first algorithm, two people make simultaneous or independent choices in sequence, starting with their most preferred item and then progressively descending to the least desired belongings that have not already been allocated. For the second algorithm, the participants submit their complete preference rankings in advance to an arbitrator or referee. The researchers say this algorithm is "envy free" because the participants prefer each of their items to a corresponding item of the other party. Still, potential conflicts can arise, for example, when the players covet the same items at the same time, but the algorithm comes up with an efficient allocation based on each party's preference for pairs of items to the other's and the lack of an alternative allocation the participants would prefer.

WPI Researchers Tackle the Challenges of Putting Robots on the Shop Floor and in Our Homes
Worcester Polytechnic Institute (02/03/14) Eileen Brangan Mell

Worcester Polytechnic Institute (WPI) researchers are studying the challenges involved with robots working alongside people in settings such as manufacturing plants and the homes of the elderly. The researchers are developing algorithms that enable robots to collaborate with people on manufacturing operations, and allow common users to teach robots how to complete everyday tasks. By helping to make smaller manufacturers more competitive, the researchers hope to play a role in revitalizing American manufacturing, according to WPI professor Dmitry Berenson. "A robot needs to understand that process and be able to look at the state of things--where the parts are and what has already been done--and anticipate what the person will do next, so it can do something that helps and doesn't interfere," says Massachusetts Institute of Technology professor Julie Shah. In laboratory studies, volunteers will carry out a variety of assembly tasks as their actions are captured with cameras and laser scanners. The recordings will be used to create a library of human task behaviors that can be integrated with algorithms that will enable the robots to better understand how humans complete tasks. "Task planning and motion planning have not really been brought together in this way before," Berenson says.

Quantum Dots Provide Complete Control of Photons
Linkoping University (01/31/14)

Researchers at Linkoping University say they have used quantum dots to achieve light polarization that is strong and easy to control. Their method involves asymmetrical quantum dots of a nitride material with indium formed at the top of microscopic six-sided pyramids. The team created light with a high degree of polarization, on average 84 percent. Polarized light, in which all light waves oscillate on the same plane, is the foundation for technology such as liquid-crystal displays (LCD) in computers and TVs, and advanced quantum encryption. However, there are efficiency issues with the process of generating polarized light. "Our theoretical calculations point to the fact that an increased amount of indium in the quantum dots further improves the degree of polarization," says Linkoping's Fredrik Karlsson. He says the research could lead to more energy-effective polarized light-emitting diodes in the light source for LCD screens. The results also are very promising for the use of quantum encryption for wiretap-proof communications.

Edison Electrifies Scientific Computing
Berkeley Lab News Center (01/31/14) Margie Wylie

The U.S. National Energy Research Scientific Computing (NERSC) Center, a division of the Lawrence Berkeley National Laboratory, recently accepted Edison, a new flagship supercomputer designed for scientific productivity. "We support a very broad range of science, from basic energy research to climate science, from biosciences to discovering new materials, exploring high energy physics and even uncovering the very origins of the universe," says NERSC director Sudip Dosanjh. Edison can execute nearly 2.4 quadrillion floating-point operations per second at peak theoretical speeds. However, Dosanjh says what is really important is the scientific productivity of the users, which is why Edison was configured to handle data analysis and simulation and modeling, equally well. He notes that both types of computing rely heavily on moving data. "So Edison has been optimized for that: it has a really high-speed interconnect, it has lots of memory bandwidth, lots of memory per node, and it has very high input/output speeds to the file system and disk system," Dosanjh says. Since Edison does not utilize accelerators, researchers have been able to move their codes from NERSC's old system to Edison with little or no changes.

The Promise of 'Big Data'
Harvard Gazette (01/31/14) Caroline Perry

A recent symposium called "Weathering the Data Storm: The Promise and Challenges of Data Science," hosted by the Institute for Applied Computational Science at the Harvard School of Engineering and Applied Sciences (SEAS), addressed the potential and challenges of big data. Academic and business leaders discussed the applications of big data to real-world problems. For example, IBM’s T.J. Watson Research Lab worked with UNICEF on a system to sort text messages reporting famines, floods, Ebola outbreaks, evictions, and dried-up water sources. Because UNICEF lacks enough workers to read all of the text messages it receives, the system helps prioritize critical information to direct resources. The system parses spelling errors, uses common word associations to understand synonyms, and uses conditional probability techniques to determine the most urgent messages. As computing power increases and organizations increasingly understand the benefits of big data, new challenges are emerging, including privacy and security issues as well as a shortage of big data software and data scientists. SEAS is working to educate not only computer science majors but all students in data science, and SEAS dean Cherry A. Murray says other universities have an obligation to do the same. Data science "cannot become its own narrow discipline, but will need to be intrinsically transdisciplinary," says Murray, noting the impact of ubiquitous computing and data on everyday life.

SequenceL Language Takes the Pain Out of Multicore Coding
InfoWorld (01/31/14) Paul Krill

Researchers at Texas Tech University and the U.S. National Aeronautics and Space Administration have developed SequenceL, a declarative, functional language that is geared to multicore programming. SequenceL also provides automatic parallelization, and the compiler outputs C++ code. The researchers chose C++ because it gave them the ability to run on any platform and leverage all of the available C++ tools and interoperate with C++, so that rewriting an entire program was unnecessary, says Texas Multicore Technologies' Doug Norton. "You could just take a piece of it, rewrite that in SequenceL, and run it through the compiler, and now you have massively parallel C++ for that portion of the code, which you can link right in with the rest of your C++ [code]," he says. "That work began in 2005-2006, to build that prototyping compiler." SequenceL also is applicable to multiprocessor programming. The researchers want to focus primarily on x86-based systems, but Norton notes it would be relatively simple to get the language ported to other platforms. SequenceL already has been applied to applications for wireless mesh networking, oil and gas, image processing, and hearing aid audio algorithms.

An Integrated Computer Modeling System for Water Resource Management
National Science Foundation (01/31/14) Marlene Cimons

The U.S. National Science Foundation is funding research to design an integrated computer modeling system that will seamlessly connect all of the different water resource management models, including hydrology, engineering, economics, public policy, chemistry, ecology, and agriculture, among others. The researchers hope "to take all these models from different groups and somehow glue them together," says University of Virginia professor Jonathan Goodall. The researchers were motivated by an initiative funded by the European Union called the Open Modeling Interface, which was originally designed to facilitate the simulation of interacting processes, particularly environmental ones, by enabling independent computer models to exchange data as they ran. "One of the ways we are trying to strengthen the software is by trying to understand which kinds of problems it can handle," Goodall says. The researchers are applying the work specifically to the challenge of modeling water and nutrient transport within watersheds. "The modeling framework system will then be used to go beyond the capabilities of current models by including new disciplines into the watershed modeling process, and then eventually allowing specialized groups to advance components of the overall modeling system," Goodall says.

Google's Ray Kurzweil Envisions New Era of Search
CIO Journal (02/04/14) Steve Rosenbush

In an interview, Google engineering chief Ray Kurzweil discussed a new type of search engine that he is developing, noting that search engines will have increasingly human-like problem-solving capabilities in the years to come. Kurzweil says future search engines will be able to understand documents, answer complex questions, and watch for new information that could be helpful to users. "Google has already taken steps toward actually understanding the meaning and it can read with a little bit of understanding. That's basically what I'm working on, to actually understand the content of the Web pages," Kurzweil says. "You can ask it more complex questions that might be a whole might engage in a dialogue with you to find out what you need." Furthermore, he says people could assign research projects to a search engine, which would continuously scan for relevant information. Search engine advances will be noticeable within five to eight years, and human-like search abilities will emerge by 2029, when the Singularity between human and artificial intelligence occurs, Kurzweil says.
View Full Article - May Require Paid Subscription | Return to Headlines | Share Facebook  LinkedIn  Twitter 

Abstract News © Copyright 2014 INFORMATION, INC.
Powered by Information, Inc.

To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe