Association for Computing Machinery
Welcome to the March 25, 2015 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


ACM Turing Award Goes to Pioneer in Database Systems Architecture
PRWeb (03/25/15)

ACM has named Massachusetts Institute of Technology scientist Michael Stonebraker recipient of the 2014 ACM A.M. Turing Award for his contributions to the concepts and practices underlying modern database systems. Stonebraker pioneered the engineering of database systems that support these concepts and released them as open software, and many modern database systems incorporate source code from his systems. Stonebraker's accomplishments include the development of Ingres, which showed relational database theory's viability. Ingres enabled him to make many key contributions, including query language design, query processing techniques, access methods, and concurrency control, and to demonstrate the use of query rewrite techniques to relational views and access control implementations. Another achievement was his introduction of the object-relational model of database architecture with the release of Postgres, integrating important concepts from object-oriented programming into the relational database context. More recently, Stonebraker has advocated for the "no size fits all" database systems architecture model and devised architectures for specialized functions. "Michael Stonebraker's work is an integral part of how business gets done today," says ACM President Alexander L. Wolf. "Moreover, through practical application of his innovative database management technologies and numerous business start-ups, he has continually demonstrated the role of the research university in driving economic development."


Obama, Wowed by Young Scientists, Announces New STEM Pledges
The Associated Press (03/23/15) Jim Kuhnhenn

U.S. President Barack Obama on Monday announced more than $240 million in pledges to boost the study of science, technology, engineering, and math (STEM). The pledges include a $150-million philanthropic effort to encourage promising early-career scientists to stay on track and a $90-million campaign to expand STEM opportunities to underrepresented youth. In total, the new STEM commitments have brought total financial and material support for these programs to $1 billion. In addition, more than 100 colleges and universities have committed to training 20,000 engineers, and a group of CEOs has pledged to expand high-quality STEM educational programs to an additional 1.5 million students in 2015. Obama has emphasized STEM education since the beginning of his presidency, when he launched Educate to Innovate, an effort to encourage STEM studies, in 2009. Obama made the announcement at a science fair event at the White House, during which 35 young winners came to showcase a range of breakthroughs. "It's not enough for our country just to be proud of you," Obama told the students. "We've got to support you." He said the science fair was one of the White House's most fun annual events. "Every year I walk out smarter than when I walked in," Obama said.


Better Debugger
MIT News (03/24/15) Larry Hardesty

Massachusetts Institute of Technology (MIT) researchers last week at ACM's International Conference on Architecture Support for Programming Languages and Operating Systems in Istanbul, Turkey, presented a new algorithm for identifying integer-overflow bugs. The researchers tested the algorithm on five common open source programs and found three known bugs, as well as 11 new ones. The system, called Directed Integer Overflow Detection (DIODE), starts by feeding the algorithm a single input. As that input is processed, DIODE records each of the operations performed on it by adding new terms to the "symbolic expression." "This 32-bit integer has been built up of all these complicated bit-level operations that the lower-level parts of your system do to take this out of your input file and construct those integers for you," says MIT professor Martin Rinard. When the program reaches a point at which an integer is involved in a potentially dangerous operation, DIODE records the current state of the symbolic expression. Although the initial test will not trigger an overflow, DIODE can analyze the symbolic expression to calculate an input that will. "DIODE provides an effective mechanism for finding dangerous integer overflows that affect memory allocation sites, the source of many critical security vulnerabilities," says Imperial College London senior lecturer Cristian Cadar.


Superfast Computers a Step Closer as a Silicon Chip's Quantum Capabilities Are Improved
University of Surrey (03/20/15) Amy Sutton

A British-Dutch-Swiss team has demonstrated laser control of quantum states in an ordinary silicon wafer and observed these states via a conventional electrical measurement. The researchers achieved a quantum on/off switching time of about a millionth of a millionth of a second, which they say is the fastest-ever quantum switch with silicon and more than 1,000 times faster than previous attempts. The team found silicon provides a clean environment for the phosphorous atoms trapped inside where its quantum information was being stored. "We put the atoms into a superposition state with a very short [a few trillionths of seconds] laser pulse from the FELIX laser facility, and then we showed we can create a new superposition, which depends on the exact time at which a second laser pulse arrives," says University of Surrey researcher Ellis Bowyer. The superposition state survived even when electrons were flying around the trapped atom while current flowed through the chip. Moreover, the current depended on the superposition state. The next phase of the research could lead to the creation of fast quantum silicon chips and devices such as super-accurate clocks and ultra-sensitive bio-medical sensors.


Computer Science Surge Sparks Campus Building Boom
Network World (03/23/15) Ann Bednarz

Colleges and universities across the U.S. have been building new facilities to keep up with expanding science, technology, engineering, and math (STEM) programs. As they add capacity, these institutions also are rethinking how STEM facilities should be designed. "The buildings that we're designing now are really about engaging people in the sciences," says EYP Architecture & Engineering's Leila Kamal. Fewer traditional lecture halls, easily accessible labs, and more circulation spaces are hallmarks of modern STEM architecture. STEM always has been an early proving ground for ideas about interdisciplinary learning because of the natural relationships among the subjects. Students want "opportunities to meet with other interdisciplinary groups, to formulate ideas, to fabricate ideas, to test and prototype and, ultimately, to meet with individuals outside of the university setting that could help them realize implementations of these ideas," says Mark Thaler, education practice leader at global design firm Gensler. "We're investigating ways to create spaces that are adaptable, multimodal, and flexible." State-of-the-art research facilities and lab space also can be a major draw for prospective students. In 2014, colleges and universities spent $9.8 billion on the construction of new facilities, building additions, and renovations, up 20 percent from 2013. There has been particular growth in STEM projects over the last five years.


Rewriting the Rules of Turing's Imitation Game
Technology Review (03/17/14) Simon Parkin

Ever since Alan Turing's "imitation game," artificial intelligence (AI) researchers have tried to formulate a way of determining just how intelligent their creations are. There already are AI programs that far outstrip human beings' in certain areas, such as playing games and doing complex math. However, these AI systems are all extremely specialized and, as AI expert Leora Morgenstern notes, often fail to translate their skill at one task to skill at another. When Turing addressed the problem of defining machine intelligence, he said a computer would have to be able to convince human beings it also was human. Just last year, a chatbot dubbed Eugene Goostman was able to do so, passing a Turing test organized by the University of Reading. Meanwhile, alternatives to Turing's test have been proposed that incorporate a capacity for creativity, such as Mark Riedl's Lovelace 2.0 test. However, Riedl suspects there is something futile about trying to formulate a test for AI. "I think it is ultimately futile to place a definitive boundary at which something is deemed intelligent or not," he says. "Who is to say being above a certain score is intelligent or being below is unintelligent? Would we ever ask such a question of humans?"


Images That Fool Computer Vision Raise Security Concerns
Cornell Chronicle (03/20/15) Bill Steele

Cornell University researchers have created images that look to humans like white noise or random geometric patterns but which computers identify as common objects. The researchers say the fact that computers can be fooled by these optical illusions raises security concerns and opens new channels for research in computer vision. The results are important because "they highlight the extent to which computer vision systems based on modern supervised machine learning may be fooled, which has security implications in many areas," says Cornell graduate student Jason Yosinski. He also says "the methods used in the paper provide an important debugging tool to discover exactly which artifacts the networks are learning." The researchers arrived at their results by evolving images with the features a deep neural network (DNN) would consider significant. Starting with a random image, they slowly mutated it, showing each new version to the DNN. If a new image were identified as a particular class with more certainty than the original, the researchers would remove the old version and continue to mutate the new one. Over time, the process resulted in images that were recognized by the DNN with more than 99-percent confidence but were not recognizable to human vision.


Lack of Effective Timing Signals Could Hamper 'Internet of Things' Development
NIST News (03/19/15) Chad Boutin

The U.S. National Institute of Standards and Technology (NIST) has released a report that stresses the importance of marrying computers and networks with timing systems. The Internet of Things (IoT) is expected to encompass many applications that depend on precise timing in computers and networks, but modern data systems were designed to operate optimally with probabilities on execution times. For example, a driverless car would need precise synchronization across systems such as safety and decision-making programs. The paper reviews the state of the art in areas that are key to the use of timing signals such as clock design, the employment of timing in networking systems, hardware and software architecture, and application design. The report says cross cutting research is needed to improve the current approach and technology. Networked components require a way to combine time-sensitive processes and those that can be done whenever the system gets around to them, says NIST's Marc Weiss. "The kind of growth in the IoT that is expected to happen will be severely hampered without these improvements," Weiss warns.


The Science of Magic: Rensselaer and Walt Disney Imagineering Research & Development Advance the Frontiers of Cognitive Computing
RPI News (03/19/15) David Brond

Rensselaer Polytechnic Institute (RPI) researchers are studying how cognitive computing technology can augment the experience of visitors at theme parks, cruise ships, and other venues. The researchers are focusing on information-extraction techniques to help computers better understand words written or spoken by a human, as well as agent-based methods for investigating how computers and humans can engage in more natural conversations. The research relies on unstructured data, which is an increasingly important part of big data technology. "Our goal in this project is to work with Walt Disney Imagineering Research & Development to transform the leading-edge tools and techniques into fully developed applications that will help make the Disney experience even more enjoyable for people and families around the world," says RPI professor James Hendler, director of The Rensselaer Institute for Data Exploration and Applications (IDEA). He says the research is an important step forward for all of the data-related studies taking place at IDEA. "We believe Rensselaer's world-class text and language-processing tools, in conjunction with Walt Disney Imagineering Research & Development's cutting-edge autonomous character platforms, will enable a new class of guest/character experiences," says Disney Imagineering's Jonathan Snoddy.


Smart Machines at Work: AI Gets a Job
Computerworld (03/19/15) Lamont Wood

The advent of deep-learning technology over the last decade has led to a revolution in the use of artificial intelligence (AI), with such systems finding their way into workplaces across the economy. Memorial Sloan Kettering Cancer Center, for example, is using IBM's Watson AI platform to help its doctors decide on initial treatments for lung cancer patients. Insurance company USAA is using Watson to create what vice president Neff Hudson calls "question-answer machines" that can be used by members to get advice on everything from what sort of car to buy to how to handle personal finances. Medical billing service Zotec Partners is using a natural-language processing system to automatically generate financial summaries for its clients, while public relations firm Waggener Edstrom Communications has created a similar system that monitors social media to gauge public sentiment toward their clients. Rob High, CTO of IBM's Watson group, sees the creation of such "smart machines" becoming a cottage industry, with companies in any number of sectors using platforms such as Watson to build AI systems to meet their unique needs. Many worry the technology will inevitably make human employees obsolete, but High argues AI technology is still very limited and will likely act as tools used by human workers, rather than as their replacements.


Me, Myself, and iCub: Meet the Robot With a Self
New Scientist (03/18/15) Tony Prescott

Researchers at the University of Sheffield's Sheffield Robotics program are trying to give their iCub robot an artificial "self" by emulating the five elements of the self first articulated by psychologist Ulric Neisser in the 1990s: the ecological or physically situated self, the interpersonal self, the temporally-extended self, the conceptual self, and the private self. So far, the Sheffield researchers have been able to emulate three of the five using a variety of processes meant to replicate the functions of the human brain. To create a physically-situated self, the researchers have developed a "body schema" that enables the iCub to be aware of its body and how to move it through space. To create an interpersonal self, they have devised a means by which the iCub can observe and then imitate human behaviors, such as learning hand gestures. Meanwhile, they created a temporal self by giving the iCub the means of remembering things that have happened and using that knowledge to predict what might happen in the future. The researchers say the two remaining elements of the self, the conceptual and private self, are likely to be very difficult, but could be the key to granting the iCub self-awareness.
View Full Article - May Require Paid Subscription | Return to Headlines | Share Facebook  LinkedIn  Twitter 


Smarter Smart Grids
National Science Foundation (03/18/15) Aaron Dubrow

North Carolina State University (NCSU) researchers are using cloud computing resources to analyze smart grid data from thousands of sensors, called phasor measurement units, which are distributed across the transmission grid and connect a wide range of electrical generating plants. The research leverages resources developed through the U.S. National Science Foundation's ExoGENI project, which is part of the Global Environment for Network Innovations initiative. The ExoGENI platform combines computation, storage, and network capabilities with open cloud computing and dynamic circuit fabrics to address complex scientific and network engineering problems. The NCSU researchers linked real-time sensor data to on-demand virtual computing resources at nodes across the U.S. "We want to show how processing, analyzing, and monitoring power system data can be done using a distributed architecture instead of traditional centralized methods," says NCSU professor Aranya Chakrabortty. The researchers currently are extending the testbed to a completely closed-loop sensing and control system for wide-area control of power grids. "As the number of phasor measurement units in the North American grid grows exponentially over the next five years, such a distributed data processing architecture will become imperative for monitoring and control, and eventually for initiating actions to solve problems," Chakrabortty says.


Apple Co-Founder Steve Wozniak on the Apple Watch, Electric Cars, and the Surpassing of Humanity
Australian Financial Review (03/24/15) Paul Smith

In an interview, Apple co-founder Steve Wozniak said he is concerned about what the rapid advancements in artificial intelligence will mean for the future of humanity. Wozniak said he agrees with fellow tech luminary Elon Musk and physicist Stephen Hawking that "the future is scary and very bad for people." He warned against building "devices to take care of everything for us," noting eventually devices might be able to think faster than people and eventually do away with them, especially if they are put to the task of running corporations efficiently. Wozniak also ponders what role humans will take in a world of superior machines, wondering if they will be revered as gods, treated as pets, or dismissed as insignificant as ants. Wozniak suggests such a future could be forestalled by the impending end of Moore's Law, as transistor density approaches the atomic scale. Quantum computing could provide further advancements in computing power beyond the atomic scale, but Wozniak did not express high hopes for the technology, saying, "for all the time they've been working on quantum computing they really have nothing to show that's really usable for the things we need."


Abstract News © Copyright 2015 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe