Association for Computing Machinery
Welcome to the November 4, 2013 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


Summit at Stanford Addresses How to Attract Underrepresented-Minority Students to Careers in Science and Technology
Stanford Report (CA) (11/04/13) Tom Abate

The United Negro College Fund's HBCU Innovation Summit last week was aimed at finding ways to get more young people, especially underrepresented minorities, to pursue careers in science, technology, engineering, and math (STEM) fields. The first-of-its-kind gathering brought together leaders from institutions including Spelman College, Clark Atlanta University, and Howard University that have traditionally focused on educating African Americans. The program was organized by Stanford University's Center for Engineering Pathways to Innovation (Peicenter) and the Center for Professional Development, with support from the U.S. National Science Foundation. "We think that kind of skill can be built in any person who has the drive, who has the excitement, who wants to change the world," says Stanford president John Hennessy. Stanford Dean of Engineering Jim Plummer urged his fellow educators to help change common practices that have unintentionally turned students off, such as asking high school students to declare their interest in STEM careers. "We actually lose a lot of young people who could be potential engineers and potential scientists and potential mathematicians because they are asked to make that choice as seniors in high school," Plummer says. The Innovation Summit was designed to show how HBCUs can help boost minority representation in STEM fields.


Germany Looks at Keeping Its Internet, Email Traffic Inside Its Borders
Washington Post (11/01/13) Michael Birnbaum

An alliance of German phone and Internet companies wants to create a network of German email and Internet systems transmitted strictly within German borders. The proposals aim to boost the security of Germany's internal communications by preventing them from leaving the country. Although the plan seems popular in Germany, the potential results have come into question, as Germans would still want to surf American Web pages, and the U.S. National Security Agency could still theoretically access German data on German soil. Nevertheless, U.S.-based technology firms are preparing for tough competition from foreign companies that claim they are freer from U.S. intrusion and monitoring than their U.S. counterparts. "Germans tend to be very sensitive to the use of their data, I think due to German history," says Jan Oetjen, CEO of GMX, which is collaborating with two other German email firms to offer a service called "E-Mail made in Germany." Meanwhile, other countries also are considering ways to nationalize their Internet traffic. For example, Brazil wants to require U.S. companies to store data about Brazilian customers within Brazil, and European Union leaders have called on its members to develop cloud data storage options that are independent from the United States.


Berners-Lee Demands Countries Deliver on Open Data Promises
Telegraph.co.uk (10/31/13) Matt Warman

Speaking at the Open Data Institute, Sir Tim Berners-Lee called on world leaders to back talk on transparency and accountability with action, claiming that fighting poverty, accelerating industry and innovation, and reducing corruption can all be assisted through the release of publicly held data to the public and software developers. "Governments and companies must not shy away from publishing contentious datasets if they contain information that could be used to dramatically improve people's lives," Berners-Lee says. The United Kingdom is the most advanced country when it comes to releasing data, with the United States, Sweden, New Zealand, and Denmark also in the top five, according to Berners-Lee's new report. The report says that 55 percent of countries surveyed have formal open data policies in place. However, the report also notes that when government datasets are released, they are often issued in inaccessible formats. The study aims to encourage efforts that coax entrepreneurs, the public, and organizations to use data, rather than for governments to simply publish it. "The open data movement has made a promising start, but many Open Government Data initiatives are presently resting on shallow foundations, at risk of falling backwards if political will or pressure from campaigners subsides," Berners-Lee warns.


A Gestural Interface for Smart Watches
Technology Review (11/01/13) Rachel Metz

Researchers at the University of California, Berkeley and UC Davis are developing Chirp, a computer chip that uses ultrasound waves to detect a wide range of gestures in three dimensions, and could be implanted in wearable devices. The researchers say Chirp eventually could be used in devices ranging from helmet cameras to smart watches. Chirp relies on sonar via an array of ultrasound transducers that send ultrasonic pulses outward in a hemisphere, echoing off objects in their path. The echoes can be used to detect a range of hand gestures in three dimensions within a distance of about a meter. Berkeley's Richard Przybyla says Chirp's basic set of gesture commands could be programmed into Chirp-enabled devices. Since the system uses sound, which travels considerably slower than light, it can use low-speed electronics for sensing, which significantly decreases the system's overall power consumption, enabling it to run off a watch battery continuously for up to 30 hours, Przybyla says. In the future, the researchers hope to develop the technology to be able to recognize individual finger movements, instead of just hand gestures.


Get a Security Boost: Add More Women to Your Cyber Team
NextGov.com (10/30/13) Brittany Ballenstedt

A dearth of women in IT and cybersecurity positions may be playing a role in the frequent failure of enterprise cybersecurity strategies and defense, according to a new (ISC)2/Frost & Sullivan/Symantec report. Just 11 percent of the cyber workforce is female despite double-digit yearly increases in the cybersecurity profession, and this trend is worrisome given that women often have more diverse academic backgrounds and viewpoints than men. Such traits could help expedite change needed in the information security sector. A new, more diverse skill set is required because the cybersecurity discipline has evolved to include threats such as a competitive global marketplace, conflicting regulatory requirements, and the adoption of new technologies. ISC(2) determined that men and women diverge in the way they define the various proficiencies needed to be an effective cybersecurity professional, and the novel skills women contribute may be essential in addressing that threat evolution. The report also found little variation in average job tenure, median and average annual salary, and academic background among men and women serving in senior cybersecurity capacities. However, ISC(2) Foundation director Julie Peeler says these differences became more profound among junior-level workers, where women tend to be more educated and less well-compensated than men.


Microsoft Uses Kinect to Interpret Sign Language From Deaf People
IDG News Service (10/31/13) Michael Kan

Microsoft Research developers are using the Kinect technology to develop a system that can read sign language from deaf users and translate it into spoken text. Developers have been training the Kinect translator to recognize sign language, and so far it can recognize 370 of the most popular words in American Sign Language and Chinese Sign Language. The system can turn sign language into words spoken by a computer and do the reverse, and a non-deaf user can speak or type words into the Kinect translator and it will motion the words in sign language using a virtual avatar shown on a display. Microsoft has been working on the technology for about 18 months, and is collaborating with the Chinese Academy of Sciences and Beijing Union University. The system could potentially enable deaf users to more easily communicate with non-sign language speakers. Microsoft researchers say they will continue to improve language recognition and expand the vocabulary of the system.


Out in the Open: Palm Pilot Inventor Wants to Open Source the Human Brain
Wired News (10/28/13) Klint Finley

Computer scientist and Grok co-founder Jeff Hawkins has devised a unified theory of the brain's inner workings and produced algorithms for applying the theory to computer science. He also has open-sourced his work so anyone can freely apply the algorithms and software to the construction of their own machine-learning systems. This effort included publishing a white paper outlining Hawkins' theory and underlying math, and the release of the NuPIC open source platform, which includes Grok's algorithms and a software framework for building prediction systems. Grok produces a cloud-based service for monitoring IT infrastructure, and its ability to detect abnormal occurrences or alert the IT team of an impending failure is based on its mimicry of the brain's pattern-recognition systems. Grok's foundational cortical-learning algorithms attempt to realistically simulate the human neocortex and emulate its six-layer hierarchy. Grok's Matthew Taylor says NuPIC's uniqueness partly stems from its online learning capabilities, in which "as patterns change, it will forget the old patterns and remember the new patterns" in much the same way the brain adapts to change. NuPIC's potential applications besides IT infrastructure monitoring could include natural-language processing, machine vision, and robotics.


Forget the Needle, Consider the Haystack: Uncovering Hidden Structures in Massive Data Collections
Princeton University (10/28/13) John Sullivan

Princeton University computer scientists have developed a method to leverage big data using a mathematical method to determine the probability of a pattern repeating itself throughout a data subset. The researchers say their method significantly reduces the time required to uncover patterns in large data collections such as social networks, enabling researchers to pinpoint links between seemingly unrelated groups. "The data we are interested in are graphs of networks like friends on Facebook or lists of academic citations," says Princeton professor David Blei. The researchers developed an algorithm to analyze a subset of a large database, determining the likelihood that nodes belong to various groups in the database. The researchers then created an adjustable matrix that accepts the subset's analysis and assigns weights to each data point based on its probability of belonging to different groups. The research is based on a stochastic optimization method that identifies a central pattern from a group of seemingly random data. Blei compares the technique to navigating from New York to Los Angeles by asking random people for directions, which would eventually be successful given the right questions and interpretations. The researchers used the method to find patterns in connections between patents using public data from the U.S. National Bureau of Economic Research.


EarthCube: NSF Funds $14.5 Million in Grants to Improve Geosciences Cyberinfrastructure
National Science Foundation (10/28/13) Cheryl Dybas

The U.S. National Science Foundation's (NSF) EarthCube initiative intends to enable researchers to plot geoscientific data from any source and visualize it any manner, and model results and investigate concepts from a desktop, a lab, or the field. EarthCube's purpose is to devise new ways of comprehending and predicting the Earth system, and NSF has allotted 13 grants totaling $14.5 million to cultivate a dialogue among geo-, bio-, and cyberscientists to create an EarthCube architecture. NSF's Directorate for Geosciences and its Directorate for Computer and Information Science and Engineering are sponsoring the EarthCube effort. "As the Internet revolutionized the way we lead our daily lives, scientists are searching for technologies that will advance the ability to discover, collaborate, and conduct research at all levels," says the Geosciences Directorate's Roger Wakimoto. "Through EarthCube, NSF has made investments in these technologies and the infrastructure that will be the foundation of addressing challenges in studying the Earth system." The project includes the participation of specialists in governance, workflows, data discovery, mining, and access, and other fields. A core objective of EarthCube is setting up a computing system that can help locate, extract, and aggregate data, as well as process, summarize, and synthesize it in ways that help geoscientists better understand and simulate Earth systems.


The Status of Moore's Law: It's Complicated
IEEE Spectrum (10/28/13) Rachel Courtland

As computer chips grow denser, it becomes increasingly difficult to measure the progression of Moore's Law. Exacerbating this situation is the mutability of the definition of node names, especially as manufacturers prepare to launch 14nm and 16nm chips. Some analysts imply that, irrespective of the next chip generation, the migration from old to new no longer guarantees the kind of price or performance improvements that it once did. The breakdown between performance and node name began around the mid-1990s as chipmakers not only continued to use lithography to pattern circuit components and wires on the chip, but also started etching away the ends of the transistor gate to make the devices shorter and faster. Eventually, "there was no one design rule that people could point to and say, 'That defines the node name,'" says Intel fellow Mark Bohr. Despite this change in the transistor measurement trend, manufacturers persisted in packing the devices closer and closer together, assigning each successive chip generation a number about 70 percent that of the previous one. Today the node names are no longer consistent with the size of any specific chip dimension. No matter what definition is applied, numbers in node names have steadily declined, as have the distance between transistor gates.


Computer Science Helps Astronomers Exploring the Sky
Heidelberg Institute for Theoretical Studies (10/28/13) Peter Saueressig

The new Astroinformatics research group at the Heidelberg Institute for Theoretical Studies will develop methods and software for astronomers to help bring order to vast quantities of astronomy data. Led by physicist and computer scientist Kai Polsterer, Astroinformatics will develop approaches for analyzing and processing the continuously growing amount of data in astronomy, which is stored in various archives. "We focus on using new approaches to support observing scientists with their research," Polsterer says. Computers have transformed astronomy with new detectors and telescopes that constantly collect data. Polsterer aims to make this data accessible to astronomers in a way that fosters intuitive research. For example, the Sloan Digital Sky Survey provides digital data of its sky mapping with images in five wavelengths and detailed spectroscopy. "There are many treasure chests full of data to be unearthed, but it is not easy for astronomers to actually browse the data, i.e. to do exploratory work," Polsterer says. The research group initially will focus on tools that automatically extract object features from data. The group also intends to advance machine-learning approaches to astronomy.


Chips 'Inspired' by the Brain Could Be Computing's Next Big Thing
ReadWrite (10/25/13) Dan Rowinski

Technology companies are getting closer to creating computers that are modeled on the human brain, after decades of disappointing efforts. Chipmakers such as Qualcomm, IBM, and Intel are now merging their knowledge of microprocessors and neuroscience to develop a new brain-inspired products. For example, Qualcomm next year plans to release its Zeroth chip, based on a type of processor called a neural-inspired processing unit (NPU). Although an NPU is still a computer chip made of silicon and patterned with transistors, it can perform qualitative functions. Theoretically, an NPU based on mathematical modeling and human biology could learn like a human, and Qualcomm hopes that in the next few years NPUs will appear in devices ranging from smartphones to servers in a cloud architecture. IBM is working on a cognitive computing paradigm called TrueNorth, which attempts to model computing on the brain and could alter computer vision, motion detection, and pattern matching. TrueNorth can be programmed with "corelets" that teach the computer different functions. Meanwhile, Intel's neuromorphic chip architecture is based on devices that use the spinning of an electron to perform a variety of computing functions. Intel says neuromorphic hardware based on spin devices can perform analog data-sensing, data conversion, cognitive computing, associative memory, programmable logic, and analog and digital-signal processing.


Abstract News © Copyright 2013 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe