Association for Computing Machinery
Welcome to the January 13, 2017 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Please note: In observance of the U.S. Martin Luther King, Jr. Day holiday, TechNews will not be published on Monday, Jan. 16. Publication will resume on Wednesday, Jan. 18.

ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


SMU School of Information Systems Awarded Two Cybersecurity Research Projects
Singapore Management University (Singapore) (01/12/17) Teo Chang Ching

Singapore Management University's (SMU) School of Information Systems has been awarded two cybersecurity research projects under a collaboration program with Singapore's National Research Foundation and Tel Aviv University's (TAU) Blavatnik Interdisciplinary Cyber Research Center in Israel. The SMU researchers will work with TAU colleagues on one project to address key questions of how to characterize the interdependency of cyberattacks, and achieve a balance between openness and security when implementing international enforcement actions and sharing technology information to counter cyberattacks. "As international collaboration in technical and legal strategies are recognized as being central to efforts to tackle globalized cybersecurity threats, this project will add to the knowledge base on cybersecurity interdependency, and the policy analytics about the legal measures and technology mitigations against cyberattacks," says SMU professor Wang Qiuhong. The second project aims to protect the safety and privacy of people who use mobile apps to access smart city services. The project will design a system that detects anomalous and potentially harmful behaviors in apps and create suitable alerts. "This project addresses the need to empower users of various demographics to better protect themselves against cyberattackers and secure their private data and information," says SMU professor David Lo.


5 State Policies to Sustain Computer Science Education
eSchool News (01/13/17) Laura Ascione Devaney

The Southern Regional Education Board (SREB) convened a group of U.S. state legislators, secondary, and postsecondary education leaders to develop policies and actions that positively support computer science education. ACM estimates that by 2020, as many as 4.6 million out of 9.2 million science, technology, engineering, and math jobs will be computer-related. The SREB report outlines five actions to help states develop strong and impactful computer science education policies. States first need to develop computer science standards for K-12 by working in partnership with secondary and postsecondary educators, experts, and industry leaders. In addition, states should develop or adopt standards-based, developmentally appropriate computer science curricula that appeal to diverse learners. Secondly, states need to lay the groundwork for learning computer science by requiring students to take four years of math aligned with their career and college goals. States also should create clear pathways to computing careers, while the senior year of high school could be redesigned to enable students who meet college-readiness benchmarks to earn college credits that transfer to associate and bachelor's degrees. Finally, the report says states should prepare great computer science teachers, and educate communities about computer science and computing careers.


Project Aims for Wearable Systems in People-centered Smart Cities
The Engineer (United Kingdom) (01/10/17) Helen Knight

Researchers at Southampton University in the U.K. are developing low-energy sensors and artificial intelligence to enable people's clothing to communicate with smart city networking systems. "Most of the sensors that are used in smart cities are fixed, on lampposts for example, and there has been some work to put them on vehicles such as buses, but we are looking to put them on people, in something they can just put on and wear," says Southampton professor Steve Beeby. He says the technology offers a better distribution of sensors throughout a smart city, and could let researchers influence the density of information they receive by offering incentives to wearers. "It provides a flexible, movable input to the smart city, which isn't possible with the existing fixed sensor network," Beeby says. The Southampton project is based on previous research in which technology is being developed to place electronics, including microprocessors, microcontrollers, and logic circuits, into textile yarns, which can then be woven into wearable fabrics. The Southampton researchers currently are planning to incorporate microelectromechanical systems-style sensors alongside the electronics in the yarns. In addition, the team will be studying the use of energy-harvesting technologies to power the sensors and electronics.


Army of 350,000 Star Wars Bots Found Lurking on Twitter
New Scientist (01/12/17) Chris Baraniuk

Researchers at University College London (UCL) in the U.K. discovered a Twitter botnet that could be comprised of more than 350,000 accounts. They stumbled upon the botnet, which has tweeted thousands of random quotations from "Star Wars" novels, after taking a random sample of 1 percent of Twitter users, which produced about 6 million English-language accounts. The researchers plotted the locations of the accounts on a map, and found more than 3,000 were within two oddly uniform rectangles; one roughly covering Europe and North Africa, and one over North America. Since so many of the accounts appeared to tweet from uninhabited deserts or oceans, the researchers concluded they were not real users. The researchers then used the accounts as a training set for a machine-learning algorithm to identify other bots across Twitter with the same characteristics. All of the accounts were created between June and July in 2013, and none of them tweeted more than 11 times. If all of the accounts are shown to be related bots, it would be an unusually large botnet for Twitter. Although no one knows why the botnet was set up, the researchers theorize the bots are sold for money as fake followers, says UCL professor Shi Zhou.


Scientists Use a Gaming Algorithm to Enhance a DNA Sequencing Android App
Agency for Science, Technology and Research (A*STAR) (01/11/17)

Researchers at the Agency for Science, Technology and Research's (A*STAR) Bioinformatics Institute in Singapore are using a new image-processing algorithm to improve the accuracy of a smartphone application designed to help analyze biomedical samples. GelApp analyzes and labels outputs from gel electrophoresis, a technique in which DNA and proteins are separated and identified by passing through a gel under electric charge. Molecules move through the gel at different speeds, creating a pattern of bands across the surface that are usually labeled by hand, an error-prone and time-consuming process. The first iteration of GelApp enabled automation, but the A*STAR researchers wanted to improve the app's accuracy when faced with variations in laboratory setups and cameras. A Monte Carlo Tree Search algorithm was selected, and the researchers trained the algorithm using hand-labeled gel band images; later, the algorithm continued to learn from individual GelApp users' inputs. GelApp relies on chains of image filters to identify gel bands, and each filter increases the sharpness and accuracy of the image by reducing noise and determining the edges of each band. The algorithm selects the best five filters from a filter bank to provide the best match for manually-identified images. GelApp 2.0 improved band detection accuracy by 56 percent for proteins and 36 percent for DNA compared with the original version.


How Northeastern Plans to Reach Equal Male-Female Computer Science Enrollment by 2021
TechRepublic (01/11/17) Alison DeNisco

Carla Brodley, dean of Northeastern University's College of Computer and Information Science, aims to reach an equal male/female balance in computer science (CS) enrollment by 2021. To attract new students, the university expanded its combined majors program to include more than 20 special majors that pair CS with subjects such as business administration, English, and biology. Northeastern also created a CS minor in which every student takes the same two introductory computing courses. After completing the requisite classes, students can choose their minor courses depending on their major. The CS introductory course was revamped to focus on a coding language the faculty designed, ensuring no students had an advantage. A separate accelerated section of the introductory course was created for students with prior coding experience. In addition, Northeastern offers two CS master's programs; one for people with an undergraduate CS degree, and one for students with degrees in other fields. "The goal is for 50/50 female/male majors, but the other goal is that every student should know some computer science before they leave the education system," Brodley says. "Until we fix that at the high school level, we need to fix it at the college level."


Split-Second Data Mapping
MIT News (01/11/17) Rob Matheson

A new database-analytics platform leverages graphics-processing units (GPUs) to process billions of data points in milliseconds. MapD was developed by Todd Mostak, a former researcher at the Massachusetts Institute of Technology's Computer Science and Artificial Intelligence Laboratory. In addition to processing data faster than traditional database management systems, MapD visualizes all processed data points nearly instantaneously, and parameters can be modified to adjust the visualized display. In most implementations of GPU-powered databases, the data is stored initially on a central-processing unit (CPU), transferred to the GPU for a query, and results are moved to the CPU for storage. Instead of storing data on CPUs, MapD caches as much data as possible on multiple GPUs. If a database needs to query the same data point repeatedly, it can access that data point in the GPU's random-access memory. In a recent test, the system analyzed a 1.2-billion-record New York City taxi dataset. MapD ran 74 times faster than several advanced CPU database systems and completed several queries within milliseconds. The U.S. Central Intelligence Agency has invested in MapD to accelerate development of certain features for the intelligence community, and financial services firms, advertising agencies, and social media companies also are potential clients for the platform.


Graphene Temporary Tattoo Tracks Vital Signs
IEEE Spectrum (01/11/17) Katherine Bourzac

University of Texas at Austin researchers are developing graphene-based health sensors that stick to a person's skin like a temporary tattoo and take measurements with the same precision as conventional medical equipment. The graphene tattoos are 0.3 nanometers thick, making them the thinnest epidermal electronics ever created, and they can measure electrical signals from the heart, lungs, and brain, as well as determining skin temperature and hydration levels. The researchers want to develop a system that can take measurements of the same quality or better compared with electrocardiogram monitoring technology, but that is unobtrusive. The researchers initially grew single-layer graphene on a sheet of copper. The two-dimensional carbon sheet was then coated with a stretchy support polymer, and the copper was etched off. The polymer-graphene sheet was then placed on temporary tattoo paper and the graphene was carved to make electrodes with stretchy spiral-shaped connections between them. In a proof-of-concept experiment, the researchers used the graphene tattoos to take five kinds of measurements, and compared the data with results from conventional sensors.


Air Force Tests IBM's Brain-Inspired Chip as an Aerial Tank Spotter
Technology Review (01/11/17) Andrew Rosenblum

The U.S. Air Force Research Lab (AFRL) is exploring whether brain-inspired computer chips could give satellites, aircraft, and drones the ability to automatically identify vehicles. IBM's neuromorphic TrueNorth chip has successfully detected military and civilian vehicles in radar-generated aerial imagery while using less energy than a regular high-performance computer. The chip processes data using a network of elements designed to mimic neurons and synapses that store and operate on data, making the chip more efficient than conventional systems, in which components that perform calculations are separate from memory. AFRL conducted contests between TrueNorth and the Jetson TX-1, a high-powered NVIDIA computer. The systems used different implementations of neural network-based image-processing software to identify 10 types of military and civilian vehicles. Both systems achieved about 95-percent accuracy, but the TrueNorth chip used between 3 percent and 5 percent of the power required by the NVIDIA computer. The conventional computer ran its neural network on chips with general-purpose hardware, while the IBM chip's hardware is hard-coded to represent artificial neural networks using 1 million neurons customized to the task. However, it is much easier to deploy neural networks on conventional computers due to a wider availability of software, and the IBM chip is much more expensive.


NRAM Set to Spark a 'Holy War' Among Memory Technologies
Computerworld (01/12/17) Lucas Mearian

A non-volatile memory technology based on carbon nanotubes should be more disruptive to enterprise storage, servers, and consumer electronics than flash memory when it is commercialized in 2018, according to a new BCC Research report. Since its creation by Nantero in 2001, nano random-access memory (NRAM) reportedly tops dynamic RAM performance 1,000-fold but stores data when the power is off. NRAM is composed of an interlocking fabric matrix of carbon nanotubes that can either be touching or slightly separated. Each NRAM "cell" comprises the network of the nanotubes that exist between two metal electrodes; the memory functions the same way as other resistive non-volatile RAM technologies. BCC Research's Chris Spivey expects NRAM to trigger a "holy war" with many new memory technologies, such as ferroelectric RAM and phase-change memory. Spivey says NRAM's most interesting feature is its transition from silicon to carbon-based memory, "which can evidently be carried out seamlessly on traditional [complementary metal-oxide semiconductor] foundries and also, it seems, even in logic foundries. This ushers in an era of potential mass customization." BCC Research says this customization will facilitate such innovations as inexpensive autonomous Internet of Things sensors and memory for the smartphone industry, embedded application-specific integrated circuits for cars, and headphones with intrinsic music storage.


Will Artificial Intelligence Help to Crack Biology?
The Economist (01/05/17)

There is growing interest in using artificial intelligence (AI) to solve the riddle of biological complexity and advance human health. Advocates say AI can "ingest" all kinds of content to accumulate knowledge, make connections, and formulate hypotheses. One example is BenevolentAI, a company that uses a natural-language processing machine-learning system to mine chemical libraries, medical databases, and scientific papers for potential drug molecules. Larger efforts include a partnership between IBM and Pfizer in which IBM's Watson system is being applied to accelerate drug discovery in immuno-oncology. Examples of AI in clinical care include a recent study published in the Journal of the American Medical Association demonstrating the use of AI to identify several causes of blindness in retinal images. AI also is proving helpful in dealing with complexity in experimental studies. A system at BERG Health, for example, studies all types of data to model a network of protein interactions underlying a specific disease, to be tested by researchers in a real biological system. Still, scientists caution AI has not yet reached a level of molecular refinement that would enable it to model the inner workings of a single cell.


Software System Labels Coral Reef Images in Record Time
UCSD News (CA) (01/10/17) Ioana Patringenaru

University of California, San Diego (UCSD) researchers have released a new version of a software system that processes images from the world's coral reefs up to 100 times faster than manual data processing. The new version of the system, CoralNet Beta, includes deep-learning technology that uses far-reaching networks of artificial neurons to learn to interpret image content and process data. CoralNet Beta reduces the time needed to go through a typical 1,200-image diver survey of the ocean's floor from 10 weeks to one week, while maintaining the same level of accuracy. Coral ecologists and government organizations use CoralNet to automatically process images from autonomous underwater vehicles. The system provides more than 2,200 labels, including those that enable researchers to label different types of coral and whether they have been bleached, different types of invertebrates, and different types of algae. "This will allow researchers to better understand the changes and degradation happening in coral reefs," says UCSD professor David Kriegman. CoralNet Beta runs on a deep neural network with more than 147 million neural connections. "We expect users to see a very significant improvement in automated annotation performance compared to the previous version, allowing more images to be annotated quicker," says CoralNet founder Oscar Beijbom.


Baidu's Andrew Ng Talks AI With Gigaom
Gigaom (01/11/2017) Byron Reese

In an interview, Baidu chief scientist and Stanford University professor Andrew Ng says his team is developing basic artificial intelligence (AI) technology to support internal businesses and cultivate new opportunities for AI. Ng notes much of their early work in face recognition and neural machine translation was fundamental research that transitioned into practical commercial applications. Current AI challenges Ng finds exciting include transfer learning, in which knowledge can be passed from one machine to another. Ng says computer science principles have driven AI progress to a much greater degree than neural science principles. "It turns out that our tools for advancing a piece of AI technology tend to work better when we're trying to automate a task that a human can do, rather than try to tackle a task that even humans cannot do," he notes. Ng also stresses AI researchers should not overreach, but instead concentrate on tangible goals that he believes are better for the world in general. "That's actually more productive...than spending all of our time building toward a science fiction which might not come to fruition for maybe even hundreds of years," he says. Ng notes machine behavior currently exists that could be categorized as "somewhat creative," with AI to make incremental progress over the next several years.


Abstract News © Copyright 2017 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]