Welcome to the February 14, 2018 edition of ACM TechNews, providing timely information for IT professionals three times a week.

ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

To view "Headlines At A Glance," hit the link labeled "Click here to view this online" found at the top of the page in the html version. The online version now has a button at the top labeled "Show Headlines."
TPUs Go Public on Google Cloud
Top 500
Michael Feldman
February 13, 2018


Google has announced a beta program to make its in-house Tensor Processing Units (TPUs) available to cloud customers, offering a TPU board constructed from four custom application-specific integrated circuits providing up to 180 machine-learning teraflops and 64 GB of high bandwidth memory. The board is designed to operate computationally intensive machine-learning algorithms that support Web applications including language translation, text searching, and ad serving. Google says the TPU can expedite these workloads and consume far less power than that required by conventional graphics-processing units (GPUs) and central-processing units. Google has open-sourced a number of machine-learning models for the TPU, such as ResNet-50 for image classification, Transformer for language processing, and RetinaNet for object detection. For applications external to those domains, TPUs can be coded at a lower level using the TensorFlow application programming interfaces.

Full Article

Landslide blocking roadway Technological Breakthrough for Monitoring and Predicting Landslides
Victoria University
February 13, 2018


Researchers at Victoria University in Australia say they have developed AccuMM, an automated solution for the long-term monitoring of landslides. The team notes AcuuMM uses low-cost solar- or battery-powered wireless global-positioning system sensors and a specialized, cloud-based algorithm to calculate the location of each sensor, relative to a fixed-based station. This system enables daily measurements to be taken at multiple points on a landslide without the need for human site visits. In addition, the researchers say the system has no line of sight or cabling requirements, and requires no intervention at the site for five years or more. The researchers ran a pilot test of AccuMM in Taiwan, and now they are trialing the system in areas of Australia where landslides have occurred. "We can power the wireless network by energy harvesting, which means our system can operate for long duration to meet the monitoring needs of geotechnical engineers," notes Victoria University professor Winston Seah.

Full Article

Visualization of gene expression patterns of murine brain cells generated with Scanpy Software Package Processes Huge Amounts of Single-Cell Data
Helmholtz Association of German Research Centers
February 13, 2018


Researchers at the Helmholtz Zentrum Munchen (the German Research Center for Environmental Health) in Germany have developed Scanpy, a software program that can manage enormous datasets and is a candidate for analyzing the Human Cell Atlas. "For this project, and in a growing number of other projects in which databases are combined, it is important to have scalable software," says University of Munich professor Fabian Theis. He notes it is therefore no surprise that Scanpy is a candidate for helping to analyze the Human Cell Atlas. Theis says the publication of Scanpy represents the first time software has been developed to enable comprehensive analysis of large gene-expression datasets with a broad range of machine learning and statistical methods. Scanpy is based on the Python language, the dominant language in the machine-learning community. In addition, Theis says Scanpy relies on graph-based algorithms, differentiating the system from other biostatistics programs, which are traditionally written in the R programming language.

Full Article
Universities Rush to Roll Out Computer Science Ethics Courses
The New York Times
Natasha Singer
February 12, 2018


U.S. universities are starting to offer ethics courses relating to computer science, with the hope of training next-generation technologists and policymakers to weigh the social and moral ramifications of innovations before they are commercialized. One factor driving this trend is the popularization of tools such as machine learning, which have the potential to significantly change human society. "We need to at least teach people that there's a dark side to the idea that you should move fast and break things," says New York University's Laura Noren. "You can patch the software, but you can't patch a person if you...damage someone's reputation." A joint Harvard University-Massachusetts Institute of Technology course concentrates on the ethical, policy, and legal implications of artificial intelligence. The course also covers the proliferation of algorithmic risk scores that use data to predict whether someone is likely to commit a crime.

Full Article
*May Require Free Registration
Swirly Skyrmions Could Be the Future of Data Storage
IEEE Spectrum
Dexter Johnson
February 13, 2018


Researchers at France's CNRS/Thales lab, a joint laboratory of industrial company Thales and the French National Center for Scientific Research, are moving toward commercializing skyrmion-based magnetic data storage by electrically detecting for the first time a single skyrmion at room temperature. CNRS' Vincent Cros says these signals are so tiny that the team had to be sure the measured electrical signal is really associated with the presence of a skyrmion. "That is exactly what we demonstrate here by a concomitant electrical measurement and magnetic imaging on the very same devices," Cros notes. His team suggests a "racetrack memory" would be the optimal technology. "We have employed a new approach in which we inject short current pulses into the materials, which allows us to create isolated skyrmions located in a strip [or track] designed by electron-beam lithography," Cros says. This enables the adjustment of the total nucleated skyrmions by tuning different parameters, such as the current pulse width or the intensity of the external magnetic field.

Full Article

Computer chip with padlock symbol Energy-Efficient Encryption for the Internet of Things
MIT News
Larry Hardesty
February 12, 2018


Researchers at the Massachusetts Institute of Technology have constructed a chip that performs public-key encryption while consuming only 1/400 as much power as software execution of the same protocols would, and also using about 1/10 the memory to execute 500 times faster. This general-purpose elliptic-curve chip has a modular multiplier that can handle 256-bit numbers, and a special-purpose inverter circuit enlarges the chip's surface area by 10 percent while halving power consumption. The datagram transport layer security protocol is hardwired into the chip, dramatically cutting the amount of memory required for its execution. Also integrated into the chip is a general-purpose processor that can be used in tandem with the dedicated circuitry to carry out other elliptic-curve-based security protocols, but it can be powered down when not in use so it will not compromise energy efficiency. The researchers envision the chip as a mechanism for efficient encryption of Internet of Things devices.

Full Article

Laptop with a privacy policy document open on screen Chat Tool Simplifies Tricky Online Privacy Policies
University of Wisconsin-Madison
Renee Meiller
February 13, 2018


Researchers at the University of Wisconsin-Madison, the University of Michigan, and Swiss Federal Institute of Technology in Lausanne, Switzerland, have developed Pribot, a unique online chatbot that can answer questions about specific privacy policies in simple language. Pribot looks and acts like a text-messaging conversation, including blue and gray speech bubbles and emojis, and if it cannot find the website the user is asking about, it volunteers to search out and add it to its database, which currently includes more than 14,000 privacy policies. The researchers also created Polisis, a tool that shows users what types of data companies collect, how and why they share it, how they secure that information, what data they store, and how children's data is treated. Pribot and Polisis rely on artificial intelligence and natural language processing, and the researchers also are collecting data from user interaction with the two new tools in an effort to improve their performance.

Full Article
NCSA Allocates Over $2.4 Million in New Blue Waters Supercomputer Awards to Illinois Researchers
HPCwire
February 13, 2018


Fifteen research teams at the University of Illinois at Urbana-Champaign have been awarded computation time on the National Center for Supercomputing Applications' sustained-petascale Blue Waters supercomputer. The allocations range from 75,000 to 582,000 node-hours of compute time over either six months or one year, for a grand total of nearly 4 million node-hours. The research goals of these teams range from studies on tiny HIV capsids to massive binary star mergers. Blue Waters is one of the world's most powerful supercomputers and is capable of sustaining 1.3 quadrillion calculations every second, and at peak speed can reach a rate of 13.3 petaflops. The system's massive scale and balanced architecture provide researchers with the ability to address problems that could not be solved with other computing systems. To date, the Blue Waters system has allocated a total of 64.6 million node-hours to Illinois-based researchers through the Illinois General Allocations program.

Full Article
Stanford Researchers Develop New Method for Waking Up Small Electronic Devices
Stanford News
Taylor Kubota
February 12, 2018


Researchers at Stanford University are developing a wake-up receiver that can activate a shut-off device at a moment's notice. They say the new receiver could serve as a solution for extending the battery life of wireless devices. The receiver turns on a device in response to incoming ultrasonic signals outside the range that humans can hear. The receiver also works at a significantly smaller wavelength and uses ultrasound rather than radio waves, which enable the new system to be much smaller while operating at extremely low power and with extended range when compared to similar wake-up receivers. After it is attached to a mobile device, the wake-up receiver listens for a unique ultrasonic pattern that tells it when to turn on the device. The system only requires a very small amount of power to maintain this constant listening, meaning it can still save energy overall while extending the battery life of the host device.

Full Article
New Process Allows 3D Printing of Nanoscale Metal Structures
Caltech News
Robert Perkins
February 9, 2018


Researchers at the California Institute of Technology (Caltech) say they have developed a new method to three-dimensionally (3D) print complex nanoscale metal structures. The team employed organic ligands to produce a mostly polymer resin that also contains metal that can be printed in the manner of a scaffold. Caltech's Andrey Vyatskikh notes he bonded nickel and organic molecules to create a liquid, and then used computer software to design a structure, which was constructed by striking the liquid with a two-photon laser. Vyatskikh says the molecules also were bound to the nickel atoms, enabling the researchers to print a 3D structure that was initially a combination of metal ions and organic molecules. Vyatskikh then heated the structure to 1,000 degrees Celsius, vaporizing the organic materials so only the fused metal remained. The structure's dimensions shrank by 80 percent, but it kept its shape and proportions.

Full Article
Which Programming Language Is Best for Big Data?
Datanami
Alex Woodie
February 12, 2018


The key factor in choosing a programming language for a big data project is the goal at hand, and Python is currently the most popular language in the data science exploration and development stage. Also popular in this area is R, while the SAS environment is still popular among business analysts and MATLAB also is frequently used for the exploration and discovery phase. Another determinant of a data science language can be what notebook a researcher is using: Jupyter is the heir to the iPython notebook, and has close alignment with Python while also supporting R, Scala, and Julia. Coders often will choose a different set of languages in terms of developing production analytics and Internet of Things applications, as they frequently rewrite the app and re-deploy the machine-learning algorithms using different languages than those used during experimentation. When speed and latency are imperatives, many programmers opt for C and C++.

Full Article
New Data Reconfirms Well-Known Gender Problem in Tech
The Daily of the University of Washington
Mitali Palekar
February 8, 2018


A study of tech talent migration suggests technology workers were likely to gain significant financial raises when they moved. Although the data indicates a strong positive benefit to moving to large technology hubs, it also notes the benefits of the migration greatly favor men. The study found for every four men who moved to Seattle for a new technology job, only one woman did. Only 32 percent of University of Washington computer science graduates are currently women, and only 25 percent of professional computing jobs in the U.S. are held by women. Many of today's college students recognize that the pipeline problem for women in computer science starts at the grassroots level, as there is a common perception among young girls that computer science and technology is not meant for them. In recent years, several initiatives have been fostered within computer science with the goal of improving the state of diversity within the technology sector.

Full Article

Microsoft Research and founder of Black in AI, Timnit Gebru Black in AI's Founder on What's Poisoning the Algorithms in Our Lives
Technology Review
Jackie Snow
February 14, 2018


In an interview, Microsoft researcher Timnit Gebru cites the urgent need for diversity to counter the infiltration of bias into artificial intelligence (AI) systems. "We are in a diversity crisis for AI," Gebru contends. "In addition to having technical conversations, conversations about law, conversations about ethics, we need to have conversations about diversity in AI." Diversifying datasets is one aspect of this movement, with Gebru also emphasizing the absence of clear guidelines on the application of machine learning. "AI is just now starting to be baked into the mainstream, into a product everywhere, so we're at a precipice where we really need some sort of conversation around standardization and usage," she notes. Gebru says of particular interest to her is persuading companies to give more information to users or scientists, noting, "we're in a place almost like the Wild West, where we don't really have many standards [about] where we put out datasets."

Full Article
Special Issue on Technology in eLearning
 
ACM Ambassador Program
 

Association for Computing Machinery

2 Penn Plaza, Suite 701
New York, NY 10121-0701
1-800-342-6626
(U.S./Canada)



ACM Media Sales

If you are interested in advertising in ACM TechNews or other ACM publications, please contact ACM Media Sales or (212) 626-0686, or visit ACM Media for more information.

To submit feedback about ACM TechNews, contact: [email protected]