Welcome to the June 7, 2017 edition of ACM TechNews, providing timely information for IT professionals three times a week.

ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).
First Quantum-Secured Blockchain Technology Tested in Moscow
Technology Review
June 6, 2017

Researchers at the Russian Quantum Center in Moscow have designed, constructed, and tested the world's first quantum blockchain system using a standard, off-the-shelf quantum cryptography solution. Evgeny Kiktenko and colleagues devised a method to thwart brute-force attacks by incorporating quantum key distribution, ensuring the transaction data sender and recipient can confirm each other's identities. Quantum key distribution transmits information as quantum particles that cannot be duplicated by an eavesdropper without destroying them. Kiktenko says the system attaches a quantum signature to each transaction, ensuring it is tamper-proof. The team notes the system uses Swiss company ID Quantique's quantum cryptography system. "We have developed a blockchain protocol with information-theoretically secure authentication based on a network in which each pair of nodes is connected by a quantum key distribution link," the researchers say. They also tested the system in a four-user network, and foiled double-spending attempts while permitting the generation of a block featuring only legitimate transactions.

Full Article

A network router sitting on a table Cybersecurity Researchers Claim Every Network Router at Risk of Secretly Leaking Data
Conner Forrest
June 6, 2017

Researchers at the Ben-Gurion University of the Negev (BGU) Cyber Security Research Center in Israel have demonstrated a method for stealing information by exploiting a network router's light-emitting diodes (LEDs). The researchers say they employed their proprietary xLED malware to hijack the router and commandeer the LEDs to flash a pattern that transmits data. "Sensitive data can be encoded and sent via the LED light pulses in various ways," notes BGU's Mordechai Guri. "An attacker with access to a remote or local camera, or with a light sensor hidden in the room, can record the LED's activity and decode the signals." The researchers say xLED can make routers leak data at rates from 10 bits/second to more than 1 Kbit/second. The malware also can make the LEDs pulse at more than 1,000 flickers per second for each light. The researchers note they used a drone to steal data from LEDs on an air-gapped computer.

Full Article

Thermal photo of a house Researchers Use AI to Dramatically Increase Image Clarity Under Severe Conditions
Tokyo Institute of Technology
June 5, 2017

Researchers at the Tokyo Institute of Technology in Japan have developed technology that uses artificial intelligence (AI) to derive greater image visibility from devices such as thermal and x-ray cameras. The researchers say the technology automatically selects highly visible parts from multiple images and combines them, while enhancing the smallest characteristics contained in non-visible images. After the AI carries out a detailed examination of each image in order to assess the degree of visibility of each part, the system automatically extracts the best areas from each image, taking environmental characteristics such as brightness, the direction of light, and obstacles, into account. "This technology eliminates the need for...manual work, using AI to effectively and automatically combine images taken by different cameras," says Tokyo Institute professor Masatoshi Okutomi. "This also increases visibility by actively utilizing the strong points of each visible image and non-visible image, even when the images are difficult to visualize."

Full Article
Scientists Develop Divide and Conquer Approach for More Stable Power Generation
June 7, 2017

Researchers at the University of Connecticut (UConn) and international engineering firm ABB have developed a two-pronged approach to ensure wind-generated power does not diminish as a renewable resource. UConn professor Bing Yan says the idea is to couple each remote wind farm with a sufficiently large and not necessarily co-located conventional power generation unit. She notes the researchers use an algorithm to virtually relocate the traditional power generation units in relation to their wind counterparts. Yan says the approach computationally reduces the distance, which reduces the need for expensive high-capacity batteries to store wind-generated power. The team says the new technique splits power generation of conventional units into two components--one that estimates future wind states based only on current states, and another that provides limitations, based on global information, for extreme wind states. "Our approach provides an efficient way to dampen the effects of wind uncertainties," Yan says.

Full Article

IBM scientists prepare test wafers with 5nm silicon nanosheet transistors An IBM Breakthrough Ensures Silicon Will Keep Shrinking
Brian Barrett
June 5, 2017

IBM researchers have announced a new transistor design they say will ensure the continued miniaturization of computer chips via horizontal layering of silicon nanosheets. The researchers say their approach overcomes the limitations of FinFET transistors, whose performance taps out around the 5-nanometer scale. "You can imagine that FinFET is now turned sideways, and stacked on top of each other," says IBM Research's Mukesh Khare. VLSI Research CEO Dan Hutcheson says this is a major milestone, noting, "If I can make the transistor smaller, I get more transistors in the same area, which means I get more compute power in the same area." IBM says the design yields either a 40-percent performance gain at the same power level, or a 75-percent cut in power required at the same efficiency. The researchers expect the breakthrough to meet mass manufacturing's demands in several years, powering autonomous cars, on-board artificial intelligence, and 5G sensor deployments planned by the tech industry.

Full Article
Why So Few Women Break Through Tech's Bro Culture
Laura Colby
June 2, 2017

Despite the technology industry's attempts to achieve gender equality, the sector suffers from chronic underrepresentation of women. Experts cite several possible reasons for this disparity, including the difficulty of changing a male-dominated culture, harassment, and excluding women from the informal "buddy networks" that work to the advantage of male peers. It is unlikely the educational sector will correct this imbalance, because although women currently earn about half of all college degrees in science, technology, engineering, and math (STEM), the ranks of both female computer science and engineering graduates have declined significantly in recent years. Most tech-company executives say driving diversity initiatives is ultimately their responsibility, and possible remedial actions experts suggest women follow include volunteering for tough projects and taking leadership positions as they are offered. Meanwhile, groups such as Girls Who Code and the Girl Scouts of America are focused on improving tech industry diversity by encouraging girls to study in STEM fields.

Full Article
Engineer Unveils New Spin on Future of Transistors With Novel Design
UT Dallas News Center
Amanda Siegfried
June 5, 2017

Researchers at the University of Texas at Dallas (UTD) have developed a computing system made exclusively from carbon which they think could replace the silicon transistor. The all-carbon spintronic switch functions as a logic gate that relies on the magnetic field generated when an electric current moves through a wire. In addition, the UTD researchers say a magnetic field near a graphene nanoribbon affects the current flowing through the ribbon. Transistors cannot exploit this phenomenon in silicon-based computers, but in the new spintronic circuit design, electrons moving through carbon nanotubes create a magnetic field that impacts the flow of current in a nearby graphene nanoribbon, providing cascaded logic gates that are not physically connected. Since the communication between each of the graphene nanoribbons takes place via an electromagnetic wave, the researchers predict communication will be much faster, with the potential for terahertz clock speeds.

Full Article
Catching the IMSI-Catchers: SeaGlass Brings Transparency to Cell Phone Surveillance
UW Today
Jennifer Langston
June 2, 2017

Researchers at the University of Washington (UW) have developed the SeaGlass system to spot cellular network anomalies that may signal where and when International Mobile Subscriber Identity-catchers are being used to pinpoint mobile phones, monitor conversations, or send spam. SeaGlass sensors built from commercially available components can be installed in vehicles, and they detect signals from existing cell tower networks, accumulating a baseline map of normal tower patterns. Using algorithms and other techniques devised to spot network irregularities that can reveal a simulator's presence, the UW team built statistical models to locate anomalies. SeaGlass sensors were deployed over two months in 15 ride-sharing vehicles in Milwaukee and Seattle to identify numerous anomalies consistent with patterns anticipated from cell-site simulators. "We're eager to push this out into the community and find partners who can crowdsource more data collection and begin to connect the dots in meaningful ways," says UW researcher Ian Smith.

Full Article
Brain-Inspired Computing Pushes the Boundaries of Technology
R&D Magazine
James B. Aimone
June 2, 2017

The prospect of true brain-inspired computing capabilities is looming, and the sources of inspiration include cognitive function, brain circuitry, and the novel characteristics of neurons and synapses, writes Sandia National Laboratories' James B. Aimone. "Recent advances in each of these domains has made brain-inspired computing more attractive, particularly in light of the growing demands for computing to analyze and interpret vast amounts of data," Aimone says. However, he notes there is still a wide gap between neuroscience and computer science. "For this reason, many of the underlying research questions around neural computing center around what aspects of biological neural computing should be emulated, as how to build neural architectures is increasingly well understood," Aimone says. He also points to the computer science community's increasing appreciation for machine learning as the logical initial neural hardware implementation. Aimone thinks the technology could mimic the evolution of graphics-processing units, with specialized applications initially, followed by broader utility later.

Full Article
How Flattening Our Dimension Can Bring Better Graphics Into the Fold
Texas A&M Engineering News
Rachel Rose
May 31, 2017

Researchers at Texas A&M University are studying how to create more realistic graphics by changing how three-dimensional (3D) models are made. To create 3D models, thin sheets of material are layered together to create a single object, but Texas A&M professor Scott Schaefer wants to peel back those layers to create the most realistic graphics ever. The approach involves parameterization, or the flattening of a piece of a surface to the two-dimensional (2D) domain. "This flattening of a shape creates a map between the 2D image and the 3D shape," Schaefer says. Although this can introduce some distortion into the shape and most current parameterization methods attempt to reduce this distortion, Schaefer's research centers on how the distortion is measured and optimized. The researchers have developed a method for efficiently optimizing parameterizations that produce a bijection, a mathematical property in which triangles do not fold over or overlap in any way during the flattening process.

Full Article
Beyond Scaling: An Electronics Resurgence Initiative
June 1, 2017

The U.S. Department of Defense's proposed fiscal 2018 year budget includes $75 million for the Defense Advanced Research Projects Agency's (DARPA) "electronics resurgence" effort. The initiative aims to catalyze electronic performance upgrades via new materials, designs, and frameworks for microsystems, which will guarantee continued improvements beyond traditional scaling's benefits. For new materials, the initiative will explore the use of unconventional circuit components to boost performance without requiring smaller transistors. Circuit structures optimized for specific tasks will comprise the architectural segment, while the design portion will concentrate on rapid specialized circuit design and generation. DARPA's Bill Chappell says the purpose of the resurgence initiative is "to embrace progress through circuit specialization and to wrangle the complexity of the next phase of advances, which will have broad implications on both commercial and national defense interests." Chappell says DARPA looks forward to collaborating with commercial firms, academia, defense contractors, and other innovation producers on the resurgence.

Full Article

Students using vr gloves to play piano A Glove Powered by Soft Robotics to Interact With Virtual Reality Environments
UC San Diego News Center
Ioana Patringenaru
May 31, 2017; et al.

Researchers at the University of California, San Diego (UCSD) are using soft robotics technology to create light and flexible gloves that enable users to feel tactile feedback when they interact with virtual reality (VR) environments. The team says the gloves are equipped with a type of soft robotic component called a McKibben muscle, or latex chambers covered with braided fibers, which responds similarly to a spring to apply force when the user moves their fingers. The system involves three main components--a sensor that detects the position and movement of the user's hands, a custom fluidic control board that controls the gloves' movements, and soft robotic components in the glove that individually inflate or deflate to mimic the forces the user would encounter in the VR environment. "Our final goal is to create a device that provides a richer experience in virtual reality," says UCSD professor Michael Tolley.

Full Article

For Natural Language Processing and Affective Computing Natural Language Processing and Affective Computing
The Conversation
Mark Esposito; Kariappa Bheemaiah; Terence Tse
June 1, 2017

Natural-language processing (NLP) uses machine learning and artificial intelligence to comprehend unstructured text and context, enabling the parsing of information as it is received, writes Harvard University professor Mark Esposito and colleagues. They note NLP functionality can be deconstructed into signal processing, syntactic analysis, semantic analysis, and pragmatics, and recent innovations in the last three areas have advanced the use of NLP for sentiment analysis, which "plays a very important role in decision-making and the ability of a machine to convert human language into machine readable code and convert it into actionable insights." Computerizing the interpretation of non-vocal cues and other human affects is the purpose of affective computing, which Esposito and colleagues describe as "the next step in the analysis of sentiments and emotions" into quantifiable and actionable results. "Automating this task to a computer is a true challenge for scientists trying to bridge the barrier between machine-human interfaces," they say.

Full Article
MIT Advances in Imaging
ACM Learning Center

Association for Computing Machinery

2 Penn Plaza, Suite 701
New York, NY 10121-0701

ACM Media Sales

If you are interested in advertising in ACM TechNews or other ACM publications, please contact ACM Media Sales or (212) 626-0686, or visit ACM Media for more information.

To submit feedback about ACM TechNews, contact: [email protected]