Welcome to the January 18, 2017 edition of ACM TechNews, providing timely information for IT professionals three times a week.
ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).
HEADLINES AT A GLANCE
China to Develop Prototype Super, Super Computer in 2017
Agence France-Presse (01/17/17)
China plans to develop a prototype exascale computer by the end of this year as it seeks to win a global race to build a supercomputer capable of 1 billion, billion calculations a second. If successful, the achievement would solidify China's place as a leading power in the world of supercomputing. "A complete computing system of the exascale supercomputer and its applications can only be expected in 2020, and will be 200 times more powerful than the country's first petaflop computer Tianhe-1, recognized as the world's fastest in 2010," says National Supercomputer Center researcher Zhang Ting. Zhang notes the exascale computer could have applications in big data and cloud computing research, and the potential prototype could lead the world in data transmission efficiency as well as calculation speed. As of last June, China had more top-ranked supercomputers than the U.S. with 167 to 165, according to Top500.org. Of the top 10 fastest supercomputers, two are in China and five are in the U.S., with the others located in Japan and Switzerland.
Seeing the Quantum Future...Literally
University of Sydney (01/14/17) Vivienne Reiner
Researchers at the University of Sydney in Australia led by professor Michael J. Biercuk have utilized big data methods to predict how quantum systems will change and then preventively suppress decoherence. Vital to this milestone was devising a technique to predict disintegration of a quantum system composed of quantum bits (qubits) formed from trapped ytterbium ions. Biercuk says seemingly random behavior in fact contained sufficient information for a computer program to infer how the system would change in the future, enabling it to then predict the future without direct observation, which would otherwise trigger decoherence. The accuracy of the prediction enabled the research team to use their guesses preemptively to compensate for the expected changes. The real-time performance of this process prevented decoherence, extending the longevity of the qubits. "We know that building real quantum technologies will require major advances in our ability to control and stabilize qubits--to make them useful in applications," Biercuk says. "Our techniques apply to any qubit, built in any technology, including the special superconducting circuits being used by major corporations."
Scientists Propose a Novel Regional Path Tracking Scheme for Autonomous Ground Vehicles
Phys.org (01/16/17)
Researchers at Jilin University in China have proposed a model predictive control algorithm that could track a complex road by monitoring the desired lateral position of the road centerline of autonomous ground vehicles (AGVs). The research was designed to overcome the drawbacks of the pure-pursuit tracking method, which may cause collisions when tracking a more complex road while ignoring the size and shape of fully automated vehicles. In addition, the algorithm aims to solve failures due to neglecting the width of the path when using the centerline to describe the desired path. In the study, the road boundaries and shape of the vehicle are taken into account. To follow the centerline in the given feasible region, the algorithm needs to minimize the difference between the predicted output and the road centerline, and it also must consider the saturation of the mechanical system and ensure that AGVs consume less energy. The road boundaries and saturation of the actuator also are described as constraints. The new technology consists of an environment perception system and a driving control system. The driving control system runs on a single-board computer handling decision-making, planning, and control, while the environment perception system features lane marking detection and preceding vehicle recognition, which run on two different computers.
Using Mathematics to Hunt for Computer Errors
Scilog (01/16/17)
The Austrian Science Fund is funding the development of methods devised by researchers led by Institute for Science and Technology Austria professor Krishnendu Chatterjee to apply mathematical analysis to enhancing software and hardware security. Chatterjee notes scientists use graph theory for the mathematical analysis of computer systems for verification, and the speed of these verification algorithms is what particularly interests him. "Some algorithmic problems in verification development have been stagnating since the 1990s," he says. "A new aspect of this project was to use cutting-edge approaches from graph theory to improve the algorithms." In collaboration with project partner Monika Henzinger at the University of Vienna, Chatterjee successfully exceeded several speed limits for certain verification algorithms, including those involving Markov decision processes. Chatterjee says these processes are models containing multiple options and an element of randomness, such as the development of robots. "The most efficient algorithm for addressing that problem available till now dated back to 1995 and had quadratic complexity," he notes. "In our project, we were able to overcome this limit by using graph algorithm techniques."
Contributions of PCAST
CCC Blog (01/17/17) Helen Wright
The President's Council of Advisers on Science and Technology (PCAST) for the Obama administration is marked by the greatest diversity and distinction in U.S. history. PCAST, which dates back to the Eisenhower administration, achieved many milestones over the last eight years, including a report on Independence, Technology, and Connection in Older Age, which examined technologies and policies that will help U.S. citizens live independently as they get older. Another PCAST effort was the 2015 follow-up study on the progress of the Federal Networking and Information Technology Research and Development program, with concentrations on cybersecurity, health, big data and data-intensive computing, information technology and the physical world, privacy protection, cyber-human systems, high-capability computing, and foundational computing research. The final PCAST report, published this month, acknowledges the importance of semiconductors--and semiconductor leadership--to modern life in a competitive world. The Computing Community Consortium followed up the report, which said future leaders must transcend Moore's Law to leverage many advances, with the Nanotechnology-inspired Information Processing Systems of the Future workshop.
LATTICE Connects Women Engineers in Early Academic Careers With Peers, Support
UW Today (01/12/17) Jennifer Langston
A new national program aims to broaden U.S. participation in computing and engineering fields by building supportive networks for early-career and pre-tenure women in academic careers. Launching Academics on the Tenure-Track: An Intentional Community in Engineering (LATTICE) is sponsored by the University of Washington, North Carolina State University, and California Polytechnic State University. The program focuses on supporting early-career women and underrepresented minority women during the transition from graduate studies to tenure-track positions through a combination of symposia, networks, and other support structures. The first symposium in May will focus on postdoctoral women in electrical engineering and computer science, including researchers, assistant professors, assistant research professors, and other pre-tenure positions. A second symposium in 2019 will focus on women in all engineering fields who are members of racial or ethnic minorities or who have disabilities. Each symposium will center around academic career skills such as teaching, proposal writing, and tenure proposals, and senior engineers and faculty will be present for mentoring and networking. The networks established at the symposiums will be expanded through peer mentoring circles and online connections. With funding from a five-year U.S. National Science Foundation Grant, LATTICE will combine professional development programs with an ethnographic research study to identify successful strategies for future diversity initiatives.
Nim Language Draws From Best of Python, Rust, Go, and Lisp
InfoWorld (01/16/17) Serdar Yegulalp
The under-development Nim programming language is advertised as blending the compilation speed and cross-platform targeting of Google's Go language, Rust's safe-by-default behaviors, the readability and ease of development of Python, and the metaprogramming capabilities of Lisp. Nim's syntax bears a strong resemblance to Python's, as it employs indented code blocks and some of the same syntax, while Go- and Rust-like features include first-class functions, distinct types, and object-oriented programming with composition favored over inheritance. Nim permits templates and generics, and expresses C code as a default setting, while it also is capable of generating C++, Objective-C, or JavaScript. Compile code caching means big projects with small changes to one module will recompile solely in that module. Nim's memory management uses a deferred reference counting system for default garbage-collecting, which can be completely disabled in favor of manual management when necessary. Nim intends to provide both a strong standard library and a solid assortment of third-party modules, while its biggest current drawback is the relatively small user-developer community for the language. Nevertheless, Nim helps create software that eventually must be swift and robust, with a less precipitous learning curve or cognitive overhead typically related to existing languages.
4G Network Infrastructure Could Mean Fewer Accidents by Drivers
University of Bristol News (01/13/17)
Researchers at the University of Bristol in the U.K. have found a preexisting 4G network infrastructure could help drivers make safe decisions in or near accidents. A major factor in vehicle-related accidents is the lack of information in knowing if drivers are aware of their surroundings and road conditions. A cost-effective solution to this problem is for city-owned base stations to form a single frequency network (SFN) that will enable drivers to have the information they need to make safe decisions in or near accidents, according to the Bristol researchers. In order to ensure that transmissions are reliable, tight bounds on the outage probability would need to be developed when the SFN is overlaid on an existing cellular network. In addition, the researchers developed an extremely efficient transmission power allocation algorithm that can reduce the total immediate SFN transmission power by up to 20 times compared to a static uniform power allocation solution. "We have shown that our proposed power allocation [PA] model can help to significantly reduce the transmission power of the proposed network while target signal-to-noise and interference ratio [SINR] outage constraints are met," says Bristol's Andrea Tassi.
Carnegie Mellon Receives More Than $250 Million From DoD and Partners to Launch Robotics Initiative
TechRepublic (01/13/17) Hope Reese
American Robotics, a nonprofit venture founded by Carnegie Mellon University (CMU), has been awarded more than $250 million by the U.S. Department of Defense (DoD) and other groups to build an advanced robotics manufacturing center in Pittsburgh. The institute received $80 million from the DoD and $173 million from its other partners to lead the Advanced Robot Manufacturing Group (ARM), which will work on defense and industry-related technology in aerospace, automotive, electronics, and textile fields. The award creates opportunities for CMU researchers to explore emerging technologies in manufacturing, artificial intelligence, autonomy, three-dimensional printing, and other applications. Robotics are essential for defense and manufacturing, yet they are often expensive and difficult to program. Still, they are important for smaller manufacturers, and ARM will attempt to address this barrier, finding solutions to better integrate robots across all different fields. CMU professor Howie Choset says ARM will have four goals--to empower American manufacturing workers, create new jobs, enable companies of all sizes to adopt robotics, and assert U.S. leadership in advanced manufacturing. Choset says ARM also will be joining the Manufacturing USA institute network of industry, academic, and government groups that aim to ensure the U.S. remains competitive in manufacturing.
New Fingerprinting Techniques Identify Users Across Different Browsers on the Same PC
BleepingComputer (01/12/17) Catalin Cimpanu
A U.S.-based multi-university research effort has yielded distinct fingerprinting methods to track users when they use different browsers installed on the same device. Cross-browser fingerprinting (CBF) techniques rely on making browsers execute operations that use the underlying hardware components to process the desired data. By measuring the response to these operations, researchers applied the resulting data to flag the different hardware configurations, specific to distinct users, irrespective of the browser accessing a test website. Browser features that could be misused for CBF operations include screen resolution, the number of central-processing unit virtual cores, AudioContext, the list of fonts, vertex shaders, fragment shaders, installed writing scripts, transparency via the alpha channel, modeling and multiple three-dimensional (3D) models, lighting and shadow mapping, operations for rendering 3D models in two dimensions, clipping planes, and line, curve, and anti-aliasing operations. All of these methods were utilized in combination to test how many users the researchers could track to the same computer. CBF techniques correctly identified 99.24 percent of all test users, up from 90.84 percent realized by previous research methods. The researchers advise users to either employ the Tor Browser to avoid CBF, or deploy virtualization layers for other browsers.
Will Blockchain-Based Election Systems Make E-Voting Possible?
Government Technology (01/11/17) Adam Stone
Determining whether blockchain technology can accurately count votes and ensure the integrity of an electronic voting system was the purpose of a competition among university teams held by Kaspersky Lab. Kaspersky's Juan Guerrero says the blockchain's model has different peers in different systems vet each other's transactions. "If one of them gets hacked or one of them gets altered, all the others would be able to notice that change," he notes. Three submissions out of 19 were winners of the Kaspersky contest, including a "permissioned blockchain" model in which a central authority admits voting machines to the network and produces a distributed ledger of votes. The other winning submissions included a model founded on global public keys that encrypt ballots and provide voter receipts, and a solution based on the Open Vote Network and DRE-i and DRE-ip encryptions. To balance vote auditability and privacy, one solution would match voters with random identity numbers so those numbers could be exposed by an audit without compromising individual voters. To address the threat of voting under duress, most teams chose to stay with traditional voting places instead of remote voting. Guerrero says the results of the contest should help spark discussions among stakeholders--and U.S. voters--on finding proof-of-concept e-voting systems.
Neuroscience Can't Explain How an Atari Works
Technology Review (01/12/16) Jamie Condliffe
To help address a lack of neuroscientific datasets needed to understand brain functions via algorithmic analysis, the University of California, Berkeley's Eric Jonas and Northwestern University's Konrad Kording used analytical software to see if it would accurately explain the workings of a simpler system--the MOS 6502 microchip. Their experiments involved the chip running different videogames as its transistors' behavior was recorded, and then applying their software to the dataset. Although Jonas and Kording were able to determine that different transistors had different roles, they say the results "still cannot get anywhere near an understanding of the way the processor really works." Another test had the researchers remove a transistor to see its effect on the game, but the software analysis failed to explain why the game was interrupted. "While some of the results give interesting hints as to what might be going on, the gulf between what constitutes 'real understanding' of the processor and what we can discover with these techniques was surprising," Jonas says. He and Kording note the experiments' outcomes raise issues with neuroscience. For example, they say a small collection of high-quality datasets of the brain may be insufficient for explaining neural processes.
New Duke Professor Helen Li Explains Research in Brain-Inspired Computing, Future of Artificial Intelligence
Duke Chronicle (01/11/2017) Grace Mok
In an interview, new Duke University professor Helen Li discusses the future of artificial intelligence (AI), or "brain-inspired" computing. Her field involves developing hardware architectures that mimic neural biological structures so computation speed and energy efficiency can be enhanced. "Computers can't think by themselves and can't find problems by themselves," Li says. "They do not have curiosity. They can't do anything without specific algorithms." Li believes conscious AI will eventually arrive, with foundational technologies, theories, statistics, machine learning, and databases already developed. "All the components are there, but our major challenge is putting them together to enable artificial intelligence," she says. Li also cites Duke's strength in the fields of bio-engineering, machine learning, nanotechnologies, data centers, and computer engineering. "Duke is at the learning position and has a lot of potential," Li notes. "In each individual area, Duke is very, very competitive, but since artificial intelligence is not one discipline, it requires a multidisciplinary effort." Li says her industry experience also has informed her research into neuromorphic computing.
Abstract News © Copyright 2017 INFORMATION, INC.
To submit feedback about ACM TechNews, contact: [email protected]
|