Welcome to the October 14, 2016 edition of ACM TechNews, providing timely information for IT professionals three times a week.
Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).
HEADLINES AT A GLANCE
T-Rays Could Push Computer Memory Into Overdrive by Factor of 1,000
Silicon Republic (10/13/16) Colm Gorey
Terahertz radiation (t-rays) could boost computer memory speeds by a factor of 1,000, according to computer scientists from the Moscow Institute of Physics and Technology (MIPT). The researchers say t-rays offer a new way to control magnetization in computer memory cells. They note with modern computers, every complete reset for a magnetic memory cell takes time, and it has been very difficult to reduce this cycle. The team proposed using electromagnetic pulses at terahertz frequencies in memory switching, instead of external magnetic fields, to achieve faster speeds. In order to determine whether they could be used to store magnetic bits of information, the team performed an experiment with thulium orthoferrite that achieved higher speeds, but required an external magnetic field to control the spins. However, the researchers say the t-rays were able to control spins without this external field, and at an incredibly much faster rate. "We have demonstrated an entirely new way of controlling magnetization, which relies on short electromagnetic pulses at terahertz frequencies," says MIPT's Anatoly Zvezdin. "This is an important step towards terahertz electronics."
White House: AI Will Be Critical Driver of U.S. Economy
Computerworld (10/12/16) Sharon Gaudin
A new White House report prepared by the U.S. National Science and Technology Council's subcommittee on Machine Learning and Artificial Intelligence (AI) says AI can be a critical economic and societal driver with its development supported by a collaboration between industry, government, civil society, and the public. "Used thoughtfully, AI can augment our intelligence, helping us chart a better and wiser path forward," the study notes. Examples of AI's societal benefits cited by the council include a military hospital using AI to better predict medical complications and enhance treatment of combat injuries, and smart traffic management software that helps reduce congestion. Although the White House acknowledges the likelihood that advancing AI could replace human workers in lower-wage jobs, the report also predicts the technology will give rise to new, higher-level jobs in computer science, engineering, and math. "Public policy can address these risks, ensuring that workers are retrained and able to succeed in occupations that are complementary to, rather than competing with, automation," the report says. The study also stresses the federal government must assume responsibility for positively advancing AI. It can set the agenda for public debate, monitor AI applications' fairness and safety, and adjust regulations to spur innovation while protecting the public.
World's First Conference for Research Software Engineers Hailed Phenomenal Success
University of Southampton (10/12/16)
The world's first conference for Research Software Engineers (RSEs) in Manchester, U.K., was called a "phenomenal success" by Simon Hettrick, who helped organize the event. Hettrick, from the University of Southampton's Web and Internet Science research group, says RSEs "work with researchers to gain an understanding of the problems they face, and then develop, maintain, and extend software to provide the answers." More than 200 RSEs from 14 different countries attended the conference at the U.K.'s Museum of Science and Industry. "We wanted to give them the opportunity to share their methods and best practice on a much wider scale than they can at the moment," Hettrick says. He notes 70 percent of RSEs report their work has a fundamental reliance on software. The RSE conference "created a huge number of new collaborations and brought further attention to the fundamental importance of the RSE role in academia," Hettrick says. Conference attendees learned about the cutting-edge techniques being used in research and listened to a broad range of speakers, including Microsoft Research's Matthew Johnson and University of Southampton professor Susan Halford.
Encrypting Medical Photos With Chaos
Researchers from the University of Batna in Algeria say they have developed an algorithm that generates pseudo-random sequences that change a plain image into a ciphered image in a single step, producing a file that cannot be cracked. University of Batna researchers Assia Beloucif, Oussama Noui, and Lemnouar Noui say increasing concerns about personal data and privacy mean there is a growing need for easy-to-implement but tough-to-crack encryption technology that can protect images. They note although there are powerful encryption tools for converting text documents, they are not ideal for color photos because there is a strong correlation between the data in the original image and an encrypted one, particularly in terms of bulk data volume. The researchers developed encryption techniques that exploit confusion or diffusion, and can remove redundancies to reduce file sizes and spread pixel values so it becomes difficult to extract parts of the original image using data recovery and cracking tools. In addition, the researchers say chaos theory could be used to further blur the correlation between original and encrypted images. They note the algorithm uses a chain of "tweaks" spread randomly between pixels in the conversion by the encryption key.
In a Medical First, Brain Implant Allows Paralyzed Man to Feel Again
The Washington Post (10/13/16) Amy Ellis Nutt
For the first time, a paralyzed man has regained his sense of touch via a thought-controlled robotic arm thanks to experiments conducted by the University of Pittsburgh and the University of Pittsburgh Medical Center (UPMC). Electrodes implanted in the sensory cortex of Nathan Copeland's brain receive transmissions from the prosthesis so he can feel tactile sensation in his paralyzed right hand when the robot arm's fingers are pressed. Chips previously implanted in Copeland's brain link him to the arm so he can control its movements with his mind. Enabling two-way feedback between a prosthetic limb and the patient's brain is critical if the appendage is to truly mimic a human limb's functionality, and electric stimulation of nerves in amputees' bodies facilitates control of artificial limbs, but not actual sensation. "With Nathan, he can control a prosthetic arm, do a handshake, fist bump, move objects around," says UPMC biomedical engineer Robert Gaunt. "And in this [experiment], he can experience sensations from his own hand. Now we want to put those two things together so that when he reaches out to grasp an object, he can feel it. He can pick something up that's soft and not squash it or drop it."
Just Give Me Some Privacy
Drexel Now (10/12/16) Britt Faulstick
The reasons Internet users turn to Internet Protocol obfuscation to protect their privacy are explored in a new study by Drexel University researchers to be presented in February at the ACM Conference on Computer-Supported Cooperative Work and Social Computing (CSCW 2017) in Portland, OR. The study found people such as Wikipedia editors and The Onion Router (Tor) users often feel they, their reputations, and their families are threatened by other individuals, groups of people, and governments. These feelings are most pronounced among people participating in open collaboration efforts to make information freely and universally available online, as well as those expressing political opinions. Those wishing to be online without being traced frequently use Tor, which is popular among reporters and political dissidents in the wake of Edward Snowden's revelations about the U.S. National Security Agency's online surveillance. Although much content is hidden or blocked from Tor users, they say the network's anonymity is worth the limited access. The researchers say administrators of open collaborations must see the value of anonymous users' contributions before anonymous participation is permitted by sites. Otherwise they risk discouraging the diversity needed to generate "the sum of all human knowledge," according to the researchers.
Two-Dimensional Materials Combined to Produce 'Quantum LED'
IEEE Spectrum (10/11/16) Dexter Johnson
Devices made from thin layers of graphene, boron nitride, and transition metal dichalcogenides (TMDs) can generate a single photon entirely by electricity. Built by researchers at the University of Cambridge in the U.K., the devices are all-electrical ultra-thin quantum light-emitting diodes (LEDs). The TMD layers provide a tightly confined area in two dimensions where electrons fill in holes. When an electron moves into one of the holes that reside at a lower energy, the difference in energy produces a photon. The researchers demonstrated that TMDs of tungsten diselenide can operate electrically as a quantum emitter. Moreover, they demonstrated that TMDs of tungsten disulfide are a new class of quantum emitter that can offer all-electrical single-photon generation in the visible spectrum. The team believes the device will bring on-chip single-photon emission for quantum communication closer to reality. "Ultimately, we need fully integrated devices that we can control by electrical impulses, instead of a laser that focuses on different segments of an integrated circuit," says Cambridge professor Mete Atature. "For quantum communication with single photons, and quantum networks between different nodes, we want to be able to just drive current and get light out."
Self-Learning Computer Tackles Problems Beyond the Reach of Previous Systems
Phys.org (10/10/16) Lisa Zyga
Researchers from Belgium's Free University of Brussels have developed a self-training neuro-inspired analog computer based on the "reservoir computing" algorithm. They say the system not only outperforms experimental reservoir computers that do not use the new algorithm at solving complex computing tasks, but also meets challenges beyond the reach of traditional reservoir computing. "Our work shows that the backpropagation algorithm can, under certain conditions, be implemented using the same hardware used for the analog computing," says researcher Piotr Antonik. He suggests such a system could be harnessed to bypass the end of Moore's Law. Backpropagation systems execute thousands of iterative calculations that reduce errors a little each time, bringing the computed value closer to the optimal value. Ultimately, the repeated computations teach the system a better way of computing a solution to a problem. The Belgian researchers showed the algorithm can perform three complex tasks--a speech recognition task, an academic task for testing reservoir computers, and a nonlinear task--better than reservoir computing systems that do not use backpropagation. Their current system has data coded as the intensity of light pulses propagating in an optical fiber, with both the reservoir computer and the backpropagation algorithm physically deployed on the same photonic setup.
Big Data for Little Creatures
UCR Today (10/10/16) Sarah Nightingale
University of California, Riverside (UCR) researchers received $3 million from the U.S. National Science Foundation Research Traineeship (NRT) program to prepare next-generation scientists and engineers. The NRT in Integrated Computational Entomology (NICE) program will be a replicable education and training model for other institutions with an interest in developing computational entomology programs. The UCR researchers are developing big data systems to track mosquitoes that spread the Zika virus, intervene quickly, and help governments plan for future outbreaks. They also are developing low-cost, wireless insect sensors that classify species with up to 99.9-percent accuracy and generate data that can be incorporated into a classification algorithm. In past years the researchers have collected mass volumes of data to help farmers determine when to apply pesticides, and help public health officials stop the spread of insect-borne diseases. NICE will launch next summer and fund at least 80 researchers in the engineering or life sciences. The researchers will continue work on sensors and develop real-time tracking devices that send texts to farmers, informing them when a harmful insect is multiplying. The training program will create "endless research possibilities and a new way to address some of the most critical challenges of our time," says UCR professor Eamonn Keogh.
Cambridge Cyber Summit Convenes Industry, Academia, and Government
MIT News (10/12/16)
The Massachusetts Institute of Technology's (MIT) Computer Science and Artificial Intelligence Laboratory (CSAIL) last week hosted a summit that focused on key issues in privacy and security. The "Cambridge Cyber Summit" featured discussions with leaders such as Admiral Michael Rogers, director of the U.S. National Security Agency, and Andrew McCabe, deputy director of the U.S. Federal Bureau of Investigation. Rogers discussed the growing threats to public and private companies, and how hacking has changed over time. "The challenge for us is how to access content in a way that will protect [people's] rights, but still allow us to generate the answers to protect our citizens," he said. During a live demonstration of the Dark Web and ransomware, CSAIL's Srini Devadas discussed the good and bad side of anonymity. White-hat security hacker David Kennedy, who had previously penetrated the healthcare.gov website in only four minutes, demonstrated how easy it is to uncover personal information. MIT researchers noted cybersecurity continues to be a focus of CSAIL. Last year, CSAIL launched Cybers[email protected], the Cybersecurity Policy Initiative (CPI), and MIT Sloan's Interdisciplinary Consortium for Improving Critical Infrastructure Cybersecurity. These initiatives span multiple labs and departments and aim to provide a cross-disciplinary strategy for tackling the problem of cybersecurity.
Wireless Data Center-on-a-Chip Aims to Cut Energy Use
WSU News (10/06/16) Tina Hilding
Washington State University (WSU) researchers say they have designed a tiny, wireless data center that could dramatically reduce the energy needed to run them. The researchers previously developed a wireless network on a computer chip, which includes a tiny, low-power transceiver, on-chip antennae, and communication protocols that enable wireless shortcuts. The new research expands on these capabilities for a wireless data-center-on-a-chip. The researchers note they are moving from two-dimensional chips to a highly integrated, three-dimensional wireless chip at the nano- and microscales that can move data more quickly and efficiently. They say they will use this new system to run big data applications three times more efficiently than the best data center servers. In addition, they will evaluate the wireless data center to increase energy efficiency, while also maintaining fast, on-chip communications. The tiny chips, which are composed of thousands of cores, could run data-intensive applications orders of magnitude more efficiently compared to existing platforms. The researchers say the new system has the potential to achieve a comparable level of performance as a conventional data center using much less space and power. "This project is redefining the foundation of on-chip communication," says WSU professor Partha Pande.
Auto 'Finprinting' Identifies Individual Sharks as They Migrate
New Scientist (10/05/16) Chris Baraniuk
Researchers from the University of Bristol in the U.K. have developed a system that can automatically identify sharks from photographs of their fins. The technique, known as "finprinting," uses the unique contours of a shark's dorsal fin as a biometric, similar to a human fingerprint. The system was trained on 240 photographs of shark fins, picking out recognizable portions of the fin contour so images of fins that later become partially damaged might still be used for identification. During testing, the software was able to analyze a picture of a shark fin and determine if it belongs to a known individual shark with 81-percent accuracy. The researchers say their approach could help monitor the behavior of shark migrations. They plan to have the tool completely optimized and available to researchers by the first quarter of 2017, using photos collected by scientists in the field. The system still needs some tweaking before it can be used to identify large numbers of individuals, which would mean training the tool on a larger set of photographs. The researchers want to feed the artificial intelligence system photos contributed by members of the public who go shark watching or cage diving.
Omnidirectional Mobile Robot Has Just Two Moving Parts
Carnegie Mellon University (10/05/16) Byron Spice
The creators of the ballbot, a robot that glides using a single sphere, have improved upon the design by adding the spherical induction motor (SIM), resulting in a robot with only one moving part. Previously, ballbots maintained balance atop the sphere through mechanical means that required frequent replacements and recalibrations. The new SIMbot, engineered by researchers at Carnegie Mellon University (CMU) and Tohoku Gakuin University in Japan, has an induction motor that can move the ball in any direction using an electrical current. The SIM rests on a hollow ball inside a copper shell, while steel stators alongside the ball produce magnetic waves, guiding the ball in the direction of the wave. By altering the currents produced by the stators, the SIMbot can be steered. Removing the mechanical drive system cuts down on friction, making the machine more efficient and requiring less routine maintenance, says CMU professor Ralph Hollis. He notes the SIMbot is capable of the speeds reached by the mechanically driven ballbot, but is not yet as efficient. Hollis says the new technology should make ballbots more practical for wider adoption.
Abstract News © Copyright 2016 INFORMATION, INC.
To submit feedback about ACM TechNews, contact: [email protected]