Welcome to the August 19, 2015 edition of ACM TechNews, providing timely information for IT professionals three times a week.
Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).
HEADLINES AT A GLANCE
Could Hackers Take Down a City?
The Washington Post (08/18/15) Andrea Peterson
Researchers such as David Raymond, deputy director of Virginia Polytechnic Institute and State University's IT Security Lab, warn of the possibility of cyberattackers crippling a city because of urban centers' increasing reliance on technology and the frail, messy connections that bind those systems together. "The digital pathways between all of the entities and organizations in a city [are] often not well managed," Raymond cautions. "In many cases, there's no overarching security architecture or even understanding of holistically what the city looks like." Raymond, U.S. Military Academy at West Point professor Gregory Conti, and Drawbridge Networks' Tom Cross presented research at this month's Black Hat USA conference on cities' cyber-vulnerabilities. They speculate transportation systems are one area that may be susceptible to a targeted attack, given they are places where otherwise well-shielded technology may converge in ways that are not well protected, leading to a cascade effect that impacts the entire city. Other researchers presenting at Black Hat detailed how security vulnerabilities involving Ethernet switches could be exploited to cause a nuclear plant shutdown. Conti also notes cities concerned about hacking vulnerabilities often have difficulty drawing the right specialists and secure resources to offer a long-term solution. Cross argues cities should use the same types of risk management tactics they apply to traditional attacks to the digital domain.
IBM's 'Rodent Brain' Chip Could Make Our Phones Hyper-Smart
Wired (08/17/15) Cade Metz
IBM researchers say they have built the digital equivalent of a rodent brain encompassing 48 TrueNorth chips, an experimental processor designed to emulate neurons. IBM is holding a "boot camp" for government researchers and academics, where TrueNorth software is being developed. Some researchers have crafted software that can run deep-learning algorithms capable of recognizing spoken words, identifying images, and understanding natural language. "The chip gives you a highly efficient way of executing neural networks," says University of Michigan professor Jason Mars. TrueNorth chips promise to run algorithms in smaller spaces while consuming less electricity, and Lawrence Livermore National Laboratory's Brian Van Essen says this "lets us tackle new problems in new environments." TrueNorth is currently suitable for only one aspect of deep learning--enabling the neural network to execute models it has been trained for--but IBM's Dharmendra Modha, a recipient of the ACM Gordon Bell Prize in 2009, notes this is appropriate. "We're trying to lay the foundation for significant change," he says. The chip is equipped with 5.4 billion transistors, yet it draws only about 70 milliwatts of power. The smallness and low-power consumption of TrueNorth enables faster data processing as the information is not network-routed, and Mars thinks this will help embed more processing within devices.
Intel's Reinvention of the Hard Drive Could Make All Kinds of Computers Faster
Technology Review (08/18/15) Tom Simonite
Intel has announced plans to launch a line of hard drives next year that use new memory technology that can operate as much as 1,000 times faster than current flash-based hard drives. The new hard drives will be sold under the name Intel Optane and will use the 3D Xpoint memory technology Intel developed in collaboration with memory chip company Micron Technology. Current flash memories store bits of data as electric charges, while the 3D Xpoint memory stores data by using electricity to change the arrangement of atoms inside materials at the junctions of grids of metal wires. Like flash memory, 3D Xpoint memory is non-volatile, retaining its data when powered down. It has been speculated the new memory is similar to the memristors being developed by Hewlett-Packard. It cannot currently offer the same density as flash memory, but Intel says it can be stacked vertically, which would provide one way of packing more storage onto a single chip. Early prototypes of the Optane Drive demonstrated at Intel's annual developer conference performed seven times as fast as top-of-the-range flash-based hard drives and the ones that go on sale next year could be even faster. Intel sees applications for the new drives in corporate data centers, consumer PCs, and laptops.
New Optical Chip Lights Up the Race for Quantum Computer
University of Bristol News (08/14/15)
Researchers at Britain's University of Bristol and Nippon Telegraph and Telephone in Japan say they have made a major breakthrough in quantum computing: the development of single multipurpose, reprogrammable optical chip that can carry out experiments that would have taken months in just hours. One of the major factors holding back the development of quantum computers is the need to build out the apparatus for every experiment, a time-consuming process. However, the researchers say their new chip will do away with this tedious chore. "A whole field of research has essentially been put onto a single optical chip that is easily controlled," says lead researcher Anthony Laing with the University of Bristol. "The implications of the work go beyond the huge resource savings. Now anybody can run their own experiments with photons, much like they operate any other piece of software on a computer. They no longer need to convince a physicist to devote many months of their life to painstakingly build and conduct a new experiment." The chip and others like it also are going to be put to work in the university's Quantum in the Cloud service, which will make a quantum processor publicly accessible.
Drawing in the Third Dimension
National Science Foundation (08/13/15) Sarah Bates
A software platform developed by U.S. National Science Foundation-funded startup Mental Canvas could soon enable users to sketch a two-dimensional (2D) image and then look around it in three dimensions (3D). The tool is the brainchild of Julie Dorsey, a Yale computer scientist and founder of Mental Canvas, who developed the tool over nearly a decade. The 3D sketching software incorporates elements of several visual design technologies from 2D and 3D graphics to software engineering and human-computer interaction. "In other systems, you can create drawings or paintings on individual layers and those layers slide around in a single plane," Dorsey says. "Our technology involves drawing in space, so the underlying representation is 3D rather than 2D, and it's really fast and fluid." The tool could eventually be used by scientists to create 3D visualizations of molecular structures, by architecture students to rough out a design, or by filmmakers for visual storyboarding. "Imagine picture books and graphic novels--and illustrations in general--in the future aren't flat but have active 3D qualities to move around the story," Dorsey says.
Unusual Magnetic Behavior Observed at a Material Interface
MIT News (08/18/15) David L. Chandler
Researchers at the Massachusetts Institute of Technology (MIT) and other institutions have identified an exotic magnetic behavior driven by the proximity of two materials that could be used as the basis for building some quantum computer components and open the way to study theoretical areas of physics. The phenomena observed by the researchers is called proximity-driven magnetic order and was induced using a technique called spin-polarized neutron reflectometry. The phenomena was produced by bonding a layer of topological insulator, a material that will allow electricity to flow along its surface but not through its bulk, to a ferromagnet. The proximity-driven magnetic effect is observed where the two materials meet and produces a localized and controllable magnetic pattern at the interface. Possible applications include the creation of spintronics, transistors based on a particle's spin rather than its charge; such devices are expected to have low energy dissipation. Another possible application, pointed out by MIT professor Ju Li, is components of quantum computers. Li says the interface effect could be used to make "a perfect quantum wire" and "novel quantum spintronics." The new effect also could open up study of phenomena such as Majorana fermions, a type of theoretical particle that has not yet been observed.
Programming and Prejudice
University of Utah (08/14/15) Suresh Venkatasubramanian
University of Utah researchers have developed a technique to determine if software programs used for hiring decisions, loan approvals, and other weighty tasks discriminate unintentionally and violate the legal standards for fair access to employment, housing, and other opportunities. "If there are structural aspects of the testing process that would discriminate against one community just because of the nature of that community, that is unfair," says University of Utah professor Suresh Venkatasubramanian. The researchers also determined a method to fix these potentially troubled algorithms. "The irony is that the more we design artificial intelligence technology that successfully mimics humans, the more that AI is learning in a way that we do, with all of our biases and limitations," Venkatasubramanian says. The researchers determined if the algorithms can be biased through the legal definition of disparate impact. They used a different machine-learning algorithm to accurately predict a person's race or gender based on the data being analyzed, even though race or gender is hidden from the data, an indication that there is a potential problem for bias based on the definition of disparate impact.
Researchers at RIT Seek to Solve the Problem of Looping with Meshed Tree Protocol
Rochester Institute of Technology (08/13/15) Scott Bureau
Rochester Institute of Technology (RIT) researchers are developing Meshed Tree Protocol, the next standard for loop avoidance that will make our computer networks more reliable, faster, and more secure against cyberattacks. The new standard is designed to solve a problem of miscommunication commonly sent in a network of computers. All large computer networks must use some version of loop-avoidance to fix this miscommunication and function properly. "The significant reduction in convergence time, combined with its simplicity and security, indicates that Meshed Tree Protocol would be a superior candidate to resolve looping issues in switched networks," says RIT professor Nirmala Shenoy. Meshed trees do not use the traditional single tree from one root concept. Instead, the Meshed Tree Protocol is a collection of all possible paths, and uses knowledge of the incoming ports and the structure of the meshed trees to detect attempts to modify or interfere with the topology. The Meshed Tree Protocol will provide a mechanism to authenticate valid members of the meshed tree switch group, and will feature four levels of security with different levels of authentication and encryption.
Algorithm Clarifies 'Big Data' Clusters
Rice News (08/12/15) Mike Williams
Rice University researchers could impact healthcare via a new big data algorithm called "progeny clustering." Rice bioengineer Amina Qutub notes the algorithm attempts to cluster populations by data about numerous traits in each subject, such as proteins in patients' blood. She reports the algorithm filters out characteristics about patients from a data set, randomly mixing and matching them to generate artificial populations presenting the descendants of the parent data. The characteristics manifest in approximately the same ratios in descendants as they do among parents. There can be hundreds or thousands of such traits, or dimensions, for a small population. By creating progeny with the same dimensions of features, the Rice algorithm grows the size of the data set, making distinct patterns more apparent and enabling the algorithm to optimize the number of clusters that warrant attention from doctors and scientists. Qutub and lead researcher Wendy Hu say the method's reliability is on a par with cutting-edge clustering evaluation algorithms, while costing far less computationally. The algorithm is being used in a hospital study to determine which drug treatments should be administered to children with leukemia, and Qutub says the technique could apply to any data set.
Computer Scientists Find Mass Extinctions Can Accelerate Evolution
University of Texas at Austin (08/12/15) Marc Airhart
Robots evolve more quickly and efficiently after a virtual mass extinction modeled after real-life disasters such as the one that killed off the dinosaurs, according to computer science researchers at the University of Texas. Their research shows how simulations of mass extinctions promote novel features and abilities in surviving lineages. Researchers Risto Miikkulainen and Joel Lehman created computer simulations by connecting neural networks to simulated robotic legs with the goal of evolving a robot that could walk smoothly and stably. They introduced random mutations so a wide range of features and abilities would result. After hundreds of generations, in which a broad spectrum of robotic behaviors had evolved, the researchers mimicked a mass extinction by randomly killing off 90 percent of the development niches. They discovered the surviving lineages had the greatest potential to produce new behaviors, and better solutions to walking were evolved in simulations with mass extinctions. The researchers think their work could be used to develop robots that are better able to overcome obstacles, as well as human-like game agents.
How Bioinformatics Could Find the Next Breakthrough Cancer Drug
Forbes (08/12/15) Emily Mullin
Immunotherapy currently is one of the most promising areas of cancer treatment, and involves tweaking the body's immune system so it attacks and kills cancer cells. Advancing immunotherapy requires the ability to study various immune cells' interactions with other cancer cells on a large scale, and new software from the University of Houston and the University of Texas M.D. Anderson Cancer Center can achieve this task. The tool, Time-lapse Imaging Microscopy in Nanowell Grids (TIMING), focuses on studying various types of T-cells and white blood cells and how they can kill cancer cells. The researchers say TIMING can study tens or hundreds of thousands of these cells at a time, while conventional analysis involves studying just 10 to 100 samples at a time. TIMING accomplishes this by tracking cell-to-cell interactions using time-lapse video recordings of isolated samples of immune and cancer cells housed in an expandable structure called a nanowell grid. So far the researchers have used TIMING to study leukemia and melanoma tumor tissue. Lead study author Badri Roysam says the next goal is to make TIMING faster. It currently takes hours or days to complete an analysis, but Roysam wants to get that number down to minutes.
Researchers Tinker With Flying, Rolling RFID-Sensorized Robots
RFID Journal (08/10/15) Beth Bacheldor
A team of researchers from Google and the University of Washington have developed a prototype robot-based environmental-measuring system comprising commercially available components. The components include passive ultrahigh-frequency radio-frequency identification (RFID) sensor tags, robots, and ground control software running on a laptop to plan the robots' missions, send them commands, and receive and display the data they collected. "This was ideally matched to mobile robots carrying RFID readers; the robot could help deploy and read the sensor tags in remote, hard-to-reach locations," says Google's Travis Deyle. The researchers tested ground-based as well as flying robots equipped with RFID sensors. They found the system could be used to automatically measure a soil's moisture level. The researchers also think the system could be used to monitor water quality and crops. The robots could be used to measure the stress, strain, corrosion, and wear levels on infrastructure, according to the scientists. They note there are still limitations regarding tag read range, battery life, and robot control, all of which could pose challenges for larger deployments.
Quantum Computing Advance Locates Neutral Atoms
Penn State News (08/12/15) A'ndrea Elyse Messer
Pennsylvania State University researchers have developed a method for addressing individual neutral atoms without changing surrounding atoms. The researchers want to use neutral atoms for quantum computing, and focused on ways to individually locate and address an atom to store and retrieve information. They first used laser light to create a three-dimensional lattice of traps for neutral cesium atoms with no more than one atom at each lattice site. "We are studying neutral atom qubits because it is clear that you can have thousands in an apparatus," says Penn State professor David S. Weiss. However, the researchers note neutral atoms cannot be held in place as well as ions, because background atoms in the near vacuum occasionally knock them out of the traps. After the cesium atoms were in place, the researchers set them to their lowest quantum state by cooling them. They then shifted the internal quantum state of the atoms using two perpendicular circularly polarized addressing beams. The targeted atom was shifted about twice as much as any other atom, enabling the researchers to use microwaves to change the target atom's qubit state without affecting the states of any other atoms. The researchers can only fill about 50 percent of the laser atom traps with atoms, but they can perform quantum gates on those atoms with 93-percent fidelity and cross-talk that is too small to measure.
Abstract News © Copyright 2015 INFORMATION, INC.
To submit feedback about ACM TechNews, contact: email@example.com
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.