Welcome to the January 13, 2014 edition of ACM TechNews, providing timely information for IT professionals three times a week.
Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).
HEADLINES AT A GLANCE
Designing the Next Wave of Computer Chips
The New York Times (01/09/14) John Markoff
Although experts warn that the ability to shrink semiconductors in accordance with Moore's Law is reaching its limit, a new class of nanomaterials that can self-assemble might enable the creation of nearly molecule-scale circuits. These new materials include metals, ceramics, polymeric, and composite materials with bottom-up organization, rather than top-down. Emerging chemical processes can cause these materials to self-assemble into circuits by forming patterns of ultrathin wires on a semiconductor wafer. Together with conventional chip-making techniques, semiconductor designers believe nanomaterials will enable a new class of computer chips that will maintain Moore’s Law and lower the cost of chip making. Silicon Valley researchers are leading the transition from silicon to computational materials, using supercomputers to simulate predictions. Economic factors are helping to drive this research, as Gartner predicts that next-generation semiconductor factories will cost $8 billion to $10 billion, which is more than double the current cost. This staggering expense creates a huge risk of failure for chip makers, encouraging them to turn to new self-assembling materials. For example, Sandia National Laboratories researchers in December released a paper on metal-organic frameworks, which are crystalline structures of metal ions and organic molecules that have been simulated with high-performance computers and verified experimentally.
Gender Bias in Tech Professions Called a Reality
Investor's Business Daily (01/10/14) Sheila Riley
Although the technology industry is often a world leader and innovator, it is behind the times in terms of gender equality in the workplace, according to Stanford Law School fellow Vivek Wadhwa. "Things overall are moving in the right direction, except for pockets of resistance in the tech world," says Wadhwa, who plans to publish research about the experiences of 500 women in tech jobs. Women held 57.2 percent of U.S. professional occupations in 2012, according to Department of Labor statistics. However, women account for just 25.6 percent of workers in what the agency calls "computer and mathematical occupations," according to the report. Gender bias in tech is not an issue of overt discrimination, but more about underrepresentation and subtle biases, says the National Center for Women & Information Technology's Catherine Ashcraft. She notes that other reasons women are often left behind in tech include lack of mentors, lack of access to professional networks, and the stereotypical messages girls get from the media, teachers, parents, and peers. Ashcraft says employers also often treat women differently, such as in performance evaluations. "For women, they often tend to be shorter and attribute success to stereotypical feminine characteristics such as collaboration and teamwork," she says.
Reddit, Mozilla, EFF to Hold Day of Protest, Activism in Memory of Aaron Swartz
Network World (01/10/14) Jon Gold
Reddit, the Electronic Freedom Foundation, Free Press, Mozilla, and several other organizations are planning an online protest on Feb. 11 against wide-ranging government surveillance of Internet users in memory of activist Aaron Swartz. The organizations will urge people to contact lawmakers and pressure them to help end the U.S. National Security Agency's aggressive information-collection activities. "These programs attack our basic rights to connect and communicate in private, and strike at the foundations of democracy itself," says Free Press' Josh Levy. The announcement came just before the one-year anniversary of Swartz's suicide on Jan. 11, 2013. Swartz was under indictment for allegedly illegally downloading a large number of academic journal articles, and prosecutors had repeatedly stated their intention to jail him. "If Aaron were alive he'd be on the front lines, fighting back against these practices that undermine our ability to engage with each other as genuinely free human beings," says Demand Progress co-founder David Segal.
Eureka! New Tech Shrinks Cloud Computing's Carbon Footprint
InfoWorld (01/10/14) David Linthicum
New algorithms designed to simulate a worldwide network of connected data centers can predict how to best limit carbon emissions from cloud computing. A team of computer scientists at Trinity College Dublin and IBM Research developed the algorithms, which also carry out the necessary computing and deliver the required data. The researchers used the algorithms, collectively called Stratus, to simulate a scenario inspired by the setup of Amazon's Elastic Compute Cloud (EC2) data center, which has locations in Virginia, California, and Ireland. The experimental model placed data centers in these locations, and used queries from 34 sources in different parts of Europe, Canada, and the United States as tests. The team factored in carbon emissions, the cost of electricity, and the time needed for computation and data transfers to optimize the workings of the network. The Stratus algorithms, which reduced the EC2 cloud's emissions by 21 percent, routed more requests to the Irish data center, which had faster servers, and used less power and generated fewer emissions.
The 'Personalized Advantage Index,' a New Decision-Making Tool Developed at Penn
Penn News (01/08/14) Evan Lerner
Researchers at the universities of Pennsylvania and Pittsburgh have created a decision-making model called the "personalized advantage index" for scientific research that compares and weighs multiple variables to help optimize researchers' choices. The team tested the model on data from a study of patients seeking depression treatment, who received either cognitive behavioral therapy or medication. The model assigned scores to patients to suggest which treatment was likely to be more effective. The results provided an advantage equivalent to that of an effective treatment versus a placebo, indicating the technique has clinical value. The tool could have applications in any decision-making scenario with complex and sometimes contradictory variables. Whereas previous randomized control trials have addressed relevant single variables such as negative life events, the personalized advantage index gauges the degree to which each variable impacts the treatment outcome. The algorithm maximizes the predictive value of multiple variables by assigning weights to each one to offer a useful context. Machine learning and other artificial intelligence techniques could further refine the tool by comparing variables in complex, nonlinear ways. "This is a way to begin to close the chasm between the wealth of information on how to improve outcomes and how that information is actually applied," says University of Pennsylvania's Robert DeRubeis.
Computer Science: The Learning Machines
Nature (01/08/14) Nicola Jones
Deep-learning computers are advancing toward true artificial intelligence (AI) that will enable them to think as humans do. The approach relies on tremendous data sets and vast computing power to answer problems that humans can easily solve, such as identifying patterns in a large number of images to identify categories such as cats and people. Deep learning is based on the concept of neural networks, which are modeled loosely on the interconnected neurons of the human brain. Progress in deep learning is bringing consumers software that is better able to sort through photos, understand spoken commands, and translate text from foreign languages. Furthermore, scientists are using deep-learning computers to identify potential drug candidates, map neural networks in the brain, and predict the functions of proteins. Deep learning in neural networks appeared promising in the 1980s, but enthusiasm dwindled when the approach proved challenging. However, interest resumed in 2000 due to increases in computing power and a sudden abundance of digital data, with many researchers focusing on speech and image recognition. Now the emphasis in deep learning is shifting to natural language understanding to enable computers, for example, to comprehend human speech well enough to rephrase and answer questions, or to translate languages. Although the results show great potential, deep learning is still a nascent field.
Emory, Georgia Tech Team Up on High-Performance Computing Cluster
Campus Technology (01/08/14) Dian Schaffhauser
Researchers at Emory University and Georgia Tech will share the computing power of TARDIS, a new high-performance computing cluster. TARDIS replaces Ellipse, a legacy cluster acquired by Emory in 2007. The new cluster will take up only a 20th of the space used by Ellipse, consume less energy, and generate less heat while delivering faster performance and providing more storage with backup. "The performance advantages will be significant, and the power savings are tremendous," says Emory professor Dieter Jaeger. The new server will enable the processing of 20 exomes per hour, which is a 60-fold increase in speed, according to Emory professor Michael Zwick. "This is a dramatic improvement and will allow members of the Emory community to perform larger experiments faster and for less money," Zwick says. "We will be a significant user of the new cluster and our computational services will be taking advantage of this exciting new capability." Georgia Tech's Rich Computer Center will host TARDIS in its Partnership for an Advanced Computing Environment and latency will be insignificant because the two sites share a 10 Gbps connection. "We are very excited to begin this next phase of collaborations between Georgia Tech and Emory, and look forward to strengthening this partnership for years to come," says Georgia Tech's Neil Bright.
MIT Debuts Online Big Data Course for Tech Pros
Network World (01/09/14) Ann Bednarz
The Massachusetts Institute of Technology (MIT) will offer an online big data course for technology professionals as part of its new lineup of Online X professional programs. The course, "Tackling the Challenges of Big Data," will run from March 4 to April 1, and will cover data collection from smartphones, sensors, and the Web. The course also will address data storage and processing, including scalable relational databases, Hadoop, and Spark; analytics such as machine learning, data compression, and efficient algorithms; visualization, and a range of applications. MIT will use the Open edX platform to deliver the course, which will include learning assessments, case studies, discussion forums, and a community wiki as part of the experience. Faculty members from the Computer Science and Artificial Intelligence Laboratory will teach the course. Participants will receive an MIT Professional Education certificate for successfully completing the course, and will gain access to the group's professional alumni network.
Lawrence Livermore Explores the Shape of Data, Expanding Query-Free Analytics
Government Computer News (01/09/14) William Jackson
Lawrence Livermore National Laboratory is using topological data analysis research to explore new ways of obtaining useful information from extremely large, complex data sets. The lab is working with Stanford University spinoff Ayasdi, which is funded by the U.S. Defense Advanced Research Projects Agency and the National Science Foundation. Ayasdi’s Insight Discovery software handles big data problems by extracting information from very large data sets. The lab, which uses high-performance computing for modeling and simulation in energy, climate change, biological defense, and national security, has developed its own tools in the past but is now looking to commercial technologies, as big data gains momentum in the technology world. In particular, the lab believes its work could benefit from topological data analysis, which studies the shapes and meanings of vast, high-dimensional data sets. Topological methods focus geometrically on pattern and shape recognition within data, which enables scientists to make discoveries without specifically seeking them. The Insight Discovery software analyzes data to produce dimensional shapes and uses algorithms to extract relationships, without querying databases. The U.S. Department of Agriculture is using the software to study E. coli bacteria. One particular topological data analysis focus for the Lawrence Livermore lab will be bioinformatics.
Eye-Catching Electronics
ETH Zurich (01/07/14)
ETH Zurich researchers are developing electronic components that are thinner and more flexible than previous technologies, opening up new possibilities for ultra-thin, transparent sensors. "These new thin-film transistors adhere to a wide range of surfaces and adapt perfectly," says ETH Zurich's Niko Munzenrieder. The membrane consists of the polymer parylene, which the researchers evaporate layer by layer onto a conventional two-inch wafer. The researchers then release the parylene film with its attached electronic components from the wafer. They say an electronic component fabricated in this way is extremely flexible, adaptable, and transparent. The researchers say "smart" contact lenses are one potential use of the technology and, during testing, they attached the thin-film transistors to standard contact lenses. The lenses were placed on an artificial eye and tested to see whether the electronics could withstand the bending radius of the eye and continue to function. Although the tests were successful, the researchers say they must still overcome a few technical problems before a commercially viable solution can be considered. For example, the way in which the electronics are attached to surfaces such as a contact lens has to be optimized to account for the effects of the aqueous ocular environment.
NASA Robots Blaze the Trail for Humans on Mars
Computerworld (01/07/14) Sharon Gaudin
The U.S. National Aeronautics and Space Administration (NASA) launched the Mars rovers Spirit and Opportunity in the summer of 2003 with the goal of having the machines survive and transmit data from the Martian surface for three months. Spirit, which sent more than 128,000 images of Mars back to Earth, worked for about seven years before it became stuck in soft sand and was abandoned in 2010, but Opportunity is still operational more than 10 years after its launch. "The robots we put down on Mars are our avatars right now," says NASA scientist John Connolly. "They are our eyes, our feet, our hands on the ground that inform us before we get there. Without these robots doing this work, it would be a very risky endeavor getting to Mars." In addition, sending humans to Mars largely depends on the information the robotic rovers and the Mars orbiters send back about the planet's geology, mineral makeup, water reserves, and atmosphere. NASA wants to send humans to Mars in the 2030s, but first the space agency plans to send at least one more rover and more orbiters to make more discoveries.
Using Psychology to Create a Better Malware Warning
Threatpost (01/07/14) Chris Brook
Cambridge University cryptography and computer science experts have released a report on the psychology of malware warnings, suggesting that using different language could make the messages more effective. Studies show that average users tend to ignore computer warnings due to daily overexposure and the difficulty of separating actual threats from inconveniences. "There is a need for fewer but more effective malware warnings...particularly in browsers," the report says. The researchers presented more than 500 men and women with variations of a Google Chrome warning. Concrete rather than general warnings were the most effective, and invoking authority also improved attentiveness. "Warning text should include a clear and nontechnical description of potential negative outcome or an informed direct warning given from a position of authority," the researchers say. Social influence also appears to impact users, with study participants often clicking through warnings when friends told them it was safe. In addition, nine out of 10 respondents left computer warnings turned on, while just one in 10 wanted to turn them off.
Abstract News © Copyright 2014 INFORMATION, INC.
To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe
|