Association for Computing Machinery
Welcome to the January 31, 2014 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


Storage System for 'Big Data' Dramatically Speeds Access to Information
MIT News (01/31/14) Helen Knight

Massachusetts Institute of Technology (MIT) researchers have developed BlueBDM, a storage system based on a network of flash storage devices for big-data analytics that can dramatically accelerate the time it takes to access information. In BlueDBM, each flash device is connected to a field-programmable gate array (FPGA) chip to create an individual node. The FPGAs are used to control the flash devices and are capable of performing processing operations on the data itself, says MIT researcher Sang-Woo Jun. "This means we can do some processing close to where the data is [being stored], so we don't always have to move all of the data to the machine to work on it," he notes. FPGA chips also can be linked together using a high-performance serial network, which has a very low latency. Using multiple nodes allows for the same bandwidth and performance for the storage network as much more expensive systems, the researchers report. They have already built a four-node prototype network, and are currently working on a much faster 16-node prototype network in which each node will operate at 3 Gbps. "If we're fast enough, if we add the right number of nodes to give us enough bandwidth, we can analyze high-volume scientific data at around 30 frames per second, allowing us to answer user queries at very low latencies, making the system seem real-time," says researcher Ming Liu.


Researchers Implement HPC-First Cloud Approach
HPC Wire (01/29/14) Tiffany Trader

North Carolina State University researchers have demonstrated a proof-of-concept for a novel high-performance cloud computing platform by merging a cloud computing environment with a supercomputer. The implementations show that a fully functioning production cloud computing environment can be completely embedded within a supercomputer, allowing users to benefit from the underlying high-performance computing hardware infrastructure. The supercomputer's hardware provided the foundation for a software-defined system capable of supporting a cloud computing environment. This "novel methodology has the potential to be applied toward complex mixed-load workflow implementations, data-flow oriented workloads, as well as experimentation with new schedulers and operating systems within an HPC environment," the researchers say. The software utility package, Kittyhawk, serves as a provisioning engine and offers basic low-level computing services within a supercomputing system. Kittyhawk is what allowed the researchers to construct an embedded elastic cloud computing infrastructure within the supercomputer. The HPC-first design approach to cloud computing leverages the "localized homogeneous and uniform HPC supercomputer architecture usually not found in generic cloud computing clusters," according to the researchers. This type of system has the potential to support multiple workloads, including traditional HPC simulation jobs, workflows that involve both high-performance computing and non-high performance computing analytics, and data-flow orientation work.


Quantum Engineers Take a Major Step Towards a Quantum Computer
University of Bristol News (01/30/14)

Researchers at the University of Bristol have generated and manipulated single particles of light on a silicon chip, which they say represents a major breakthrough in the race to build a quantum computer. The researchers say they are exploiting state-of-the-art engineering processes and principles to make significant strides in a field previously dominated by scientists. The new chip integrates components that can generate photons inside the chip itself. "This could eventually lead to an optical quantum computer capable of performing enormously complex calculations," says Bristol researcher Joshua Silverstone. The researchers next plan to integrate the remaining necessary components onto a chip, and show that large-scale quantum devices using photons are possible. "We hope to have within the next couple of years, photon-based devices complex enough to rival modern computing hardware for highly-specialized tasks," says Bristol researcher Mark Thompson. However, the researchers say that in order to realize the full potential of quantum machines, a new field of engineering will be required.


Scientist Developing 3D Chips to Expand Capacity of Microprocessors
National Science Foundation (01/29/14) Marlene Cimons

Stony Brook University researchers say they are developing new technology, circuits, and algorithms for the next generation of microprocessors, mobile computing devices, and communication chips. They hope to overcome the fundamental limitations of current electronic systems, such as high power consumption, says professor Emre Salmon. The researchers are working to develop three-dimensional (3D) integration, an emerging technology that would vertically stack multiple wafers, which could significantly enhance the capability of the current two-dimensional chips. In 3D integration technology, discrete chips or tiers are stacked on top of each other before they are packaged. A single 3D chip would be able to detect data from the environment, process and store the data using advanced algorithms, and wirelessly transmit the data to a remote center. The researchers say they are focusing on an approach with the potential to enlarge the 3D domain from high-performance computing to relatively low-power systems-on-chip, which integrate multiple functions into a single 3D chip. "Our fundamental objective is to develop a reliable 3D analysis and design platform for these applications which will host future electronics systems that are increasingly more portable, can interact with the environment, consume low power, yet still offer significant computing capability," Salmon says.


How a Database of the World's Knowledge Shapes Google's Future
Technology Review (01/27/14) Tom Simonite

In an interview, Google vice president of engineering John Giannandrea discusses Google's Knowledge Graph and the company's efforts to teach computers to better understand the meaning of data. Google plans to create products that will better understand users and their search needs via Knowledge Graph, a cross-company effort that encompasses nearly all of the structured data from all Google products. Knowledge Graph creates a symbolic structure of ideas and common sense that analyzes relationships among entities such as food, sports, or famous people. Giannandrea says the technology will take Google's Web search a step closer to understanding the context of documents, for example, knowing that a document about famous tennis players also is about tennis and sports. Google.com is now able to perform some question answering and has a new exploration feature called the carousel for exploring categories of entities. Knowledge Graph is continuously updated with new information and even contains some subjective information. "One of the main areas is to try and understand at a slightly higher level what text is about," Giannandrea says. "Words that you see in a text are fundamentally ambiguous [to a computer] but if you have Knowledge Graph and can understand how the words are related to each other, then you can disambiguate them."


Menthal App Reveals Just What a Screen Addict You've Become
Wired.co.uk (01/27/14) Olivia Solon

Smartphone owners will be able to determine how much time they spend on the device using an app developed by computer scientists and psychologists from the University of Bonn. The free Android app, called Menthal, is designed to enable people to monitor and measure their smartphone use. "This app can show us in detail what someone's average cellphone consumption per day looks like," says Bonn professor Alexander Markowetz. The researchers want to determine what constitutes normal mobile phone use and where too much starts. The app also could potentially be used to measure the progression and severity of depression. Smartphones can record when someone makes fewer phone calls and goes outside less frequently, thanks to their built-in global positioning system. The Bonn team tested Menthal on 50 students over six weeks and found that 25 percent used their phones for more than two hours a day. They also learned that participants activated their phones more than 80 times during the day. Typical users spoke for eight minutes each day, wrote 2.8 text messages, and spent half the time using messenger apps or social networking sites.


Inside the First-Year Data From MITx and HarvardX
Campus Technology (01/27/14) Rhea Kelly

In an interview, Harvard University professor Andrew Ho discusses the recent Harvard and Massachusetts Institute of Technology (MIT) study of edX massive open online course (MOOC) data. The project, for which Ho served as a lead researcher, aimed to improve the understanding of how students learn and how technology can aid the teaching process both online and in the classroom. The researchers studied an average of 20 gigabytes of data for each of 17 courses offered on the edX platform from 2012-2013. The study found that 50 percent of MOOC registrants dropped out within a week or two of enrolling, but attrition rates fell significantly thereafter. "Our results show considerable differences on all demographic variables, from gender to age to prior educational attainment," Ho says. He also says MOOC course completion rates are misleading, because many registrants view course material as Web content to surf, and learn without completing the course. Asynchronicity presents another difficulty in interpreting MOOC data, because students sometimes enroll in courses months after the certification window has closed, and thus are automatically tallied as dropouts. Ho says further MOOC research is needed, and notes that Harvard and MIT are already preparing their Year 2 research, which includes randomized experiments to estimate causal impacts on performance and persistence.


IU Scientist Exploring Artificial Intelligence
Bloomington Herald-Times (IN) (01/27/14) Dan Denny

Indiana University researchers want to use artificial intelligence (AI) to provide doctors with a tool to help them diagnose and treat patients. "Artificial intelligence can detect patterns matching a patient's disease progression and recommend up-to-date, cost-effective treatment plans to a human doctor," says Indiana professor Kris Hauser. The researchers have received a U.S. National Science Foundation grant to develop and test prototype decision-support systems designed to help physicians diagnose and treat patients with heart problems and clinical depression. "AI systems can digest relevant information and put it all on the table, ultimately making healthcare more transparent and cost-effective," Hauser says. The researchers are using statistical relational-learning techniques to find patterns in large electronic healthcare databases. They then are putting those patterns into a mathematical framework that produces a set of possible treatment sequences. "The system will present information to doctors about what might be the best treatment plan, along with the plan's expected outcome and costs," Hauser says. He also notes the systems can help physicians make the best diagnoses and order the most appropriate and effective tests.


We the Internet: Bitcoin Developers Seed Idea for Bitcloud
Phys.Org (01/27/14) Nancy Owano

Bitcoin developers want to decentralize the current Internet and replace it with a new Internet. The Bitcloud developer group has proposed a peer-to-peer system for sharing bandwidth, and enabling individual users to complete computing tasks such as routing, storing, or computing in exchange for cash. The Bitcloud approach would enable nodes on a mesh network to be rewarded financially for routing traffic in a brand-new mesh network. The researchers note their cash model would eliminate the need for Internet service providers. The proposal is based on the ideas of Bitcoin, Mediaglobin, and Tor, and the researchers are looking for more developers to assist with the project. They point out that the initiative demands a "massive amount of thought and development in many different parts of the protocol, so we need as many people helping as possible." Cloudcoins would be the system's currency, and would operate similar to bitcoins as the currency of the Bitcoin protocol. "If you're interested in privacy, security, ending Internet censorship, decentralizing the Internet, and creating a new mesh network to replace the Internet, then you should join or support this project," according to a recent appeal on Reddit.


IST Professor, CSE Grad Develop Tools to Access 'Scholarly Big Data'
Penn State News (01/24/14) Stephanie Koons

Pennsylvania State University (PSU) researchers have developed recommendation systems for expert and collaborator discovery that enable users to access scholarly big data. "For data-driven research, we have a very good opportunity for big data research because we are one of the few groups that have such a large volume of data," says PSU's Hung-Hsuan Chen. The researchers propose CSSeer, a free and publicly available key phrase-based recommendation system for expert discovery based on the CiteSeerX digital library and Wikipedia as an auxiliary resource. The system generates key phrases from the title and the abstract of each document in CiteSeerX, and those key phrases are used to infer the author's expertise. "The system automatically figures out who are the experts of a given area," Chen says. CiteSeerX aims to improve the dissemination of scientific literature and to provide improvements in functionality, usability, availability, cost, comprehensiveness, efficiency, and timeliness in the access of scientific and scholarly knowledge. PSU professor C. Lee Giles says the long-term plans for CiteSeer include making the system easier and more efficient to run, improving data-extraction methods, increasing the number of documents, and automating the extraction of tables and figures in documents.


Air Force Researchers Plant Rootkit in a PLC
Dark Reading (01/27/14) Kelly Jackson Higgins

Researchers at the U.S. Air Force Institute of Technology's Center for Cyberspace Research have demonstrated a prototype rootkit capable of infiltrating the firmware and corrupting the operations of programmable logic controllers (PLCs), which are used in many different sectors to operate industrial and other machinery. Center director Jonathan Butts says he and fellow researcher Stephen Dunlap did not exploit bugs to insinuate their code into PLCs. "We just used methods to code the system up where you take advantage of and embed your own malicious software to run on top of the firmware," he says. The rootkit had two payloads: one that would make the PLC shut down seemingly at random and another that would permanently brick the device so that it would have to be replaced. The researchers say they were able to develop the rootkit in less than four months and for a cost of about $2,000. Butts says the rootkit could infect a PLC through a malicious firmware update or via an infected USB drive connected to the PLC via a laptop. Butts and Dunlap suggest that vendors, integrators, and operators can all take steps to improve the security of PLCs.


Augmented Reality Lifts Awareness of Nature Preservation
UC Newsroom (01/21/14) Chris Casey

University of Colorado Denver digital design students created an augmented reality app for the U.S. Fish and Wildlife Service (USFWS) that is designed to enable park visitors to enjoy virtual wildlife sightings in nature. The USFWS approached professor Michelle Carpenter's Design Studio 3 class about enhancing its "Get Your Goose On" program, aimed at increasing awareness of the National Wildlife Refuge System, especially among young people. "We thought maybe we could use augmented reality to increase interest from younger people," Carpenter says. "That way people could have a phone or tablet and see animals, including endangered animals, which might appear there." The students say the interactive quality of augmented reality, as well as the ability to use smartphones and other devices, will appeal to young people. Thus far, the team has created a virtual deer and a bald eagle that appear when a device is held up to the "Get Your Goose On" logo. The USFWS is considering using the app at the Rocky Mountain Arsenal National Wildlife Refuge. Carpenter says the app also could enable users to take photos with their devices and view a drop-down screen with information about a particular animal.


Why Aren't More Girls Interested in Computer Science?
CCC Blog (01/27/14) Shar Steed

In an interview on HLN Weekend Express, Barbara Ericson, director of computer outreach at the Georgia Institute of Technology, discusses the alarming findings of a recent study on girls in computer science. The study found that boys outnumber girls in high school Advanced Placement (AP) Computer Science classes four to one. In addition, no girls took the AP Computer Science test in Mississippi, Montana, and Wyoming, and the highest percentage of girls taking the test in any state was 29 percent for Tennessee. Computer science is an elective, and Ericson sees that as contributing to the gender gap in participation. She cites stereotypes that discourage girls and the abstract, individualized nature of computer science classwork as other reasons. Girls tend to prefer more practical and social activities, Ericson says. One way to help close the gender gap is to make computer science a core requirement. Ericson also suggests training more teachers, which would provide greater access for students.


Abstract News © Copyright 2014 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe