Welcome to the October 27, 2014 edition of ACM TechNews, providing timely information for IT professionals three times a week.
Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).
HEADLINES AT A GLANCE
How Big Data, Mobile, and Cloud Are Fighting Ebola in Sierra Leone
ComputerWeekly.com (10/27/14) Alex Scroxton
IBM is using data analytics, mobility, and cloud computing technologies to control the spread of Ebola in Sierra Leone. "Africa is ground zero, and we want to be essential to the continent of Africa so, when this disease became pervasive in West Africa, we felt we had no choice," says IBM Africa research lab chief scientist Uyi Stewart. IBM, in conjunction with Sierra Leone's Open Government Initiative (OGI) and its technology partners, has implemented a system to enable citizen reporting of Ebola through SMS and voice. "This is our core strength, to collect data through innovative approaches and analyze it to generate actionable innovation," Stewart says. IBM uses cloud-based supercomputers to aggregate and correlate a wide range of data, and the results are shared with people on the ground through OGI. In addition, IBM has deployed SoftLayer cloud technology to set up an Ebola Open Data Repository, providing governments, aid agencies, and researchers with free and open access to data. IBM's Africa research lab aims to use science and technology to deliver commercially viable innovation that improves the lives of people across the continent. "Our ultimate goal is to help transform the African continent and the human condition," Stewart says.
Using Cash and Pressure, China Builds Its Chip Industry
The New York Times (10/26/14) Paul Mozur
China wants to transform its chip industry into a world leader by 2030 and become less reliant on foreign-sourced technology. McKinsey & Co. reported in June the government could spend $170 billion over the next five to 10 years to support promising chip makers. "There is a clear sense of urgency nowadays about semiconductors and chips in particular,” says Rhodium Group founding partner Daniel H. Rosen. "There is a sense that since China is overwhelmingly still dependent on imports--especially for higher-end chips that go into everything made in the country--there is a national security vulnerability." Government subsidies have helped the Semiconductor Manufacturing International Corporation (SMIC) become a major chip producer since its founding in 2000. Chinese officials have a deeper concern about U.S. government surveillance and the use of foreign-sourced technology components such as chips due to the fallout over the disclosures by former U.S. National Security Agency contractor Edward Snowden. In recent months, China has increased pressure on multinational companies facing antitrust and price-fixing probes, and resolutions with tech companies that involve lower licensing fees or other technology-sharing arrangements would fit officials' industrial policy goals. Meanwhile, some tech security analysts say Chinese hackers are targeting the chip technology of foreign companies.
Twitter Grants Select Researchers Access to its Public Database
The Washington Post (10/26/14) Mohana Ravindranath
In February, Twitter announced a data grant program offering a handful of research teams free access to its database. The academic researchers making the requests want to study almost a decade of historical data, according to Twitter's Chris Moody. He notes researchers want to develop models to predict the success of political campaigns, the spread of public health crises, and other phenomena. "They call it 'back-testing'--they needed to back-test their hypotheses," Moody says. A team of researchers from Harvard University and Boston Children's Hospital was chosen as one of six research projects for the data grant pilot program. The project combines food-poisoning reports from the U.S. Centers for Disease Control and Prevention with content from Internet users to better understand the spread of food-borne illnesses. Meanwhile, a University of California, San Diego team is examining whether happy people are likely to post happy images on Twitter, enabling the group to measure the relative happiness of cities' citizens. University of Twente researchers are assessing the effectiveness of social-media campaigns encouraging early cancer detection, while University of East London scientists are studying a potential link between public tweets and sports team performance. Separately, Twitter has committed $10 million over five years to establish a social-media research lab at the Massachusetts Institute of Technology.
U.S. Fights Critiques of How Web Is Managed
The Wall Street Journal (10/26/14) Drew FitzGerald
U.S. officials are tackling opposition from countries over how the Internet is overseen. At the International Telecommunication Union's (ITU) conference in South Korea, which began last week, more than 190 nations are debating whether the ITU's mandate on "information and communication technology" includes the Internet, which U.S. officials and their allies want to keep separate. The charter for the ITU, the United Nations agency in charge of radio and telephone standards, doesn't specifically cover the Internet, and U.S. delegates to the conference say Internet governance is best handled by organizations such as the Internet Corporation for Assigned Names and Numbers, which do not answer to any government. However, some countries, including Russia and several Arab states, want the ITU to have more say in Internet governance. Although the debate has been ongoing for more than 10 years, the issue intensified following the reports of widespread U.S. National Security Agency network surveillance. Some Western diplomats worry giving the ITU authority over the Internet could give authoritative governments increased control over the Internet beyond their borders, while U.S. officials are concerned the issues under discussion at the ITU conference could influence the way smaller countries create laws about privacy and censorship.
Ghosts in the Machine Language
The Economist (10/24/14)
Four of this year's most dangerous software exploits--Heartbleed, Goto Fail, Shellshock, and POODLE--were all examples of vulnerabilities that lay undiscovered in widely utilized code for years. When they eventually were discovered, their impact was deep and far reaching, a phenomena that likely will only get worse as the use of open source software proliferates. Meanwhile, the wide-ranging use of software and operating systems (OSes) based on decades-old code is potentially more dangerous. For example, both Android and the Apple OSes have roots in the 1970s-era UNIX OS, as do numerous embedded devices such as set-top boxes, routers, and game consoles. Along with the useful elements of UNIX, these devices and OSes also often include vestigial bits of code that is decades old and widely available, creating innumerable opportunities for new exploits emerging in code people have forgotten to monitor. The threat is especially insidious in simpler embedded devices, which often require only enough code to carry out a few simple tasks, but nevertheless contain a complete operating system. John Hopkins cryptographer Matthew Green says this creates a situation in which "nobody knows all the [device's] features, let alone all the bugs."
New Research Center Aims to Develop Second Generation of Surgical Robots
The New York Times (10/23/14) John Markoff
University of California (UC), Berkeley scientists are establishing the Center for Automation and Learning for Medical Robots, a research center intended to help develop medical robots that can perform low-level and repetitive surgical tasks, enabling human doctors to concentrate on the most challenging and complex aspects of operations. "Our goal is to help surgeons focus on the critical aspects of surgery, rather than having to perform each tedious and repetitive subtask," says UC Berkeley professor Ken Goldberg. In May, the researchers presented a paper detailing the first example of a robot automating surgical tasks involving soft tissue. The robot, called the da Vinci surgical robot, is operated by surgeons at a workstation who remotely control instruments inserted through small incisions during minimally invasive procedures. In a series of experiments, the researchers "taught" the da Vinci system to cut away small fragments of cancerous tissue on its own and to make a circular incision without human guidance. "I think this is a small but good step toward a hard-to-reach goal of enabling us to show a robot how to do something," says Stanford University researcher Kenneth Salisbury.
Precise and Programmable Biological Circuits
ETH Zurich (10/22/14) Fabio Bergamin
ETH Zurich researchers have developed several new components for biological circuits they say are important for constructing precisely functioning and programmable bio-computers. The researchers want to created small circuits made from biological material that can be integrated into cells to change their functions. The researchers developed a biological circuit that controls the activity of individual sensor components using an internal "timer." The circuit prevents a sensor from being active when not required by the system. In the new biosensor, the gene responsible for the output signal is not active in its basic state because it is installed in the wrong orientation in the circuit DNA. "The input signals can be transmitted much more accurately than before thanks to the precise control over timing in the circuit," says ETH Zurich professor Yaakov Benenson. The researchers also have developed a signal converter that changes one signal into another, and also can be used to convert multiple input signals into multiple output signals. Benenson says the technology will increase the number of applications for biological circuits. "The ability to combine biological components at will in a modular, plug-and-play fashion means that we now approach the stage when the concept of programming as we know it from software engineering can be applied to biological computers," he says.
Data Mining Reveals How News Coverage Varies Around the World
Technology Review (10/23/14)
Qatar Computing Research Institute researchers Haewoon Kwak and Jisun An analyzed news agendas in different parts of the world to see how the coverage reflects actual international events. They developed a cartogram by forming a database of 195,000 disasters occurring between April 2013 and July 2014. The disasters were reported by more than 10,000 news outlets, and Kwak and An noted the countries in which each news outlet was based and counted the published stories from other parts of the world. They then created a map of the world showing where the news was from. The researchers found that people in South Asia consumed more news about disasters in that region than people in North America, and people in Latin America consumed significantly more news from Argentina than Europe. However, the cartogram also revealed that people everywhere consumed relatively large amounts of news from Egypt and Syria, chiefly about the unrest in these countries and the accompanying humanitarian crises. Kwak and An also found that population size is significant. People in all regions were more likely to see disaster news from other large countries, probably because immigrants are more likely to be from those large countries that provide demand for that kind of coverage.
Google Teams Up With Oxford Academics to Bring Human-Like Robots Closer to Reality
Daily Mail (United Kingdom) (10/23/14) Victoria Woollaston
Google is collaborating with Oxford University artificial intelligence (AI) researchers to help machines better understand users, and to improve visual-recognition systems using deep learning. "It is a really exciting time for AI research these days, and progress is being made on many fronts, including image recognition and natural language understanding," says Demis Hassabis, co-founder of DeepMind and vice president of engineering at Google. "We are delighted to announce a partnership with Oxford University to accelerate Google's research efforts in these areas." Four Oxford researchers, who together co-founded Dark Blue Labs earlier this year, will work with Google on machine-learning technology. Three other Oxford professors, who co-founded Vision Factory, will work on visual-recognition systems. "These exciting partnerships underline how committed Google DeepMind is to supporting the development of U.K. academia and the growth of strong scientific research labs," Hassabis says. Google also is working with physicist John Martinis to build processors based on quantum theories. His hire is part of a hardware initiative to design and build chips operating on sub-atomic levels in ways that makes them much faster than existing processors, according to Google.
'Wearable Technology' Curriculum Aims to Fuel Interest in STEM
University of Nebraska Omaha (10/22/14) Charley Reed
Researchers at the University of Nebraska Omaha (UNO) and the University of Nebraska-Lincoln (UNL) are developing a curriculum that will enable all students to learn the science behind "wearable technology." The three-year project, backed by a U.S. National Science Foundation grant, will include inquiry-based activities to about 900 students in grades 4-6 who attend public school in Nebraska. Participating students will receive kits featuring conductive thread, light-emitting diode (LED) lights, sensors, and other components commonly found in high-tech clothing. The students also will work with microcontrollers, including tiny circuit boards that can be programmed to direct the various devices. "It's hard to name an industry that isn't impacted by technology today and so the earlier that we can introduce students to the different ways technology is used today, the more options they will have available to them when they go to college," says UNO researcher Neal Grandgenett. The curriculum will help students learn basic principles of engineering design they can use to create LED-enabled bracelets and other apparel. "We're hoping to teach these students to think like engineers, and wearable technology is the vehicle that we're using to do it," says UNO professor Brad Barker. UNO also will integrate wearable technologies into to graduate-level education offerings for current and future science, technology, engineering, and mathematics teachers.
Quantum Internet Could Cross Seas by Container Ship
New Scientist (10/21/14) Jacob Aron
Container ships could be used to create a kind of international quantum Internet, according to Simon Devitt from Ochanomizu University in Tokyo and colleagues. No one knows how to build the devices proposed to send quantum data over long distances, called quantum repeaters, and the technology needed to store quantum bits also does not exist. The team examined the research of groups working on quantum hard drives and calculated that for diamond-based drives, a single shipping container could hold the equivalent of 125 bytes of quantum data, while for silicon-based drives the same space would hold nearly 200 terabytes. Most of the container space would be used for cooling and power to keep the drives functioning. Considering large ships can carry 10,000 containers, a fully loaded vessel on a 20-day voyage between Japan and the United States would have a transfer rate equivalent to between 10 bytes per second and 1 terabyte per second, depending on the memory used.
Is the City of the Future Finally Here?
HPC Wire (10/21/14) Tiffany Trader
Argonne National Laboratory recently hosted the Argonne OutLoud series, during which grid computing pioneer and big data visionary Charlie Catlett delivered a presentation on "Big Data and the Future of Cities." The presentation examined how emerging technologies in high-performance computing, embedded systems, and data analytics can help mitigate some of the challenges associated with increased urbanization. Catlett says with data sources and technologies catalyzing new applications and services, there is an opportunity to make policy that is proactive rather than reactive. He says Argonne researchers are developing tools and methods to help social scientists, economists, policymakers, and climate scientists study cities. The primary goal is to get different fields that use computing and computer scientists that develop the systems to work together. "One of the ways to think about the laboratory is we run national and international instruments [such as Mira, the world's fifth fastest supercomputer] to do the kind of science you couldn't do in the laboratory of a university or a small company," Catlett says. He currently is working with the City of Chicago on the Array of Things project, which aims to build a platform that will enable researchers to collect data for various kinds of scientific inquiries into the sustainability and operation of cities.
Abstract News © Copyright 2014 INFORMATION, INC.
To submit feedback about ACM TechNews, contact: email@example.com
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.