Welcome to the December 12, 2012 edition of ACM TechNews, providing timely information for IT professionals three times a week.
ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).
HEADLINES AT A GLANCE
By 2020, There Will Be 5,200 GB of Data for Every Person on Earth
Computerworld (12/11/12) Lucas Mearian
Over the next eight years more than 40 zettabytes of digital data will be produced, which is equal to 5,200 GBytes of data for every person on Earth, according to IDC's latest Digital Universe study. A majority of the data will be produced by machines as they talk to each other over data networks, including machine sensors and smart devices communicating with each other. By 2020, up to 33 percent of all data will contain information that might be valuable if analyzed, according to IDC. The study notes that business intelligence techniques could help analyze the data, revealing patterns in social media use, correlations in scientific data from discrete studies, and medical information combined with sociological data. "Herein is the promise of 'big data' or MapReduce technology--the extraction of value from the large untapped pools of data in the digital universe," IDC says. The study also predicts that the number of servers worldwide will grow 10-fold by the end of the decade and the amount of information managed directly by enterprise data centers will grow by a factor of 14. In addition, the study estimates that about 33 percent of all data requires some type of protection, either to safeguard personal privacy, adhere to regulations, or prevent digital theft.
Study: Tech Job Growth Strong
Government Technology (12/10/12)
Engine Advocacy and the Bay Area Council Economic Institute have released a report that details key findings on how technology sector jobs impact the overall U.S. economy. The report, "Technology Works: High-Tech Employment and Wages in the United States," says that since 2004, tech employment growth has outpaced the performance of the private sector by a ratio of three to one, and tech jobs have been more resistant to fluctuations in the economy. From 2002 to 2011, job growth was faster in science, technology, engineering and mathematics (STEM) fields than in all other occupations by a ratio of 27 to one. The high demand is expected to continue through 2020 and possibly beyond, with high-tech employment projected to continue to lead other sectors in growth. The report says high-tech and STEM jobs paid 17 percent to 27 percent more than other jobs. The report also notes that tech jobs are key to regional economic development. "The creation of one job in the high-tech sector of a region is associated with the creation of 4.3 additional jobs in the local goods and services economy of the same region in the long run," the report says.
U.N. Proposal Renews Concerns of Internet Power Grab
CNet (12/11/12) Declan McCullagh
During the ongoing World Conference on International Telecommunications, the International Telecommunication Union (ITU) has circulated draft language, known as DT/51-E, which would give the organization a more active role in the Internet's future. The proposal calls for ITU becoming involved in "Internet-related technical, development, and public policy issues." Other portions of DT/51-E deal with regulating the security of networks and bulk electronic messages. DT/51-E also states that national governments can manage the Internet's naming, numbering, addressing, and identification resources, a move that would take power away from the Internet Corporation for Assigned Names and Numbers. However, the White House is against increased ITU involvement in Internet-related issues. "Millions in the United States and around the world have already added their voices to this conversation, and their position is clear: they do not want the (ITU summit) to govern the Internet or legitimize more state control over online content," according to a White House statement. In addition, the Internet Society has told the ITU that some of the proposals could harm the long term prospects of a global, open Internet. Meanwhile, a Russian-led coalition that includes China, Saudi Arabia, and the United Arab Emirates recently withdrew a proposal that would have gone even farther than DT/51-E.
Is the Pixel About to Die?
University of Bath (12/11/12)
University of Bath researchers say they have developed a vector-based video codec that could lead to the death of the pixel within five years. The researchers say the codec, which presents an image using contoured colors, can fill between the contours. The result is a resolution-independent form of movie and image, capable of the highest visual quality but without using pixels. "This is a significant breakthrough, which will revolutionize the way visual media is produced," says Bath professor Phil Willis. "At the moment we're focusing on applications in post-production and we're working directly with leading companies in this area, however there are clear applications in Web, tablets, and mobile which we haven't explored in detail yet." Willis notes that his group is working with Root6 Technology, Smoke & Mirrors, and Ovation Data Services on the project. “Involvement from a greater variety of companies with different interests will extend the project in a variety of ways and increase the potential applications of this game-changing research," he says.
NSF Joins in Commemorating Computer Science Education Week 2012
National Science Foundation (12/09/12) Lisa-Joy Zgorski
The U.S. National Science Foundation (NSF) is joining the effort to promote Computer Science Education Week, which commemorates the birthday of computing pioneer Grace Hopper and highlights the need to support computer science at all education levels. "Computer Science--or more broadly information technology or computing--drives our economy, ensures global competitiveness, accelerates the pace of discovery, and is crucial to achieve many of America's national and societal priorities," says NSF's Jan Cuny. "Yet, despite the growing demand for IT specialists and professionals with computer science skills in all disciplines, we are teaching less computer science in our schools." Computer science is the only science, technology, engineer, and math field that has more job openings than there are college graduates to fill them. The NSF directorate for Computer Information Science and Engineering is working to address this problem by promoting ways to make computer science more available and engaging to K-12 students. Last year NSF began publishing its "Bits & Bytes" newsletters, which is designed to engage students in computer science by highlighting cutting-edge scientific research, videos, interactive activities, and profiles of inspiring computer scientists.
Crowdsourcing Site Compiles New Sign Language for Math and Science
U.S. and European researchers are developing sign language versions of specialized terms used in science, technology, engineering, and math (STEM) fields, such as "light-year," "organism," and "photosynthesis." The effort builds on a University of Washington project, launched in 2008, that uses crowdsourcing to enable members of the deaf and hard-of-hearing community to build their own sign language guide for STEM terms. "The goal of the forum is to be constantly changing, a reflection of the current use," says Washington professor Richard Ladner. In addition, with funding from Google and the U.S. National Science Foundation, Ladner helped launch the ASL-STEM Forum, an online compilation of signs used in STEM fields that is similar to Wikipedia. "The goal was to have one place where all these signs could be," Ladner says. The site lists 6,755 terms from biology, chemistry, engineering, math, and computer science textbooks. Ladner hopes a recent article in the New York Times outlining the effort will spur interest and encourage people to suggest more entries among the remaining terms. "I hope ASL-STEM Forum helps more deaf students become scientists and engineers," Ladner says.
Tiny Compound Semiconductor Transistor Could Challenge Silicon’s Dominance
MIT News (12/10/12) Helen Knight
Massachusetts Institute of Technology (MIT) researchers have developed an indium gallium arsenide transistor that measures 22 nanometers in length. MIT professor Jesus del Alamo says the breakthrough makes the compound material a promising candidate to eventually replace silicon in computing devices. "The more transistors you can pack on a chip, the more powerful the chip is going to be, and the more functions the chip is going to perform," del Alamo says. However, as silicon transistors are reduced to the nanometer-scale, the amount of current that can be produced by the devices also is shrinking, which limits their performance and makes it necessary to find alternatives. Indium gallium arsenide, which is already used in fiber-optic communication and radar technologies, is one possibility. The researchers say they built a nanometer-sized metal-oxide semiconductor field-effect transistor (MOSFET). "We have shown that you can make extremely small indium gallium arsenide MOSFETs with excellent logic characteristics, which promises to take Moore’s Law beyond the reach of silicon," del Alamo says. The researchers are now working to improve the electronic performance of the transistor by eliminating unwanted resistance within the device.
Do We Live in a Computer Simulation? UW Researchers Say Idea Can Be Tested
UW Today (12/10/12) Vince Stricherz
University of Oxford philosophy professor Nick Bostrom has suggested that the universe could be a computer simulation run by humanity's descendants, but limited computing resources has restricted scientists' ability to test that theory. Supercomputers currently use a technique called lattice quantum chromodynamics to simulate a portion of the universe slightly larger than the nucleus of an atom, and it could be years before they are powerful enough to simulate large parts of the universe, notes Washington professor Martin Savage. However, he says there are tests that supercomputers could perform now, or in the near future, that might uncover signatures of resource constraints in current simulations that would indicate if people are living in a computer model. Savage notes that if the simulation is big enough, something like the universe should emerge. He says researchers could then look for a "signature" in the universe that has an analog in the current small-scale simulations. Savage and colleagues say the signature could show up as a limitation in the energy of cosmic rays. "This is the first testable signature of such an idea," Savage says.
How Technology May Help Cut Meat Consumption
Bloomberg BusinessWeek (12/10/12) Kevin Fitchard
Tech entrepreneurs, software developers, butchers, farmers, food industry executives, and health policy officials will meet at the upcoming Hack//Meat hackathon, sponsored by Food+Tech Connect, with the goal of finding a technological answer to the problems of meat supply, processing, distribution, health, and consumption. One of the biggest problems the group will address is how to get people to eat less meat. Foodpairing, a food industry research company and app developer, has created technology that can identify vegetable or seafood ingredients that reinforce the flavor of different meats, and in some cases act as a substitute for a meat. Foodpairing has isolated the molecular components that make up a meat's flavor and compiled databases that match those components with different ingredients. Foodpairing's Bernard Lahousse notes, for example, that "the flavor of chicken contains about 20 important flavor molecules, some of them you can find in coffee, bread, potato, mushroom, etc." Lahousse believes that by adjusting recipes and cooking techniques to use alternative ingredients on a wide scale the use of meat could be minimized or eliminated.
Device Helps Children With Disabilities Access Tablets
Georgia Institute of Technology (12/10/12) Liz Klipp
Georgia Tech researchers have developed Access4Kids, a wireless input device that uses a sensor system to translate physical movements into fine-motor gestures to control a tablet. The device, when used in conjunction with open source applications and new software, enables children with fine motor impairments to access off-the-shelf apps, as well as custom-made apps for therapy and science education. The research helps children with disabilities "to use what’s in their mind so they have an outlet to impact the world," says Georgia Tech professor Ayanna Howard. The current Access4Kids prototype includes three force-sensitive resistors that measure pressure and convert it into a signal that instructs the tablet. "The real goal is to make it safe and efficient so someone can make it into a commercial product," Howard says. A more advanced prototype will include wireless sensors that can be placed anywhere a child is capable of hitting them.
Researchers Develop Featherweight Chips That Dissolve in Water
IDG News Service (12/10/12) James Niccolai
University of Illinois at Urbana-Champaign researchers have developed integrated circuits that can stick to the skin and in some cases dissolve in water when they are no longer needed. University of Illinois professor John Rogers says the "bio chips" can be worn comfortably on the body to help diagnose and treat illness. He notes the circuits are made from silicon sliced to a nanometer thick so it becomes a "floppy" membrane that can bend and twist. The circuits can be applied like a child's temporary tattoo by laying them on the skin and washing off a thin, soluble backing. The researchers also say they are developing "transient" circuits that are 35 nanometers thick and will dissolve in about two weeks. Rogers says the bio chips contain less silicon, magnesium, and other minerals than are in a daily vitamin pill, so they are safe in the body. The researchers suggest that soluble electronics could be used to help prevent infections from forming at surgical sites, or for non-medical purposes, such as environmental monitors at a chemical spill.
Storytelling Software Learns How to Tell a Good Tale
New Scientist (12/08/12) Hal Hodson
University of Central Florida professor Lotzi Boloni has developed storytelling software with an architecture that is based on a narrative structure. Boloni designed Xapagy to keep stories as a series of interconnected events, rather than use them to build rigid logic rules for future actions. He manually translates stories into a computer language called Xapi. The software looks for familiar connections in its database when it comes across words in new stories, and if it finds any, it uses them to predict what will happen next and then tells the story. Each word can have many different associations in Xapagy's database, depending on the stories it has read. When the software does not find a clear connection, it substitutes in its own word that makes grammatical sense, and continues the story in a way that makes narrative sense. Boloni says that once Xapagy has a large enough back catalog of stories the software should be able to think up new stories on its own. "If Boloni is successful, it would result in a much more flexible way of learning," says University of Portland artificial intelligence researcher Andrew Nuxoll.
Buggy Software: Achilles Heel of Big-Data-Powered Science?
The Atlantic (12/07/12) Edward Tenner
Software defects are a growing concern in the scientific computing community. A recent workshop focusing on maintainable software practices discussed how software code errors caused retractions in major research papers. Kingston University professor Leslie Hatton addressed the issue in a research paper. "The defects themselves arise from many causes, including: a requirement might not be understood correctly; the physics could be wrong; there could be a simple typographical error in the code, such as a + instead of a - in a formula; the programmer may rely on a subtle feature of a programming language which is not defined properly, such as uninitialized variables; there may be numerical instabilities such as over-flow, under-flow or rounding errors; or basic logic errors in the code," the paper says. Although most of the defects are caused by human error, they are facilitated by the complexity of programing languages and algorithms, and the sheer size of the computations, Hatton adds. Columbia University professor Victoria Stodden recently launched RunMyCode, a Web site that helps scientists discover errors by sharing code and data, and accelerating the replication or experiments.
Abstract News © Copyright 2012 INFORMATION, INC.
To submit feedback about ACM TechNews, contact: email@example.com
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.