Welcome to the March 6, 2013 edition of ACM TechNews, providing timely information for IT professionals three times a week.
ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).
HEADLINES AT A GLANCE
EU Commission Launches 'Grand Coalition' to Tackle IT Shortage
BBC News (03/05/13)
A grand coalition launched by the European Commission (EC) will address the information technology (IT) skills shortage in the European Union. The region is not producing enough skilled IT graduates to keep pace with the approximately 100,000 IT jobs being added every year. The EC plans to spend 1 million euros on the coalition. "I want people to be open in their commitments, join forces where they see the chance, and recognize we need to do things differently," says digital agenda commissioner Neelie Kroes. "Quite simply, facing hundreds of thousands of unfilled vacancies, we cannot continue as we were, and we must all do our bit." The EC estimates there will be 900,000 openings for IT-related roles within two years, and it is considering simplifying the certification system and making it easier to determine the skills of a graduate, regardless of where they studied or worked in the region. Kroes also says companies need to focus more on what they can do to address the problem, and highlighted some promising initiatives.
Hybrid Technology Degrees Emerging
Network World (03/05/13) Carolyn Duffy Marsan
College students today have an opportunity to pursue new majors that are hybrids of computer science, information systems, and computer engineering. At Pennsylvania State University, for example, engineering and business courses have been combined in an interdisciplinary program offering a B.S. and B.A. in Information Sciences and Technology (IST) and a B.S. in Security and Risk Analysis (SRA). About 78 percent of last year's graduates were placed in May 2012, but the placement rate was 91.6 percent for those with the dual major of IST and SRA, notes Penn State's Mary Beth Rosson. Recruiting is strong for IST graduates this year, and about 800 undergraduates are majoring in IST and SRA. "If you want to work with computing technology in the context of people and problems, that's why you want to come here," Rosson says. "Our programs emphasize the human, social and cultural context of computing." She notes that there has been less interest on the part of students and recruiters for the B.A. program, and says students who prefer math and algorithms should focus on computer science, while those who enjoy working with people should consider an IST major.
Big Blue, Big Bang, Big Data: Telescope Funds Computing R&D
CNet (03/05/13) Stephen Shankland
IBM is attempting to advance supercomputing technology in processing, optical communications, and memory through an international initiative to study the Big Bang's radio remnants using the Square Kilometer Array radio telescope. Prior to the telescope's construction start, IBM is working to devise the required computing technology through a five-year alliance with the Netherlands Institute for Radio Astronomy. The telescope will generate 14 exabytes of data daily, and this data must be refined into about 1/1000 its size for further processing. Processing functions will be handled by IBM microservers, packed together densely and enhanced by hot-water cooling. The microservers will communicate over a system data pathway that can accommodate 10-gigabit Ethernet links and support communications with disks, USB devices, and other system plug-ins. IBM intends to use optical interconnects instead of copper wiring to transmit data to the processors. The company also is exploring phase-change memory technology as the project's data storage instrument, as it is faster and more durable than flash memory and is capable of storing data even when power is deactivated. In addition, the project is probing the use of programmable accelerator chips specialized for extremely rapid performance on jobs such as pattern recognition, data filtering, or mathematical transformation.
Web-Connected Cars Bring Privacy Concerns
Washington Post (03/05/13) Craig Timberg
In the near future, automobiles are expected to be linked into wireless networks capable of tapping into large volumes of data that vehicles will produce about themselves and motorists. New connectivity in the form of calling systems, streaming video, cameras, and apps may enable car companies, software developers, and even police officers to have access to such information, privacy advocates fear. More than 60 percent of vehicles globally will be directly connected to the Internet by 2017, up from 11 percent last year, predicts ABI Research; in North America and Europe, that percentage could reach 80 percent. Proposed U.S. federal highway safety rules are calling for all new cars by 2014 to be equipped with black boxes that record crash data, prompting privacy groups to urge that the data be declared to be owned by the motorist, with authorities having access to it under certain conditions. “The cars produce literally hundreds of megabytes of data each second,” says Ford technologist John Ellis. "The technology is advancing so much faster than legislation or business models."
Study Maps Human Metabolism in Health and Disease
University of Manchester (03/04/13) Aeron Haworth
An international team of researchers say they have produced the most complete model of the human metabolic network available. The researchers mapped 65 human cell types and half of the 2,600 enzymes that are known drug targets. They say the research marks a key step toward personalized medicine, or tailoring treatments to the genetic information of a patient. "By converting our biological knowledge into a mathematical model format, this work provides a freely accessible tool that will offer an in-depth understanding of human metabolism and its key role in many major human diseases," says the Biotechnology and Biological Sciences Research Council's Douglas Kell. The model contains more than 8,000 molecular species and 7,000 chemical reactions. “The results provide a framework that will lead to a better understanding of how an individual’s lifestyle, such as diet, or a particular drug they may require is likely to affect them according to their specific genetic characteristics," notes University of Manchester professor Pedro Mendes. The Babraham Institute's Nicolas Le Novere says "having large collaborations like these, using open standards and data-sharing resources, is crucial for systems biology."
Video Game Invades Classroom, Scores Education Points
USA Today (03/04/13) Greg Toppo
GlassLab is an effort by Electronic Arts to use video games to inspire students to embrace science, technology, engineering, and mathematics careers. GlassLab developers have created a free online community based on the newly redesigned world-building game SimCity, as well as free lesson plans and an online teacher's network. In the game, players act as the builder and mayor of a fictional town, building infrastructure, industry, and housing to attract residents. In the SimCityEDU lessons, users must solve specific problems that plague the fictional town. "Being able to touch something, being able to experiment with it, I think, really can make a difference in a kid's life," says Maxis Games' Lucy Bradshaw. SimCity's popularity has encouraged the Bill & Melinda Gates Foundation and the John D. and Catherine T. MacArthur Foundation to spend millions of dollars researching how video games help kids learn. GlassLab plans to work with other game developers and educational-testing experts to create new games and analyze existing ones, to determine how students learn to play them and how they can learn from them. MacArthur Foundation educational director Connie Yowell, one of GlassLab's originators, notes that video games upend existing learning models.
Demand for Cybersecurity Jobs Is Soaring
CIO Journal (03/04/13) Steve Rosenbush
Demand for cybersecurity specialists is expanding at 3.5 times the pace of the overall information technology (IT) job market, and grew 73 percent from 2007 to 2012, according to a Burning Glass International study. Burning Glass CEO Matthew Sigelman sees a close correlation between this trend and the growth in demand for big data skills. "As companies are focusing more and more on big data and the value that's accrued within their customer databases...they have also come to focus more attention on managing the risks and the vulnerabilities," he notes. Cybersecurity experts also are earning considerably more compensation than most IT workers, with the former averaging $101,000 in salary to the latter's approximately $89,000. Engineers constituted the bulk of all cybersecurity jobs at 32 percent, followed by analysts with 24 percent. Sigelman observes that although large IT firms and defense contractors still account for a large share of demand for cybersecurity expertise, "much of the growth in demand...is driven by a more diversified range of businesses," including healthcare, education, public administration, and retail. The greatest growth for cybersecurity jobs is concentrated in the Washington, D.C., region, reflecting growth of nearly 50 percent in Washington itself.
Pixels Guide the Way for the Visually Impaired
Institute of Physics (03/01/13)
Using a mathematical algorithm and a video camera, University of Southern California (USC) researchers created pixels that are projected onto a headset to improve vision for people with retinal implants. The headset can assist with tasks such as navigation, route planning, and object finding. Retinal implants are currently of low resolution, but they allow blind people to sense motion and large objects, thereby improving orientation for walking and enabling most users to read large letters. The algorithm will supply retinal implant users with more information when they are looking for a specific item, says USC's James Weiland. The researchers mounted a video camera on a headset to collect real-world information in the view of the subject, then transformed the images into pixels that were displayed on a screen in front of the subject. The algorithms use intensity, saturation, and edge-information from the camera’s images to pick out the five most salient locations in the image. Additional directional cues are supplied via blinking dots at the side of the display. To test the headset, researchers asked subjects to walk an obstacle course, find objects on a table, and look for specific items in a cluttered environment. The directional cues drastically lowered errors, head movements, and time needed to complete tasks.
Short Algorithm, Long-Range Consequences
MIT News (03/01/13) Larry Hardesty
Massachusetts Institute of Technology (MIT) researchers have developed an algorithm for solving graph Laplacians that is faster and simpler than its predecessors. A graph Laplacian is a matrix that describes a graph, and the researchers say the simplicity of their algorithm should make it both faster and easier to implement them in software than previous solutions. As part of the algorithm, the researchers find a spanning tree for the graph, which is a particular kind of graph with no closed loops. The MIT algorithm adds one of the missing edges to create a loop, which means that two nodes are connected by two different paths. The researchers showed that the process of adding edges and rebalancing will converge on the solution of the graph Laplacian. The new algorithm is "substituting one whole set of ideas for another set of ideas, and I think that’s going to be a bit of a game-changer for the field," says Yale University professor Daniel Spielman.
Challenges Remain as Internet of Things Becomes a Reality
An IEEE survey of insights from more than 1,200 Facebook members on the emerging Internet of Things found that a majority of respondents are most interested in using linked devices to improve their work productivity. Given its swift development, the Internet of Things has yet to be assigned a clear industry definition. Still, the survey hints that unity may be closer than currently thought, as close to 70 percent of respondents believe connected devices can be defined as those that are either directly linked to the Internet, or indirectly linked through another component within the ecosystem. Meanwhile, only 30 percent of respondents think a device must be directly tied to the Internet to be deemed connected, such as a smartphone or laptop. Nevertheless, several pressing challenges must be tackled for the Internet of Things to continue its present growth trajectory. Issues cited by survey respondents include concerns about privacy and data security. "However, worthwhile innovation does not come without obstacles," says IEEE member Roberto Saracco. "I believe that the great minds of technology and engineering can collaborate to find a solution that will alleviate some of these development concerns."
MIT Panel Warns of Challenges of Hyper-Networked World
Network World (03/01/13) Jon Gold
The Internet is facing challenges due to the proliferation of devices and data, as well as a lack of security, according to experts at a Massachusetts Institute of Technology forum. As the concept of an Internet of Things becomes a reality, the number of network devices installed globally could reach 30 billion by 2020, with connected devices continuing to outpace Internet-connected people, says IDC analyst Rohit Mehra. From 2010 to 2015, the number of Internet users will increase from 1.8 billion to 2.9 billion, but over the same period the number of connected devices will mushroom from 5 billion to 15 billion, Mehra predicts. Challenges arising from hyper-connectivity include a shortage of Internet Protocol addresses, and the fact that transport control protocol does not work well with wireless network packet loss, says Boston University professor Mark Crovella. In addition, he says there is a lack of top-level security for the border gateway protocol that controls large ISPs' traffic. Crovella also notes that the wireless spectrum available for large-scale network projects is insufficient, so frequencies may have to be repurposed and new auctions held.
Brown Unveils Novel Wireless Brain Sensor
Brown University (02/28/13) David Orenstein
A team of neuroengineering researchers based at Brown University has developed a wireless, broadband, rechargeable, fully implantable brain sensor that has performed well in animal models for more than a year. "This has features that are somewhat akin to a cell phone, except the conversation that is being sent out is the brain talking wirelessly," says Brown professor Arto Nurmikko. Researchers can use the device to observe, record, and analyze the signals emitted by the neurons in specific parts of an animal model's brain. The device is a pill-sized chip of electrodes implanted on the cortex that sends signals through electrical connections. "Most importantly, we show the first fully implanted neural interface microsystem operated wirelessly for more than 12 months in large animal models--a milestone for potential [human] clinical translation," says Ecole Polytechnique Federale Lausanne's David Borton. The device transmits data at 24 Mbps via microwave frequencies to an external receiver. One of the major challenges the researchers faced was optimizing the device's performance given the requirements that the implant be small, low power, and leak-proof. Although the device has not been approved for use in humans, it "was conceived very much in concert with the larger BrainGate team," Nurmikko says.
Virtual Body Double Gets Ill So You Don't Have To
New Scientist (02/28/13) Douglas Heaven
University College London (UCL) researchers are using HECToR, Britain's fastest supercomputer, to create simulations of the human body as part of the Virtual Physiological Human project. As the human models become more detailed, they can be combined to mimic larger biological systems. "At some point, we will have a virtual human," says UCL's Peter Coveney. The individual models, such as that of blood flow in the human body, are already useful. The tool works by building a three-dimensional model of a patient's blood vessels from a static image, and then simulating blood flow through them to see where the forces are strongest. Researchers already have developed models of many of the major human organs. "The virtual human is the ultimate goal of computational biology," says Bayer Technology Services researcher Lars Kupfer. "Current models, including the ones we are using, are important milestones along the way." The Virtual Physiological Human project aims for patients to participate in their own medical care, simulating the outcomes of certain choices and self-diagnosing.
Abstract News © Copyright 2013 INFORMATION, INC.
To submit feedback about ACM TechNews, contact: email@example.com
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.