Welcome to the March 21, 2014 edition of ACM TechNews, providing timely information for IT professionals three times a week.
Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).
HEADLINES AT A GLANCE
Alan Turing Institute to Be Set Up to Research Big Data
BBC News (03/19/14)
British Chancellor George Osborne recently announced that the government will provide 42 million UK pounds (nearly $70 million US) to fund a research center that will carry the name of computer pioneer Alan Turing. The Alan Turing Institute will focus on new ways of collecting, organizing, and analyzing big data. Britain's government says big data "can allow businesses to enhance their manufacturing processes, target their marketing better, and provide more efficient services." Turing's work at Bletchley Park during World War II helped accelerate Allied efforts to read German naval messages enciphered with the Enigma machine. However, in 1952 Turing was convicted of gross indecency in connection to homosexual activity, lost his security clearance, and had to stop the code-cracking work. He received a posthumous royal pardon in 2013. "Now, in his honor, we will found the Alan Turing Institute to ensure Britain leads the way again in the use of big data and algorithm research," Osborne said. "I am determined that our country is going to out-compete, out-smart, and out-do the rest of the world."
Facebook's Hack, a Language of Tech Competition
The New York Times (03/20/14) Quentin Hardy
Facebook on Thursday unveiled Hack, an open source programming language designed to enable developers to program interactive Web pages faster and more effectively by evading errors that cause system crashes. Facebook says Hack is open source for several reasons, including to encourage broad use, to quickly detect errors in the language itself, and to build on it to enhance its strength and diversify its applications. Facebook has a corporate stake in releasing Hack, as developers working in one firm's framework have a tendency to adhere to the firm's objectives, even if that framework is open source. Numerous interactive websites already link to Facebook for customer sign-ups and similar services, and it appears likely that even more will do so with the launch of Hack. Facebook is increasingly using an open source strategy in approaching developers, initially for things such as games tied to the social network. Examples of this include Hack and the Open Compute Project, an effort to generate computer data center equipment via collective work. Facebook CEO Mark Zuckerberg says the goal of opening projects to developers is to enable his company to catch up to the competition.
Private Colleges Produce Prepared STEM Graduates
U.S. News & World Report (03/18/14) Vanessa Denice
A recent report by the Council of Independent Colleges says small and midsized colleges are just as capable at educating and supporting science, technology, engineering, and mathematics (STEM) students as larger public research institutions. The report says a significant percentage of STEM graduates from smaller, private colleges elect to continue their education and earn a master's or doctoral degree, and students at such institutions are more likely to complete their degrees in a timely fashion. About 300,000 students graduate from U.S. colleges with bachelor's or associate's degrees in STEM fields every year, according to a 2012 report from the President's Council of Advisors on Science and Technology. However, only 40 percent of students who intend to major in a STEM field currently complete the degree program. The report found that about 1 million additional STEM professionals than the U.S. will generate at the current rate are needed in the next 10 years if the nation is to sustain its science and technology leadership. The Obama administration is striving to make STEM degrees more attainable and appealing to students, partly by establishing different paths to a degree, such as ensuring that associate's degree programs sufficiently ready students for jobs and diversifying courses to get students drawn to subjects early on.
Data Mining Reveals How Conspiracy Theories Emerge on Facebook
Technology Review (03/18/14)
Computational social scientists at Northeastern University studied the Facebook interactions of more than 1 million people to examine the spread of misinformation on the Internet. The researchers specifically examined Facebook posts of political information from mainstream and alternative news organizations, as well as pages devoted to political commentary during the 2013 elections in Italy. The team also studied how the same people responded to false news that "trolls" released into common circulation. By measuring the time between the first and last comments about a post, the group assessed the duration of debate and found it to be the same regardless of source. In addition, the researchers studied participation in debates on posts that are known to be untrue, and found that some people are more likely to engage with false content than others. People who participated in debates on alternative news posts are significantly more likely to engage in debate over false news posted by trolls. Although people often turn to alternative news media because they are wary of conventional news sources, alternative news consumers "are the most responsive to the injection of false claims," the researchers say. The research suggests conspiracy theories emerge when satirical commentary or false content becomes credible, as groups of people seek out alternative news sources.
Student Wins 2014 Cyber Security Challenge as U.K. Seeks Top IT Talent
V3.co.uk (03/18/14) Alastair Stevenson
More than 40 competitors participated in the Masterclass Final of Britain's Cyber Security Challenge, and 19-year-old student William Shackleton emerged as the winner. The finalists competed to defend London from a simulated cyberattack, in a challenge developed by cybersecurity experts from BT, the Government Communications Headquarters, the National Crime Agency (NCA), Juniper Networks, and Lockheed Martin. Shackleton will have the opportunity to take advantage of 100,000 UK pounds worth of prizes, including training courses, industry events, paid internships, and scholarship money. The government-sponsored challenge is designed to help attract more people to the information security industry. The United Kingdom is currently facing a shortage of cybersecurity skills. "Events such as the Cyber Security Challenge provide a fantastic opportunity for us to not only test the skills of those taking part but also provide them with pathways which allow them to exploit their sought-after cyber skills," says NCA's Kevin Williams. "As we modernize our workforce by welcoming new people and new ideas into the NCA, we want roles at the agency to be the career of choice for people wanting a future in tackling cybercrime and, more broadly, in law enforcement."
Smartphone to Become Smarter With 'Deep Learning' Innovation
Purdue University News (03/18/14) Emil Venere
Purdue University researchers are developing a deep-learning method to enable smartphones and other mobile devices to understand and immediately identify objects in a camera's field of view, overlaying lines of text that describe items in the environment. The researchers' method requires layers of neural networks that mimic how the human brain processes information. "The deep-learning algorithms that can tag video and images require a lot of computation, so it hasn't been possible to do this in mobile devices," says Purdue University professor Eugenio Culurciello. The Purdue researchers have developed software and hardware that could be used to enable a conventional smartphone processor to run deep-learning software. "Now we have an approach for potentially embedding this capability onto mobile devices, which could enable these devices to analyze videos or pictures the way you do now over the Internet," Culurciello says. The deep-learning software works by performing processing in layers. "For facial recognition, one layer might recognize the eyes, another layer the nose, and so on until a person's face is recognized," Culurciello says.
Hold that RT: Much Misinformation Tweeted After 2013 Boston Marathon Bombing
UW News (WA) (03/17/14) Michelle Ma
University of Washington (UW) researchers have found that false information had a far reach on Twitter following the 2013 Boston Marathon bombing, even though users tried to dispel inaccuracies. Corrective tweets were outmatched by the volume of tweets spreading misinformation, driven largely by retweets. "We could see very clearly the negative impacts of misinformation in this event," says UW professor Kate Starbird. "Every crisis event is very different in so many ways, but I imagine some of the dynamics we're seeing around misinformation and organization of information apply to many different contexts." Researchers from UW and Northwest University studied the text, timestamps, hashtags, and metadata in 10.6 million tweets to pinpoint rumors. They then categorized tweets related to the rumors as misinformation, correction, or other. The researchers plan to create a real-time tool that alerts users when a tweet's credibility is questioned by another tweet. "It wouldn't necessarily affect that initial spike of misinformation, but ideally it would get rid of the persisting quality that misinformation seems to have where it keeps going after people try to correct it," says UW undergraduate student Jim Maddock.
How a Laser Beam Could Quadruple the Speed of the Internet
The Washington Post (03/17/14) Brian Fung
California Institute of Technology (Caltech) researchers say they have developed a new kind of laser that can quadruple the bandwidth on the fastest fiber-optic networks. "Our first-run lasers, fabricated at Caltech, are capable of a 4x increase in the number of bytes-per-second carried by each channel," says Caltech professor Amnon Yariv. "This number will increase with our continuing work, but even at this level, the economic advantages are very big." He says the new laser is an improvement over conventional lasers because it operates closer to a single frequency than any other yet created, and the purity of the beam allows it to carry more data. However, the breakthrough is not likely to benefit individual Internet users because they are limited by the plan they have purchased from their Internet service providers (ISPs). Nevertheless, dramatically expanding the rate at which data can be routed through the Internet to the ISPs could have implications for companies that stream a lot of data and it could help pave the way to smarter homes.
Lawmakers Call for More Computer Science in California Schools
EdSource (03/18/2014) Lillian Mongeau
The California state Legislature is considering six bills that address the growing concern that California students do not have the computer science skills necessary to succeed in the modern workforce. If all six bills become law, the California State Board of Education would have to develop computer science standards for grades 1 through 12 and the state higher education systems would be asked to created guidelines for courses they would be willing to accept for admission credit. One of the bills would allow school districts to offer students a third year of math credit for a computer science course, which is currently considered an elective. "Right now there is a disincentive for schools to offer computer science [courses] and a disincentive for students to take them," says Assemblywoman Kristin Olsen (R-Modesto). Most California high schools currently do not offer high-level computer science courses and just 13 percent of the state's high school seniors took the Advanced Placement computer science exam last year. For students entering college with a solid framework in programming, the job opportunities upon graduation should be plentiful, says Computer Science Teachers Association executive director Chris Stephenson. Computer science industry leaders expect to add 1.4 million new jobs by 2020, according to Code.org.
Digital Warfare: TAU on the Frontline
Tel Aviv University (03/17/14)
Tel Aviv University (TAU) computer science, engineering, and national security faculty are working together to conduct research projects ranging from doomsday cryptography, secure cloud computation and more efficient verifiability, to data anomaly and malware detection, user-controllable privacy, and recognition technologies for video surveillance. The researchers also are establishing a national cyber center for coordinating interdisciplinary study programs, research, policy analysis, industry partnerships, and international collaborations in cybersecurity. For example, TAU professor Avishai Wool is developing a system that could help identify potentially malicious intrusions on the grid and prevent them. The system uses TAU's independent electricity grid to automatically evaluate communication patterns on the grid and identify those incidents that are normal and those that are potentially malicious. During testing, Wool says the system has achieved a much lower false alarm rate than other conventional systems. Meanwhile, TAU researcher Shir Landau-Feibish has developed a tool for revealing the footprint of certain distributed denial-of-service attacks and preventing them from being repeated. Landau-Feibish led the development of an algorithm that can find the smallest set of signatures required to detect 99 percent of the attack messages. "The TAU algorithm is innovative in that it can be applied to extracting malicious signatures from textual data as well as numerical data," he says.
Agencies Experiment With Software-Defined Networks
Government Computer News (03/17/14) John Moore
Many government agencies and universities are studying software-defined networks (SDN), which could dramatically change the way governments deploy communications systems, according to computer science researchers. "SDN...provides an architectural path forward, cutting through the complexity in networks today and providing more programmability," says University of Illinois at Urbana-Champaign professor P. Brighten Godfrey. SDN enable software to take on the networking responsibilities normally embedded in the hardware, reducing the amount of time it would take administrators to perform network management tasks. However, SDN is far from mainstream and the arrival of SDN-capable networking equipment is a fairly recent development. Although SDN has some security issues, users can boost SDN security features by making sure data traffic between the controller and the devices managed on a network takes place in a segment of the network not immediately accessible to an end user, says Red Hat's Chris Wright. "We need to be really clear on what the security threats are to this new model and just engineer around those," Wright says. A test network could show how to deploy SDN, which could depend on the type of organization planning to adopt the architecture.
NASA Designs a Robot For Mars
Product Design & Development (03/17/14) Melissa Fassbender
U.S. National Aeronautics and Space Administration (NASA) researchers have developed Valkyrie, a humanoid robot that will one day explore and study Mars. Valkyrie is more than six feet tall, weighs 286 pounds, and has an 80-inch wingspan. "It feels human-like, you can look her in the eyes," says NASA's Reg Berka. The researchers currently are preparing Valkyrie for the U.S. Defense Advanced Research Projects Agency Robotics Challenge Finals. "We knew it was going to take until the finals until this complex machine was going to be able to really perform," Berka says. The researchers developed Valkyrie's hardware and software separately, not phasing the software into the system until the very end of the development process. "In order for that to happen, it was a testament to the automation tools, both hardware and software," Berka says. The researchers control the robot using a series of video cameras, laser scanners, and ultrasonics that provide them with a detailed view of the surroundings. In addition, three computers inside Valkyrie's chest serve to determine what the robot should be doing, and then command the joints to carry out those movements. A person controls Valkyrie's computers behind the scenes.
Three Questions for Leslie Lamport, Winner of Computing's Top Prize
Technology Review (03/18/14) Tom Simonite
In an interview, 2013 ACM A.M. Turing Award winner Leslie Lamport discusses distributed computing, with particular emphasis on its importance and longevity. Lamport notes his Byzantine Generals work on imbuing software with fault tolerance stemmed from a contract in which he had to build a reliable prototype computer for flying aircraft for the U.S. National Aeronautics and Space Administration. The possibility of multiple systems failing made a distributed system necessary, and Lamport notes computers with multiple processors also constitute distributed systems. He attributes the long-term endurance of his distributed-computing algorithms to consistent basic ideas about synchronization. "Running multiple processes on a single computer is very different from a set of different computers talking over a relatively slow network, for example," Lamport says. "[But] when you're trying to reason mathematically about their correctness, there's no fundamental difference between the two systems." Lamport stresses that prior to coding, programmers should understand what they are doing and write it down. He says most code is written without a blueprint or specification, and he blames the software culture for this gap and suggests the use of mathematics to correct it.
Abstract News © Copyright 2014 INFORMATION, INC.
To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.