Association for Computing Machinery
Welcome to the August 28, 2015 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


Searching Big Data Faster
MIT News (08/26/15) Larry Hardesty

A group of researchers at the Massachusetts Institute of Technology's Computer Science and Artificial Intelligence Laboratory (CSAIL) have developed an algorithm for determining whether a given data set possesses attributes that make it amenable to compression. By feeding the algorithm measurements for those properties, the researchers also can calculate the search efficiency upgrades their compression methods will yield. The compression framework, as it applies to biological data, relies on the presence of redundancy in the genomes of organisms. This means that of all the possible sequences of the four DNA letters, only a very small subset is embodied in the genomes of real organisms. Moreover, search efficiency is improved by clustering similar genomic sequences together and then choosing a representative sequence. Data sets that lend themselves to compression meet several criteria, including the data's habitation of only a small portion of the larger sphere of possibilities. Another criterion is that the density of the data points shows little variance as a searcher moves through the data. The researchers think the compression techniques may be applicable to a broad spectrum of non-biological data sets, such as those concerning behavior of Web users.


How Creative Computers Will Dream Up Things We'd Never Imagine
New Scientist (08/26/15) Paul Marks

Computer-aided invention is being furthered by innovations such as genetic algorithms,  which mimic evolution.  One example is the winner of this year's human competitiveness award at the Genetic and Evolutionary Computation Conference, which Zdenek Vasicek of Brno University of Technology earned by evolving smart software routines for approximate computers to correct errors.  A key shortcoming of genetic algorithms is their tendency to optimize pre-existing inventions of little value, and Innovation Accelerator's Tony McCaffrey tries to solve this problem using software to help inventors spot easily overlooked features of a problem that, if addressed, could lead to a novel invention.  The program first lets you describe a problem in human language, then "explodes" it into a large number of related phrases and uses these to search the U.S. Patent and Trademark Office database for inventions that solve similar problems. The system looks for analogs to the problem in other domains.  Meanwhile, Iprova employs a method of interrogating all kind of sources--including blogs and social networks--to make inventive suggestions, which it modifies as online technology trends shift.  "When computers are used to automate the process of inventing, they aren't blinded by the preconceived notions of human inventors," notes patent lawyer Robert Plotkin.  "So they can produce designs that a human would never dream of."
View Full Article - May Require Free Registration | Return to Headlines | Share Facebook  LinkedIn  Twitter 


STEM Fields Dominate Ranking of College Majors
Network World (08/26/15) Ann Bednarz

Science, technology, engineering, and mathematics (STEM) degree holders were among the top earners in a new ranking of bachelor-level majors by PayScale. Computer science degrees performed especially well in the rankings of 319 bachelor-level majors based on how much money graduates make in their early and mid-careers, with several computer science degrees in the top 25, all of which saw median wages of more than $100,000. Petroleum and nuclear engineering were the top-earning majors, with mid-career median wages of $168,000 and $121,000, respectively. Computer science & engineering took the number six spot, with mid-career earnings of $115,000. It was followed by systems engineering and electrical & computer engineering, which tied for seventh place with mid-career earnings of $114,000. Computer engineering was tied with mining engineering at number 10 with mid-career earnings of $109,000, while computer science & mathematics was tied at number 14 with aerospace engineering, both offering mid-career earnings of $107,000. Basic computer science degrees ranked at number 18, offering early career median earnings of $63,100 and mid-career median pay of $105,000.


The Growing Need for More Women Cybersleuths
CNBC News (08/26/15) Jennifer Schlesinger

The recent rash of online hacks highlights a mounting need for cybersecurity professionals, including women. "Fifty percent or more of those graduating from college are women, and 11 percent only are in the cybersecurity field," notes IBM Security vice president Shelley Westman. "So what we see is as an industry, we're leaving a lot of talent on the table." A July conference in New York City sought to recruit more women to cybersecurity, mainly by appealing to high school and college students as well as those looking for a career change. Westman reasons women can not only fill several hundred thousand open cybersecurity positions in the U.S., but they also can bring a fresh perspective. She says balanced teams of both genders "are able to look at things a little bit differently and make sure that you're really looking at all causes, all effects, and really get to the heart of the problem." Obstacles to bringing more women into the cybersecurity fold include a long-entrenched perception of women as less technically smart than men, and stereotypical views of hackers that can be off-putting to women. New York University professor Nasir Memon says dispelling such stereotypes of professionals and the cybersecurity work environment is essential.


Microsoft Says Programmable Chips Will Make AI Software Smarter
Technology Review (08/25/15) Tom Simonite

Microsoft researchers are developing a practical way to power up deep-learning software in order to significantly advance the intelligence of machines. Researchers are investigating a possible route to running deep learning at a much greater scale using field-programmable gate arrays (FPGAs). The researchers are looking for a practical way to use FPGAs to deliver a major boost to the power of deep learning. They have found a nearly 10-fold increase in the performance of a neural network attempting to identify images, compared to conventional computers without graphics-processing units. "It could be a game-changer if we eventually manage to deploy FPGAs widely at scale, which will provide an aggregate capability that exceeds what's possible today," says Microsoft researcher Eric Chung. He predicts the new technique will enable training of neural networks of never-before-seen size and quality. Chung notes that type of advance could lead to improvements in software that can describe the content of images, understand language, and show a form of common sense.


Tech Nightmares That Keep Turing Award Winners Up at Night
IDG News Service (08/27/15) Katherine Noyes

Three A.M. Turing Award winning-scientists discuss technology trends they find very troubling, with Google chief Internet evangelist Vint Cerf particularly worried about "the potential loss of openness and freedom on the Internet." He says such freedom has enabled unprecedented levels of information sharing, as well as new business models, and he thinks debate over the "right to be forgotten" is especially complex. "The other side of that coin is the freedom of people to know things they should know, and I don't think that side is getting as much visibility as it should," Cerf maintains. He also is worried about a looming "digital dark age" due to the lack of a regime that allows long-term preservation of content and the software for rendering it. Meanwhile, cryptography pioneer Manuel Blum sees a dearth of computerization of transportation as a concerning issue, as he thinks such a development will make people safer. Taking the opposite view is RSA encryption algorithm co-creator Leonard Adleman, who worries about computers evolving so quickly they overtake humans. He projects computers will "probably have their own destinies and find their own ways to evolve. They may not need artificial intelligence to become independent."


Super-Intelligent Machines Spawned by AI? Execs Aren't Worried
IDG News Service (08/27/15) Zach Miners

Technology experts and executives at an artificial intelligence (AI)-themed event this week in Silicon Valley were generally dismissive of scenarios in which super-intelligent machines eventually surpass human intelligence. Participants in one panel discussion felt overall that although AI might one day spawn a super-intelligent computer or machines, it is so far away that preparing for it now is not a pressing priority. Sentient Technologies' Nigel Duffy noted how online algorithms determine whether people qualify for a mortgage or a line of credit is a more important issue. Meanwhile, Viv co-founder Adam Cheyer, whose company is building an AI platform that other developers can use for their own software, said the odds of asteroids colliding with Earth were greater than those of AI subverting mankind. The panelists' comments seem to belie AI and robotics researchers' recent warning that technological advances threaten to make the weaponization of systems that can select and destroy targets without human intervention a reality in only a few years. "The endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow," they contended.


Eureka Prize Honors Quantum Computing Pioneers in Their Quest for 'Super Computers'
Australian Broadcasting Corporation (08/27/15)

University of New South Wales (UNSW) professor Michelle Simmons recently was honored with the Commonwealth Scientific and Industrial Research Organization (CSIRO) Eureka Prize for Leadership in Science for her pioneering work in quantum computing.  Simmons led the development of the world's smallest transistor from a single atom and the world's smallest silicon wires, which are 1,000 times narrower than a human hair.  "It will allow us to do calculations we simply can't do with our classic computer," Simmons says. "It's a massive shift, something that people are predicting is going to completely change computing."  Meanwhile, UNSW professor Michael Biercuk was awarded with the Macquarie University Eureka Prize for Outstanding Early Career Research for his contributions at the leading edge of quantum science research.  Biercuk and his colleagues have developed a method of error suppression described as quantum computing's "Rosetta Stone" for the transformational effect it will have on the field. Biercuk used quantum effects on trapped ions to set the record for the smallest force ever measured, at the level of yoctoNewtons, or 1 million-million-billion times smaller than the force of a feather pressing down on a table.  Biercuk also is using quantum simulation to look for the key to room-temperature superconductivity.


Tool Makes It Easier to Evade Online Censors
Technology Review (08/25/15) Tom Simonite

Improved censorship technologies have enabled repressive governments to greatly restrict the Internet content their citizens can see. For example, regimes in China, Iran, and elsewhere can now block anti-censorship tools such as Tor and encrypted virtual private network (VPN) connections. However, Marionette, a new tool developed by researchers at security firm RedJack and Portland State University (PSU), could offer a way around censorship technologies. Marionette disguises traffic that would normally be blocked as ordinary traffic, such as that from online games or Skype. It also can be programmed with the right responses to more active probes of Internet traffic, a step sometimes taken by Chinese censors to investigate suspicious connections. RedJack researcher Scott Coull, who developed Marionette with PSU's Kevin Dryer and Thomas Shrimpton, hopes the new tool could eventually be integrated into the Tor and Lantern anonymous browsing networks; he already has discussed Marionette's open source code with Tor's developers. Although Tor currently supports Format Transforming Encryption, a censorship-evasion method that attempts to make its data look like legitimate data, Coull says Marionette's capabilities are much deeper and more robust. However, Phillipa Gill, who is developing a censorship-evading system at Stony Brook University, cautions it takes time for new tools such as Marionette to be fully validated.


Uncovering Forged Artwork With Neural Networks
IEEE-The Institute (08/26/15) Kathy Pretz

European researchers have developed PigeoNET, a convolutional neural network that can learn the characteristics of specific artists with an accuracy rate of more than 70 percent.  The PigeoNET neural network adapts filters for such traits as painting techniques or materials used to respond to the presence of these features in an image.  The researchers showed convolutional neural networks outperformed all existing learning algorithms on a variety of challenging image classification tasks.  Computers and high-resolution digital reproductions of artwork are used to automate the attribution process. PigeoNET adapts filters to respond to specific features in an image by adjusting the parameters until a suitable configuration is found.  A learning algorithm called back-propagation then gathers the proper weights, which requires no previous knowledge other than the images and the name of the artist.  The researchers used more than 112,000 digital photographic reproductions of artworks by 6,600 artists.  Within this dataset there were more than 1,800 different types of artwork and 406 annotated materials. The researchers say PigeoNET can be further improved by expanding the data sets, but they note the system is a "fruitful approach for future computer-supported examination of artworks."


Electrical Circuit Made of Gel Can Repair Itself
Phys.org (08/25/15) Lisa Zyga

University of Texas at Austin professor Guihua Yu and colleagues have used a new gel with a unique combination of properties to fabricate a self-healing, electrical circuit.  When cut into two pieces, the flexible circuit can repair itself and fully restore its original conductivity.  The researchers note the gel offers high conductivity, flexibility, and room-temperature self-healing, properties that are not typically seen together.  The team says the gel has a hybrid composition of two gels--a supramolecular gel, or "supergel," is injected into a conductive polymer hydrogel matrix.  The "guest-to-host" strategy enables the chemical and physical features of each component to be combined.  The supergel's supramolecular chemistry consists of large molecular subunits, and the assembly is held together by weak interactions that also can be reversible, which enables it to act like a "dynamic glue" and reassemble itself.  The researchers say the gel could be used for self-healing in flexible electronics, soft robotics, artificial skins, biomimetic prostheses, and energy storage devices. They manufactured thin films of the hybrid gel on flexible plastic substrates to test their electrical properties. The tests showed the conductivity is among the highest values of conductive hybrid gels, and is maintained due to the self-healing property even after repeated bending and stretching.


Imaging Software Could Speed Breast Cancer Diagnosis
Rice University (08/21/15) Jade Boyd

Rice University researchers have developed software that could accelerate the diagnosis of breast cancer with 90-percent accuracy and without the need for a specialist. In addition, the software could improve breast cancer management, especially in developing countries. "We have developed a faster means to classify benign and malignant human breast tissues using fresh samples and thereby removing the need for time-consuming tissue preparation," says Rice professor Rebecca Richards-Kortum. The software allows for an automated histological assessment of breast cancer from tissue samples without the need for complex tissue-sample preparation or assessment by a pathologist. The approach uses high-speed optical microscopy of intact breast tissue specimens, which permits rapid diagnosis and could reduce subjectivity in the evaluation of breast histology. The software uses images from a confocal fluorescence microscope to analyze freshly cut human breast tissue samples for certain histological parameters that are normally used in breast cancer diagnosis. The parameter data is used to classify the tissue in each image and determine whether the tissue is benign or malignant. "We are excited about the possibility of using these imaging techniques to improve access to histologic diagnosis in developing regions that lack the human resources and equipment necessary to perform standard histologic assessment," says Rice researcher Jessica Dobbs.


With Silicon Pushed to Its Limits, What Will Power the Next Electronics Revolution?
The Conversation (08/27/15) Mark Hopkins

Silicon transistors have helped to make the modern world possible, but nearly 70 years after the first silicon transistor was made in 1947, the technology is beginning to reach the limits of its usefulness, writes University of Sheffield professor Mark Hopkinson. Current silicon circuits measure just 7nm wide, fast approaching the point at which the material's behavior becomes unstable. As it approaches this limit, the material's other faults have been thrown into starker focus. Hopkinson says silicon's electron mobility, a property that allows electrons to move through the material quickly, has long been outclassed by other materials such as gallium arsenide, indium arsenide, and indium antimonide. It also performs poorly at the high temperatures that intensive computation creates, necessitating extensive cooling mechanisms. Finally, it is very poor at conducting light, a problem when there is a growing need to combine photonics and electronics on a single chip. Although there are materials that stand out as promising candidates to replace silicon, such as germanium or the so-called III-IV compound semiconductors, Hopkinson says silicon is such an established and integral part of the modern electronics supply chain that it is unlikely to be unseated any time soon. Instead, he predicts the development of hybrid chips that combine silicon with the other, higher-performing semiconductors.


Abstract News © Copyright 2015 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe