HomeFeedbackJoinShopSearch
Home

Be a TechNews Sponsor

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM. To send comments, please write to [email protected].
Volume 8, Issue 888:  Wednesday, January 11, 2006

  • "ACM Names 34 Fellows for Contributions to Computing and IT"
    ACM (01/10/06) Gold, Virginia

    ACM has recognized 34 of its members for their contributions to both the practical and theoretical aspects of computing and information technology. The new ACM Fellows, from some of the world's leading industries, research labs, and universities made significant advances that are having lasting effects on the lives of citizens throughout the world. Within the corporate sector, Intel Corporation garnered two Fellows, with achievements recognized in mobile and ubiquitous systems, and high performance processors and multimedia architectures. AT&T Labs also had two Fellows, whose contributions were in algorithms and data structures, and the theory of e-commerce and market-based, decentralized computation. Other corporate research facilities with 2005 Fellows were Microsoft Research; IBM Almaden Research Center; and Bell Labs, Lucent Technologies, and Thomson Paris Research Lab. Topping the list of universities with multiple winners was Carnegie Mellon University, with three 2005 ACM Fellows. Several other universities, including Stanford, Illinois, Georgia Tech, Washington, Berkeley, Wisconsin and Brown, had double award winners. "These individuals deserve our acclaim for their dedication, creativity, and success in pursuing productive careers in information technology," said ACM President David Patterson. "By seizing these opportunities, they demonstrate the astonishing potential for innovation in the computing discipline, and the broad-based, profound and enduring impacts of their achievements for the way we live and work in the 21st Century. On a personal note, I am pleased that I've known and collaborated with many of these new fellows for several years." For more information, and a complete list of the 2005 Fellows, visit:
    Click Here to View Full Article

  • "U.S. Office Joins an Effort to Improve Software Patents"
    New York Times (01/10/06) P. C3; Markoff, John

    Bowing to criticism that it insufficiently researches existing patents before issuing new ones, the U.S. Patent and Trademark Office is partnering with the open-source software community on three endeavors intended to improve software patents and stem the tide of intellectual property litigation. Patent officials met with IBM, Red Hat, Novell, and other companies and universities that favor open source software to discuss how to better research existing patents. They concluded that new search technologies could create a tool on the patent office Web site where users could mine a database for existing patents through search criteria. Another initiative, based on the work of University of Pennsylvania intellectual property expert R. Polk Wagner, would generate a patent quality index to advise patent applicants as they write their applications. IBM is leading the search initiative, which could use some of Google's search technologies, to compile existing software by programmer to inform patent officers considering a new application. "This is a great example of how the patent office can reach out to the community and how they can help us where we have difficulty getting prior art," said John Doll, the commissioner for patents. Improving the quality of patents will reduce the cost of defending them, funneling that money back into research and development, and ultimately leading to fewer, but better, patents. Some critics of the patent office are skeptical about the benefit of providing it with more information, however, given that it has dealt ineffectively with the information it already has.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Homeland Security Helps Secure Open-Source Code"
    CNet (01/10/06); Evers, Joris

    The Department of Homeland Security has awarded $1.24 million to Stanford University, Coverity, and Symantec to search for vulnerabilities in open-source software and to improve Coverity's proprietary tool for analyzing source code. Stanford is receiving the lion's share of the money, which the department will pay out over three years. Stanford and Coverity will construct an application that conducts daily inspections of code submitted to popular initiatives, compiling a database open to developers once the system begins operating in March. Developers working in the Linux, Firefox, or Apache spaces, for instance, will be able to repair bugs before they are codified into a public release. "We're going to make automatic checking deeper and more thorough using the latest research and apply this to the open-source infrastructure to make it more robust," said Stanford professor Dawson Engler. Symantec will contribute security intelligence and conduct tests of Coverity's proprietary analysis tool. The move will help the open-source community catch up with their commercial counterparts, who regularly use tools to analyze source code. Open-source programmers are more likely to manually check each other's work because the analysis tools are prohibitively expensive. Commercial software developers will also benefit from the program, as they will be able to apply Coverity's tool to their own code. Some in the open-source community have criticized the initiative for not going far enough, calling for Coverity to distribute its tool directly to the developers, so that they can check their own code. Among the open-source initiatives that Stanford and Coverity plan to check are Apache, BIND, KDE, Linux, Firefox, OpenBSD, OpenSSL, MySQL, FreeBSD, and Ethereal.
    Click Here to View Full Article

  • "Stricter Nanotechnology Laws Are Urged"
    Washington Post (01/11/06) P. A2; Weiss, Rick

    The Woodrow Wilson International Center for Scholars has released a study warning that if existing nanotechnology laws and regulations are not strengthened, the field will fail to realize its potential either due either to a disaster or an erosion of public confidence. "We know from what happened with agricultural biotechnology and nuclear power that if you don't have public support, or at least public tolerance, a field's potential is not going to be realized," said J. Clarence Davies, an environmental lobbyist for the nonpartisan think tank Resources for the Future and the author of the study. "For nanotechnology, I don't think existing systems or laws can serve this purpose." Government and industry members contested the findings, citing other reports that make conflicting claims. Nanotechnology, expected to become a trillion-dollar industry in the next decade, is already being applied to computers, cosmetics, and medicine. The technology capitalizes on the irregular properties materials exhibit at the atomic level, though these strange characteristics have been found to obstruct airways in animals in lab tests and travel to the brain, where they could disrupt metabolism. Other studies point to environmental damage caused by nanoparticles that would be difficult to repair once the materials made their way into the soil and water. The report alleges that existing laws, such as the Toxic Substances Control Act, contain loopholes or language too vague to address the environmental hazards nanomaterials pose. Under that law, manufacturers are required to notify the EPA if they intend to use new chemicals, but many companies consider nanomaterials simply scaled down versions of existing chemicals, and thus fail to report their use. Other laws governing nanomaterials fail to require safety testing of a product before it is marketed, while the agencies charged with enforcing the regulations are understaffed.
    Click Here to View Full Article

  • "Spin Doctors Create Quantum Chip"
    Wired News (01/11/06); Hudson, John

    University of Michigan researchers have developed the world's first quantum microchip, a significant milestone on the road to replacing transistors and developing the next generation of supercomputers. The Michigan researchers built a device called an ion trap on a semiconductor chip, which enabled them to manipulate the quantum states of isolated and charged ions. Ideal quantum systems will be constructed from ions, which can exhibit a positive or negative charge, depending on the electron count of their parent atoms. "The cadmium atom that has lost an electron becomes a negatively charged ion, which can then be controlled with an electrical field," said Daniel Stick, doctoral student in Michigan's physics department. Scientists apply electric fields to an ion once it has been confined in the trap, while laser light controls the spin of the ion's free electron to switch it between quantum states. The qubit's binary value is a function of its spin, as different manipulations can make it represent a one, a zero, or both at the same time. The ability to simultaneously occupy two quantum states, known as quantum superimposition, means that quantum computers can perform calculations at speeds exponentially faster than existing machines. The scientists made their gallium arsenide chip through the conventional technique of microlithography, which they say should make mass-production fairly easy.
    Click Here to View Full Article

  • "The Innovation Burden Must Be Shared"
    Financial Times (01/11/06) P. 15; Segal Adam; Yochelson, John

    Congress is taking its most serious look at U.S. competitiveness since the ascendancy of the Japanese economy in the 1980s, though in a radically different global environment, simply spending more federal dollars on research and development will not be sufficient to ensure that the United States stays ahead of the pack, write Adam Segal, senior fellow in China studies at the Council on Foreign Relations, and John Yochelson, former president of the Council on Competitiveness and current president of Building Science and Engineering Talent. Every member of the engineering community must redouble his commitment to innovation to survive in a world economy with reduced trade barriers where well-trained and talented workers move freely across international borders. Unlike the 1980s, today's U.S. technology corporations are the undisputed leaders in their field; the concern arises in the education system, where fewer students are pursuing technical courses of study as foreign countries experience the opposite trend. A $9 billion dollar legislative package has been introduced as a top-down solution to the problem, contributing more money for basic research, recruiting 10,000 math and science teachers, and creating scholarships for 25,000 undergraduates and 5,000 graduate students in math, physical science, and engineering. More money is not the only answer, though, as federal agencies must do a better job of getting the most out of the money that they already spend. Luring talented teachers to primary and secondary schools will remain a challenge as long as compensation is low. While teachers, employers, and professional societies must all work together to preserve U.S. innovation, research universities and corporations have the greatest responsibility for reversing declining interest in technical majors and investing in education.
    Click Here to View Full Article

  • "Microsoft Research India to Work on Cryptography"
    Infoworld Netherlands (01/10/06); Ribeiro, John

    Microsoft Research plans to take advantage of the superior mathematical skills available in India by establishing a cryptography group at its lab in Bangalore. Cryptography for mobile phones, radio frequency identification units, and other small devices will be the focus of the cryptography group at Microsoft Research India, according to Ramarathnam Venkatesan, senior researcher in cryptography at the lab. "We will be researching methods that don't assume a lot of computation power," says Padmanabhan Anandan, managing director of Microsoft Research in India. More specifically, the lab will develop new cryptographic primitive operations involving encryption, decryption, and authentication algorithms, as well as analyze and attempt to break current algorithms. The cryptography group will have an opportunity to collaborate with experts at the Indian Institutes of Technology and the Indian Institute of Science. The Microsoft Research India group will also work with researchers in Israel and other countries, says Anandan.
    Click Here to View Full Article

  • "IPv6: World's Largest Technology Upgrade on Deck"
    TechWeb (01/09/06); Sullivan, Laurie

    The need for IPv6 to replace IPv4 was generally agreed upon by a panel of four experts at the 2006 Consumer Electronics expo in Las Vegas last week, but when and how were bones of contention. Dramatic increases in the use of mobile IP, IP telephony, IPTV, and related technologies have created a need for more domain space, though the United States has been slow to act. Japan already has up to 500,000 IPv6 users, compared to 2,000 at the most in the United States, says IPv6 Summit CEO Alex Lightman. But already, says Lightman, some industries are feeling the pinch of dwindling domain availability. "Telecommunication carriers periodically check to see if you're done using the dynamic address they loan you when making a call," he says. "The dirty little secret is carriers take the address back when they need it even if it ends the call, leaving you to think it's a bad cell zone." Adoption is being hampered in the United States by onerous regulations and competition, say experts. The federal government is expected to spend billions of dollars to make the transition to the new protocol eventually. The White House has already ordered agencies to develop transition plans by February and to be IPv6-compliant by June 2008. "I predict the U.S. government won't use or accept IPv4 packets after 2017," says Lightman, who calls U.S. efforts on that front thus far "pathetic." Slowing the need for the next-generation protocol are two technologies--Network Address Translation, which allows large companies to use a single IP address to connect numerous devices, and Named Based Virtual Hosting, which allows multiple DNS names to work off the same address. Nevertheless, IPTV, experts say, will eventually be the key driver to a switchover. The shift will be gradual, with security vulnerabilities expected during the transition. But the transition will eventually lead to greater online security, eliminating the ability to spoof addresses.
    Click Here to View Full Article

  • "Is Java Getting Better With Age?"
    CNet (01/09/06); LaMonica, Martin

    Sun CTO James Gosling, widely considered the father of Java, discussed his thoughts on the ongoing debate over programming languages in a recent interview. While many programmers are convinced that scripting languages such as PHP and Python are replacing Java, Gosling believes that Java use is as widespread as it has ever been, citing the growing popularity of Sun's developer education initiatives around the world. Gosling looks at languages such as PHP and sees their underlying functionality as Java-based. PHP is well-suited for strict Web page development, though it is ineffective for more complex applications. Gosling points out that Java is a language with two levels, one being a type of ASCII syntax and the other a virtual machine that has been the springboard for scores of scripting languages. Because Sun wants Java to remain useful for high-end applications, such as large transaction servers for financial institutions, it has developed tools to improve its ease of use, such as Java Studio Creator, which enables users to drag and drop chunks of AJAX code and very quickly build Web sites. Rather than viewing the emergence of LAMP as a threat to Java's market share, Gosling believes that Java is complementary to the LAMP stack of languages, and welcomes diversity in the market. Java has the advantage of providing developers with a portable skill set, as Gosling notes that a programmer steeped in Java can work on application and transaction servers, networking protocol, cell phones, and in a host of other environments.
    Click Here to View Full Article

  • "Security Conference Focuses on Collaboration"
    Telephony Online (01/10/06); McElligott, Tim

    Keynote speakers from the federal government and the communications, transportation, and utility sectors shared their horror and success stories and wisdom this week at the inaugural Homeland Security for Networked Industries conference in Orlando, Fla., where collaboration was a focus of talk. Department of Interior CIO W. Hord Tipton urged more collaboration among the various sectors of private industry and held up his department and its oversight of 70,000 employees and 200,000 volunteers spread throughout several agencies as an example of the challenges that lay ahead. "This is about securing our nation's infrastructure for a better prepared America," said Tipton, who along with others stressed the need for risk assessment, given that not every contingency could be covered. "All assets are not created equal, so you have to be able to pick the ones that matter," said McAfee's Eric Winsborrow, explaining that companies should examine what assets are most vital to them before formulating a policy. AT&T vice president of operations Roberta Bienfait called for a more proactive rather than reactive approach, holding up as an example a potential bird flu outbreak. She said an outbreak "could take 40 percent of our workforce. So we have to know how we would run our network without our employees." Also calling for collaboration between the public and private sector was Donald Purdy, acting director of the Department of Homeland Security National Cyber Security Division. "Information sharing is critical, but it has to move beyond that to real collaboration to mitigate risks within and between these sectors," Purdy said.
    Click Here to View Full Article

  • "Fake Fingers No Match for Scanner's Electronic Nose"
    New Scientist (01/07/06) Vol. 189, No. 2533, P. 22; Biever, Celeste

    Italian biometrics researcher Davide Maltoni plans to present his work involving liveness detection this week during the International Conference on Biometrics in Hong Kong. Maltoni, from the University of Bologna in Italy, has connected an off-the-shelf "electronic nose" to specifically designed computer software in an effort to improve the security of fingerprint scanners. Electronic noses often serve as detectors for pollution or spoiled food. The system is designed to distinguish between the smell of a real finger and fake fingers made of silicone, gelatine, latex, or Play-Doh, by recording voltage changes for a few seconds. The electronic nose is based on a single metal oxide, and the film changes when different gas molecules pass over it. Maltoni says the software has achieved a 92 percent success rate in tests. Commercial scanners can be beaten by fraudsters, as researchers from the Yokohama National University in Japan revealed in 2002 by using a fake finger of gelatine with fingerprints obtained from glass.
    Click Here to View Full Article

  • "Ray Kurzweil: IT Will Be Everything"
    Computerworld (01/09/06) P. 28; Anthes, Gary

    In a recent interview, futurist Ray Kurzweil outlined his vision of the development of computer intelligence, predicting that human and machine intelligence will eventually become indistinguishable due to new brain scanning technologies that can record the process of generating thoughts. With the amount of data collected about the brain doubling every year, scientists are creating complex mathematical models of different regions, and Kurzweil estimates that there will be a detailed simulation of the whole brain by the late 2020s. To map the entire brain, computers must be able to process 10 quadrillion calculations per second, which two Japanese supercomputers should be able to do by 2010. Kurzweil believes that once Moore's Law is exceeded, three-dimensional computing will take over, and a cubic inch of circuitry will have 100 million times the processing power of a human brain. Computers will also be able to read and alter their own source code, continually improving it and eventually overtaking human intelligence. Tiny screens on our eyeglasses will eventually beam images directly into our retinas, creating the perception of a very large screen with a very small one. Computers will also be miniaturized as they become embedded in clothing, and each will be a node in a universal, self-organizing network where communications automatically find the most efficient path of transmission. People and technology will converge as intelligent machines known as nanobots are infused into the bloodstream to supplement human intelligence and fight diseases. Kurzweil argues that computer intelligence will increase by a factor of 1,000 each decade, and that biological intelligence will eventually become comparatively insignificant. As machines progress, Kurzweil contends that everything of any importance will eventually be governed by IT, and that the field will move increasingly toward specialization.
    Click Here to View Full Article

  • "Wanted: Female Computer-Science Students"
    Chronicle of Higher Education (01/13/06) Vol. 52, No. 19, P. A35; Carlson, Scott

    Debate is brewing over how to account for the shrinkage of computer science enrollments in colleges, particularly among women. One theory posits that computer science programs are unsuited for women and their style of learning, but some successful women discount this suggestion, arguing instead for more social support for women in the field, reinforced by the elimination of gender-based discrimination by peers and parents. Since the publication of "Unlocking the Clubhouse: Women in Computing," which concluded that girls' motivation for learning computer science is more application-oriented than boys, Carnegie Mellon University has changed its admissions policy to emphasize ambition, high grades, and leadership skills rather than prior programming experience. Carnegie-Mellon computer science professor Lenore Blum has also established Women@SCS, a program that provides role models and mentoring services for female computer science students. Blum does not agree that there are innate gender differences between men and women, and believes re-orienting the curriculum for women will only marginalize them. Meanwhile, the University of Maryland-Baltimore County founded the Center for Women and Information Technology (CWIT) to provide a support structure for female students that attempts to mitigate some of the more discouraging factors that can prey on them, such as the perception of computer professionals as geeky and socially maladroit, and gender discrimination that often starts as early as middle school. CWIT executive director Claudia Morrell is developing extracurricular programs designed to stimulate interest in science and technology among middle-school girls as well as advise parents on how to nurture this interest.
    Click Here to View Full Article
    (Access to the full article is available to paid subscribers only.)
    For information about ACM's Committee on Women and Computing, visit http://www.acm.org/women

  • "The Legal and Practical Implications of Recent Attacks on 128-bit Cryptographic Hash Functions"
    First Monday (01/02/06) Vol. 11, No. 1; Gauravaram, Praveen; McCullagh, Adrian; Dawson, Ed

    Exploits of 128-bit hash functions MD4, MD5, RIPEMD, and HAVAL-128 presented at Crypto '2004 by Xiaoyun Wang et al demonstrate insecurity when the functions are used with processing applications where the property of collision resistance applies, thus disqualifying them as collision resistant hash functions. MD5's vulnerability to these attacks is of particular concern, given its wide usage in applications that include digital signature generation and verification and software integrity assurance of numerous products. This discovery puts the future employment of MD5 in digital signature generation and other applications in doubt, and the National Institute of Standards and Technology (NIST) intends to phase out SHA-1 by the end of the decade, and urges the replacement of SHA-1 and 160-bit hash functions with stronger algorithms available in the NIST-approved Federal Information Processing Standard. Various researchers contend that digital signature technology is critically dependent on the non-repudiation property, and the Xiaoyun exploits allow both the signer and verifier of a digitally signed message that uses MD5 to circumvent this property and thus cheat each other. This debilitates the evidential value of digital signature technology in which all of the mentioned hash functions are used for signature purposes. The attacks on MD5 can also enable a third party to alter the contents of a digital certificate without changing the certification authority's digital signature. Collision attacks on the hash functions apparently do not negatively or significantly impact password verification schemes. In addition to the SHA-1 phaseout and the cessation of 128-bit and 160-bit hash function use, it is recommended that new approaches to hash function design be devised by introducing tough constraints in basic security properties to defeat current and new cryptanalysis methods.
    Click Here to View Full Article

  • "A Conversation With Phil Smoot"
    Queue (01/06) Vol. 3, No. 10; Fried, Ben

    Hotmail engineer Phil Smoot explains that managing the service's billions of daily email transactions using less than 100 administrators involves meeting a variety of challenges, such as contending with abuse and spam, maintaining an Internet-level pace of shipping features and functionality every three and six months, and working out a strategy for issuing complex changes over a series of multiple releases. "When you ultimately get down to the cost of these big, huge megasystems, there's going to be a good chunk of it for your hardware and another big chunk for human costs, so the more you automate processes and the more resilient you can be with handling failures, the more competitive you can be," Smoot says. He believes engineers should embed automation and instrumentation within the service from the beginning. Smoot notes that Hotmail uses as many Microsoft products as the public can use, while custom tools are more likely to be built for deployment, ticketing, metrics gathering, code coverage, bug tracking, inventory, failure detection, monitoring, and build systems. He describes Hotmail as being nearly 98 percent effective at spotting spam messages and sources, and their response to spam is a daily occurrence. Smoot says backup on tapes is no longer workable, and foresees a move toward cheap disk backups. "What you'll start to see is the emergence of the use of data replicas and applying changes to those replicas, and ultimately the requirement that these replicas be disconnected and reattached over time," he predicts. Operational consistency means there is less and less need to scale out operational people, which yields economic advantages, Smoot says.
    Click Here to View Full Article

  • "What Is Speech Usability, Anyway?"
    Speech Technology (12/05) Vol. 10, No. 6, P. 22; Polkosky, Melanie D.

    Many facets of human-computer interaction technologies are frustratingly difficult to define, such as usability and speech applications. Usability has been defined as a function of how easily users can learn the system, how efficiently they can use it, and how well they can remember it, as well as the number of errors they typically make. Conventional definitions of usability fail to accurately describe speech technology, which is not conveyed visually and is further complicated by issues surrounding the persona of the interface. One of the first measures of speech usability gauged the intelligibility and naturalness of a human voice heard over a telephone, though this test failed to account for how a user reacted to a synthetic voice and was expanded to include the social impression and the musical quality of speech. This expanded Mean Opinion Scale still failed to interpret the linguistic cues assumed to influence people's perceptions of speech technology. Subsequent definitions have had similar shortcomings, leading experts to the current four-factor definition comprised of user goal orientation (which measures a user's control, confidence, and efficiency), customer service behavior, speech characteristics, and verbosity. In measuring the effectiveness of a system, speech technologists aim for a strong correlation between customer usability and satisfaction. System designers must also get to know the priorities of the intended users through focus groups and surveys. It is important to consider a system's persona, which is not merely a function of being male or female, though it must remain subordinate to the goal's of the system's users. Developers must also realize that the quality of a system stems from a user's comparison with a host of other customer service environments, such as human interaction, the Web, and the mass communication media of television and radio.
    Click Here to View Full Article

  • "Innovations From a Robot Rally"
    Scientific American (01/06) Vol. 294, No. 1, P. 64; Gibbs, W. Wayt

    The Defense Advanced Research Projects Agency's (DARPA) Grand Challenge 2005--an off-road race between automated vehicles that had to travel 132 miles in less than 10 hours to qualify for a $2 million prize--inspired the engineering community to develop technical innovations enabling the computer-controlled vehicles to rapidly traverse rough and unfamiliar terrain. The unmanned robots demonstrated advanced location tracking, high-speed path planning, and road and obstacle perception technology, which may eventually be incorporated into military, industrial, agricultural, and consumer vehicles. Carnegie Mellon University's Red Team carefully planned out the path their entrants would follow with a "preplanning" system that used satellite, aerial, and laser-scanned imagery to map out routes and divide them into manageable segments that were reviewed by experts; this task was made all the more challenging by the fact that the race course was not unveiled until two hours before the contest began. The Grand Challenge indicates laser scanning may be robot vehicles' best method for road and obstacle perception, although approaches varied. Each of the Red Team's vehicles was equipped with a single, moveable long-range laser, while Team DAD tackled the laser scanner calibration problem with an innovative sensor that contained 64 lasers within a motorized circular platform that spun 10 times a second. The winning vehicle was Stanford University's Stanley robot, which used a combination of laser and video sensors and machine learning to apply the gas whenever it detected a smooth road extending into the distance. Team Terranax's "trinocular" vision system constructed a 3D stereo view of its surroundings from any of three pairs of color video cameras.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM