Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM. To send comments, please write to technews@hq.acm.org.
Volume 7, Issue 872:  Wednesday, November 30 2005

  • "Overhaul of Linux License Could Have Broad Impact"
    New York Times (11/30/05) P. C3; Lohr, Steve

    A forthcoming overhaul of the General Public License (GPL) could have a seismic impact on the software industry, which has come to see open source initiatives such as Linux become viable alternatives to traditional commercial products. Linux alone accounts for an estimated $40 billion worth of software and hardware. The work of overhauling the GPL begins today when a document outlining the process is posted at www.gplv3.fsf.org. The first revision to the license in 15 years, expected to be completed by summer or fall of next year, will also likely reexamine software patents. The driving force behind the GPL is Richard Stallman, founder of the Free Software Foundation, who has sought to use copyright law to protect the unfettered right to use, study, copy, and modify software. Stallman authored much of the open source Unix code, though it was not until 1991 that Linus Torvalds incorporated his work into the kernel that would eventually become known as Linux. Stallman, like much of the open source community, believes that proprietary software restricts the free flow of information, describing software patents as "utterly insane." He does, however, acknowledge that commercial software is necessary for Linux's continuing success, due to the economic impossibility of a computer able to solely run open source software.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Another Blow to E-Voting Company"
    Associated Press (11/29/05); Robertson, Gary D.

    A North Carolina judge has ruled that electronic voting machine manufacturer Diebold will not be protected from criminal prosecution in the event that it does not make its software code available, as required by state law. Due to the ruling, Diebold could halt sales of new voting equipment in that state, where lost votes cast doubt on the results of a statewide election last year. "We will obviously have no alternative but withdraw from the process," said Doug Hanna, a lawyer for Diebold, which supplies voting machines to roughly 20 North Carolina counties. The dispute originates from the requirement that suppliers make available the code that powers voting machines and the programmers who design it. However, Hanna says that because Diebold uses Windows, it does not have the right to disclose Windows code, nor is it possible to provide the names of every programmer who designed Diebold's software. No criminal charges have been brought against Diebold yet, though the company's reputation was further tarnished for supplying voting machines that were responsible for election disruptions in California last year.
    Click Here to View Full Article
    For information regarding ACM's e-voting activities, visit http://www.acm.org/usacm

  • "Supreme Court to Hear eBay Patent Case"
    IDG News Service (11/28/05); Gross, Grant

    The Supreme Court will hear a case emerging from a suit involving the restriction placed on eBay against using technologies patented by MercExchange. An appeals court earlier ruled that eBay had acted in violation of two of MercExchange's patents concerning search software and online auctioneering. An original award of $35 million was eventually pared down to $25 million, and one of the patents was ruled invalid, though the appeals court upheld the infringement concerning eBay's "buy it now" feature. EBay's Hani Durzy said that "MercExchange, which doesn't practice its own patents and only exists to sue others, shouldn't be allowed a permanent injunction." The software and pharmaceutical industries could both feel the impact of the ruling, as they depend heavily on patents. The ruling could also be a broad referendum on a patent system that critics allege has become bloated and corrupt, as a growing number of companies are obtaining bogus patents and suing other companies for infringement. Pharmaceutical companies, which depend on patents to recoup the expense of developing drugs, could step in to oppose revisions to current patent law, which dates back to 1908, though some analysts believe that substantial changes to the existing laws are unlikely.
    Click Here to View Full Article

  • "Next-Generation Networks"
    Technology Review (11/29/05); Talbot, David

    In a recent interview, Internet2 CEO Doug Van Houweling likened today's Internet to second-class mail and outlined his vision on how it could be improved. Internet2 serves more than 240 academic and research institutions throughout the United States, as well as corporate and international organizations, using state-of-the-art technology to handle the most data-intensive applications. Van Houweling cites the emergence of closed architectures, scalability issues, security concerns, and the reluctance of companies to support solutions based on standards as limitations on today's Internet technology. He expects consumers to demand increased capacity to power the multimedia applications that are straining the capacity of the current Internet. As Van Houweling envisions it, the network of the future would support high-quality video, resource sharing, and ultimately lead to a fundamental shift in how we live our lives. To unlock the real potential of the Internet, Van Houweling identifies the three key areas of focus as optical networking, federated authentication, and consistent network performance. To address the escalating bandwidth requirements, the Internet2 group has developed a hybrid approach where conventional IP technology is augmented with optical circuit paths. Trust will be the foundation of the next-generation Internet, said Van Houweling, where authentication will be built into every transaction, so as to curb the mounting problem of Internet fraud. To improve on the inconsistencies of today's Internet, diagnostic technology would be embedded in the future Internet, so the average user would be able to troubleshoot their system when performance was slumping. Van Houweling anticipates the research of the Internet2 group trickling down to improve the everyday user's experience over the next five to 10 years.
    Click Here to View Full Article

  • "Human-Centered Intranet Design"
    Intranet Journal (11/28/05); Chin, Paul

    The difficulty humans have interacting with technology is epitomized in poorly designed or needlessly complex intranets, the intended engines of integration that can paradoxically drive a wedge between a user and his computer. The critical lesson for developers is that regardless of their technological capabilities, the end product must be compatible with a human's innate work habits. As software grows more sophisticated, it frequently becomes more difficult for the layman to use, despite the assurances of simplicity proffered by marketing teams. Human-centered design is not a new concept, but it is too frequently given short shrift in a development process that focuses narrowly on technology. ACM's SIGCHI has offered this definition of human-computer interaction (HCI): "A discipline concerned with the design, evaluation, and implementation of interactive computing systems for human use and with the study of major phenomena surrounding them." Factors undermining HCI are poorly designed software, prior experience, cultural biases, and individual limitations and abilities, though irrespective of how user-friendly a program claims to be, any new user will need a period of acclamation where their first impressions are formed. It is the responsibility of developers to establish an initial connection with all users, rather than frustrating those who are not technologically inclined and sending them off in search of a simpler system. Developers are well-advised to avoid loading up a system with unnecessary features that inhibit, rather than facilitate, usability. To match a system to human work habits, designers should arrange content in a linear fashion, and make sure that meaning can be inferred at a glance. Clutter should be avoided, and the systems should be self-explanatory and avoid jargon.
    Click Here to View Full Article

  • "Stolen Passwords and Lost Laptops Among Top PC Concerns"
    Business Wire (11/29/05)

    Lost laptops and stolen passwords are the principal concerns among PC users, according to a recent survey, while 78 percent of those surveyed feel that biometrics will improve security. The survey, sponsored by AuthenTec, found that most users believe security has improved in the last five years, though many are still concerned about viruses and hackers, in addition to password issues. Despite the recommendations of security experts, 52 percent of users have the same password for all their accounts, and 77 percent report having trouble remembering their password. Almost three-quarters would opt for a laptop with biometric recognition when purchasing a computer in the future. "The survey results show that many computer users continue to use password techniques that put their systems at risk at work and at home," said Tom Aebli of AuthenTec, which has become a sponsor of today's National Computer Security Day, November 30, along with ACM and organizations. The annual event began in 1988 to heighten awareness of issues pertaining to computer security.
    Click Here to View Full Article

  • "Security Flaw Allows Wiretaps to Be Evaded, Study Finds"
    New York Times (11/30/05) P. A21; Schwartz, John; Markoff, John

    Telephone wiretapping systems used by law enforcement agents can be defeated by off-the-shelf equipment due to a security flaw in the systems, according to computer security researchers at the University of Pennsylvania. The finding means that people who are being wiretapped can use "devastating countermeasures" to disrupt the recording, says lead researcher Matt Blaze. The wiretapping technology can be foiled by stopping the recorder remotely, and the numbers dialed can even be falsified, raising "implications not only for the accuracy of the intelligence that can be obtained from these taps, but also for the acceptability and weight of legal evidence derived from it," according to the researchers' report. The most vulnerable wiretapping systems are older ones, many of which are still used by state and local law enforcement, but an FBI spokeswoman says that only about 10 percent of state and federal wiretaps have the security flaw, making it a non-issue. The researchers' paper recommends that the FBI conduct a comprehensive review of its wiretapping technologies for security threats because the discovered security flaw threatens "law enforcement's access to the entire spectrum of intercepted communications." The researchers defeated the wiretapping systems by having the wiretapping target send the same "idle signal" that the tapping equipment transmits to the recorder when the telephone is not being used, which turns the recorder off and allows that targets to continue their conversation while sending the signal.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Smoothing the e-Media Delivery Chain"
    IST Results (11/25/05)

    To overcome the incompatibility and standardization obstacles that impede the exchange of digital content, the IST-funded MediaNet group is working to broaden the linkage between IP networks and other networks to facilitate new online services such as multi-channel publishing and interactive television. MediaNet members identified the tripartite package of voice, data, and video as a model ideal for commercial ventures, but ill-suited for the broader requirements of the future. They also decided that current video technologies, such as MPEG2, are not efficient enough to deal with the growing demands of multimedia formats. MediaNet partners also found content security and copyright protection to be inadequate, areas critical to an e-media value chain that consists of content creators, providers, and aggregators, application service providers, access network service providers, and the in-home setting. In the home entertainment arena, the group identified bandwidth and quality of service as principal areas of concern. To create a smooth flow of content throughout the e-media chain, MediaNet recommends improving each segment's interface, with the ultimate benefits of providing greater choice at lesser cost and improving the quality of service in the audiovisual sphere with technologies such as content compression that would assure a high quality of broadcast in analog television geared for IPTV. Some of the project's most significant work was in video compression with the H.264 standard capable of encoding video using a third of the bits required in existing MPEG2 encoders. While it remains uncertain whether the standard will be applied commercially, it has been deployed across Europe for broadband video services and digital video broadcasting.
    Click Here to View Full Article

  • "Online Maps: The Next Generation"
    University of Southern California (11/21/05); Ainsworth, Diane

    University of Southern California computer scientist Cyrus Shahabi says the next generation of data management and visualization technology is needed to help decision-makers integrate the wealth of geospatial information now available to them. Shahabi, a specialist in databases and information management, and his colleagues at USC's Integrated Media Systems Center are doing their part by developing technology as part of the GeoDec (Geospatial Decision Making) project. "GeoDec is designed to enable an information-rich and realistic three-dimensional visualization and/or simulation of geographical locations, such as cities or states, rapidly and accurately," says Shahabi. The GeoDec technology, similar to Google Earth and MSN Virtual Earth, allows for quick building of accurate 3-D models, mapping of images and live video textures to the models, and automatic integration of spatial and temporal data such as road networks and GPS data into the model. "The idea is not just to allow navigation through a 3-D model, but to be able to submit queries and get information about the area seamlessly and effortlessly," says Shahabi. The technology would benefit city managers, city planners, emergency response planners, and first responders, among others. The GeoDec team also includes artificial intelligence expert Craig Knoblock, computer vision specialist Ram Nevatia, and graphics professors Ulrich Neumann and Suya You.
    Click Here to View Full Article

  • "It's Life, But Not as We Know It"
    Age (AU) (11/29/05); Head, Beverley

    Increasing numbers of computer scientists believe analysis of real-life phenomena will lead to the development of tools that can explain and more efficiently manage complex systems. Monash University scientist David Green notes that, in many respects, advanced computing is almost identical to biology, and he says there is a wealth of knowledge to extract from studying how nature addresses complex problems. University of Queensland professor Peter Lindsay, director of the Australian Research Council's Center for Complex Systems, is studying bird flocking patterns via swarm analysis in the hopes of improving air-traffic management. Airservices Australia is participating in a joint study with the ARC Center and NASA to chart air traffic management to 2020, and part of this effort involves gaining a better understanding of individual controllers' workloads; study manager Gerard Champion says such an understanding helps expedite the introduction of free flight, in which planes fly "semi-randomly and autonomously," and where the controller functions as a monitor. Other areas the ARC Center is researching through multidisciplinary programs include genetic regulatory networks and evolutionary economic systems. ARC has awarded a three-year, $244,000 grant to Monash University ecologist Dr. Martin Burd to study the organizational behavior of ants as a jumping-off point for human traffic augmentation, while the construction of networks that exhibit a degree of creativity could be helped along by swelling volumes of data from advanced brain imaging. Awareness of complex systems research's advantages is rare in Australia's private industry, according to Lindsay. "We need industry pioneers to take it forward, but the most likely immediate users are governments and agencies," he says.
    Click Here to View Full Article

  • "Quantum Bubbles Are Key to Extreme Computing"
    New Scientist (11/26/05); Chown, Marcus

    Using electron bubbles to encode bits in several quantum states at once could be a strategy for building quantum computers, according to Weijun Yao of Brown University in Providence, R.I. An electron bubble is produced by injecting a fast-moving electron into liquid helium that has been cooled below 2.17 kelvin; the collisions made with the helium atoms eventually slow down the electron and create a cavity of about 3.8 nm across. The cavity serves as the method for isolating quantum entities from their surroundings to protect their fragile states. The electrons' spin would carry 0s and 1s, and the qubit can be both 0 and 1 because the spin can be parallel or anti-parallel to the magnetic field. A linear quadruple trap and a set of conducting rings can be used to cage an electron in its bubble, spins can be initialized by cooling the unit to 0.1 kelvin, and the electrons can be manipulated by altering the spin so that they interact as logic gates. "I see no major technical obstacles to the system I envisage working with 100 qubits," Yao says. "That means it could do 1,000 billion billion billion operations all at once."
    Click Here to View Full Article

  • "...And Now Cyber Hugging"
    Techtree (11/29/05)

    James Teh of the Nanyang Technological University in Singapore says his "Poultry Internet" technology will lead to "human-to-human virtual hugging." For the past two years, Teh and Adrian David Cheok and Lee Shang Ping, director and manager, respectively, of the interaction and entertainment research center, have been working on developing a "hug suit" for chickens, as part of a system for tracking activity in the coop by video camera, and transmitting information via the Internet for a 3D simulation of movements. The "hug" technology that a chicken wears is a wireless, sensor-rigged "jacket." The idea is for the owner to touch the 3D model of the chicken, to translate the instruction into data, and reproduce the information as a series of vibrations via the jacket worn by the animal. The researchers are currently testing the technology. Teh says children could soon wear "pajama suits" studded with sensors that are able to pick up signals via the Internet, and interpret the data to adjust to changes in pressure and temperature to deliver a hug from their mom or dad, and parents who wear the suits will be able to receive a hug from their kids. He believes such hugging and touching will become a key element of future communication.
    Click Here to View Full Article

  • "Universities Say New Rules Could Hurt U.S. Research"
    New York Times (11/26/05) P. A11; Shane, Scott

    Recently proposed restrictions on the access to sensitive technology granted to foreign students and workers has raised the fear among universities that their research efforts will be severely curtailed. The restrictions, already proposed by the Defense Department and anticipated from the Commerce Department, appear to be aimed particularly at China, which has 60,000 nationals in the United States, and is unabashedly eager for U.S. technology for use in military applications. The academic community counters that impeding scientific research will actually erode national security. A foreign citizen needs formal permission, known as a deemed export license, to work on technology with military applications in a U.S. lab, though many foreign researchers are exempt from the license if it is understood that their work is intended for public knowledge. The Commerce Department issued a report last year alleging that current regulations were too lax, and that they open the door to spies working in U.S. labs. The report called for using a foreigner's country of origin, rather than his current citizenship, as the criterion for whether he should need a license, but the Association of American Universities' Tobin Smith counters that an individual's allegiances cannot be determined by his country of birth, and that the recommendations would impose the prohibitive expense on universities of tracking down every researcher's native country and wading through the labor-intensive process of determining which researcher could work with which machine. The proposed restrictions also call for created a segregated work environment where sensitive equipment is cordoned off and made unavailable to certain researchers. "That's not really realistic in a campus environment," of shared facilities Smith said.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Baking Privacy Into Personal Data"
    CNet (11/28/05); LaMonica, Martin

    In a recent interview, IBM's Jeff Jonas described the identity resolution technology that he developed more than 20 years ago. Jonas believes the technology has reached an acceptable level of maturity to be adopted by industry and government organizations. Begun as a fraud detection system, Jonas' company, SRD, was acquired by IBM in January, due largely to IBM's promise that Jonas would be able to expand the scope of his software to other industries. Jonas credits much of the system's development to the time he spent working with casinos in Las Vegas to spot scammers using multiple identities. His work with the casinos helped Jonas develop his relationship resolution software, an anonymization system that enables users to extract meaningful relationships out of an existing body of data, such as whether or not a vendor and a purchasing corporation have any connection, without compromising the security of a database. The anonymization software could have substantial value for companies increasingly assailed by concerns from customers and employees that their personal information is not secure. Anonymization technology could also have a sweeping impact on a wide variety of industries, from health care to government. Unlike conventional methods of encryption, the technology runs its data analysis on encrypted information, rather than decrypting the information before it performs analytics.
    Click Here to View Full Article

  • "Thank You for the Music"
    The Engineer Online (11/21/05)

    Researchers at Sun Microsystems Laboratories in Burlington, Mass., are working to give people more power to find and organize music in their personal collections. The "Search Inside the Music" project comes at a time when digital music collections have expanded substantially over the past decade, such as the capacity for MP3 Players to hold 10,000 songs or more. However, the tools available for finding songs has remained largely the same in that the search options of traditional music software still consist of music genre, artist, album, or song title. Engineer Paul Lamere and his colleagues envision people having the ability to organize music based on acoustic similarity, mood, lyrics, musical theme, tempo, rhythm, and instrumentation, so that they will be able to search music by content and context. Lamere's team is currently involved in indexing music that "sounds similar" to songs the listener favors, and using social data on musical preferences to recommend and organize songs the listener would likely want to hear.
    Click Here to View Full Article

  • "Version Control in Online Software Repositories"
    University of Southampton (ECS) (11/28/05); Watkins, E. Rowland; Nicole, Denis A.

    Existing open-source version control repositories supply an architecture for tracking documents as they evolve and clustering them into projects and releases, but they cannot reason well enough to make intelligent inferences about the content and audit its development. E. Rowland Watkins and Denis Nicole of the University of Southampton's School of Electronics & Computer Science combine the concepts of the Semantic Web, the WikiWikiWeb, and XML Signature-based document signing in an online collaborative tool that provides rule-based semantic deductions along with basic version control. This in turn has yielded a small group of extensions based on broadly used ontologies. Watkins and Nicole have employed these extensions as a description logic to augment a basic wiki with semantic content that specifies documents and how they relate to differing versions. This was followed by the creation of a cryptographic validation tool that uses digital signatures for RDF as its foundation. The researchers note that the tools were created mainly to uphold the management and execution of continuous software development projects, but they can also be used to assess how effective the semantic elements are by bulk loading existing projects' code repositories and making unique conclusions about their historic behavior. The online collaboration tool's value to the developer community hinges on how well inferences can address complex queries that are tough or outside the range of simple relational database queries. Watkins and Nicole express their enthusiasm for companies' growing interest in employing wikis as a tool to spur knowledge sharing and collaboration overall.
    Click Here to View Full Article

  • "Agency Weighs Single-Letter Web Addresses"
    Associated Press (11/28/05); Jesdanun, Anick

    As Internet name space becomes increasingly cluttered, ICANN is considering lifting its prohibition on single-letter domain names, which could ignite a bidding frenzy on prized Internet real estate. ICANN is expected to develop a plan for the release of the coveted domain names, which could include auctions netting six-figure bids for a given name. It will also have to determine if it will open single-letter addresses, which were held aside in 1993 out of a fear that they would not be able to fulfill growing demand, with every suffix. A few companies have already requested that ICANN make single letters available, such as Overstock.com, which would prefer to brand itself as o.com, and Yahoo!, which has put itself at the front of the line for y.com. Sedo.com CEO Matt Bentley expects the bidding frenzy that would ensue should ICANN release the names to be enormously profitable: "They would have a lot of cachet as a brand name," he says. "I could see there would be tons of demand."
    Click Here to View Full Article

  • "The Beauty of Simplicity"
    Fast Company (11/05) No. 100, P. 52; Tischler, Linda

    Google director of consumer Web products Marissa Mayer says the search engine's continued popularity is largely due to the maintenance of a simple, easy-to-use, user-friendly Web site that masks the complexity of the underlying technology. She says the Google home page is successful because "It gives you what you want, when you want it, rather than everything you could ever want, even when you don't." Mayer, a specialist in artificial intelligence not design, keeps Google developers compliant with strict standards designed to sustain the site's ease of use even with the addition of new and enhanced services. For instance, new services will not be included on Google unless they are compelling enough to solicit millions of page views a day, and Google promotes only a half-dozen services on its home page and sells no advertising. Factors contributing to a surfeit of frustrating, overcomplicated consumer technologies include the narrow-mindedness of engineers and industrial designers; a competitive environment where new features are the key measure of product differentiation; and marketers' failure to bring "ease of use" into vogue. Sometimes the first step to simplifying products is simplifying the company; for example, Royal Philips Electronics has earned kudos and additional sales revenue by dramatically reducing its businesses and divisions and adopting a more end-user-focused approach to business and product development. John Maeda, who runs MIT Media Lab's Simplicity Consortium, says technological simplicity is in part an issue of scale, as the advent of the microchip made it possible for very small devices to be very complex but with little room for instruction. Maeda is working on software dubbed OPENSTUDIO that would connect designers to customers in order to create products more tailored to individual users needs.

  • "Designing Data-Centric Software"
    Embedded Systems Design (11/05) Vol. 18, No. 11, P. 16; Buechele, Eugene K.

    A fresh perspective on embedded software development is needed as devices transform into data-rich systems with data-driven applications, which is where data-centric software design comes in. A data-centric development approach shifts centrality from procedural functions and interfaces to data, and various alternatives for designing data-centric software exist. The use of software-design approaches and data-abstraction techniques to simplify applications programming and data management is one option, while the use of an embedded database is another. However, the computing power, memory, and storage requirements of database technologies often exceed the capabilities of most embedded systems. A data-centric development framework is a third option, and such a framework must employ a data-centric design approach; convert data-centric language into source code and data service libraries; possess a data management service architecture that would include storage management services, data-stream services, table-management services, memory-management services, and transaction services; and devote special attention to code portability. The provision of these various components would allow data-management problems to be addressed more efficiently with a domain-specific language, and permit the remaining framework elements to produce the code and supply the low-level deployment services required for optimum execution.
    Click Here to View Full Article