Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM. To send comments, please write to technews@hq.acm.org.
Volume 7, Issue 836:  Wednesday, August 31, 2005

  • "State Law Would Mandate Following E-Vote Paper Trail"
    Inside Bay Area (CA) (08/30/05); Hoffman, Ian

    The California Senate voted unanimously on Aug. 29 to pass a bill requiring e-voting systems to include a paper trail, which was conceived by computer scientists as a safeguard against election fraud or voting errors. "People need and deserve to know their votes have been counted accurately, and the best way to ensure that is to make sure there's a paper printout of every electronic ballot," declared bill author Sen. Debra Bowen (D-Marina Del Rey). Her legislation would accord electronic ballots the status of a legal record, but depend on the paper trail for the state's mandatory hand count of ballots in 1% of districts. University of California, Berkeley, computer science professor David Wagner said touch-screen voting machine vendors have made the paper rolls used for the printed ballots unnecessarily clunky and prone to jams, while California Secretary of State Bruce McPherson said the paper rolls should not be treated as ballots because they do not look like ballots. If the measure is approved by Gov. Arnold Schwarzenegger, paper trails would likely be a fact of life for California voters for quite a while, as a more effective method for e-vote confirmation remains elusive. "My guess is for the next five to 10 years paper will be the preferred alternative," said Wagner.
    Click Here to View Full Article

  • "Universal Software, Universal Appeal"
    IST Results (08/30/05)

    Addressing the inefficiencies of designing software for devices that run on heterogeneous networks is the underlying motivation of the IST-funded DEGAS project, which devised a theory for managing such networks and created tools to write software that can work on a wide spectrum of devices. Demos of e-commerce software and a mobile adventure game were developed by the DEGAS team, using Universal Modeling Language to design the core components. Currently, applications are differentiated exclusively by the system architecture they are specifically designed for, but the DEGAS team held off differentiation as long as possible by using the Process Calculi intermediate framework, according to project coordinator Corrado Priami of Italy's University of Trento. The researchers created a central program that remains consistent across all devices, and the intermediate language was adapted to individual devices via compilers. A major breakthrough for the DEGAS team was the construction of most analysis tasks at the logical or universal layer, which significantly streamlines application development. The development of formal analysis and validation tools was another substantial accomplishment, one that guarantees the proper operation of the software and boosts program security. The architecture behind the mobile multiplayer adventure game addresses latency through the use of a small central server while most game controls reside on the specific device, and Priami says the team also developed a peer-to-peer protocol enabling communication between devices without central control. An upcoming IST-funded project, SENSORIA, will focus on the enlargement of applications developed with the DEGAS tools.
    Click Here to View Full Article

  • "Sampling Finds Federal Data Mining Fails to Assure Privacy Protections"
    Associated Press (08/30/05); Sniffen, Michael J.

    The Government Accountability Office (GAO) held a study of five federal agencies that employ data mining and issued a report on Aug. 29 concluding that none of the agencies fully comply with the Privacy Act, federal information security statutes, or government directives concerning the collection of information on citizens. Consequently, the agencies offer no guarantees that individual privacy rights are properly safeguarded. The GAO investigated an Agriculture Department Risk Management Agency initiative to detect fraud in federal crop insurance; a State Department-General Services Administration effort to monitor employees' use of government charge cards; the IRS Reveal System to spot terrorist activity, financial crimes, and fraud; the FBI Foreign Terrorist Tracking Task Force's attempts to locate terrorists in the U.S.; and the Small Business Administration's (SBA) use of its risk measurement/management system in two loan programs. The office discovered that only three agencies had prepared privacy impact assessments of their data programs, none of which were fully compliant with Office of Management and Budget (OMB) guidelines. The report found evidence indicating that all five agencies had made some progress toward security, but none had adhered to all federal and OMB privacy regulations: State had failed to perform a risk assessment to ascertain vulnerabilities and develop countermeasures; FBI and Agriculture did not test contingency plans; SBA and Agriculture did not fully document their incident response capabilities; and the IRS system was still being tested.
    Click Here to View Full Article

  • "Don't Fear Software Patents"
    Wall Street Journal (08/30/05) P. B2; Lehman, Bruce A.

    International Intellectual Property Institute Chairman Bruce Lehman says the concern over software patents is overblown, and sees no evidence that such patents have had a detrimental effect on the U.S. software industry in the 24 years since the landmark Supreme Court decision that legitimized software patenting. He points out that the U.S. currently owns at least 80% of the global software market, and notes that the accumulation of patents by major computer manufacturers in the early days did not stop former start-up companies such as Microsoft and Cisco from becoming global behemoths. Lehman frowns on the practice of keeping inventions secret in the absence of patents, as it hinders development of new products and services because of a lack of cross-vendor product compatibility. Lehman recalls that shortly after his appointment as U.S. Commissioner of Patents & Trademarks he held public hearings on software patentability in which opposition was a minority view, and led to the decision to concentrate on improved training for patent examiners, the recruitment of trained computer scientists, bolstering the "prior art" database, and releasing written guidelines on examination standards. Lehman thinks opposing software patents is the wrong strategy, and instead recommends that the U.S. Patents and Trademark Office be allowed to examine patent applications faster and with better quality, as well as not be required to divert its fee revenue to the general Treasury. Although Lehman laments the European Parliament's recent failure to reach consensus on a pan-European software patenting standard, he expects the U.S. software development job market to grow as a result.

    For information regarding ACM's work in the area of intellectual property, visit http://www.acm.org/usacm/copyright.

  • "Hollywood, Microsoft Align on New Windows"
    CNet (08/30/05); Borland, John

    As part of Microsoft's promise to Hollywood studios to shield their content from video piracy, the next version of the Windows operating system, called Vista, will feature unprecedented protections. The most fundamental change will be the management of some audio and video in a new "protected environment" that will segregate applications such as plug-ins or media players from the actual media data, while "Protected Video Path" technology will try to guarantee the encryption of a video stream until it reaches the monitor or other display devices. If such encryption is not possible, the computer could completely shut down video output, if so required by content owners. As an alternative, Vista will also boast a "constriction" feature that can reduce the resolution of high-definition video rather than block the connection altogether. Copy-protected audio files will be treated to a similar process. These safeguards could reduce interoperability between Vista computers and older monitors and TVs, as well as complicate casual duplication. Digital activists and computer programmers fear that such concessions could hinder programming innovation or even prevent computer owners from accessing parts of their own systems. Microsoft says it and other consumer electronics suppliers must assure the entertainment industry that their content is adequately protected if their devices are to run the studios' planned HD content and be central elements in new digital home networks.
    Click Here to View Full Article

  • "Distance Detection May Help Secure Wi-Fi"
    IDG News Service (08/29/05); Lawson, Stephen

    At last week's Fall Intel Developer Forum, Intel senior fellow Justin Rattner unveiled a method of user identification that could lead to more secure Wi-Fi networks. The technology, known as precision location, measures the time it takes for packets to travel back and forth from an access point and uses that information to derive the user's proximity. As an alternative to GPS tracking and Wi-Fi triangulation, precision location could help administrators locate and fix hardware malfunctions in data centers. Because of the accuracy it offers, Intel says the new technology should secure home and enterprise networks better than current methods of encryption; the technology could also be used to transfer a video program from one device to another, as Rattner demonstrated the seamless transition from a television to a stationary PC to a notebook computer. Although modifications would have to be made to both access points and clients, Intel intends to submit the technology to the IEEE 802.11 group for standardization. During the same presentation, Rattner exhibited a 100MHz CMOS voltage regulator that modulates a CPU's power more quickly than current analog devices located on the opposite side of the motherboard; when coupled with a graphics and memory controller hub, the voltage regulator offers a power savings between 15% and 30% in a notebook computer, without compromising performance. Among the other forthcoming innovations from Intel is a technology that promises to detect and isolate worm attacks on PCs by analyzing traffic patterns. Intel is also at work on a new digital image search program and a method to identify environmental conditions such as heat and humidity that threaten to cause servers to overheat.
    Click Here to View Full Article

  • "Research Uncovers Genetic Instructions to Build Life"
    University of Toronto (08/29/05); Kelly, Karen

    The genetic instructions for building mammalian life have been discovered by University of Toronto researchers, who accomplished this breakthrough by feeding biological data into an artificial intelligence program, as detailed in the Aug. 28 issue of Nature Genetics. University of Toronto electrical and computer engineering professors Brendan Frey and Timothy Hughes teamed up with researchers from Toronto's Hospital for Sick Children and Mount Sinai Hospital to probe samples from 37 mouse tissues for genetic instructions. Microarrays were used to illuminate DNA regions that were being read by cells in various body parts; the likely influence of a gene is connoted by similar patterns of activity in nearby regions. Frey says his group developed an AI program into which these patterns were fed, resulting in the identification of thousands of genetic instructions. Among the revelations the researchers made with this technique was the presence in a particular region of the fourth chromosome of one very long gene rather than four short genes. The long gene is now believed to play a role in the assembly of large protein molecules in the cell nucleus, and the scientists hope to uncover new insights into genetic malfunctions and the roots of disease by better understanding this gene. It was also learned through analysis with the AI program that no more protein-coding genes exist to be discovered, which Hughes says runs contrary to popular research. The discovery was made by studying the data and surmising the most probable genes based on user-program variables.
    Click Here to View Full Article

  • "Intel Helps UCSD Teach Students About Wireless, Multimedia Embedded Systems"
    Jacobs School of Engineering (UCSD) (08/26/05)

    Intel is donating $193,638 worth of microprocessor development kits typically reserved for its own developers or partner companies to the University of California, San Diego. The kits are designed for the creation of embedded systems with multimedia and wireless applications in mind, signaling a departure from the PC research that has historically claimed the lion's share of funding. The donation of the 40 PXA27x Processor Developer's Kits and attendant computer equipment comes out of a desire to "jump-start relevant curriculum and research that prepares students for tomorrow's applications," said Intel's Jerry Kissinger. The kits will be used in a biannual embedded systems course with an approximately one-to-one kit-to-student ratio. The students' projects will focus on application and systems programming with a particular concentration on embedded systems on wireless networks, addressing the power limitations of mobile devices. One proposed project will help seismologists at the Scripps Institution of Oceanography receive data instantly from seismic sensors when an earthquake occurs. Intel is hopeful that this grant will help encourage students to specialize in an area that could influence the next generation of wireless computing. Their donation contributes to UCSD's long-term goal of raising $1 billion for the school's Imagine What's Next campaign.
    Click Here to View Full Article

  • "Scientists Reignite Open Access Debate"
    Financial Times (08/31/05) P. 4; Cookson, Clive

    A group of computer scientists that includes World Wide Web inventor and Southampton University professor Tim Berners-Lee yesterday issued a rebuttal to journal publishers' allegations that freely releasing publicly funded research on the Internet will ultimately destroy scientific journals. This rebuttal took the form of a letter to Research Councils UK (RCUK), urging the organization to continue supporting its proposal to mandate the deposit of research papers in open-access databases as expeditiously as possible. In its analysis, leading scientific journal publisher Reed Elsevier charged that "if the RCUK proposal was implemented, access would not increase beyond current levels; current quality assurance levels could be reduced; U.K. higher education institutes would end up paying more for articles they can already access; the continuity and completeness of the scientific record would be threatened; and the productivity of multiple stakeholders in the U.K. science research community would be reduced." The computer scientists countered that Reed Elsevier's claims have not been validated, citing evidence indicating that author self-archiving and journals can coexist. In addition, freely available, self-archived research articles can give increased exposure to authors, institutions, funders, and publishers, the computer scientists claimed.

  • "Open Source Project Aims at Middleware"
    InformationWeek (08/29/05); Babcock, Charles

    Companies' increasing dependence on middleware has led to an open source movement from the Apache Software Foundation, known as Synapse, that could eventually challenge commercial applications such as IBM's WebSphereMQ and Tibco's Rendezvous. Though Synapse is still in its embryonic stages, and has yet to be upgraded by Apache to a full-fledged project, the prospect of a universal, standardized, and open source method of brokering services on a network could alter the commercial landscape considerably. If Apache proceeds with Synapse, the project will build on Apache Axis and Infravio's X-Broker, which will lend "two mature pieces of code" to help launch the initiative and attract the attention of a community of developers, says Infravio's Miko Matsumura. Tibco's Rob Meyer claims the open source project has no support from major vendors, but Synapse has garnered backing from Iona Technologies, Blue Titan Software, and Sonic Software. Synapse could help convert legacy systems to Web services if it succeeds in creating a standardized message and brokering system that builds on current standards, such as WS-Security and WS-Policy. As the demand for service-oriented products increases, Apache may emerge as the preeminent provider of open source code if it manages to add Synapse to an open source stack of interrelated programs. Forrester Research's Michael Gouldes says, "Apache is setting a lot of the direction of where open source and Java is going."
    Click Here to View Full Article

  • "Faster Supercomputers Aiding Weather Forecasts"
    National Geographic News (08/29/05); Handwerk, Brian

    Accurate weather forecasts can mean the difference between life and death when killer storms and other dangerous meteorological conditions arise, and they also play a critical role in the global economy; for example, up to one-third of the American GDP--$3 trillion worth of goods and services--is at least partially reliant on weather. The accuracy of weather forecasts receives a boost from faster and more powerful supercomputers crunching vast amounts of data received from satellites, weather balloons, ocean buoys, mountaintop observation stations, and other sources around the clock. The National Oceanic and Atmospheric Administration (NOAA) generates approximately 200,000 distinct meteorological products daily from these observations, which are expected to soon exceed 200 million per day. NOAA's primary forecasting supercomputer is IBM's Blue in Gaithersburg, Md., while the IBM White supercomputer in West Virginia serves as backup. The TOP500 List of Supercomputers ranks Blue and White as the 69th and 70th most powerful computers in the world, respectively. IBM's Dion Rudnicki reports that NOAA's machines would probably be even more capable if they were freed from their critical responsibilities. "The main characteristic for NOAA is that they live and die by a 24-by-7-by-365 clock--[and are] concerned about reliability and availability," he explains. Todd Gross, chief meteorologist for WHDH-TV in Boston, notes significant increases in forecasting precision, and remarks that the accuracy of four-day forecasts are now on a par with the old two-day forecasts.
    Click Here to View Full Article

  • "LANL Computers Weather Daily Cyber Assaults"
    Los Alamos Monitor (NM) (08/26/05); Snodgrass, Roger

    Los Alamos National Laboratory (LANL) runs 25,000 computers that process 850GB of data in 20 million legitimate sessions per day. Up to 15 million malicious sessions occurred during peak traffic between May and mid-August with more than 90% of weekend activity coming from malicious sources, according to LANL statistics. Security consists of firewall networks for public areas and compartmentalization for its classified network with passwords cryptographically generated and used one time. Other key security is user education, detection and prevention intrusion, quick software patches, setting traps, alarms, and a constant response team that works in association with law enforcement and counterintelligence organizations. New Mexico Rep. William Payne (R-Bernalillo) believes LANL is too focused on security defenses and should plan some offensive maneuvers, such as sending attackers messages saying they are being monitoring by the FBI or using a worm to destroy their computer. Cybersecurity is a growing issue, with the President's Information Technology Advisory Committee reporting a more than 20% annual rise in cyberattacks and 92% of companies reporting virus infections in 2003. FBI director Robert Mueller says recent cyberattacks caused a sewage system computer system to release more than 250 million tons of raw sewage onto the property of a luxury hotel, while the Slammer worm disrupted safety systems at an Ohio-based nuclear power plant, and hackers took control of a Russia-based gas pipeline for an entire day after breaking into its electronic control systems.

  • "Google Talk Gives Boost to XMPP"
    InfoWorld (08/26/05); Moore, Cathleen

    Google's venture into the IM sector will lend considerable support to the emerging protocol on which it is based: the Extensible Messaging and Presence Protocol (XMPP). XMPP and SIMPLE appeared a few years ago as the two major protocols competing for hegemony in the IM space; XMPP is based on XML, and SIMPLE, which gained an early lead with its adoption by Microsoft and IBM, is based on Session Initiation Protocol (SIP). SIMPLE is still in the development stages with the IETF, whereas XMPP was ratified last year. Google's decision to base its Talk program on XMPP is a nod toward its overall commitment to open source technologies, in contrast to AOL, Yahoo!, and MSN, which have all rejected standards in an attempt to protect the users of their IM networks. Although those companies have made some overtures toward interoperability, Google Talk goes farther in its offering of a completely open source program that welcomes users of any client supporting XMPP, including Adium, iChat, GAIM, and Trillian Pro. Typically reserved for SIP, Google Talk also supports voice communication, and plans to include SIP support in a future update. Google's considerable market share could significantly alter the IM landscape, particularly if its Gmail application gains popularity and the other major players are forced into accepting interoperability. Though its impact on the overall IM landscape is still uncertain, Google Talk's adoption of XMPP, following on the heels of a similar move by Apple and the protocol's IETF acceptance, clearly improves the standard's position in the marketplace.
    Click Here to View Full Article

  • "The Future of Computer Worms"
    IT Observer (08/30/05); Sancho, David

    Trend Micro research engineer David Sancho outlines possible future attack strategies of bot worms and what steps can be taken to counter them. He says the modular design of bot worms enables them to exploit vulnerabilities faster, which means the interim between the disclosure of a vulnerability and its exploitation will shrink in the very near future; countermeasures Sancho suggests include the immediate patching of home systems as soon as updates are available, and the deployment of software and hardware designed as protective measures against malware in corporate environments. The author thinks future worms could employ polymorphic shellcode exploit attacks, a method in which bot authors create a module that alters the exploit code so that it always varies, which could thwart vulnerability and intrusion detection systems whose effectiveness hinges on the exploit code never changing. A solution to this threat would be a tool that detects the unique compression methods used by each worm variant, and Trend Micro has a scan engine in the works that promises to spot different compression techniques before isolating specific detection patterns. Sancho also expects future worms to perform RSS feed hijacking, in which worms commandeer the existing configured RSS-feed clients to automatically download new worms and other kinds of malware. The author believes the release of Internet Explorer 7 could make RSS feed hijacking a legitimate threat, and recommends that companies implement a method to scan HTTP traffic as a protective measure.
    Click Here to View Full Article

  • "Senator Coleman Denounces U.N. Internet Governance Report"
    Government Technology (08/26/05)

    Sen. Norm Coleman (R-Minn.) recently decried a U.N. proposal to wrest away governance of the Internet from the United States. The United Nations' Working Group on Internet Governance (WGIG) has issued a report suggesting that it supplant the U.S. in overseeing the evolution of the Internet, and that the responsibilities of ICANN be transferred from the State Department to the United Nations. Claiming that it is plagued by incompetence and corruption, Coleman charges that "the first priority for the United Nations must be the fundamental reform of its management and operations rather than any expansion of its authority and responsibilities." Coleman particularly objects to the notion that countries such as China and Cuba, which he criticizes as established opponents to the free flow of information, would have the same steering authority as the U.S. over one of the world's most powerful economic drivers. Coleman has introduced legislation with Sen. Dick Lugar (R-Ind.) to explore what he characterizes as the culture of corruption in the United Nations. Coleman maintains that he will carry on a dialogue with technology leaders and other countries to determine the best course of development for the Internet, and whether more legislation is appropriate, though U.N. control is "out of the question."
    Click Here to View Full Article

    For information regarding ACM's Internet governance work related to ICANN, visit http://www.acm.org/serving/IG.html.

  • "Fewer CS Majors Not a Big Concern"
    Computerworld (08/29/05) P. 19; Robbins, Virginia

    Chela Education Financing CIO Virginia Robbins argues that most corporations have little use for people with computer science degrees per se. Rather, business-oriented personnel (marketers, accountants, and the like) with computer skills are desired. Robbins does not think the decline of students graduating with degrees in computer science is a worrisome trend. "If we assume that a capitalist society will continue to reward innovation and protect intellectual property, then the best minds will continue to migrate to the best research centers, regardless of where one graduates," she reasons. Robbins cites her 21-year-old nephew and his college roommate as examples of why computer science has limited appeal. Her nephew said he only entered computer science on the off-chance he could convert such skills into a well-paying career in gaming; his roommate also chose computer science out of a love for gaming. Robbins acknowledges that not all computer science students are gamers, but her nephew and his roommate's attitudes are very common in wanting to make money from something they enjoy.
    Click Here to View Full Article

  • "Supernets for Global Research to Shine at iGrid"
    EE Times (08/29/05) No. 1386, P. 54; Brown, Chappell

    The iGrid alliance will meet next month to demonstrate global-computing applications of 10Gbps optical networks developed by researchers from 21 countries. The group has been meeting every two years or so since 1998 to prototype the transnational use of existing 10Gbps networks for scientific research, and California Institute for Telecommunications and Information Technology (CalIt2) director Larry Smarr said the September conference will be a momentous event. Three grand-challenge problems to advance the global lambda grid will be presented at the meeting: Constructing a terabit local-area network to serve as a test bed for terabit global networks, creating secure streaming media for super-high-definition digital cinema, and experimenting with high-definition video for virtual reality systems. The international lambda grid deviates from the current Internet by permitting direct end-to-end links at very high data rates, which must be supported by cross-continental optical backbones as well as direct fiber-optic connections to computing centers. The iGrid participants are contributing the high-data-rate optical network to grid computing, and Smarr's CalIt2 center is working on the OptIputer, an IP-based general optical-computing architecture. Smarr characterized the emergence of the lambda supernetwork as "a once-in-20-year kind of transition and...a worldwide phenomenon," noting that U.S. efforts in this vein have trailed behind those of other countries.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "New Legal Code"
    IEEE Spectrum (08/05) Vol. 42, No. 8, P. 60; Klemens, Ben

    The Brookings Institution guest scholar and author Ben Klemens suggests that software developers can create innovative new products without running afoul of patent-holders by having software covered by copyright law rather than patent protection. Copyrighting inventions has the advantage of avoiding paperwork while also allowing independent invention to be used as a defense against charges of infringement. "A copyright is much less likely to stifle innovation than a patent or to impose the cost of hiring a standing army of lawyers," Klemens writes. There is no clear evidence to support the argument that the absence of patent protection would have a detrimental effect on innovation, and Klemens also notes that a copyright regime offers protection against underhanded practices such as piracy, code duplication by competitors, or theft of code by insiders. The author writes that patents cannot be applied to the decentralized software industry; most industry players realize this, which is why patent law is largely ignored. Most companies prefer to pay royalties rather than spend money to keep track of all relevant patents, or pay enormous sums to defend themselves against patent infringement claims. This has cultivated a business model in which parties file or purchase vaguely worded patents so they can make money solely by exacting royalties and infringement damages.

    For information regarding ACM's work in the area of intellectual property, visit http://www.acm.org/usacm/copyright.

  • "Visions of VoIP"
    Software Development (08/05) Vol. 13, No. 8, P. 42; Ravella, John; Falcone, Joe; Meyrick, Gareth

    Although the vision of voice over Internet Protocol (VoIP) continues to outclass the reality, the technology's potential for digital convergence, cost savings, and cool experimentation is spurring experts to predict wide consumer adoption. Since the emergence of Asynchronous Transfer Mode (ATM), the phone system has relied on virtual circuits using packet-switched architectures. Digital transmission of voice data between two points has evolved across three environments: Telephony, network, and peer-to-peer (P2P). International Telecommunications Union (ITU) telephony engineers began developing standards for converged digital telephony in the 1990s, and the ultimate product was the H.323 standard for Ethernet. Internet engineers developed a more elegant standard, SIP, which together with H.323 employs standard IETF protocols such as Real-Time Transport Protocol, Resource Reservation Setup Protocol, Telephony Routing over IP, Common Open Policy Service, the Diameter protocol, and the Media Gateway Control Protocol. VoIP systems based on SIP and H.323 have done well in corporate settings. P2P-based VoIP is being developed especially for the residential sector, where installation and reliance on multiple protocols can be intimidating. The P2P Skype system, which permits all peers to dynamically participate in traffic routing, processing, and bandwidth-heavy operations, is superior to conventional VoIP in terms of firewall and network address translation traversal in the absence of end-user reconfiguration; multitiered networking whereby supernodes guarantee that each network node has full and rapid knowledge of all available resources and users; and intelligent routing for reduced latency and better call quality.
    Click Here to View Full Article
    (Access to the full article is available to paid subscribers only.)