Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to technews@hq.acm.org.

Volume 5, Issue 522:  Monday, July 21, 2003

  • "Senate Votes to Deny Funding to Computer Surveillance Effort"
    Washington Post (07/19/03) P. E1; Partlow, Joshua

    The Senate has voted against funding a controversial computer program backed by the Pentagon. A setback for what is now called the Terrorism Information Awareness program, the Senate added a provision to deny funding for the computer surveillance effort in its $369 billion military spending bill, which it passed unanimously. Both the Bush administration and the Defense Advanced Research Projects Agency, the research arm of the Pentagon that oversees the Terrorism Information Awareness program, called on the Senate to remove the provision to deny $54 million for research and development of the computer surveillance initiative over three years. The Pentagon wants to create a database of credit card bills, airline records, and other health, educational, and personal information that it could comb to identify patterns of terrorist-related activity. The computer surveillance initiative also would include long-distance surveillance technology that could identify someone by their gait, or even by the irises of their eyes. However, bipartisan support in Congress and among privacy groups has grown to check the program, which features an all-seeing eye logo and the slogan "knowledge is power" in Latin. The House, in its defense appropriations bill, gives the Pentagon more leeway to pursue the program, but a provision would prevent it from using the technology on U.S. citizens without congressional permission.
    Click Here to View Full Article

    For more information on TIA, visit http://www.acm.org/usacm/Issues/TIA.htm.

  • "Apple Co-Founder Creates Electronic ID Tags"
    New York Times (07/21/03) P. C3; Markoff, John

    Steve Wozniak's Wheels of Zeus company has unveiled its first product, a wireless tracking system for personal property, pets, and people. WozNet, as the system is called, involves tags attached to the person or thing to be tracked, base stations that broadcast one or two miles on the unlicensed 900 MHz band, global positioning satellite signals, and a back-end system that can send users email notices about errant dogs or students arriving at school. Wozniak says the idea was an inexpensive system that could keep track of things simply. WozNet data is encrypted for each individual user, but the system can be adjusted to accommodate neighborhood watch-type applications. Wheels of Zeus plans to install multiple base stations connecting to one another, thereby extending WozNet's range and usefulness since chips can communicate with any WozNet base station. Each tracking chip costs less than $25 to make, and the simple 20,000 bits per second signal is meant to cut through crowded radio environments. Those familiar with the development of WozNet say Wozniak was integrally involved in the hardware design. He is known for simple and elegant solutions, such as Apple's early graphics and display innovations and disk controller design. After a serious plane crash in 1981, Wozniak left Apple to start Cloud 9, a technically astute company that failed commercially. Although happy as an elementary school teacher during the 1990s, he admits, "I was itching a little for another startup experience. Also, it was easy to laugh and enjoy the ideas that the first few of us were coming up with, this time."
    http://www.nytimes.com/2003/07/21/technology/21ZEUS.html
    (Access to this site is free; however, first-time visitors must register.)

  • "Hard-Disk Drive Industry Braces for Technology Changes"
    EE Times (07/18/03); Ojo, Bolaji

    New serial technologies are set to replace standard SCSI and ATA (Advanced Technology Attachment) interfaces over the next two years, even as hard-disk drive manufacturers prepare for an entirely new form of bit storage. Perpendicular recording will replace longitudinal recording in storage devices, placing bits on end instead of lying them parallel on the disc surface, thus dramatically increasing the possible storage density. Whereas longitudinal recording will max out at about 100- to 200Gbits per square inch, perpendicular recording will enable up to 1Tbit per square inch, according to Seagate Technology's Gary Gentry. Working prototypes for perpendicular recording have been made, but disk-drive makers Seagate and Hitachi Global Storage Technologies continue to invest in longitudinal recording advances. Gentry explains that in the hard-disk industry, reliability is of paramount importance and that technologies are introduced only at a mature stage. Meanwhile, new serial interfaces promise higher transmission speeds for today's hard disks while eliminating some of the electrical interference and bandwidth bottlenecks that had plagued parallel interfaces. In the PC market, Apple's new G5 system uses Serial ATA (SATA) while Intel systems are expected to use the new interface by year's end. Although it got a later start, Serial Attached SCSI (SAS) technology is making faster headway because developers can leverage preliminary work done on SATA. Gentry expects SATA to result in 2.5-inch disk drives in the enterprise market, which will improve performance density and lower the overall cost of ownership.
    http://eetimes.com/sys/news/OEG20030718S0038

  • "Two Men, Two Ways to Speak Computerese and Two Big Successes"
    Wall Street Journal (07/21/03) P. B1; Gomes, Lee

    Larry Wall and Guido van Rossum each created their open source scripting languages, Perl and Python, respectively, in the 1980s and earned the loyalty of users by providing an easy way to write code without extensive computer science expertise. Despite the lack of support among many university computer science programs, Perl and Python--and other scripting languages such as PHP, Ruby, and TCL--still appeal to software developers who are looking for a free-and-easy programming language. Although originally used for smaller tasks, scripting languages are now capable of handling the same robust applications as Java or C, though a graphics-intensive computer game, for example, might run more slowly. The autoexec.bat batch file found in Windows and DOS systems is an example of script code's humble origins. Perl and Python each represent a certain programming mindset, with Perl being more flexible and permissive and Python requiring strict discipline. Wall finds Perl analogous to his evangelical Christian faith, saying, "God doesn't enforce morality on us, but prefers to let us choose it. And if we are created in God's image, we ought to extend the same courtesy." Now looking for a new sponsor after a stint with the nonprofit Perl Foundation, Wall says he is now working a major redesign of Perl to compensate for bad decisions made early on.

  • "CMU Lands $7 Million Federal Contract for Software That Thinks"
    Pittsburgh Business Times (07/16/03)

    Carnegie Mellon University has won a $7 million contract to develop software that will enable computer systems to learn by receiving advice and instruction from human masters. Awarded by the Defense Advanced Research Projects Agency (DARPA), the contract calls for the university's School of Computer Science to develop the Reflective Agents with Distributed Adaptive Reasoning (RADAR) system, which will move computers closer to operating as thinking machines, transforming how people interact with them. Carnegie Mellon intends to have RADAR help sort email, plan meetings, allocate resources, maintain Web sites, and write reports. RADAR will be part of DARPA's Perceptive Assistant that Learns (PAL) program, which benefits cognitive information processing research such as artificial intelligence, machine learning, and human-computer interaction. The PAL program will produce technology that will be helpful to the military, businesses, and academia. DARPA, a federal agency, also awarded a $22 million contract to SEI International, which will use the money to develop software that enables computers to apply reason to unexpected circumstances.
    Click Here to View Full Article

  • "Voting Machines Need Paper Trails"
    SiliconValley.com (07/20/03); Gillmor, Dan

    California Secretary of State Kevin Shelley is set to issue rules for so-called "Direct Recording Electronic" (DRE) machines that tally election votes digitally. The case represents the on-going battle between eager advocates of updated technology and those that believe more security needs to be built into the system, writes Dan Gillmor. Computer experts have largely come down on the side of slower adoption, saying paper trails need to be created, such as with confirmation tickets issued to the voter immediately after they cast their vote. Meanwhile, groups such as the American Civil Liberties Union and the Leadership Conference on Civil Rights have either filed lawsuits or taken an aggressive stance against slow adoption of DRE equipment. Critics of current DRE systems say no technology is completely failsafe and that private companies' assurances should not be trusted blithely. Secretary Shelley commissioned a task force to study DRE machines, which reported that DRE systems did need improved testing and certification processes. However, only three of the panel's nine members insisted on a paper ticket, which could be useful in validating any contested election results, as well as in increasing voter trust. Another critical argument of undocumented DRE voting is that any failure could significantly undermine voter confidence. On the national level, U.S. Rep. Rush Holt (D-N.J.) has introduced the "Voter Confidence and Increased Accessibility Act of 2003" that would require DRE systems leave a paper trail for the 2004 elections. Gillmor writes that even a hint at the possibility of fraud is justification for proceeding slowly with new voting technology and heeding the warnings of computer experts who argue that deploying untraceable computer-based voting systems could lead to an "electoral train wreck."
    http://www.siliconvalley.com/mld/siliconvalley/6344554.htm

    For more information on e-voting technologies and standards, visit http://www.acm.org/usacm/Issues/EVoting.htm.

  • "Researchers Study Internet Trust Issues, Create Solution"
    Purdue Exponent (07/18/03); Bennett, Kathryn

    Purdue University computer science researchers are working on creating a more trustworthy environment for online transactions. Using mathematic models, the researchers are examining how to collect more reliable information about a certain users' trustworthiness, based on buying history instead of nominal required identification, for instance. While such problems plague common online transactions, such as those completed over eBay, Purdue computer scientist Leszek Lillien says trust between organizations is also important, and often breaks down because of mechanical errors. For example, network connections can break down and leave database security holes. Formal schemes can minimize the participant's vulnerability, requiring more background information from transacting parties. Purdue communications professor Glenn Sparks says average American users still have serious doubts about their security in an online environment, but the growing success of online auctions and improving technology will overcome those hindrances. Mike Atallah, professor computer science, says computer-related trust can be undermined not by technology, but due to human interactions. He says, "The fact that [businesses or individuals] don't trust each other has nothing to do with the data...There is this fear that the other party will leak the information." Bhargava's research aims to decrease an entity's vulnerability to leaked information, thereby increasing trust in computer systems.
    Click Here to View Full Article

  • "Universities Compete for NSF Nanotech Program Worth Millions"
    Small Times (07/03); Kelly, Matt

    The National Science Foundation (NSF) is ready to award funding to a new university consortium that will carry forward nanotechnology research for the next decade. The new National Nanotechnology Infrastructure Network (NNIN) will replace the previous consortium, comprised of five schools and led by Cornell and Stanford universities. The NSF is required to seek new contracts every 10 years, and is taking the opportunity to expand the scope of its university nanotechnology research program, focusing more on life sciences and energy fields, for example, and researching social implications. "The nanotech community wants to avoid the huge problems the genetically modified food people got themselves into," says Arizona State professor Trevor Thorton, whose school is part of a bidding group led by MIT. Control and measurement techniques will also be pursued, instead of only the basic creation of nanostructures. So far only two bidding groups have been confirmed--one comprising the original group plus five other universities, and another led by MIT and the University of California, Berkeley. Links to industry are significant factor in consideration, and NSF officials can pick out components of different bids to create a new contract. Funding for the new group will also be doubled, from $5.6 million each year previously to about $14 million per year for the NNIN.
    Click Here to View Full Article

  • "The Merging of GPS and the Web"
    TechRepublic (07/14/03); Cagle, Kurt

    Corporate technology leaders still have yet to exploit the integration of global positioning system (GPS) technology and the Web, though the technical standards for doing so are emerging. Map applications, for example, could be integrated with GPS through scalable vector graphics (SVG), where doors, walls, escalators, and other features are assigned specific coordinates. The GPSml markup language proposed to the W3C by Charon, Inc. and based on XML would allow developers to assign Web services applications to a specific physical location, so people with a tracking application could get updated directions based on their location, among other things. Because each GPS unit has a unique identifier, there is the potential that Web services tied to GPS could be exploited by marketers, pulling up a customer profile associated with the identifier each time they near a store. An identifier's importance could be better understood by associating it with a Uniform Resource Identifier (URI), which could be used to assign applications to specific locations so they will be triggered when those locations are referred to. GPSml would provide the Web Services document for a number of possible context-relevant applications. The Really Simple Schema currently popular in the blogging community could be a way to publish Geeplogs, or GPS-enabled blogs that describe an area, or perhaps a person's travel through an area.
    Click Here to View Full Article

  • "Sensors Guard Privacy"
    Technology Research News (07/23/03); Patch, Kimberly

    Tracking the location of people accurately via sensor networks while keeping their identities private is the goal of software designed by researchers at the University of Colorado at Boulder. "Our work applies...anonymity or depersonalization on-the-fly to a stream of location data before it can be stored in a database that might be exposed to inside attacks or...inadvertent data disclosures," notes researcher Marco Gruteser, who says the technique could be ready for practical applications in three to six years. The software devised by Gruteser and colleagues allows the accuracy of location data to be automatically adjusted while overt identifiers such as names can be removed thanks to current sensors' computational abilities. "The algorithm monitors the overall number of people and adaptively changes the precision of reported locations--say, from a room level to a building floor level--to maintain the predefined minimum level of privacy," Gruteser explains. Furthermore, data gathered by the network is distributed among multiple sensor nodes, thus thwarting hackers from learning private information even if they compromise individual nodes. However, Gruteser concedes that the scheme is only appropriate for applications that are not dependent on user identification and can accommodate less accurate location information--applications such as facility usage surveillance, vehicular traffic surveillance, monitoring available meeting rooms and offices, and gathering retail store traffic statistics. Gruteser adds that organizations should have sensor networks certified by third-party agencies if they wish to collect data gathered by those networks. Purdue University computer science professor Gene Spafford cautions that the privacy-aware location sensor network approach may be too opaque for numerous applications, and adds that ensuring all sensors are tamper-proof may be an unrealistic expectation.
    Click Here to View Full Article

    Gene Spafford is co-chair of ACM's U.S. Public Policy Committee, http://www.acm.org/usacm.

  • "Your House May Track Your Movements"
    ZDNet Australia (07/17/03); Pearce, James

    Pervasive computing could benefit from research in Australia involving the use of sound to track people as they move around a building. Dr. Arkady Zaslavsky, associate professor at the School of Computer Science and Software Engineering at Monash University, supervised a research project that sought to track and even predict the movements and direction of a person according to their voice. "The main aspect [of the project was] to address some aspects of pervasive computing, namely, use the ability of mobile software agents to track objects of interest, and in this particular case, using sounds those objects make," says Zaslavsky. The Australian researchers used Voice Micro-edition to demonstrate and implement the program, which was done on two IBM computers in different rooms. According to IBM, pervasive computing turns everything from cars, refrigerators, and desk lamps into computing devices, as a result of having computing chips installed in them and a connection to the Internet. In addition to mobile software agents, the research project made use of active microphones for computers. Zaslavsky envisions the technology allowing an email window to follow someone throughout the office. Similarly, a home-based worker who leaves her office to get a snack would have an incoming document delivered to a unit in the kitchen as a result of the technology.
    Click Here to View Full Article

  • "Grid Computing: Fulfilling the Promise of the Internet"
    EarthWeb (07/14/03); Chui, Willy

    Grid computing allows the Internet to fulfill its full potential, turning the network itself into a powerful computing platform available anytime, anywhere. On its 30th anniversary, the Internet allows collaboration and sharing throughout the world through the Web and email, and is becoming an efficient platform for other older communication systems, such as the radio, television, and telephone. All of these innovations are contingent upon the open standards and collaborative culture that continues to drive Internet development. Now, the increase of processing and bandwidth capacity, together with open standards and collaboration, are making grid computing a reality. With grid computing, the Internet makes available the idle capacity, applications, and data on connected machines to the entire network. As with the creation of the Internet itself, grid computing technology began in academic circles for high-end applications such as physics and life sciences modeling. And in the same way businesses caught onto the promise of the Internet, commercial applications for grid computing are emerging in the financial services, manufacturing engineering, and even gaming sectors. Butterfly.net, for instance, utilizes a grid infrastructure to provide a platform for massive multiplayer online games. Key to the maturation of grid computing, however, are open standards and autonomic infrastructure, one example of which is IBM's Blue Gene supercomputer. In that experiment, researchers are making 8 million discrete computing elements work together autonomously.
    http://itmanagement.earthweb.com/erp/article.php/2234691

  • "Doctoral Student Developing New Facial Recognition System"
    Newswise (07/16/03)

    Virginia Tech Ph.D. engineering student and Sagem Morpho senior product engineer Creed Jones is developing software involving highly complicated mathematics in order to improve face recognition systems, whose accuracy can be affected by variant lighting conditions, camera angles, or facial expressions. "We need new and better algorithms and theories to focus on the problem, and we need better software," argues Lynn Abbott of Virginia Tech's Bradley Department of Electrical and Computer Engineering, who acts as Jones' thesis advisor. Abbott explains that the hardware component is relatively easy, but the software needs to meld statistical analysis and artificial intelligence analysis. Jones is working on a face recognition tool that is adapted for color by modifying Gabor filters, which are designed to handle black and white images. "Everyone currently uses black and white images in biometrics and this presents a problem with skin tone," the doctoral student notes. Biometrics technology has become ubiquitous, and comprises a large component of the security push that has thus far characterized the 21st Century. Jones says that legislation on the use and distribution of biometric surveillance data will need to be devised to quell societal concerns.
    http://www.newswise.com/articles/view/?id=500150

  • "Survival Guide: John McCarthy, Executive Director of the Critical Infrastructure Protection Project"
    Washington Technology (07/07/03) Vol. 18, No. 7, P. 38; Wait, Patience

    The Critical Infrastructure Protection Project, a joint venture between James Madison and George Mason universities in Virginia, seeks to shield the United States' critical infrastructure and solve cybersecurity problems through an interdisciplinary research initiative, and former Critical Infrastructure Assurance Office staff member John McCarthy serves as the project's executive director. McCarthy, who also coordinated cybersecurity readiness planning as an auxiliary to the Assistant to the President for Y2K, says the joint university program is unique in that it attempts to meld law, policy, governance, and technology under one banner. Over 30 research projects are being conducted under McCarthy's program, and McCarthy insists that all of them are important, although categorizing their importance is difficult, given that there are several research layers to consider, such as awareness. Among the projects being undertaken is an analysis of Internet infrastructure in order to map out the country's telecommunications framework, which will help researchers understand how the destruction of key facilities would disrupt Internet connectivity and performance. McCarthy is impressed by the cybersecurity movement's progress over the past seven years, from a relatively obscure issue to a leading point of the Clinton and Bush administrations' agendas. "At this point, it would be almost awkward to have a highly centralized White House element with a department trying to build a cyber and physical security structure," McCarthy notes. "We applaud Dick Clarke for getting this agenda moving; now it's time to put it back in a departmental structure and let it grow from there." McCarthy agrees with the national cybersecurity strategy, but says that a vital element is missing--an extensive conceptual architecture that is universally accepted.
    Click Here Here to View Full Article

  • "Digital Homes"
    Business Week (07/21/03) No. 3842, P. 58; Edwards, Cliff; Weintraub, Arlene; Kunii, Irene M.

    The concept of the networked home--a living space characterized by "ambient intelligence" where digital devices intuit and respond to user needs--is growing in popularity thanks to a surge in broadband adoption, the development of more user-friendly technology, and a shift to more entertainment-oriented PCs concurrent with the rollout of more PC-like consumer electronics. Networking a home extensively is an expensive proposition, and hurdles such as standards incompatibility and navigating through a mire of wires or wireless software codes still need to be overcome. Nokia, Sony, Hewlett-Packard, and other companies founded the Digital Home Working Group in June to deploy guidelines to establish interoperability between networked devices. Digital home residents interviewed by BusinessWeek explained that their desire for technology depends on its ease of use and how it empowers the user. The living room is becoming the hub of the digital home: The growth of DVDs is spurring consumers to upgrade their living room technology to create a more movie theater-like experience; cable and satellite companies and content suppliers hope to make a mint by supplementing this experience with on-demand movies and games delivered via the Internet, while PC makers think that "entertainment PCs" will reenergize sales. Wi-Fi, in conjunction with mobile devices, is allowing Internet access to migrate out of the study and into the bedroom, while automated climate control currently has little appeal for modestly-budgeted consumers. Manufacturers say there has been less enthusiasm for a digital kitchen because women are less willing than men to deal with technology implementation headaches; nevertheless, Sears and other retailers see potential in enhancing kitchen appliances such as ovens and refrigerators with radio-frequency identification tags, remote diagnostic systems, and other technologies. Another trend poised to affect the move toward the digital home is e-health, in which high-tech gadgets and network connections are being tested and used for assisted living: Certain e-health applications may show up in the bathroom, such as toilets that measure sugar levels in urine and transmit the data to physicians over the Net.
    Click Here to View Full Article

  • "Every Move You Make"
    New Scientist (07/12/03) Vol. 179, No. 2403, P. 40; Knight, Will

    Despite his crushing defeat to Deep Blue six years ago, world chess champion Garry Kasparov insists that computers have given chess new life--indeed, they have effected a radical transformation of the game. The addition of computers and more sophisticated programs has changed the way players can prepare for matches--Kasparov notes, for instance, that the ChessBase program, which he encouraged engineers to design, now features a database of over 2.7 million games, providing players with a wealth of information that can aid them in their strategies. Five years ago Kasparov proposed a new chess concept in which a human player and a computer would team up to combat another man-machine duo. "The way the game is designed we can compare human intuition and computer calculating power," he exclaims. Kasparov believes his tournament with Deep Junior, which ended in a draw, was valuable from a scientific standpoint and a societal standpoint, while demonstrating that humans are still superior to machines, provided they do not make large errors. "The long-term experiment will be whether, on his best day, the best human player assisted by a machine to prepare the game could beat the best machine," Kasparov explains. He says the game of chess helps solidify the debate between whether artificial intelligence should be evaluated by performance or results, and adds that he thinks results are the more important factor. Kasparov acknowledges that machines are starting to exhibit personalities--for example, he is able to immediately recognize the Deep Junior and Deep Fritz programs based on the moves they make.

  • "Security Lockdown"
    InfoWorld (07/14/03) Vol. 25, No. 27, P. 42; Yager, Tom

    The 2003 InfoWorld Security Survey of over 500 IT executives and strategists finds that readers' security tactics have changed, with greater emphasis on internal security and less investment in outsourcing. Forty-nine percent of respondents report that their current security solutions are very effective--52 percent say less than 100 intrusions were attempted on their networks in the past 12 months, while 63 percent estimate that fewer than 10 attacks penetrated their security. Readers concur that security does not rely on technology alone, but rather on a mix of technology, corporate policy, and education. Eighty-four percent of respondents list viruses and worms as the No. 1 security threat because of the annoyance factor, while the No. 1 internal security problem is unintentional employee error. The poll indicates that hacking is losing its appeal in the face of tougher penalties for computer crimes, more vigorous law enforcement, and the roster of talent agencies employ to interrupt, track down, and apprehend hackers. Fifty-one percent of respondents report that their firms handle all security matters internally, while just 2 percent have fully outsourced security to third parties. However, 79 percent say the biggest security hurdle they will have to overcome in the next year is the fact that employees generally downplay the importance of complying with corporate security policies, a problem reinforced by a lack of enforcement staff. Author Tom Yager offers several recommendations to establish an effective security strategy: Employees need to be kept apprised of security policies, wireless networks should be thoroughly planned and tested prior to live deployment, and WLAN security must be upheld even at the cost of user convenience; a heavier concentration on qualified security personnel than on software and hardware will reduce accidental security breaches.
    http://www.infoworld.com/article/03/07/11/27FErrintro_1.html

  • "Holographic Data Storage: When Will It Happen?"
    Photonics Spectra (06/03) Vol. 37, No. 6, P. 54; Anscombe, Nadya

    The commercialization of holographic data storage (HDS) is being held back by a lack of funding as well as problems in finding the best recording materials, be they crystals, polymers, or glasses. Photorefractive crystals were highly touted as an HDS medium for a time, but low sensitivity and the need for holograms to be embedded within the crystals discouraged investors from funding startups such as Optostor. Polymers are divided into two classes for HDS--photopolymers and photochromic polymers; photopolymers, in which bonds coalesce or break in the presence of light, are the most promising technology, while photochromic polymer technology can support rewritable HDS because it is reversible. InPhase Technologies chief scientist Bill Wilson expects his company to launch its first commercial HDS product in 2005. Polymers, however, are highly susceptible to shrinkage, a defect that glasses do not have; moreover, glasses have better optical properties and a smaller thermal coefficient compared to polymers. InPhase is developing both the HDS material and the drive technology itself. Holographic drive development is being led by Asian companies such as Optware, which has designed a storage system that eliminates much of the clunkiness associated with the technology. Many companies expect to have HDS products on the market within two years, but Barry Schechtman of the Information Storage Industry Consortium has his doubts. "If HDS does not come onto the market by 2005, other technologies will overtake, even though HDS looks better on paper," he says.