ACM TechNews is published every week on Monday, Wednesday, and Friday.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.
To send comments, please write to firstname.lastname@example.org.
Volume 5, Issue 448: Wednesday, January 22, 2003
- "Recording Firms Win Copyright Ruling"
Washington Post (01/22/03) P. E1; Krim, Jonathan
In a triumph for music labels, U.S. District Judge John D. Bates upheld the 1998 Digital Millennium Copyright Act (DMCA) yesterday when he ruled that Verizon Communications had to disclose the name of a customer who had downloaded a large volume of songs using the Kazaa music-file-sharing service to the Recording Industry Association of America (RIAA). The RIAA has been embroiled in a long-running fight against such file-swapping services, which it claims infringes on copyrights; it is estimated that online file sharing cost the industry $5 billion worldwide in 2002. Washington lawyer Jonathan Band says the decision will give the entertainment industry the clout "to reach a class of users that they have not been able to reach until now." Verizon insisted that it is against digital piracy, but maintained that ISPs can only legally disclose customers' names if they store the unlawfully copied material on its network. However, Bates declared that the DMCA invalidates Verizon's arguments, writing that the ISP "has provided no sound reason why Congress would enable a copyright owner to obtain identifying information from a service provider storing the infringing material on its system, but would not enable a copyright owner to obtain identifying information from a service provider transmitting the material over its system." Verizon associate general counsel Sarah B. Deutsch said that Bates has misinterpreted the law and that her client plans to file an appeal. Meanwhile, EarthLink's David Baker noted that copyright owners have the right to protect their material, but declared that the music industry "is misusing the DMCA as a sword instead of a shield."
- "As Linux Nips At Microsoft, Its Advocates Talk Numbers"
New York Times (01/20/03) P. C1; Lohr, Steve
The lineup of speakers at this year's Linux World conference in New York shows how the business community has embraced the open-source operating system, spawned by programmers devoted to a share-and-share-alike ideology. In previous years, open-source luminaries held the spotlight at Linux World, but more and more companies are now aware of the business benefits of the operating system, namely its flexibility and low cost. Linux has already proven a success on Web servers and in super-computing clusters, and is slowly making its way further into everyday enterprise applications. Although analysts say this encroachment means a real threat to Microsoft, the imminent danger is to proprietary Unix flavors, such as those that power mainframe servers made by IBM, Hewlett-Packard, and especially Sun Microsystems. Developed at Bell Laboratories in the late 1960s, Unix is tailored by major technology firms for their high-end machines. Linux, on the other hand, was specially designed to run on commodity Intel-based hardware, and allows Unix applications to be easily ported to its platform. E*Trade chief technology officer Joshua S. Levine says his company last year decided to switch two-thirds of its data center infrastructure over to Linux-based Intel servers, which he says gives him more control over relationships with hardware vendors since he is not locked into a proprietary technology. Goldman Sachs recently released a report predicting that "Linux-on-Intel appears likely to emerge as the dominant platform in corporate data centers." But Goldman analyst Thomas Berquist says "what is really at risk is the concept of a proprietary operating system."
(Access to this site is free; however, first-time visitors must register.)
- "Profiling the Hackers"
Associated Press (01/20/03); Dobbin, Ben
State University of New York (SUNY) at Buffalo researchers are working on a system that can profile network users in real time and catch cybercriminals in the act. These profiles are built by tracking each command a user executes at each computer terminal; the way a person opens files, sends email, searches archives, and performs other routines are continuously watched for even the smallest deviations, and anomalous behavior--entering a zone that is off-limits or using a stolen password, for example--is reported to network administrators. "The advantages offered by this approach is an intruder with malicious intent can be identified very early and a system operator can contain the damage, repair it in real time and shut out the intruder," notes Harris information security specialist Mike Kurdziel. The idea for the software originated when SUNY Buffalo computer science professor Shambhu Upadhyaya reasoned that faster, more effective monitoring was possible if user commands rather than network traffic were tracked. He says that user-profiling tools that analyze network traffic for deviations are usually 60 percent to 80 percent reliable, but the program developed at SUNY Buffalo has demonstrated a maximum reliability of 94 percent in simulation. Since 1998, the National Security Agency has designated 36 research and teaching centers to protect U.S. information technology systems, including SUNY Buffalo.
- "Uh-Oh: Spam's Getting More Sophisticated"
Computerworld Online (01/17/03); Machlis, Sharon
Click Here to View Full Article
- "In Software Industry, a Passage to India"
Boston Globe (01/20/03) P. C1; Kirsner, Scott
Software programming is set to follow the pattern of the textile industry in the United States as the low-end, hands-on work of coding gets done overseas. But, just as with apparel manufacturers today, design, marketing, and retailing operations are likely to stay here, according to experts. Forrester Research estimates that almost a half-million U.S. tech jobs will be siphoned to other countries by 2015, mostly to India. A number of companies in Massachusetts are already making the switch, in part because of the general tech slowdown. Shikhar Ghosh, founder of e-commerce pioneer Open Market and former Massachusetts Software & Internet Council chairman, is currently selling the assets of Verilytics, his most recent effort. He decided to hold onto his company's 20 programmers in India while laying off workers in Burlington, Mass., because of simple cost reasons. Whereas each U.S.-based worker costs the company $10,000 per month, the entire Indian operation, including rent, costs Verilytics just $25,000 per month. PTC, the largest software firm in the state, has over 200 Indian programmers in Pune, but vice president of research and development Rich Butler says overseas programmers in general are less productive than those close to the company's corporate base, and that complex development projects are better done in the United States. However, some firms are looking overseas for high-end skills as well, such as GGA Software in Cambridge, which keeps a staff of 120 in St. Petersburg, Russia, working on statistical modeling, drug discovery, and other scientific applications. John Russo, who heads the computer science department at Wentworth Institute of Technology, says his school is re-positioning its curricula to make sure its graduates know skills valuable for the future, such as how business and IT intersect.
- "IBM Aims to Get Smart About AI"
CNet (01/20/03); Kanellos, Michael
The demand for machines with a relative sense of autonomy is growing thanks to the proliferation of the Internet, the increasing numbers of computers people are using, and the burgeoning types of data the Net supports. To help meet this demand, IBM in the near future plans to introduce the Unstructured Information Management Architecture (UIMA), an XML-based data retrieval structure designed to integrate various schools of thought on artificial intelligence. Key to UIMA is the Combination Hypothesis, which theorizes that within a relatively short time it will be possible to unify various AI approaches, including statistical machine learning and syntactical artificial intelligence. "If we apply in parallel the techniques that different artificial intelligence schools have been proponents of, we will achieve a multiplicative reduction in error rates," declares Alfred Spector of IBM Research. UIMA could, for instance, enable vehicles to automatically collate and display data on traffic conditions and automobile speed in real time, while computers could add automated language translation and natural language processing to their arsenal of capabilities. Meanwhile, Web searching could be narrowed down so that computers can refine queries to only match relevant data. AI can be whittled down to two fundamental schools: One subscribes to the statistical learning theory, which advocates memory-based learning, while the other supports rules-based intelligence that stresses contextual awareness. The basic AI architecture would have sensors gather external data and transmit it to a computer, which would coordinate the best course of action and notify human owners only when it has no other choice.
- "Scientists Giddy About the Grid"
Wired News (01/20/03); Dotinga, Randy
Scientists wanting to connect supercomputers together for collaborative research have been stymied by a lack of feasibility thanks to incompatible standards, but grid computing offers them new hope. "The assumption is that people will buy into this and we'll be one unified community," notes Dan Abrams of the University of Illinois at Urbana-Champaign. Grid computing projects such as the one University of Illinois researchers are working on will pave the way for the "democratization of computing," according to a booster at last week's GlobusWorld conference. Abrams is participating in a $10 million federal initiative to link national earthquake study centers with Globus software, a standard grid computing language that is rapidly being adopted worldwide. Meanwhile, several U.S. hospitals are planning to make thousands of mammograms available to doctors by digitally storing them in a grid system. NASA is combining supercomputer models of myriad commercial airplane engine components into a single simulation via grid computing, and astronomers are developing a virtual universe to be used as a shared resource. A National Science Foundation committee recently released a draft report recommending a $650 million investment in improved U.S. computer infrastructure and grid expansion. "The grid has the potential to transform the process of science," declares William Johnston of the Lawrence Berkeley National Laboratory.
- "Job-Rich Silicon Valley Has Turned Fallow, Survey Finds"
New York Times (01/20/03) P. C4; Fisher, Lawrence M.
Jobs in Silicon Valley fell 9 percent between the first quarter of 2001 and the second quarter of 2002, estimates a report that Joint Venture Silicon Valley will publish on Monday; the 127,000 jobs lost in this period accounted for more than 50 percent of the total jobs gained in the region over the same time. Incurring a 22 percent job loss were industries categorized as "driving" clusters: Software, chips, and computer and communications hardware. Collaborative Economics President Doug Henton says the data conclusively shows that the valley has gone through a boom-bust cycle, but determining the next major innovation to spur economic growth is difficult. He believes it will emerge from the biomedical industry, although there are no guarantees. Nor is there a guarantee that the next big thing will come from Silicon Valley--Anna-Lee Saxenian of the University of California at Berkeley cites growing competition from many different global centers threatening the valley's monopoly on innovation. The survey also notes that average pay in the region fell for the second consecutive year, while demographics have undergone a radical change. Whereas 83 percent of valley residents were Caucasian in 1970, today they account for 45 percent; Asians or Pacific Islanders account for 26 percent, Hispanics 21 percent, and African Americans 3 percent. Henton points out that a shift from hardware to software production is wreaking changes throughout corporate architecture: Software firms have an average of 20 people, compared to 200 for hardware companies. The long-term prospects of these little companies once they enter the global arena is difficult to predict.
(Access to this site is free; however, first-time visitors must register.)
- "High-Tech Voting Raises Questions"
Insight on the News (01/21/03); Cherry, Sheila R.
North America's transition to computer-based voting systems has raised a number of issues, including concerns about machine failure, flawed software, and code tampering. Rebecca Mercuri of Bryn Mawr College notes that the election officials who bought the new voting systems realize that they are subject to internal audits that are kept from the public unless authorized by court order, because the companies that make the software codes want to protect their trade secrets. Meanwhile, AccuPoll Holding's Dennis Vadura adds that paper-verified ballots will leave an audit trail but will not guarantee the elimination of software errors or unverified code. Election Center director R. Doug Lewis says that much election software is programmed by the jurisdictions that use it, and code accuracy is only as good "as the attention span of the person on the day that they [typed it in]." He also notes there is resistance from elected officials to switch over to electronic voting, while the general public has only begun to accept the transition. Lewis also explains that, although most election errors can be traced back to the equipment, programming errors can crop up because of differing ballot styles caused by the way a specific jurisdiction groups its candidates. Eva Waskell of Computer Professionals for Social Responsibility cautions that election staff have little knowledge of fundamental security standards and procedures, so they often look to voting-system vendors for technical support. Another issue is what to do when a politician on the ballot is an owner of or stockholder in the company that makes the voting-system software, a situation that critics claim creates a conflict of interest. Election Systems & Software's Bonnie Cuellar says the voting-system certification process also leaves room for error, because independent testing authorities only test a sample of voting machines and software cartridges.
- "Digital Defenses"
Red Herring (01/13/03); Bruno, Lee
For a business' electronic defenses to continue to offer maximum network protection, adaptation is key. The challenge lies in keeping sensitive company information secure while still maintaining network openness in order to sustain commerce and communications--an impossible situation, according to most experts. Bob Blakely of IBM's Tivoli division thinks there are other, more efficient security options besides erecting firewalls; he says, "The industry is moving out of the fortress model of information security, with tall walls and moats, to a more sophisticated early-detection system that spots symptoms." Resilient components capable of intelligently responding to intrusions and blocking access to network resources in such a scenario are thought to be the most desired security solution. Proposed Web services security measures are expected to offer a way to keep data protected during transit, but similar promises offered by Web services have yet to be delivered. Meanwhile, a number of companies are refocusing on security products in order to get a piece of $37 billion in information and anti-bioterrorism technology funding allocated by the newly-created Homeland Security Office. Some experts caution that without a coordinated, methodical development strategy, much of the government money will be spent on similar research efforts, which will lead to less value for the results. Jonathan Silver of Core Capital Partners expects more federal money to be poured into bioscience research rather than into information technology.
- "Where the Girls Aren't"
New York Times--Education Life (01/12/03) P. 35; Stabiner, Karen
Opinions are divided as to why computer programming is unpopular among girls: One camp subscribes to the theory that girls are socially conditioned to avoid computer science, while another reasons that they are naturally disinclined toward the field. "The wanting to know how things work, that's often what boys want to know," observes Hope Chafiian, director of technology and curriculum at Spence. Westover School principal Ann Pollina estimates that women account for fewer than one-third of all computer and information science bachelor's degrees, and just 18 percent of advanced degrees; the ratio of male computer programmers to female programmers in industry is four to one. Girls' reluctance to study programming could threaten the U.S. domination of the programming industry, according to Kurt Schleunes of the Marlborough School in Los Angeles. He believes that a lot of women are put off by the Advanced Placement curriculum, and suggests that it be revamped so that it is more girl-friendly--such revisions include a de-emphasis on mathematics and a greater concentration on practical applications. Pollina thinks that the computer curriculum must undergo a similar user-friendly retooling, and also believes the number of female computer science graduates could improve if their adult peers change their expectations. Yale freshman Kaitlyn Trigger, who studied under Schleunes, says that girls must learn programming if they are to have a successful technology career.
(Access to this site is free; however, first-time visitors must register.)
To learn more about ACM's Committee on Women in Computing, visit http://www.acm.org/women.
- "After the Copyright Smackdown: What Next?"
Salon.com (01/17/03); Vaidhyanathan, Siva
Siva Vaidhyanathan of New York University writes that the Supreme Court's recent decision to extend the term of copyright by 20 years may be disheartening for advocates of copyright reform, but notes that their movement is gaining momentum, thanks to greater public interest in the issue. Justice Stephen Breyer opined he saw no "constitutionally legitimate, copyright-related way" in which the public would benefit from the extension, and observed that the chief benefactors are the copyright owners. Even Justice Ruth Bader Ginsburg, who supported the ruling, gave public interest activists fuel for their cause by expressing her faith in fair use and the idea/expression dichotomy when, ironically, both these rights are being assaulted by legislation. Fair use is being threatened with a bill that would allow copyright holders to hack into and interfere with computers suspected of unlawfully distributing content, while another bill threatens the idea/expression dichotomy by calling for the institution of a new form of intellectual property so that databases are protected. Furthermore, Ginsburg's statement that "traditional contours of copyright protection" have not been altered by Congress is false, in light of the 1998 Digital Millennium Copyright Act and its provisions, notes Vaidhyanathan. He adds that public awareness of copyright has swelled throughout the media, while legislation from Rep. Richard Boucher (D-Va.) calls for the restoration and illumination of fair use for research and education, and the labeling of materials that restrict fair use. Growing dissatisfaction among religious communities, family groups, scholars, and consumers is finding voice through organizations such as the Electronic Frontier Foundation, while the legal community has added its support with clinics offering legal aid and documentation.
"X11: Apple's Secret Formula"
CNet (01/22/03); Shankland, Stephen; Wilcox, Joe
Apple Computer this month released a beta version of the Unix windowing environment X11, which allows Unix applications to run concurrently with those on the Mac OS X, and provides a more friendly Unix developing environment. Analysts say the Apple X11 version will help the company make inroads in the business sector, especially where users are already using Macs or PCs alongside their Unix machines. Apple X11 allows users to toggle between X11 windows and those running day-to-day software, such as Microsoft Office. In addition, users have the option of going portable with a Mac notebook computer, which is definitely not an option with larger server hardware. Apple's offering makes sense for specialized businesses that can also benefit from Apple's other strengths, such as its graphics power and relative low cost compared to Sun Microsystems servers. But Linux is already quickly taking market share away from Unix, and this year will be the first with more Intel-powered servers shipping than specialized reduced instruction set computer (RISC) processors, according to Gartner Dataquest research. Ed Peterlin, an open-source programmer working on a Mac version of OpenOffice, says Apple's X11 version adds "an air of legitimacy" to the platform and also works 10 percent to 25 percent better than other versions. The OpenOffice group expects to release a version of its productivity suite for Apple X11 this spring.
- "Cell Phone, PDA Makers Work to Find Ideal Mix of Features"
Investor's Business Daily (01/22/03) P. A1; Seitz, Patrick
The global market for converged devices--devices that combine cell phone and handheld computer functionality--will boom from 4 million units sold in 2002 to around 59 million units in 2006, predicts International Data (IDC). However, equipment manufacturers agree that no single type of device will be able to accommodate all consumers: As a result, they are designing devices that fulfill a primary use, such as voice calls, Web browsing, video games, text messaging, or data storage. At the recent Consumer Electronics Show, Microsoft Chairman Bill Gates said he expects manufacturers to tinker with form factors, while Dell Computer CEO Michael Dell announced plans to enhance future generations of his company's Axim Pocket PC with wireless communications. Leading handheld computer maker Palm is convinced that consumers favor keeping cell phones and personal digital assistants (PDAs) separate, while senior VP of marketing Kenneth Wirt maintains that attempts to converge the two led to design and usability trade-offs. His company believes the largest current market opportunity is for PDAs that use Bluetooth to connect to small phones, and Wirt insists such devices make no compromises. Meanwhile, handheld computers equipped with larger displays are being prepared for market by Samsung and BSquare; both will offer products with built-in high-speed wireless Internet communications. Top cell-phone manufacturer Nokia is developing an array of products with assorted bells and whistles, including a mobile video game device with cell-phone capability as a secondary feature.
- "Multimedia Programming Comes in New FLAVOR"
NewsFactor Network (01/21/03); Martin, Mike
Formal language for audiovisual object representation (FLAVOR) is an open-source extension of C++ and Java that can be used to formally describe coded multimedia bitstreams, which are used to format data such as JPEG, GIF, and MPEG. "Most of the multimedia and network data are coded into bitstreams in order to minimize the cost and storage and transmission," notes FLAVOR inventor and Columbia University electrical engineering professor Alexandros Eleftheriadis. He says the purpose of his project is to manipulate bitstream data with a minimum of time and effort. Eleftheriadis observes that C++ and Java lack native facilities for handling such data, and outlines a two-part procedure that software codec or application developers must follow to provide this capability: They must create software that accommodates bitstreams, and then deploy code that adheres to the syntax of the available format. FLAVOR descriptions can be converted into established C++ or Java code that can be utilized to read and write corresponding bitstreams with built-in translation software. FLAVOR enables programmers to substitute "ad hoc descriptions of bitstream syntax with a well-defined and concise language," according to Columbia University computer scientist Danny Hong. Any application developer can immediately access multimedia content for the purposes of editing, indexing, filtering, or searching thanks to FLAVOR, Eleftheriadis says.
- "Reaching for the W-Band"
eWeek (01/13/03) Vol. 20, No. 2, P. 35; Carlson, Caron
The FCC is considering licensing the upper-millimeter wave band, or W-band, to enterprises as well as carriers. Industry advocates claim that this would boost bandwidth capabilities while lowering its cost, while others characterize the technology as "personal broadband" that would enable companies to set up alternative Internet routes, accelerate connection speed at less cost, and implement broadband on demand. John Ernhardt of Cisco Systems says the W-band would "allow more gigabit speeds," while BGI and its affiliate i-Fi are among the private companies urging the FCC to create an affordable site-by-site W-band licensing scheme. Supporters say site-by-site licensing will benefit communications currently hampered by bandwidth strictures, boost efficient spectrum usage by letting enterprises sell idle bandwidth, improve security, and promote operational expansion and disaster recovery processes. The commission has suggested that the W-band be auctioned off, although private-sector support for this proposal appears to be nonexistent. Still, the FCC's Lauren Kravetz Patrich maintains that "We're trying to get away from where we're telling users what they can do on what spectrum." Cisco is trying to persuade the FCC to consider all industry enterprises in its W-band deliberations, not just service providers. The FCC can also allow anyone to use the W-band free of licensing, but most industry boosters argue that a certain degree of regulation is necessary to ensure W-band reliability.
- "Security's Next Steps"
InfoWorld (01/13/03) Vol. 25, No. 2, P. 1; Fonseca, Brian
New security tools are being developed to face new kinds of electronic threats. Security experts such as Raleigh Burns of Northern Kentucky's St. Elizabeth Medical Center expect future security products to have simpler features and smoother integration, a trend that Carrie Jensen-Badaa of Barclay's Global Investors also expects to see. Meanwhile, quantum encryption products such as Navajo from MagiQ Technologies promise a new level of security governed by the laws of quantum physics. The act of reading quantum-encrypted information triggers a molecular change of the data, making non-detection impossible. There are, however, drawbacks: Navajo can only cover a 30-km radius between two devices, although MagiQ CEO Bob Gelford plans to extend that range to 100 km within a year; meanwhile, Gartner's Ray Wagner argues that the application of quantum encryption will be mostly limited to the U.S. government because of affordability issues. Several companies are planning to debut additional security in their firewall platforms--NetScreen Technologies CTO Nir Zuk says his company is combining its platform with intrusion detection safeguards from OneSecure, while Microsoft's recently-released Feature Pack 1 for ISA server offers intuitive security management by embedding application-layer security into the firewall. Future ISA server versions will include SOAP and XML filtering, and .Net Framework integration in order to accommodate Web services and address myriad nuisances that plague customers. A number of vendors are trying to penetrate the market with biometric security products, which have yet to be widely accepted because of issues about accuracy. One product that could do well comes from Ultra-Scan--a fingerprint scanner that uses sound waves so that dirt, grime, oil, and other contaminants do not affect the scan.
- "Hardware Hangover"
IEEE Spectrum (01/03); Goldstein, Harry
With spending on corporate hardware falling off as a result of the economic recession and the maturation of the IT market, tech companies are focusing on software and services that help enterprises unify and boost the efficiency of existing systems. New equipment is more likely to be purchased if companies can save operational costs by consolidating their IT assets. Software-oriented storage is becoming more prevalent--in fact, software and services will account for half of storage vendors' revenues by 2005. General Motors followed a consolidation strategy that merged four email systems into one and 14 computer-aided design (CAD) systems into a single platform, and cut its product development cycles from four years to 18 months. Others are using blade servers to reduce data center overcrowding that often results from the migration of standalone servers to a centralized facility. Meanwhile, major IT vendors are marketing enterprise management and maintenance automation software: Hewlett-Packard is helping businesses combine their computing and storage assets into data and storage centers, Sun Microsystems is promoting grid computing through its N1 initiative, and IBM set up an independent autonomic computing program in the fall. Companies such as GM are investing in storage-area networks (SANs) to increase memory utilization, but Gartner's Roger W. Cox says that there is little money to spend on more disks. Virtualization software, which consolidates separate storage pools into a single platform, has also begun to have an impact. Cox notes that IBM, HP, Veritas Software, and EMC are developing products that enable single-screen management of multiple storage assets.
- "How You'll Pay"
Technology Review (01/03) Vol. 105, No. 10, P. 50; Schwartz, Evan I.
Manufacturers are in a race to develop high-tech payment systems that offer superior security, versatility, and convenience. Smart cards, which come equipped with both microprocessors and memory chips, have become commonplace in Europe, but their adoption in the United States has been hindered by implementation costs; however, advances in computing power, falling prices, and the potential for even more applications are encouraging signs, as is the desire among large companies and government agencies to distribute smart cards to employees for security reasons. U.S. smart card initiatives include the SmarTrip Metrorail system in Washington, D.C., and Target Stores' distribution of Visa smart cards and smart-card readers to customers. Competing developers such as ExxonMobil are doubtful that smart card technology will catch on because of its complexity, and so have developed an alternate device outfitted with a radio transponder, enabling users to pay for gasoline by waving it in front of a pump equipped with a radio receiver, for example. ExxonMobil's Speedpass is used by almost 6 million people, and the company plans to branch the system out to other retailers. A third technology is Dallas Semiconductor's iButton, a microchip-equipped steel canister that can hold electronic cash, coupons, IDs, and other kinds of data; the manufacturer says the iButton is more robust and cheaper than plastic cards, and there are no transaction fees. It is unlikely that any one of these technologies will become predominant. Smart cards have made a huge splash in Europe since their introduction over 10 years ago: One of the technology's more notable advantages is that it precludes phone-based ID verification and stymies criminals who like to "clone" credit cards.