ACM TechNews is published every week on Monday, Wednesday, and Friday.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.
To send comments, please write to email@example.com.
Volume 5, Issue 485: April 21, 2003
- "Cracking the Productivity Paradox"
Financial Times (04/21/03) P. 6; Abrahams, Paul
Although major IT players are convinced that IT boosts productivity, evaluating IT-based productivity improvement is difficult due to a lack of an accepted system of measurement. Cisco Systems, Intel, Xerox, Hewlett-Packard, and Microsoft are all founding members of the Information Work Productivity Council, an effort established to coordinate academic research dedicated to solving this dilemma. The project, which will be overseen by MIT's Sloan School of Management, includes the establishment of the Center for Information Work Productivity, which will be tasked with studying how to assess and improve IT-related productivity. The undertaking represents the IT industry's realization that customers are unconvinced that IT products and services can generate solid returns, and addressing the problem has become even more crucial because of the IT slump of the last two years. MIT professor Erik Brynjolfsson says that IT investment alone does not guarantee improved productivity; it must be accompanied by organizational restructuring. "Technology requires changes in the way humans work, yet companies continue to inject technology without making the necessary changes," notes Xerox CEO Anne Mulcahy. Brynjolfsson, who will head the new MIT center, will conduct a survey of best practices that will be organized into a publicly available library, as well as devise a technique to effectively measure IT-related productivity improvements. IBM is expected to join the Information Work Productivity Council soon, adding its weight to that of members such as BT Group, Accenture, and SAP.
- "Once-Dashing Chip Turns Out to Be Just A Face in the Crowd"
Wall Street Journal (04/21/03) P. B1; Gomes, Lee
The popularity of custom-designed application specific integrated circuits (ASICs) has waned in the last two years due to factors unrelated to the economic recession. The growing complexity of computer chips, as dictated by Moore's Law, has stifled hopes of achieving cheap and simple mass production, while rising costs and longer design cycles have also made a negative impact. The cost of ASIC "masks" used in factory production have risen from a few hundred thousand dollars per unit to around $1 million, while ASIC design now takes months instead of mere weeks. Meanwhile, markets for products that were thought to have been ideal for ASICs--DVDs, for instance--have grown so dramatically that new chip makers are creating specialized chips exclusively for them, allowing customers to eliminate the time and cost of ASIC design. Programmable logic devices (PLDs) that can be customized instantly on a PC are also gaining market share thanks to accelerated performance. ISuppli forecasts that ASICs will experience slight growth for a few years, and then begin a final plummet while PLD sales skyrocket. The decline of ASICs will not hit consumers particularly hard. Leading ASIC manufacturers such as IBM are refocusing on other areas, while smaller firms such as LSI Logic are developing hybrid ASICs that stress simplicity.
- "The Paradox of Privacy"
CNet (04/18/03); Kanellos, Michael
Faster processors, more abundant data storage, and better database technology all mean that those interested--governments, companies, and other people--have easier access to personal data. Personal vendettas now can often result in enemies' personal information being posted online while job applicants can use the value of an interviewer's house as a baseline for salary negotiations, based on data culled from appraisal databases. Privacy is a topic of growing importance in the computing industry, and IBM focused on privacy at its recent Almaden Institute symposium. The RSA Security Conference followed the IBM symposium, giving experts another opportunity to discuss privacy concerns. IBM fellow Rakesh Agrawal said work is continuing on a randomizing technique that would render database information forever anonymous while retaining its value to those studying aggregate trends. Sun Microsystems distinguished engineer Whitfield Diffie spoke at the Almaden symposium about future surveillance systems that can recognize people by facial recognition and other identifiers, saying that such systems would obviate the need to carry ID, but warned that technology will advance so that computers would be able to do some "mind-reading" by comparing archived facial expressions. However, the technology that could destroy privacy is the same that is helping researchers find a cure for Severe Acute Respiratory Syndrome (SARS) by cross-referencing patient data. Cryptology expert Bruce Schneier also pointed out that the government and corporate entities still have a long way to go in terms of using collected data effectively, while CheckFree chief privacy officer John Tomaszewski noted that legal and regulatory threats would keep companies from treading on individual privacy excessively.
- "What's Holding Broadband Back?"
IDG News Service (04/18/03); Gross, Grant
IT and telecommunications executives gathered at the FCC's Technological Advisory Council on Thursday discussed issues holding back broadband adoption. Attendees cited a lack of compelling content, the price of new infrastructure, and government regulation as hindrances to more people signing on. U.S. broadband penetration will lag behind eight other countries by 2005, according to a University of Colorado study that also put the number of connected U.S. homes at just 17.6 percent; the same survey found 44 percent of respondents had dial-up Internet. Intel Capital's Sriram Viswanathan said telemedicine applications could spur adoption, and demonstrated an application turning magnetic resonance imaging scans into 3D models of the body. Other council members noted that most of the current broadband demand is stimulated by free music downloads, which much of the industry is trying to stop. AT&T's Broadband's Nick Frigo said the business case for laying new optical fiber infrastructure was very difficult to find. An alternative technology called PowerWiFi allows ISPs to provide Wi-Fi connections from transmitters attached to power lines. Phil Hunt, CEO for Amperion, which markets PowerWiFi, said the technology would not compete directly with either cable broadband or DSL, but that utility companies are interested in providing PowerWiFi as a value-added service.
- "Minority Women Perceive IT as Way to Promised Land"
Penn State researcher Dr. Lynette Kvasny says women in differing income brackets have markedly disparate views of IT: Minority women in low-income communities believe IT can be a ticket to upward mobility, while middle-class, predominantly white women think IT has few advancement options, indicating that IT and gender studies should consider women to be a heterogeneous group rather than generalized and homogenous. Kvasny says, "Populations of women have different and competing perceptions about technology's potential impact on their life experiences." Kvasny's conclusions are based on interviews with African American women participating in a computer-training course two years ago, and were presented at ACM's "Freedom in Philadelphia: Leveraging Differences and Diversity in the IT Workforce" conference on April 12. Kvasny notes that IT skills could do more for minority women than allow them to earn higher salaries; they could also allow them to focus on their personal as well as community assets in order to improve their lot. They could, for instance, enable them to lobby for neighborhood bus stops, take legal action against delinquent landlords, or learn how to file for child support. In addition, minority women in Kvasny's study think they would be able to establish deeper relationships with their computer-savvy children through their IT training. Kvasny says that minority women view IT as a "promised land" that can allow them to overcome societal and economic hurdles.
Click Here to View Full Article
- "Modern Organizations Adapt and Respond in the Information Age"
Two of the eight projects that the National Science Foundation (NSF) is funding under the Management of Knowledge-Intensive Dynamic Systems (MKIDS) program are being overseen by Carnegie Mellon University's Kathleen Carley and Stanford University's Ray Levitt, who are trying to optimize organizational structures and processes that can adapt and respond rapidly to changing situations. The goal of the MKIDS program is to leverage IT to simplify processes for organizations--the intelligence community, news media, multinational research corporations, and the like--whose effectiveness depends on rapid response. The systems being investigated by MKIDS are expected to represent a quantum leap beyond data-mining systems; they would, for instance, use scheduling tools to enable decision-makers to fine-tune organizational processes by dynamically apportioning technology services as well as physical and human resources, and track the company's response to the scheduling changes and outline ways to refine the process and retain best practices. Also on MKIDS' agenda is developing methodologies to represent, store, and exchange knowledge acquired by the systems to ensure internal scalability, more effectively measure performance, and give managers more distributed decision-making capability. Levitt's team wants to design, from the bottom up, organizations that have no weaknesses, and is using the NSF funding to expand tools for building small-scale project teams to account for interaction- and relationship-based effects within global and multicultural activities. Carley's project constructs computational models out of an organization's external data sources that delineate the organization's internal architecture and can identify potential "failure points."
- "Light Bulbs Being Replaced by Microchips"
San Francisco Chronicle (04/15/03) P. B2; Feder, Barnaby J.
Experts believe that light-emitting diodes (LEDs) will eventually outdate conventional lighting technologies--lightbulbs, fluorescent lamps, neon tubes, etc.--and revolutionize lighting applications. "We are not talking about replacing lightbulbs," explains former Bell Labs researcher Arpad Bergh. "We are talking about a totally new lighting industry." Light-emitting microchips are more efficient and longer-lasting than their non-electronic counterparts, and gradually fade rather than suddenly burn out. Furthermore, they can be easily integrated to computers and software programs, which enables them to switch between over a million shades of color instantaneously. Experts predict that the proliferation of LEDs will save billions of dollars in yearly energy and maintenance costs, and usher in the use of lighting systems to influence people's moods. Altering the lighting's color or intensity could, for example, enliven or calm nursing home residents, make movies more exciting, and give videogame users a more immersive experience. Building designers and architects hope LEDs will allow them to set up indoor lighting systems that delivery more natural illumination. Light-emitting chips are expected to penetrate the home and office by 2007.
Click Here to View Full Article
- "NeXT Still Stands Out in Its Mac Incarnation"
SiliconValley.com (04/20/03); Gillmor, Dan
NeXT technology has risen from the ashes as the cornerstone of Apple Computer's OS X operating system, writes Dan Gillmor. He characterizes NeXT as ahead of its time when it was launched about 10 years ago, and lists stability and a coherent graphical user interface among its key advantages. It also featured an advanced toolkit that enabled programmers to built applications with little difficulty, but NeXT's high costs prevented it from becoming a hot seller. Apple's acquisition of NeXT in 1997 resurrected the technology for the Mac environment. Gillmor comments that NeXT's basic architecture has been retained as well as improved and expanded. Stone Design's Andrew Stone enthuses that the current Mac architecture and programming tools will support a "samurai" model of software development, in which the building of new applications will be opened up to individual programmers or small teams, rather than restricted to giant corporations. NeXT-based products that are currently being marketed include AquaMinds' NoteTaker software and Circus Ponies' NoteBook, which capture information--text files, audio recordings, Web pages, etc.--and arrange it into a coherent, searchable scheme. Stone Design sells Create for Mac OS X (formerly TextArt) as well as other products marketed for Web site designers and graphic artists.
Click Here to View Full Article
- "Distributed Computers Power New Search Engine"
New Scientist (04/17/03); Knight, Will
LookSmart's Grub distributing computer project is cataloging vast numbers of existing Web pages in the hopes of building an expansive, constantly updated search engine that may one day compete with Google. As with other distributed computing projects, Grub taps into idle computing power and Internet bandwidth to catalogue pages via software deployed on roughly 1,000 PCs worldwide. The software trolls the Web and gathers data on thousands of pages every hour, and sends that information to a central database. The Grub homepage claims that more than 36 million Web pages have been catalogued in the last 24 hours. The public can also freely access the Web page data Grub collects and incorporate it into Web sites and desktop applications. Anthony Rowston of Microsoft's Cambridge research laboratory notes that the success of the Grub project rides on the generosity of the volunteers who install the Grub software on their computers. LookSmart has indicated that Grub may eventually include sophisticated mechanisms that Google uses to rank Web page significance. However, Search Engine Watch editor Danny Sullivan told Wired News that the Grub software could be changed to give certain sites preference over others.
- "Perception May be Nano's Biggest Enemy, Leaders Tell Congress"
Small Times (04/10/03); Brown, Douglas
Lawmakers and leading figures from industry and research convened to discuss the potential environmental, ethical, and societal ramifications of nanotechnology before the House Science Committee on Wednesday prior to its vote on the Nanotechnology Research and Development Act of 2003. A major issue for some speakers was the negative perception of nanotech popularized in the media: Vicki Colvin of Rice University's Center for Biological and Environmental Nanotechnology cited Michael Crichton's techno-thriller "Prey" as an example; she warned that this fictional account of swarms of killer nanorobots, and how seriously readers take it, could become "a public relations nightmare." Meanwhile, Langdon Winner of Rensselaer Polytechnic Institute was skeptical that nanotech was the best area to spend public R&D funding in these turbulent times. "Any broad attempt to relinquish nanotechnology will only push it underground, which would interfere with the benefits while actually making the dangers worse," declared artificial intelligence pioneer Ray Kurzweil, who added that the science will only flourish through open debate. Foresight Institute President Christine Peterson pushed for funding of a "feasibility review" where advocates and opponents of molecular manufacturing can show their technical cases to a panel of nondiscriminatory physicists. The Nanotechnology Research and Development Act of 2003 would make the National Nanotechnology Initiative (NNI) a permanent component of the federal government, while Sens. Ron Wyden (D-Ore.) and George Allen (R-Va.) support similar legislation in the Senate. NNI Director Mike Roco estimated that the program's 2002 spending budget on research into nanotech's societal and educational impact was $30 million, while its projected 2003 and 2004 budgets are $35 million and $40 million, respectively. Research spending on nanotech's environmental implications climbed from $50 million to $55 million between 2002 and 2003, and is on track to reach $60 million next year.
Click Here to View Full Article
- "NASA Hopes to Improve Computers With Tiny Carbon Tubes on Silicon Chips"
Scientists at the NASA Ames Research Center think they could extend Moore's Law thanks to a manufacturing breakthrough that allows copper interconnects on silicon chips to be replaced by carbon nanotubes. Meyya Meyyappan, director of Ames' Center for Nanotechnology, explains that the research effort that yielded the breakthrough was driven by high-performance computing needs for autonomous spacecraft. NASA researcher Jun Li notes that carbon nanotube interconnects, unlike their copper counterparts, can conduct high currents without corroding; he adds that the new process also eliminates the need to etch grooves in the chips for the copper conductors. The nanotubes are chemically "grown" on the surface of the wafer, and then a silica layer is added to fill the gaps between the tubes, after which the surface is buffed flat. Interconnected layers of electronics can be built up in a cake-like architecture using vertical nanotube "wires." "The bottom line is that computer chips with more layers and smaller components can do more for us," remarks Meyyappan. "While we are working on carbon nanotube-based chips for long-term needs, we also are indirectly helping industry to keep silicon-based computer chips in use as long as possible." The research center announced the breakthrough in the April 14 edition of Applied Physics Letters.
Click Here to View Full Article
- "Edges of Magnetic Tape Key to Boosting Data Density"
Newswise (04/17/03); Gorder, Pam Frost
A nine-month study conducted by Ohio State University engineers concludes that the data density of magnetic tape is significantly affected by a key manufacturing process. Ohio State professor Bharat Bhushan and graduate student Anton Goldade report in the April issue of Tribology Letters that microscopic study of magnetic tape shows evidence of tears and grooves along the edge that are likely caused by dull rotating blades used to cut the tape during the assembly phase. These tears may be minor at first, but repeated use could enlarge them enough to damage the coating of magnetic material, leading to a loss of data. The Ohio State researchers have devised methods to measure the quality of a tape edge and gauge factors that can impact tape's condition after prolonged exposure. Bhushan and Goldade believe that tape factories could incorporate these methods into their quality control processes. Certain magnetic tape cartridges can store 200 GB of data, but some industry experts expect hard disks to wrest control of the data storage market away from tape. Bhushan notes that competing with hard disks means increasing tape's storage capacity, which entails adding even more tape tracks while simultaneously reducing the size of tape width. The American magnetic tape industry is currently worth $6 billion.
- "Technology of Many"
Computerworld Singapore (04/17/03) Vol. 9, No. 22,; Chua, Louis
Amorphous computing is what lies beyond today's distributed computing efforts, including Web services, grid computing, and peer-to-peer technology. Amorphous computing, also dubbed swarm computing, relies on multitudes of relatively weak nodes that are programmed with basic software instructions; together, these networks can achieve far more than stronger nodes could separately. The concept challenges existing frameworks, especially in software design, which has become too fragile and obscure in efforts to reduce computer operations. Today, hardware capabilities have far outstripped those of software, and new software concepts must be created with amorphous computing in mind: Sun Microsystems' Jini technology, for example, assumes the network will not always be working. Scientists have already begun using biological organisms, such as Escherichia coli bacteria, to create chemical components of biological computer devices. Biochemical gates in these bacteria can perform AND, NOT, and IMPLY logic operations. Amorphous computing is becoming reality in concept as well, demonstrated by projects such as Eyebees.com, which allows groups of like-minded Internet users to collaboratively surf the Internet. Networking technology has also made significant real-world impacts in societies where most people use cell phone text-messaging, in popular uprisings, and even in the current Iraq war; reports say Iraqi defenders used text-messaging to warn about a U.S. helicopter attack and prepare a defense that significantly damaged most helicopters in the group.
Click Here to View Full Article
- "Can Your PC Become Neurotic?"
Darwin (04/03); Georges, Thomas M.
Neurotic computers may seem like the stuff of science fiction, but intelligent, autonomous machines will be able to design and retool their programs in response to changing environments and situations. In his book, "Digital Soul: Intelligent Machines and Human Values," Thomas M. Georges writes that such a development could lead to unpredictable and undesirable computer behavior if internal directives come into conflict with external instructions, or if external instructions from different sources are inconsistent and confusing. The solution is to design monitoring programs that can identify and clear up conflicts. Such programs would serve the same function as cognitive therapy in humans, which is used to correct deeply embedded attitudes dictated by historical or genetic pressures that cause phobias and other mental disorders characterized by behavior that is not consistent with certain situations. There is a growing fear that self-replicating nanobots running amok will trigger a doomsday scenario in which humanity cannot hope to stop them from reproducing to overwhelming numbers. Such machines could be unleashed accidentally or be reprogrammed by malevolent parties. Countering such fears will depend on a clear effort to address control issues in nanotech development. Solutions include built-in programs that detect and correct deviant behavior, while nanobot populations could be maintained at safe levels through the use of "sterilization" programs.
- "Mission Impossible?"
IEEE Spectrum (04/03); Kumagai, Jean
Revelations that 9-11 could perhaps have been averted if the FBI's information resources and infrastructure were better organized has prompted Bureau director Robert S. Mueller III to institute a widescale effort to modernize the agency's computer systems; key to the success of such an overhaul will be resolving myriad technical issues, as well as quelling critics who feel civil liberties and privacy are being compromised by such measures. Ronald Kessler, who wrote "The Bureau: The Secret History of the FBI," thinks the tech-savvy Mueller offers the FBI strong leadership, an element it has sorely lacked and which is chiefly responsible for its continued reliance on an antiquated computer system. Since the fall of 2001, Mueller and other agency leaders have been pushing an agenda that includes deploying the $534 million Trilogy network infrastructure, which aims to securely link all FBI personnel to headquarters and field and satellite branches; replacing the Bureau's Automated Case Support database with the Windows-based Virtual Case File System; recruiting 350 intelligence analysts and 900 special agents whose expertise covers counterterrorism, engineering, physical sciences, computer science, foreign languages, and military intelligence; and building a corps of officers tasked with detecting and gathering information from FBI probes and sharing it with the intelligence community. Perhaps the most controversial reform involves the use of data mining and data warehousing to find patterns that could signal criminal or terrorist activity. Such techniques would be used to search more than a billion records currently stored in multiple databases, while raw data would be automatically sorted and organized by text-mining software. Experts such as the Federation of American Scientists' Steven Aftergood urge that the Bureau install "realistic error-correcting procedures" in order to reduce or eliminate the identification of innocent people as terrorists, which could be the result of intentional abuse as well as statistical or data-processing mistakes. However, one computer expert is convinced that the FBI will be "totally ineffective in its professed purpose [of catching terrorists] but too effective as a domestic police state tool."
Click Here to View Full Article
- "The Observant Computer"
Technology Review (04/03) Vol. 106, No. 3, P. 66; Zacks, Rebecca
Alex Waibel, director of Carnegie Mellon University's Interactive Systems Laboratories, is trying to eliminate forced interaction between humans and machines by developing an observant computer that can study human behavior and deduce how to serve people better. He characterizes the template of such a system as "A good butler or a good secretary--someone who invisibly hovers in the background, guesses your very needs, and serves them up before you even ask." Waibel notes that an observant computer must acknowledge someone's presence in the immediate vicinity, as well as track their movements and activities. To do this, a desktop computer has been enhanced with a camera and image-processing software so that it color-codes the people it observes and distinguishes faces by tracking a person's eyes, nostrils, and mouth corners with white marks. Waibel's lab has also developed a number of translation systems that could enable an observant computer to comprehend speech in multiple languages. A special meeting room has been set up to improve a computer's ability to observe people and discern their identity, emotions, and focus of attention through the use of microphones to record sound and a camera that furnishes a simultaneous view of everyone gathered around a conference table. So that the machine can determine a person's focus of attention, Waibel says that his team has equipped it with "a statistical model that combines head direction over time together with some notion of what potential targets of interest are--human faces and people speaking, for example." The visual and audio data the computer picks up is fed into an array of such models so it can infer what is going on.
- "Trends Shaping the Future: Technology Trends"
Futurist (04/03) Vol. 37, No. 2, P. 30; Cetron, Marvin J.; Davies, Owen
Technology's increasing influence on the economy and society will have a wide array of benefits over the next 20 years, including minimization of industrial pollution, new jobs and business opportunities, and greater profitability balanced by lower prices for goods and services thanks to automation. Computers will be more deeply embedded into the environment; robots will replace people in many high-risk professions; the engineering, technology, and health sectors will experience rapid growth; and superconductors and other sophisticated devices will shrink and become standard elements of commercial products. As research and development's economic benefits grow, low-wage laborers in low-wage countries will start earning more, high-tech talent in those same nations will continue to move to the United States and other higher-wage countries, and fields with a high rate of R&D return-on-investment will be a haven to scientists, engineers, and technicians. Reduction of design and marketing cycles, fostered by computer-aided design, means that early adopters of cutting-edge techniques will flourish. Advanced technologies are making air, rail, and road travel safer, and making vehicles more efficient and longer-lasting, thus accelerating travel and shipping; effects of this trend should include fewer auto accidents and plane crashes, less demand for oil, urban efforts to reduce traffic congestion, and investments in alternate modes of transport. Rapid medical advances--nanotechnology, prosthetics, designer drugs and drug-delivery systems, artificial blood, and so on--will have several negative consequences, but the benefits will include an increase in life span. The growth of the Internet will mean more job opportunities for technically-inclined people over the next 15 years, and the
erosion of cultural, political, and social borders.
- "Computer Science Prize to Honor 3 Forerunners of Internet Security"
New York Times (04/14/03); Markoff, John
ACM plans to announce that Ronald L. Rivest, Adi Shamir, and Leonard M. Adleman will receive the 2002 A. M. Turing Award for their development work in public-key cryptography. The award, which carries a $100,000 prize financed by the Intel Corporation, is given annually to leading researchers in the field of computer science. Working at the Massachusetts Institute of Technology in 1977, the three men developed the RSA algorithm, which is widely used today as a basic mechanism for secure Internet transactions, as well as in the banking and credit card industries. The strength of this approach is that it provides highly secure communications over distances between parties that have never previously been in contact. Dr. Rivest now teaches in the electrical engineering and computer science department at M.I.T. Dr. Shamir is a professor in the applied mathematics department at the Weizmann Institute of Science in Israel. Dr. Adleman is a professor of computer science and of molecular biology at the University of Southern California.
Click Here to View Full Article
(Access to this site is free; however, first-time visitors must register.)
For more information about the Turing award announcement, visit http://www.acm.org/announcements/turing_2002.html