Timely Topics for IT Professionals
About ACM TechNews
ACM TechNews is published every week on Monday, Wednesday, and Friday.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.
To send comments, please write to firstname.lastname@example.org.
Volume 5, Issue 473: March 24, 2003
- "Warfare Enters the Digital Age"
USA Today (03/24/03) P. 3B; Acohido, Byron
The U.S. military is vastly changed from the force it was in 1991, when commanders had comparatively little visibility into the battlefield and ability to coordinate strikes on small, moving targets. Today, however, the armed forces tout a style of "netcentric warfare" which allows fast response to more precise incoming data. U2 spy planes and Global Hawk surveillance drones beam digital pictures of possible target sites to military leaders while aircraft such as the Boeing AWACS and Joint STARS cruise at high altitudes and scan the air and ground for moving enemy forces using radar. Unmanned aerial vehicles (UAVs) such as the Predator drone fly over the targets and stream video and position coordinates to command centers, which then relay targeting information to fighter-bombers, such as the F-15E. Those types of planes carry 2,000-pound JDAM(2) ordnance, which has an in-flight GPS guidance system made from commercially available parts. Experts say the military's adoption of off-the-shelf components has helped keep its IT capabilities in line with commercial advances, which are characterized by fast-growth drivers such as Moore's Law. Previously, Pentagon leaders insisted on specialized technology and equipment that proved expensive and slow to develop. The armed forces are also employing commercially available IT behind the frontlines, such as in the supply chain. Wi-Fi bar-code scanners are used in the field to help keep track of supplies and equipment moving to the front, similar to systems employed by UPS and Federal Express.
- "IT Burnout at Critical Level"
NewsFactor Network (03/21/03); Maguire, James
Employee burnout stemming from disenchantment, overwork, and the like, is on the rise in the IT sector, according to analysts. A Meta Group report authored by Maria Schafer finds that most surveyed IT managers regard burnout as a major issue, enough so that many firms have started to seriously evaluate the problem and even take steps to remedy it. She estimates that 24 percent of the companies polled have improved employee retention programs, while 5 percent relocated to assuage worker dissatisfaction and bring in more IT talent. The report also indicates that 55 percent of the respondents have initiated training programs, though Schafer acknowledges that such programs are hard to afford, given worker layoffs. Meanwhile, tighter budgets are limiting salary growth, although 17 percent of companies are focusing on wages. Still, Schafer points out that average salaries for IT professionals are still higher than those of other company employees, and predicts that demand for IT workers will climb back up, given the trend's past history. She comments that even before the recession hit IT employees were feeling stressed out; in fact, the average IT professional clocks in between 55 and 60 hours each week. A subsequent study will focus on IT burnout's corporate ramifications, one of which could be a reduction in productivity, according to Schafer. "For some of the companies, who were at the mercy of engineers for three or four years [during the boom period], this is fair play in their eyes," notes Yankee Group analyst Zeus Kerravala.
- "War Images Give New Purpose to High-Speed Web"
New York Times (03/24/03) P. C1; Kirkpatrick, David D.
Broadband Internet usage is getting a big boost from war coverage, just as the CNN cable news network did during the first Persian Gulf war in 1991. Online news organizations such as Yahoo!, AOL, CNN.com, ABCNews.com, and RealNetworks have assembled online montages of live video feeds, interactive maps and displays, and other streamed multimedia. Many of the sites offer upgraded versions of their services for a monthly subscription fee, usually about $5. Reuters recently unveiled a "raw video" offering on its Web site where users could watch four Reuters camera feeds unedited, often before it was available on TV. One British journalist observing the fighting in Umm Qasr could be overheard commenting on his good fortune of being in a firefight, just before he started an earnest-sounding broadcast. Internet research firms report a rise in traffic to news sites and say high-speed connections now represent 52 percent of all home Internet accounts and 90 percent of work connections. They also say the current war coverage online does not tap the unique capabilities of the Internet as did the text of the Starr report on President Clinton or video of the World Trade Center attacks on Sept. 11. For now, they point to a higher percentage of news Web site hits from work as an indication that TV still provides better quality coverage than the Internet. People in the office often do not have access to a TV, whereas home viewers have a choice of media channels. However, New York Times Web editor in chief Leonard M. Apcar points out that Web sites offer users a more individualized and direct path to information than TV broadcasts.
(Access to this site is free; however, first-time visitors must register.)
- "Military's Use of GPS, A Civilian Mainstay, Is at Core of Its Might"
Wall Street Journal (03/24/03) P. B1; Gomes, Lee
The Global Positioning System (GPS), a network of 27 satellites often dedicated to civilian applications such as location services, is also a military tool, and a critical component of the U.S.-led war against Iraq. Thanks to GPS, bombs can hit targets more precisely because GPS signals are unaffected by smoke or clouds, unlike laser-guided targeting systems. One GPS feature that has not been reactivated is selective availability, which would give certain military users more accurate signals, but could reduce the accuracy of GPS products in use by both civilians and military personnel. The biggest drawback of GPS is that its weak transmissions--designed to be so in order to avoid interference with other forms of broadcast--can be disrupted with little difficulty. However, it is doubtful that the absence of GPS would seriously impact the flying ability of most American smart weapons because they are equipped with alternative navigational systems. In fact, the Pentagon plans to replace GPS with the more advanced GPS III over the next 10 years, to the tune of $15 billion. Still, an even greater danger exists: It is theorized that the global proliferation of GPS-based weaponry could make war a more politically safe option for civilians, if it can reduce battlefield casualties as intended.
- "Rebuilding Plans for Postwar Iraq Depend on IT"
Computerworld (03/21/03); Verton, Dan; Hamblen, Matt
Assuming that the war against Iraq has a successful outcome and Saddam Hussein's regime is toppled, reconstruction of the country's infrastructure will follow a plan currently under development by the Pentagon, the State Department, and the private sector. The State Department has been consulting with exiled Iraqi IT experts on how this reconstruction should proceed since October, under the aegis of the Future of Iraq Project. Among the 17 working groups included in the program is an economic and infrastructure team tasked with setting postwar Iraqi IT and telecommunications guidelines, according to the project's David Staples. Iraq-born infrastructure team member Ahmed Al-Hayerdi says the existing military infrastructure could provide a foundation on which to build a pervasive, country-wide telecom framework, and notes that many Iraqi exiles eager to contribute to the rebuilding effort are senior executives in technology companies. Fellow infrastructure group member Rubar Sandi speculates that the modernization of Iraq's data and voice networks could take up to eight years and cost between $1 billion and $1.5 billion. Meanwhile, the U.S. Defense Department is planning the establishment of an IT and telecom network for occupational forces that could be leveraged later on to build a more permanent infrastructure. Without naming any specific companies, a representative for the Defense Information Systems Agency reports that his organization has contracted for commercial information systems support, including the deployment of OC3 terrestrial connections throughout Iraq.
Click Here to View Full Article
- "Making Senders Pay the Price for Spam"
InformationWeek (03/20/03); Kontzer, Tony
Scott Fahlman of IBM's Watson Research Center has come up with a spam solution that he believes will be more effective than current anti-spam technology and legislative efforts: Making spammers pay for the right to send unsolicited commercial email. "The whole spam industry depends on spam being free to the sender," he observes. "If we change the social rules of email just a tiny bit, I think the whole problem of spam goes away." Fahlman's concept involves the establishment of a nonprofit "charity stamp" program in which participating recipients would only receive unsolicited email from senders who pay to obtain an authenticated 10-digit code, either from a charity stamp Web site or software featuring a special algorithm Fahlman has written. The software, which would reside between the recipient's desktop and the supporting mail server or ISP, would scan incoming messages to see if they are approved by the recipient's whitelist, and then check to see if they include the authentication code. Unauthenticated emails would be returned to the sender. The fee required to get the code would be affordable for legitimate emailers but too high for spammers, according to Fahlman. He is trying to drum up support within IBM for charity stamp software and the accompanying Web site, but he has not ruled out turning to outside companies for assistance.
- "On the Backs of Ants"
Technology Review (03/19/03); Patch, Kimberly
Taking a cue from communal organisms such as ants, bacteria, and slime molds, researchers at Germany's Humboldt University have developed a system in which electronic agents can assemble themselves into networks without a centralized communications infrastructure. The system mimics the way that individual organisms leave chemical pheromone trails that other members of the community can pick up on to coordinate growth processes. The system does not follow a top-down architecture of network construction, but relies on a bottom-up strategy in which randomly moving agents self-organize by homing in on unconnected network nodes and producing one of two simulated chemical trails that attract other agents. The network nodes are either blue or red; random green agents turn into the color of whichever node they find, and generate a trail that lures agents of the opposite color. The eventual result is "a network that connects almost all neighboring nodes," explains Humboldt University associate professor Frank Schweitzer. He says the network can adapt to changes in node position and rapidly mend broken links. Schweitzer speculates that the research could be applied to self-assembling circuits, adaptive cancer treatments, and the coordination of groups of robots. Eotvos University's Tama Vicsek acknowledges that the Humboldt model could be useful for network management, but its wide application is currently limited by its complexity.
- "U.S. Offers Better Prospects for European PhDs"
The Work Circuit (03/19/03); Gordon, Stephanie
More and more European science and technology (S&T) graduates are choosing to live and work in the United States, mainly due to attractive career and hiring prospects, says EU commissioner of research Philippe Busquin. The data, from a new European Commission study, shows that one in 10 non-U.S. citizens employed in the U.S.'s high-tech industry was born in the EU. Moreover, 85,700 S&T researchers came from the EU in 1999. Of these, 28,400 researchers came from the United Kingdom, 25,200 researchers came from Germany, and 7,700 researchers came from Italy. The study also shows that Europe provides 14 percent of the overall American S&T workforce, and the numbers probably have increased since 1999. China and India have the highest numbers of residents in the U.S. with PhDs, 37,900 and 30,100 respectively. The United Kingdom trails with 13,100 researchers, followed by Taiwan's 10,900. But compared to the overall number of S&T workers in Europe, the American-based population of European S&T workers is still relatively small--11 million in Europe versus 400,000 in the United States--but is expected to have a big impact on Europe's future research prospects.
- "Spider Silk Delivers Finest Optical Fibres"
New Scientist (03/19/03); Penman, Danny
A team of engineers at the University of California at Riverside are using spider silk to make finer optical fibers that could be used to carry light in nanoscale optical circuits. The fibers are made by first coating the thread with tetraethyl orthosilicate, then burning it away it by baking, leaving behind a hollow silica tube 1 micrometer in diameter. The silk they use is from the giant orb-weaving spider of Madagascar, Nephila madagascariensis. The engineers are planning to apply the process to the thinnest known spider silk, which is produced by Stegodyphus pacificus and has a 10-nm diameter, which should yield a conduit about 2 nm in diameter. Such a development would be a major breakthrough in the field of photonics, while Bath University physicist Philip Russell thinks that the method will help in the construction of minute sensors that harness the unique "supramolecular" chemistry that takes place between substances enclosed in small spaces. Other potential applications include ramping up the resolution of optical microscopes, which Heriot-Watt University chemist Christopher Viney predicts "will open up whole new vistas for biologists." The USC-Riverside engineers' method will be detailed in an upcoming issue of the Journal of Materials Chemistry.
- "Dell Sounds the Death Knell for the Venerable Floppy Disk"
Philadelphia Inquirer (03/20/03) P. C8; Chmielewski, Dawn C.
Dell Computer's announcement last month that it would exclude 3.5-inch floppy disk drives as a standard feature on its desktop PCs signaled the end of the line for the technology, analysts say. Eliminating floppy drives can cut approximately $10 to $15 in hardware costs for manufacturers. The disappearance of the floppy drive means computing has little use for the format, which cannot easily store today's digital multimedia. In contrast, the original IBM PC in 1981 had no internal disk drive, but depended on two floppies--one for the MS-DOS disk and the other for miscellaneous data. The floppy disk remained a computing mainstay for years, and was used to restart broken systems, backing up files, and other common tasks. The device's popularity peaked in 1995 to 1998 as PC users bought more than five billion floppies annually, but the portable format has been overshadowed by more robust data-storage tools such as writable CDs, USB thumb-drives, micro drives, and flash memory chips. Corporate networks and the Internet, meanwhile, provide better file-sharing capabilities than floppy disks. In addition, software developers prefer to use CDs to distribute programs instead of floppies. But in China, Latin America, and even Japan, demand for floppy disks is still strong, says Imation, the current No. 1 manufacturer of floppy disks. The firm says it produces 2 million floppy disks each day. Imation's Mike Noer says they will be "around awhile," because of their simplicity and inexpensiveness.
- "Why We Should Lose the Back Button"
Computerworld New Zealand (03/18/03); Bell, Stephen
Zanzara owner Richard Mander counsels vendors and user groups on user interface design. He says Web designers should stop being dependent on the Web browser's back button and instead incorporate more obvious navigation links and indicators on Web sites. He says the action of the back button is too vague, since it can go to a page on the site or to another site altogether, and that the user should have an option of where he or she wants to go specifically. In addition, he says, systems should inform users upfront about what its different parts are and how they can be used instead of leaving them to find out on their own. Icons should provide a hint as to what they give access to or what function they perform, similar to how buttons on a device are labeled, says Mander. As it is, many Web pages rarely define page links, application links, and static areas clearly. For intricate processes, Mander believes users should be guided step-by-step much like software program "wizards." And if something fails to work properly, the user should be able to halt the job in such a way that minimally impacts other phases of the process.
- "Memories as Heirlooms Logged Into a Database"
New York Times (03/20/03) P. E6; Eisenberg, Anne
Researchers at Microsoft's Bay Area Research Center are developing software that would allow users to search out information on everyday events, keepsakes, and other personal details archived in a single database, and project originator Gordon Bell says such a database, dubbed MyLifeBits, could prove even better than traditional memory techniques, such as collecting photographs. Bell is scanning myriad objects and documents about himself into the database, while telephone conversations, Web page visits, and received email are also entered into the MyLifeBits archive. He notes that using the database allows him to eliminate all but the essential printed documents, and thus reduce the clutter in his life significantly. Bell advises potential MyLifeBits users that the items they wish to archive should be captured digitally. Microsoft computer scientist Jim Gemmell, who is collaborating with Bell on the MyLifeBits software, says the database's biggest selling point should be convenience--in other words, anyone should be able to access and use it regardless of skill. He adds that one of the particularly helpful features of MyLifeBits allows users to attach verbal or typewritten annotations to photographs, while scrolling extensively through items during a search is obviated by the use of graphs and timelines. Bell is not concerned about MyLifeBits' potential to compromise personal security. "We are looking not at the dark side, but at all the potential it has as a surrogate memory," he explains.
(Access to this site is free; however, first-time visitors must register.)
- "Study Suggests Spam-Stopping Tricks"
CNet (03/19/03); Bowman, Lisa M.
The Center for Democracy and Technology (CDT) concluded in a recent study that masking or hiding email addresses is the most effective tactic for avoiding the deluge of junk email, or spam, that accounts for up to 50 percent of all messages in a given corporate in-box, according to a December 2002 report from the Gartner Group. The CDT study, "Why Am I Getting All This Spam?," found that consumers who use "human-readable" addresses do not receive any spam, while the majority of companies covered in the analysis honored subscribers' requests when signing up for new Web services that they not be contacted for commercial purposes. The study determined that posting addresses on a public Web site was the best way to attract spammers, but noted that doing so in certain locations--such as the Whois database--drew little spam, while spamming fell dramatically when publicly posted addresses were taken down. Strategies that people can use to thwart spammers, as suggested by the CDT report, include disguising public email or the email address; the use of multiple, disposable addresses; careful consideration of online forms, particularly check boxes that ask for the right to share subscribers' email addresses; the use of a spam filter program; and writing longer or more complex email addresses. Despite these findings, the report acknowledged that "currently there is no foolproof way to prevent spam." Furthermore, the EPrivacy Group's Ray Everett-Church noted that most spammers ignore addressees' requests to be taken off mailing lists. He also predicted that obscuring email addresses will become obsolete as spammers use increasingly sophisticated software.
- "The Worldwide Code Rebellion"
Computerworld (03/17/03) Vol. 37, No. 11, P. 46; Thibodeau, Patrick
Open-source software has gone from a short-term phenomenon to a worldwide movement as an increasing number of governments turn to the technology. Governments in Europe, Singapore, Taiwan, and China are among those that are encouraging IT managers to consider deploying open-source systems. Government agencies often say open-source software provides independence of U.S. software vendors, but they are also looking for a way to save money. Although U.S. government officials at the federal and local level have not adopted specific policies supporting open-source software as foreign government officials have, revenue shortfalls have some federal officials considering open-source, particularly at the midrange server level. Some large firms, such as IBM, have become advocates of open-source. Some observers even believe governments will eventually establish procurement policies that force software vendors to provide open and interoperable file formats as a prerequisite for contracts. Because the government market is so large, such a policy likely would spur a change in the commercial sector as well.
Click Here to View Full Article
- "Open Secret"
New Scientist (03/15/03) Vol. 177, No. 2386, P. 30; Jamieson, Valerie
Carbon nanotubes' anticipated revolution has been much quieter than advertised by the hype: They are incorporated in automobiles as conductive fuel lines insulated against sparking, as well as protective coating that reduces pollution and increases the efficiency of spray painting; they are also employed in materials used to package hard drives and computer chips, although projects are underway to have nanotubes themselves act as transistors and electronic wires. Phaedon Avouris of IBM's T.J. Watson Research Center last year fashioned nanotube transistors that perform more efficiently than silicon devices, but their practical applications are being hindered by a low-yield manufacturing process that is way too costly, with the bulk of that cost attributed to the process of impurity removal. Such nanotubes are single-walled, as opposed to multiwall nanotubes that Massachusetts-based Hyperion Catalysis churns out by the ton for use in the automotive industry. Nanotubes' unique electrical properties allow them to be metallic or semiconducting; they are also significantly more conductive than copper, are more than 50 times stronger than steel, and are highly flexible. Harnessing nanotubes' mechanical properties has its own share of drawbacks--length limitations make visions of nanotube tethers spanning from the earth's surface into space seem more like science fiction than scientific possibility, while efforts to embed nanotubes in building materials such as concrete have only introduced structural imperfections. A research team led by David Carroll of South Carolina's Clemson University showed last year that the addition of just a few nanotubes makes piezoelectric plastics more sensitive to pressure, a development that could pave the way for new conductive fibers. Carroll's team also boosted the conductive efficiency of plastic solar cells by adding nanotubes. Five percent of the light the cells soak up is converted into electricity, and their long lifespan makes them feasible for commercialization.
- "Knowledge Managing"
InfoWorld (03/17/03) Vol. 25, No. 11, P. 1; Angus, Jeff
The economic downturn has cleared a path for the rethinking of knowledge management (KM) and how it can be incorporated into the enterprise. Now companies face the daunting challenge of investing in KM and reorganizing their workflow and processes in order to accommodate it. Past KM projects were characterized by numerous failures attributed to companies' inaptitude to integrate stored knowledge and human expertise, concludes Xerox Global Services CTO Bob Bauer. "They got lost because they got focused on trying to create transactions of data with data when the fact is that any decision process or action based on valuable transactions involves people," he explains. More recent KM projects are being driven by KM's maturity as well as the entrenchment and proliferation of technologies such as XML and better recognition technology. Open Text's Anik Ganguly believes that more robust integration is an important component of KM, in that it deconstructs input, which is perhaps the single biggest hindrance to KM implementations. Forthcoming technologies that promise great advancements in KM systems' primary functions--gathering, organizing, refining, and distributing--should spur KM adoption. Voice-mining technology will be especially crucial for the knowledge gathering component, since experts reckon that 75 percent of all essential corporate knowledge is relayed verbally; improved pattern-recognition initiatives from ClearVision, Autonomy, and others will benefit organization; and Bauer says better pattern recognition and further XML adoption are helping the refinement function.
- "Taming Traffic"
Technology Review (03/03) Vol. 106, No. 2, P. 25; Talbot, David
Researchers at the University of Singapore are working on a computerized traffic management system to facilitate efficient traffic flow in the event of a car accident or traffic jam. The system uses enhanced algorithms and data-mining technology to generate traffic-clearing strategies, and could be implemented in Singapore by August 2005, forecasts Der-Horng Lee, a civil engineer who was in charge of the project at the university. The goal of the system is to provide motorists with instructions as early as possible, giving them ample time to react. After a car accident, for example, the system would select within seconds the most appropriate strategy for clearing traffic, such as lane closings, modified traffic signals, or highway advisories. The initiative is geared toward cities already equipped with smart highway networks such as Tokyo, Los Angeles, Houston, and Stockholm. But although these cities use cameras, magnetic loops, and other technologies to generate traffic information, its interpretation is still susceptible to human error. Meanwhile, MIT is developing a similar system based on drivers' reaction to highway announcements at expected high-traffic zones. The system could be in use at Los Angeles and McLean, Va., by 2004.
- "The Meaning of Computers and Chess"
IEEE Spectrum (03/03); Ross, Philip
The last three major human vs. computer chess matches ended in a draw, thus demonstrating the continued refinement of software and human players' inability to modify their strategies against such programs; it also signifies that either computer intelligence is improving, or that playing chess may not necessarily be a sign of true intelligence. It is a significant development in the field of artificial intelligence research, which was the reason why a chess-playing computer program was conceived in the first place. Chess master Garry Kasparov accused IBM, the creator of his 1997 computerized opponent Deep Blue, of cheating, arguing that only a person could have prepared to exchange pawns the way Deep Blue did. Deep Blue designer Feng-hsiung Hsu, in his book "Behind Deep Blue: Building the Computer that Defeated the World Chess Champion," counters that the software was programmed, in consultation with Grandmaster Joel Benjamin, to consider files that were not just open, but potentially open, and let its if-this-then-that algorithm dictate the move based on those variables. Electrical engineer Claude Shannon, who proposed the chess-playing algorithm over half a century ago, conceived a search function as the first step, one that generates all possible move sequences to a certain depth, as determined by the computer's speed and memory. The rub is that a program with unrestricted search powers can flawlessly play with an assessment function that can only discern between checkmate and draw, while a program with superior evaluation powers would be unable to look even one move ahead. The Israeli chess program Kasparov battled in February, Deep Junior, was not as powerful as Deep Blue, but it had the advantage of greater knowledge of the game, and thus understood chess better, according to Israeli grandmaster Boris Alterman; in addition, Deep Junior distinguished itself by being willing to sacrifice materials in order to reach intangible goals, such as freedom of movement or making its opponent's king more vulnerable to attack.