Timely Topics for IT Professionals
About ACM TechNews
ACM TechNews is published every week on Monday, Wednesday, and Friday.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.
To send comments, please write to email@example.com.
Volume 5, Issue 470: March 17, 2003
- "Internet, Wireless to Play Key Role In an Iraq War"
Investor's Business Daily (03/17/03) P. A4; Tsuruoka, Doug
The U.S. military will showcase a number of digital war-fighting technologies in the event of an Iraqi war. Analyst Steve Sigmond says, for example, that systems will be melded to give carrier battle groups a consolidated, real-time view of the action, including radar, air traffic, and weapons systems. On the ground, infantry, tanks, and other forces will feed video images of the battle to commanders so they can more accurately assess the situation. Soldiers will even be able to use gun-mounted cameras to see what is around a corner, viewing the image from a display screen fitted to their helmet. Internet technology would also have a prominent role in an Iraq war, a realization of efforts at the Defense Advanced Research Projects Agency in the late 1970s. Cyberwar expert Winn Schwartau says the type of Internet used by the military will be highly secure, closed off from the commercial Internet, and running over dedicated satellite links. Just as with the larger Internet, bandwidth and traffic demands will strain the system, and infrastructure is vulnerable to physical attack, either by conventional weapons or electromagnetic pulse technology. The Iraqi military is also said to have acquired GPS-jamming technology from foreign sources. Iraq's own reliance on technology is limited in comparison to that of the United States, and experts plan to examine a possible war for insights in how technology affects the U.S.-Iraq engagement. Besides technology directly related to battle, the U.S. is already using email for psychological operations on Iraqi officers and is screening its own soldiers' email for information leaks.
- "Network Guardians Face Thorny Job"
Wired News (03/15/03); Grebb, Michael
A gathering of communications industry officials under the auspices of the Federal Communications Commission is discussing how the sector can improve reliability and security--and avoid government regulation. Representatives at the Network Reliability and Interoperability Council (NRIC) noted that recent computer viruses such as Code Red and the Slammer worm spread across the Internet in just minutes, while efforts to contain Slammer took three days. Patch updates are difficult to install on time for many small firms with few resources, and council officials said the profusion of devices with TCP/IP addresses increased the number of targets. Cable & Wireless vice president of security and council chairman Bill Hancock said the lack of network security protocols hampered administrators, often making them choose between keeping systems running or protected. Physical infrastructure also needed to be considered in case of an attack such as the ones on Sept. 11, said Bell Labs network reliability director and council focus group chairman Karl Rauscher. He suggested network providers have contingency plans for emergency power, transportation, and even measures to protect equipment from chemical attack. The 56 members of the NRIC will vote in March on best practices recommendations they will then try to implement across the industry in order to forestall government mandates.
- "Does Cyberterrorism Pose a True Threat?"
IDG News Service (03/14/03); Evers, Joris
A cybersecurity panel at the CeBIT technology show in Germany said the threat of cyberterrorism was overblown, and that terrorists would more likely use bombs than initiate an Internet attack. The representatives gathered from IT security firms, the European Union, and the North Atlantic Treaty Organization (NATO) blamed U.S. government agencies, IT vendors, and the media for hyping the threat. Counterpane Internet Security founder and chief technical officer Bruce Schneier said pointedly, "If I can't get to my email for a day I am not terrorized." He said today's critical infrastructure--water, electrical, transportation, and emergency services systems--was under little threat from terrorists on the Internet, and that Internet infrastructure was more likely to be hit by common criminals. NATO senior information security engineer Rainer Fahs agreed and noted physical systems ran on secure networks, but that terrorists use the Internet for communications. RSA CEO Arthur Coviello also said cyberterrorism was not as large a threat as general hacking activity but that companies should take precautions based on their specific risk. A recent Symantec survey showed less than 1 percent of Internet attacks emanated from countries labeled as terrorist threats.
Click Here to View Full Article
- "Tech Firms Tackle Spam"
CNet (03/14/03); Olsen, Stefanie
In an effort to tackle the growing problem of spam, various technology firms on Friday gathered at the JamSpam forum to discuss the development of an "open, interoperable antispam specification" that would curtail what ePrivacy Group President Vincent Schiavone calls "the No. 1 problem on the Internet" today. Earthlink's Stephanie Fossan says the amount of junk email on its network has jumped 500 percent in the past 18 months; spam not only clogs networks and wastes users' time, it also impacts legitimate email since it can get caught in spam filters. Although the group has not formalized an anti-spam agenda, many solutions were discussed, such as developing email authentication standards, closing insecure "open relays" that spammers use to send bulk email, and creating a more "transparent" email system that makes it easy to determine the type of email being sent. Schiavone says, "The solution is not technological, not legal, not standardization, but a combination of all of them." Fossan says the problem needs to be solved in pieces. VeriSign, Sun Microsystems, Oracle, Dell Computer, and AOL were some of the firms attending JamSpam.
- "Paper Speeds Video Access"
Technology Research News (03/19/03); Patch, Kimberly
Ricoh Innovations and the University of California-Berkeley teamed up in a research project to augment digital video with traditional book interfaces. "While historians consider the primary research artifacts to be audio or...video recordings, the artifact that historians use in their own research is the printed transcript," observes UC Berkeley's Scott Klemmer, a member of the research team. The scientists enhanced a handheld computer with a hard drive and a barcode reader, and supplemented transcripts of oral history videos with barcodes; users can scan the barcode in a transcript, thus cueing the video on the handheld's screen to the correlating section of the recording. The devices were tested by 13 oral historians, who demonstrated interest in gestures, facial expressions, and other specific aspects of the video record, as well as checking for accuracy, according to Klemmer. Overall, the participating historians thought the devices were very useful for relaying the context of the video, but felt the handheld screen was too small. Another significant finding was that certain participants used the audio recording and paper transcript simultaneously. Klemmer predicts that cameras and wireless Internet access will become standard handheld components in three years, which will increase the accessibility of the researchers' technique. Meanwhile, he is busy developing Paper Mache, a tool designed to ease the construction of computer interfaces that blend paper and other kinds of physical interfaces with graphical user interfaces.
Click Here to View Full Article
- "U.S. Nanotech Funding Expected to Hit $1 Billion"
EE Times (03/14/03); Leopold, George
U.S. nanotechnology research is ramping up as government, commercial, and academic forces line up and gather steam. Government spending dedicated to nanotechnology has increased to $849 million approved by Congress, and Richard Russell of the White House Office of Science and Technology Policy says that total could jump to $1 billion soon. The National Research Council recently advised the government to formulate a strategy for nanotechnology, and a presidential advisory council met in early March to discuss such a plan. The government wants to stay ahead of international research efforts, such as an estimated $500 million program in Japan. Research targets include studying nanotechnology materials and how they interact, as well as "grand challenges" for nanotechnology, such as nanoscale electronics, photonics, magnetics, manufacturing, and instrumentation. Carbon nanotubes are also of tremendous interest because of potential military applications with micro-aircraft and robotics, and their ability to make electron-beam welds, vertical interconnects, and protective films. Rensselaer Polytechnic Institute's Richard Siegel says recent achievements include the assembling of carbon nanotubes in materials and devices.
- "Grid Computing Shows Mettle, ROI In Research-Focused IT Organizations"
InternetWeek (03/11/03); Turner, Mary Johnston
Early adopters of grid computing and virtualization architectures say that not only are the technologies working, but they are also reducing costs and boosting IT options. Phil Emer, chief architect of the North Carolina BioGrid Project, says grid computing can be offered as a utility to users at various universities and using different desktop platforms or research disciplines. Grid computing saves money "because grid lets us rely on clusters of enterprise-class servers rather than supercomputers that are very expensive to maintain," he says. Moreover, the technology can be easily modified to satisfy the needs of the research, he says, thereby reducing cycle times. Similarly, the European firm Inpharmatica, a pharmaceutical firm, says streamlining research using grid computing utility services has a positive effect on ROI, according to the firm's CIO Pat Leach. The firm uses a grid-computing utility service to improve computing flexibility at peak usage times for its scientists. Otherwise, the firm would end up having to continually replace outdated equipment, he says. In addition, the utility architecture allows the scientist to concentrate on data management and interchange as well as obtain taxonomy support.
- "Programmers to Compete in Calif."
Cornell Daily Sun (03/14/03); Lane, Philip
Some 70 student teams from around the world will meet in Beverly Hills, Calif., next week (March 22-26) to compete in the 27th Annual ACM International Collegiate Programming Contest sponsored by IBM. The competition requires students to find solutions to eight or more computer science problems within five hours. Each group must work together using only one computer terminal. ACM reports the problems will be equivalent to a semester's worth of programming. The teams have been narrowed down from an original batch of 3,850 teams from 68 nations. Winnings range from computers to cash amounts of $10,000.
- "PCI-X Marks the Spot for IBM, HP"
CNet (03/14/03); Shankland, Stephen
IBM and Hewlett-Packard have lined up behind the backwards-compatible PCI-X technology for new chipsets that connect network cards and other devices into servers. The PCI-X 2.0 technology will first appear next year in a PCI-X 266 version, to be followed the year after with a PCI-X 533 version, says Kimball Brown of ServerWorks. Meanwhile, Dell is backing the advanced PCI Express technology developed by Intel. PCI Express requires different hardware than PCI-X, which uses the same parallel design as the previous PCI technology. Experts, however, say that PCI-X will be more expensive because its legacy design is not suited for the high speeds brought on by new networking technology such as Infiniband, Ethernet, and Fibre Channel. PCI Express is expected to win out over PCI-X eventually, especially after it matures in the PC desktop arena. PCI Express uses serial connections and will initially utilize a familiar plug-in design, although the PCI-SIG, which manages PCI standardization, is considering different form factors in the future.
- "Lilith Stirs Interest in Technology Among Girls"
News@UW-Madison (03/14/03); Konicek, Kathy
The Lilith Computer Group is a local program in Madison, Wisconsin, that is working to encourage females to study information technology. Women hold just 20 percent of IT jobs, and groups such as Lilith have formed in an effort to get more females to seek careers in computers. Lilith targets girls in middle school, forming 10 clubs around Madison where girls meet after school once a week to work with editing digital video, creating logos with design software, and other computer activities. The group is sponsored by the University of Wisconsin-Madison, community groups, local schools, and foundations. UW-Madison's Kathy Konicek, who works part time as a Lilith Club coordinator, says, "About 86 percent of the kids who sign up come to club meetings regularly. Our retention rate is good." Lilith is developing a mentoring program using women mostly from the tech community to keep club members interested in technology once they leave middle school.
For more information on ACM's Committee on Women and Computing, visit http://www.acm.org/women.
- "Sustainable Computing Consortium Hosts Workshop on Trust and Dependability in Wireless Environments"
Carnegie Mellon University's Sustainable Computing Consortium (SCC) will host a two-day seminar beginning March 31 to discuss issues related to mobile, wireless, grid, and other "always-on" computing systems. The SCC seminar, to be held in Tempe, Arizona, will focus in particular on interoperability, dependability, and security. SCC director Dr. William Guttman says the seminar will also allow participating experts to look at new developments, challenges, and future prospects. Specifically, he says trust and interoperability are becoming larger issues as consumer adoption ramps up. The seminar will include such speakers as Walt Davis, chief scientist at Motorola; Dr. Fernando Fernandez, former director of the Defense Advanced Research Products Agency; and Clifford Wilke, bank technology director at the Office of the Comptroller of the Currency, Committee on Bank Supervision. Topics scheduled for discussion include mobile payments, network standards, and pervasive enterprise applications. Participants also include representatives from firms such as AIG, Microsoft, Pfizer, and Raytheon.
- "Green Plans for Tiny Tech"
Nature (03/10/03); Gerstner, Ed
Environmentally safe nanotechnology is the goal of Rice University's Center for Biological and Environmental Nanotechnology, according to statements by executive director Kevin Ausman at this week's meeting of the American Physical Society. Although nanomaterials by themselves may not be detrimental, their environmental interactions could have unanticipated consequences, which is why the center is focused on nanotech's potential impact. "The traditional approach of new technologies to environmental concerns is to wait until there is a problem," noted Ausman at the conference. His group is currently studying whether titanium dioxide nanoparticles, which are known to sponge up heavy metals, cause toxicity and pollutant levels to rise when released into the environment. Nanotech's potential environmental benefits, such as water purification or industrial waste removal, are another area of research at the center. The Rice center receives its funding from the U.S. National Nanotechnology Initiative. Concerns about nanotech's dark side have been popularized by the media, and one of the most well-known scenarios posits that self-replicating "nanobots" could run amok and take over the planet. "An open dialogue between the scientists and engineers developing these technologies and the environmental activists and public policy-makers can lead to a sense of trust that the process will catch problems," declared Ausman.
- "Homeland Cybersecurity Efforts Doubted"
SecurityFocus (03/11/03); Fitzgerald, Michael
Experts worry that cybersecurity is taking a backseat at the new Department of Homeland Security (DHS), which has engulfed most of the government's computer protection centers. Of the five directorates built into the DHS, the Directorate of Information Analysis and Infrastructure Protection has been allotted less than 10 percent of DHS funding and is the only directorate without a leader. Moreover, in the recent months leading up to the DHS formation, several leading officials in charge of cybersecurity have left the government, including presidential advisor and Critical Infrastructure Protection Board director Richard Clarke, and National Infrastructure Protection Center director Ron Dick. Information Technology Association of America President Harris Miller says that if these officials are replaced with persons of less rank, then national cybersecurity is likely to suffer despite White House assurances. In February, industry observers were less than enthused about the federal government's National Strategy to Secure Cyberspace because it lacked substantive action, and recently have been confused about the creation of a Terrorist Threat Integration Center that will coordinate intelligence from different security agencies. That group's function at least partly duplicates that of the DHS. Analysts do say DHS CIO Scott Cooper has a chance to improve national cybersecurity infrastructure, and hail a new federal law that strengthens non-disclosure protections for businesses reporting incidents to government.
- "Turning Out Quality"
eWeek (03/10/03) Vol. 20, No. 10, P. 22; Fisher, Dennis
Carnegie Mellon University fellow Watts Humphrey is espousing his Team Software Process (TSP) and Personal Software Process (PSP) as new software development methodologies that can help improve the quality of code while getting projects out quickly. Often, he says, management sets unrealistic goals for programming teams, which leads to disorderly plans and haphazard work. Projects should be started earlier in order to avoid this scenario, allowing programmers to use structured development methodologies such as TSP and PSP, which were developed at Carnegie Mellon's Software Engineering Institute. While blame for programming errors traditionally falls upon developers and testers, Humphrey says it is the development process itself that is often to blame for high error rates, which he estimates at about one defect per 10 lines of code, even with experienced programmers. Moreover, training regimens at many software firms simply re-emphasize outdated methodologies. Demand for usability and functionality trumps security when CIOs and IT managers decide to buy software, and software makers respond by focusing on those factors that produce commercial results. Microsoft uses Carnegie Mellon's methodologies for some of its internal programming, including one recent application with 24,000 lines of code. Microsoft senior program manager Carol Grojean says the methodologies should cut errors drastically, down from 350 errors in the last version to just about 25 errors in the latest iteration. Microsoft, however, does not plan to employ the methodologies for its commercial products.
New Scientist (03/08/03) Vol. 177, No. 2385, P. 42; Schechter, Bruce
Hackers despise junk email, or spam, with a vengeance, and programmer Paul Graham explains that this hatred stems from bruised egos. In the hopes of mobilizing hackers to combat spam, Graham issued "A Plan for Spam," an outline for a spam filter that is 99 percent effective. Rather than focus on spam giveaways in the email itself, which could lead to the deletion of ham (legitimate email mistaken for spam), Graham's proposed filter would automatically scan emails and compare them to samples of spam and ham in its database, statistically grading each word's "spamminess" using an algorithm derived from Bayesian theory. False positives would be prevented by assigning words appearing in a ham message a positive rating twice that of negatively rated spam words. The filter program would also be open-source, allowing anyone to modify it to widen its scope and effectiveness. However, POPFile creator John Graham-Cumming notes that even Bayesian filters can be fooled by crafty spammers using methods such as hiding text in the HTML or dividing the email into long, thin parallel strips. In anticipation of this development, certain programmers believe the next solution will be to increase the intelligence of anti-spam programs.
- "Presence Technology"
Computerworld (03/10/03) Vol. 37, No. 10, P. 36; Kay, Russell
Instant messaging (IM) is a form of presence technology that offers a better alternative for communicating with colleagues and customers than email because it lets users know if the other party is available first. With IM technology, users have a "buddy list," which reveals whether the computers of potential contacts are logged onto the network and whether they have been active recently. That presence capability has an increasing number of businesses interested in IM technology. The future of IM looks ever more promising now that more people are using handheld wireless devices and cellular telephones. Ericsson, Motorola, and Nokia have partnered on an m-presence capability initiative, the Instant Messaging and Presence Services Solution, which would let callers know before dialing if a recipient's phone is turned off because the person is in a meeting or at lunch. Global Positioning Systems technology, which would also let the caller know the location of the person, may even be incorporated into IM one day. However, interoperability and standards issues have to be addressed before presence technology reaches such lofty goals.
Click Here to View Full Article
- "2003 and Beyond"
PC World (03/03) Vol. 21, No. 3; Captain, Sean
Technological developments that should emerge in the next several years and become widespread by 2010 include smaller PCs, pervasive Internet, intuitive handhelds, consumer devices capable of automatic wireless communication, and improved processing and graphics performance of cell phones and personal digital assistants, according to product researchers, developers, and analysts surveyed by PC World magazine. Slimmer desktops are expected to gain in popularity, especially in the corporate sector, due to a number of technologies that could debut later this year and next, such as new 64-bit chips from IBM, Intel, AMD, and Apple; 3.5-inch hard drives that can spin discs at 10,000 rpm and store 500 GB; and Serial ATA. Despite these gains, NPD Technologies' Stephen Baker expects notebooks to outsell desktops by 2004 or 2005, while Intel and IBM are planning to extend notebook battery life with their respective introduction and adoption of Centrino Mobile Technology. Wi-Fi technology will come into its own and be embedded in all portable devices by 2004, say analysts; HP Labs' John Ankcorn predicts that future handhelds and notebooks will feature seamless wireless connectivity and foster the growth of location-based services, while Microsoft's DirectBand Network will leverage Smart Personal Object Technology (SPOT) to make information accessible from everyday devices via FM radio frequencies. Meanwhile, cell phones, handheld video players, and other portables will be given PC capabilities thanks to new processors and platforms from Intel and Microsoft. New PC and hard-drive-based products will hasten the penetration of wireless digital technology throughout the household: Notable items include PCs with built-in TV tuner cards and software that allows consumers to carry out a variety of functions via remote control, Universal Plug and Play technology that can automatically configure a network of wired or wireless devices, and products based on the 802.11a or 802.11g standards that can support streaming video. The digital camera market should explode thanks to price reductions, but PCs are expected to remain the hub for digital content management. Finally, Christian Brantley of Eizo Nanao Technologies forecasts that sales of liquid-crystal displays (LCDs) will outpace those of CRTs by 2004.
- "Can Sensemaking Keep Us Safe?"
Technology Review (03/03) Vol. 106, No. 2, P. 42; Waldrop, M. Mitchell
The Sept. 11 attacks created a demand to leverage the United States' strength in analytical technology and networking to build what the Markle Foundation's Task Force on National Security in the Information Age calls a virtual analytic community threaded together by "sensemaking" technologies designed to mine vast quantities of data for signs of possible terrorist activity. The first step in building a virtual intelligence system is to establish an online forum where local officials can share and analyze intelligence, such as a virtual private network secured by standard encryption; the next major component is data-sharing between federal, state, and local agencies. The technical limitations of such a process could be partially alleviated through distributed computing, but information stored in antiquated databases can be difficult to access due to incompatible formats, while the organizations that control the databases are reluctant to disclose so-called "sensitive" information to outsiders. Oracle thinks all data should be changed to a standard format and stored in a common data warehouse, but IBM thinks "federated" information will solve the sensitive data problem as well as the compatibility problem: With such a system, a data owner can augment his database with a "wrapper" that allows outsiders to access portions of his data while keeping sources, methods, and other sensitive information secret. The key difficulty is extrapolating information important to security from the gargantuan database of unstructured information, and Stratify is one of a number of startups funded by the CIA's In-Q-Tel firm working on a solution. Stratify's Discovery System is programmed to automatically create a taxonomy of the received information so that it can be organized into categories representing specific subject matter and concepts, notes Stratify CTO Ramana Venkata. Meanwhile, U.S. intelligence agencies are making use of i2's Analyst's Notebook visualization toolkit, which can map out the progression of events while connecting related transactions, people, and entities via link analysis charts that also include supporting evidence tagged with associated data--sources, security levels, reliability, etc.