HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to [email protected].

Volume 5, Issue 476: Monday, March 31, 2003

  • "Freedom, Technology and the Net"
    CNet (03/31/03); McCullagh, Declan

    Amid the dragging war against Iraq, the rollback of civil liberties through legislation, and copyright holders' push to expand the Digital Millennium Copyright Act, one of the few bright spots is this week's Computers, Freedom, and Privacy (CFP) conference, writes Declan McCullagh. The 12-year-old event, hosted by the ACM, brings together people concerned with technology and its impact on freedom for the purposes of serious debate. "We're trying to focus on two things: One is cyberliberties--computers, freedom and privacy--post 9/11," notes Barry Steinhardt, this year's CFP chairman. "Second, we're trying to add an international component to this. It's quite clear that these issues are not restricted to the U.S. border." Topics for debate at this year's CFP include the Pentagon's Total Information Awareness project and a passenger-profiling system being developed by the Transportation Security Administration.

    For more information about the conference, to be held April 2-4 in New York City, visit http://cfp2003.org.

  • "Ubiquitous Computing: Slow Going"
    EE Times (03/28/03); Merritt, Rick

    Computer and network engineers continue to envision rooms and buildings of ubiquitous computing environments as they did 15 years ago, but moving this theoretical design to commercial reality has taken longer than expected. Intel senior researcher Joe McCarthy forecasts that computers will migrate from the input-output framework of today to a sense-and-response framework. McCarthy has experimented with computers that can sense the physical details of people working out in a gym, for instance, and then play music tailored for their type. McCarthy will chair this year's Ubicomp conference Oct. 12-15 in Seattle. Microsoft Research senior researcher Steven Shafer says that sensor-focused research is needed as well as general viability and functionality research work. "One of the ironies of ubiquitous computing is [that] almost no ubiquitous-computing systems work ubiquitously, or even in two places," he notes. Shafer envisions using ubiquitous computing to enable someone to enter a mall with a PocketPC, ask the device where a certain store is located, and have the device be able to find out and report that information back, or be able to go to a grocery store and query the PDA on the location of milk or raisins, for example.

  • "What Hyperthreading Can (and Can't) Do for You"
    NewsFactor Network (03/28/03); Ryan, Vincent

    Although many may view hyperthreading (simultaneous multithreading) as a marketing ploy for Intel, it does offer performance benefits to users, and is expected to become a standard feature of software applications, according to Aberdeen Group's Peter Kastner. Hyperthreading, which Intel's George Alfs predicts will be embedded into most Pentium 4 chips, enables an operating system to treat a single processor as two, facilitating the simultaneous rather than sequential processing of two software streams. Meta Group's Steve Kleynhans explains that hyperthreading-enabled software must support the concurrent implementation of two separate tasks, which can be accomplished either by writing a multithreaded application or by running two independent applications simultaneously. Kastner says that most computer users "will see a visible but not huge [performance] improvement" through hyperthreading, although Kleynhans doubts that application design will be significantly impacted for the next few years. Intel says that older desktop operating systems--Windows 2000, Windows 98, Window NT 4.0, etc.--are not designed to exploit hyperthreading, and suggests that users of such systems deactivate the Pentium's hyperthreading feature in the system BIOS Setup program. Kleynhans adds that users of heavy-duty floating-point applications would also be best served by disabling the hyperthreading function. Hyperthreading's chief advantage lies in software design--it will give designers more creative latitude. "If we get to the point where we can stick a billion transistors on a piece of silicon, there's all kinds of things that will be done," Kleynhans proclaims.

  • "Out of the Shadows"
    Wall Street Journal (03/31/03) P. R6; Bulkeley, William M.

    Open-source software is now being perceived as more useful and of higher quality than in the past, and companies are making a profit by offering services to those who use open-source software, or by packaging the software with their own commercial products. Some corporate customers are installing free software, and the open-source community has made it work better with proprietary software. IBM encourages customers to run Linux, and some of its servers use that software, while the popularity of Linux has led to more corporations dipping into other open-source programs. Linux is becoming the main operating system for some new technologies, such as blade servers and grid technology. Open-source users generally say that they were first attracted by the cost advantage, but switched because of performance. MySQL AB gives away its database software, but those who pay nothing must agree to share any improvements they make; those who do not want to share can get the software at a much lower price than what regular software companies charge. Ximian's open-source offerings include Evolution, which provides group email and calendar management functionality, and a program that converts desktop Linux from a text-based scheme to one that is icon-based. Proprietary software vendors such as Microsoft are trying to dampen open-source software's impact by focusing on its potential hidden costs, but corporate buyers say the software's ease of use offsets such issues. Still, the use of free software varies across industries, with financial and retail firms leading the way while transportation and manufacturing firms have been slow to adopt the software.

  • "Battling 'Surveillance Society'"
    Associated Press (03/31/03); Jesdanun, Anick

    Barry Steinhardt of the ACLU has long championed the fight against a "surveillance society" in which the government is constantly privy to the movements, opinions, and thoughts of all citizens, a trend that could endanger people's right to free speech and expression. He cites a number of developments that constitute a serious threat to privacy and speech, including the Total Information Awareness (TIA) project, a database that would be used to track citizens' transactions in order to identify suspected terrorists; increased phone and email wiretapping privileges for law enforcement, as provisioned by the USA Patriot Act; the CAPPS II air traveler profile system, which, like TIA, would use data-mining to uncover suspicious activities; and proposed legislation from the Justice Department that would impose tougher penalties for criminals who use encryption and authorize the construction of a DNA database of terrorist suspects. These topics are likely to be subjects of debate at ACM's Computers, Freedom, and Privacy conference in New York City (Apr 2-4), which Steinhardt is chairing. As security becomes a deeper national obsession, especially in the wake of Sept. 11, Steinhardt's focus has shifted from striking a balance between security and civil liberties to judging the effectiveness of certain security technologies. However, he cautions that it may be too late for such considerations, given the country's rate of acceleration toward a surveillance society.
    Click Here to View Full Article

    For more information CFP, visit http://cfp2003.org.

  • "An Engineer by Any Other Name"
    Houston Chronicle (03/30/03); Ratcliffe, R.G.

    Texas' narrow legal definition of what constitutes an engineer could hamper the growth of the state's high-tech industry because it shuts out a large portion of the workforce, according to Steven Kester of the American Electronics Association. Current state legislation only authorizes licenses for engineers who have studied engineering and passed a licensing exam, a stipulation that prevents many software programmers from calling themselves computer or software engineers. Texas computer companies are allowed by law to give personnel the title of engineer as long as it stays in-house. Certain high-tech experts received cease-and-desist letters from the Texas Board of Professional Engineers because they used the title of engineer in correspondence; a request for clarification was subsequently sent to former Attorney General John Cornyn, who said the Texas Engineering Practice Act forbids unlicensed engineers to use the title of engineer in any form of public discourse. Unlicensed Texas engineers who publicly refer to themselves as such could face a fine as high as $3,000, notes Applied Materials' Steve Taylor. He adds that such restrictions discourage out-of-state and overseas tech employees from working in Texas. Sen. Rodney Ellis (D-Houston) and Rep. Warren Chisum (R-Pampa) support legislation to relax the title requirement law. Meanwhile, Ken Rigsbee of the Texas Society of Professional Engineers says the engineer title limitation was imposed to protect the public after the mis-engineering of a heating pipe system led to an accident that claimed 300 lives in 1937.
    Click Here to View Full Article

    To read more about ACM's position on and studies regarding the licensing of software engineers, visit http://www.acm.org/serving/se_policy/.

  • "Software Bug May Cause Missile Errors"
    IDG News Service (03/27/03); Roberts, Paul

    Glitchy software may be the reason why Persian Gulf-based Patriot missiles targeted friendly aircraft twice in the past week, in one case with fatal results. A Patriot battery fired and destroyed a British Royal Air Force Tornado GR-4 on Sunday, while an American F-16 was reportedly acquired as a target on Monday, and averted disaster by taking out the battery's radar dish. The Washington Post reported on Tuesday that an anonymous source within the Pentagon attributed these mishaps to a software bug, noting that the battery involved in Monday's incident was in automated mode when it locked onto the F-16. One defense industry expert disputed this assertion, claiming that it is impossible for a Patriot to function automatically; yet this claim was opposed by information on the Web page of defense contractor and Patriot manufacturer Raytheon, alleging that "automated operations" is a major component of the missile system. A report posted on the Radio Australia Web site indicates that the Tornado was shot down because the Patriot battery misidentified it as an Iraqi missile. Victoria Samson of the Washington-based Center for Defense Information speculates that a combination of radar system failures and human error could lead to friendly fire incidents, and adds that software problems were evident during the testing of the latest Patriot missile, the PAC-3. General Vince Brooks of U.S. Central Command said on Thursday that all possibilities are under investigation, but an official statement must wait until more information is available. "When [investigators] do find a fault, they'll put it out to the rest of the world," assured Navy Lt. Commander Charles Owens.

  • "IEEE USA Presses Congress on Visa Curbs"
    Work Circuit (03/26/03); Quan, Margaret

    Cheap foreign labor continues to have a huge impact on the prospects of U.S. workers for gaining high-tech jobs, says IEEE-USA President-Elect John Steadman. The industry group has called on Congress to investigate abuses in the H-1B and L-1 visa programs that have occurred while the U.S. high-tech industry lost 560,000 jobs from Jan. 2001 to Dec. 2002. Steadman says 799,700 new or renewal H-1B visas were issued during roughly the same period, and cites a Business Week report that estimates the high-tech sector accounted for many of the 329,000 people working in the United States on L-1 visas in 2001. IEEE-USA wants Congress to investigate "why we continue importing thousands of new workers through the H-1B and L-1 visa programs," according to Steadman. His fear is that U.S. high-tech companies are trying to lower their costs by moving jobs overseas through the programs. Earlier in the month, IEEE -USA 2003 President Jim Leonard asked lawmakers to return the H-1B visa cap, which currently stands at 195,000, to the historical level of 65,000.

  • "University of Minnesota Researchers Develop Surveillance Software"
    Minnesota Daily (03/28/03); Peterson, Branden

    A new layer of security could be added to military computer systems thanks to the work of researchers at the University of Minnesota. The Minnesota Intrusion Detection System, which was developed by University computer scientists in collaboration with the Army High Performance Computing Research Center, is a surveillance program designed to detect computer break-ins by scanning network machines for unusual or suspicious activities, and then furnishing systems administrators with ranked summaries. The software was deployed throughout the University network system in August 2002, and has scanned between 30,000 and 40,000 networked computers for possible signs of compromised security. Through the program's performance, developers gained insight that was used to refine the software's speed and efficiency. The program's development would not have been possible without expertise offered by about 10 University professors and students. In addition to military systems, the software could be applied to telecommunication networks, online banking, and air traffic control. "The tool gives [a system administrator] a point of view that he doesnt have otherwise, and its very powerful," asserts Army High Performance Computing Research Center director Vipin Kumar.

  • "Flash Forward"
    CNet (03/27/03); Kanellos, Michael

    With flash memory chips expected to reach their physical limits within two years, Intel, Motorola, and other manufacturers are exploring alternative materials and designs, although their projected market impact is a matter of debate. Flash memory's ability to retain data in the absence of a power supply--thanks to a layer of silicon dioxide in the transistor--has made it a key feature in cell phones, digital cameras, and handheld computers, while Semico Research's Jim Handy expects flash revenue to nearly double between 2002 and 2003, and reach $43 billion by 2007. He believes flash revenue will keep pace with that of dynamic RAM (DRAM) by 2004, and outpace it by 2006. However, the silicon dioxide insulator is also flash's Achilles' heel--reducing its thickness below about 80 angstroms leads to electron leakage, and scalability difficulties for flash chips are likely to emerge when they transition to a 45-nm manufacturing process. To overcome such barriers, Intel is researching Ovonics Unified Memory, whose electrical resistance is determined by amorphous and crystallized areas on the chip, the result of heating and cooling the chip's chalcogenide substrate. In the meantime, Motorola is investigating silicon nanocrystals as a substitute for the silicon dioxide insulator; Motorola's Ko-Min Chang believes samples could become available to major clients in two years, with mass production introduced in mid 2005. Other technologies being considered include magnetic RAM (MRAM), polymer ferroelectric RAM (PFRAM), and ferroelectric RAM (FeRAM). However, Intel CTO Pat Gelsinger argues that no other memory technology is as yet capable of being mass-produced.

  • "Email Traffic Patterns Can Reveal Ringleaders"
    New Scientist (03/27/03); Muir, Hazel

    Hewlett-Packard researchers have devised a new method of analyzing the flow of email traffic for patterns that could reveal online communities and their leaders, and HP's Joshua Tyler says law enforcement officials could employ the technique to sniff out terrorists and other online criminals. The scientists used HP's research lab as a testbed, and were able to scope out various communities by mapping out connections between staff who had exchanged at least 30 emails with each other, while a computer algorithm searched for crucial links between separate groups. By comparing the members of each community with the company organization charts, the researchers discovered that 49 of the 66 outlined groups had members who all worked in the same department, while most of the remaining groups were organized around collaborative projects. In another test, the scientists plotted out the emails with an algorithm that attempts to frame the network with as few entanglements as possible. Tyler says the plot placed persons with the widest range of organizational contacts, usually the managers, in the center. "If the CIA or another intelligence agency has a lot of intercepted email from people suspected of being part of a criminal network, they could use the technique to figure out who the leaders of the network might be," he explains. Tyler acknowledges that privacy could become an issue, given how information is gathered and used.

  • "HP Thinks in 3D for Web Browsing"
    InternetNews.com (03/25/03); Singer, Michael

    Hewlett-Packard has introduced a new tool for creating three-dimensional views of online stores, similar to Doom and other video games. Called the VEDA (virtual environment design automation) project, the application is used as a visualization database, and allows users to stroll through virtual rooms and corridors using their mouse. Items are arranged according to the user's chosen categories. The application was developed by HP Labs researchers Amir Said and Nelson Chang, who say the tool provides more interaction than a normal Web page, is visually appealing, and provides a little bit of the feel of a brick-and-mortal store. The back-end VEDA application is based on OpenGL and XML technologies, and supports robust audio, video, and 3D models that can be modified, says Chang. The researchers admit that the software's biggest hurdle is Internet bandwidth limitations. Said says lower speeds degrade the graphics and high resolution photographs. But the tool could run on HDTV as well as the Web because of its advanced technologies, which are more developed compared to 2ce's CubicEye and other 3D Web browsers. HP Labs is now planning to test the system with such retailers as Wal-Mart.

  • "Wave Propagation"
    CommVerge (03/20/03); Miller, Matthew

    Innovations around the Wi-Fi standard 802.11 are still picking up pace, and World Wide Web Consortium founding member and IBM veteran John Patrick says Wi-Fi will emerge as a leading Internet access tool and leading mobile phone technology. In fact, Patrick believes that Wi-Fi is a technology in which "the reality exceeds the hype." Cellular carriers will have to adapt, he insists. Patrick believes Wi-Fi is a broader tool than the Internet while also tapping into the Internet, and predicts that global blanket Wi-Fi coverage will happen one day; such coverage will enable anytime/anywhere Internet access. He says, "Wi-Fi is to wireless as the Internet was to wired communications 10 years ago." Today, many notebooks have built-in Wi-Fi access, and one day it will be standard for handhelds. Patrick also notes that IP networks make no distinction between voice and data transmission, which means Wi-Fi devices can pick up email as well as conversations and spreadsheets. With Wi-Fi-enabled IP, long distance voice communication costs the same as short distance, which means Wi-Fi voice over Internet protocol (VoIP) devices will have a pricing advantage over cellular devices. The mesh network concept for Wi-Fi will extend Wi-Fi's power by using Internet access devices as routing devices and repeaters. Patrick says the emerging WirelessMAN 802.16 standard will further boost Wi-Fi's ubiquity and mesh networking by providing a high-bandwidth interface standard for linking Wi-Fi networks to fixed wireless systems and the Internet, making broadband wireless access available to remote areas.
    Click Here to View Full Article

  • "Are We Vulnerable to Cyber-Attacks?"
    Fortune (03/20/03); Kirkpatrick, David

    The current conflict with Iraq might spur more hacking and cyber-terrorism, forecasts David Kirkpatrick. U.K. consulting firm mi2g says that so far in 2003, confirmed digital attacks have caused $16 billion in losses, nearly double that of a year ago, and 64 percent of cyber attacks have been directed against North American entities, compared to 30 percent a year earlier. The nature of the cyber attacks has shifted as well, becoming more severe and sophisticated, says Jim Kollegger, CEO of BBX Technologies, a developer of security solutions for Windows networks. For example, hacker groups in Bulgaria and China have discovered flaws in Microsoft programs and are "creating toolkits to take advantage," so that even novice programmers can cause disruptions, Kollegger says. Protest hacking is also on the rise, and mi2g says hacking activities have emanated from Brazil, France, Indonesia, Mexico, Romania, Saudi Arabia, and other countries. Since protest hackers are focusing more on corporate targets rather than on the government or the armed forces, retired Lt. Col. James Emerson at ICG suggests multinational firms symbolizing America overseas need to boost their level of cyber-security. But most cyber attacks result from people familiar with a corporation such as former employees, consultants, and others with access, says mi2g, and such people need to be closely monitored. Finally, an Aberdeen Group study shows that open-source software such as Linux and Unix are as susceptible to cyber attacks as software from Microsoft, and Apple's new operating system, OS X, is vulnerable because of its reliance on IP and Unix utilities.

  • "ICANN Ready to Chart a New Course?"
    CNet (03/20/03); Pearce, James

    Incoming ICANN CEO Paul Twomey sees ICANN's primary mission as reaching out to all Internet stakeholders in order to involve them in the ICANN process, especially stakeholders from less-developed countries. ICANN will move forward with fulfilling its obligations under the U.S. Department of Commerce's memorandum of understanding to build itself globally. Twomey endorses keeping the domain name system a "public/private partnership," and believes that governments should be involved in DNS management when DNS policy overlaps with traditional areas of governmental purview. Twomey says the IDN issue is complex not just on the technological level, but also commercially and linguistically. IDNs, introducing new domain names such as .health for instance, and adopting IPv6 will be top challenges for ICANN. The new ICANN At-Large Committee will solicit advice from ordinary Internet users. In comparison, Twomey says that the old mechanism of board elections were not perfect for gauging community views because the election system can distort region-based results; North American representatives were elected with only a few thousand votes, while the Asia region saw millions of votes cast.

  • "Computing at the Atomic Scale--and Below"
    Business Week (03/17/03) No. 3824, P. 90

    In about 10 years it may not be possible for today's semiconductor components to continue to follow Moore's Law, but new computing techniques likely will allow engineers to keep improving performance dramatically. Spintronics researchers have already said that magnetic fields could manipulate the spin on a single electron, but work at the University of California at Santa Barbara and the University of Pittsburgh shows electric signals can work as well. Magnetic fields require fundamental changes in computer design, but using the same electric signals as in computer disk drives would be much easier to implement commercially. Another possibility is moletronics, computing based on single-molecule transistors. Donald M. Eigler, who previously led ground-breaking molecular work at IBM, is now heading a group building sets of carbon-monoxide molecules to perform logic functions. That circuitry is so small it would take four decades of improvements under Moore's Law for traditional chips to catch up.

  • "Cybersecurity Downgraded?"
    Washington Technology (03/24/03) Vol. 17, No. 24, P. 10; Wait, Patience

    The information technology industry has gone from having a cybersecurity czar in the White House to perhaps not having a cybersecurity representative in the Bush administration. After cybersecurity czar Richard Clarke left his post Feb. 21, President Bush signed an executive order to close the Critical Infrastructure Protection Board (CIPB) and move its responsibilities to the Department of Homeland Security. Although some IT experts acknowledge that folding the CIPB into the Department of Homeland Security allows the government to consolidate security activities, others point out that the law creating the new department does mention cybersecurity and add that the agency's Web site still does not reveal who is responsible for IT issues. "The key question is does the issue of cybersecurity have a seat at the big table for policy questions," says Entrust's Dan Burton. And with cybersecurity losing its stature as a policy issue, some observers see the administration weakening the ties it established with companies and organizations that were willing to provide assistance in the event of a cyberattack. Although the new House Select Committee on Homeland Security has created a subcommittee for cybersecurity, science, and research and development, IT experts are still concerned about the closing of the CIPB.
    Click Here to View Full Article

  • "Snags Remain as Grid-Lock Eases"
    InformationWeek (03/24/03) No. 932,; Ricadela, Aaron

    Grid computing could gain acceptance this year as new software makes the technology easier to use and implement and helps companies take advantage of excess computing power. WorldCom, for example, is investigating grid computing services as a way to generate new streams of revenue, and is in talks with the major IT vendors IBM, Hewlett-Packard, and Sun Microsystems concerning the technology. WorldCom's Darryl Shaw says his company wants to develop grid computing solutions that customers use to better manage their data centers. The other grid player WorldCom is talking to is the Globus Project, a collaboration of academic researchers who have been working on grid computing standards for some time. The group recently released the Globus Toolkit 3, and IBM is using a version of that toolkit to help Charles Schwab harness excess Web server capacity. The financial services firm wants to gain competitive advantage by analyzing clients' portfolios in just seconds, using computing power tapped from the grid. Companies see grid technology as a way to more efficiently use their IT resources during pressing economic times. With grid computing, the burden is shared evenly among servers and other IT components. Still, many of the grid technologies currently available are not too far removed from the academic setting, which deals with static batches of data instead of the more real-time, fluid data businesses need to leverage. The Global Grid Forum group plans to ease these adoption burdens with its Open Grid Services Architecture (OGSA). IT vendors are already building the Web services extensions into their software, and the expectation is that OGSA will allow grids to identify OGSA-equipped computers easily over the Internet just as the TCP/IP Internet stack allows commonality today.

  • "Right Game, Wrong Team"
    Software Development (03/03) Vol. 11, No. 3, P. 30; Kerievsky, Joshua; III

    The test-driven approach followed by extreme programming (XP) teams can be applied to management, which often overlooks the important process of articulating financial and organizational objectives to the teams, resulting in software that misses these targets. In addition to boosting team confidence of the project's development, test-driven management solidifies connections between managers, customers, and programmers. Ideally, a management test should state in a binary way a measurable, time-limited external goal that is achievable and relevant without specifying how that goal should be reached--and in plain English. The test should avoid creating a climate of fear among the XP teams, nor should it set a deadline for specific features to be delivered, a task best left to Release Planning. The test should feature a charter that strikes a balance between a project's organizational objectives and available resources. Management tests can either be external, focusing on such variables as customer participation, team agility and productivity, and quality of release; or internal, with a concentration on factors such as knowledge transfer and learning and team satisfaction with XP. In a recent project, XP consultant Industrial Logic kept a company's management abreast of the project's progress by posting the tests in the XP team's work area. Even failing a management test can benefit the team, as it yields important data that could be used to re-evaluate test metrics.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

[ Archives ] [ Home ]