HomeFeedbackJoinShopSearch
Home

       HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to [email protected].

Volume 5, Issue 486: Wednesday, April 23, 2003

  • "Internet Is Losing Ground in Battle Against Spam"
    New York Times (04/22/03) P. A1; Hansell, Saul

    Despite the push to eradicate unsolicited commercial email, spammers still have the upper hand, and are battling automated filters and other antispam measures using various methods, ranging from as simple a method as rewording spam messages to hide their content to as sophisticated as masking their point of origin by transmitting them along labyrinthine routes. The proliferation of spam is outpacing the development of more advanced antispam technologies as well as legal attempts to crack down on mass emailers; AOL reports that spam accounts for over 70% of the email its 35 million subscribers receive, and estimates it now processes approximately two billion spam messages every day. Becoming a spammer is relatively simple and cheap: All that is required is a mailing list and a computer hookup. Many spammers claim they have users' permission to send spam, often through an opt-in policy, but antispam advocates dispute this assertion. Another tactic spammers use is to send email in bulk to addresses culled by software robots that troll through chat rooms, message boards, and Web sites; some of these techniques are easy to detect and can be effectively blocked by ISP systems, while others can be identified through bogus "honeypot" email accounts. However, spammers have responded to these strategies by "morphing" their messages via odd spellings, the addition of random words and characters, or the use of HTML format. Exacerbating the situation is the blurring line between spam and legitimate email--blacklisting computers and addresses known to be used by spammers can often lead to false positives, thanks to crafty spammers who can disguise their origins by commandeering undefended systems. Big marketers want to institute a "white list" of approved email senders, while the U.S. Senate is considering antispam legislation that would outlaw many deceptive email practices.
    http://www.nytimes.com/2003/04/22/technology/22SPAM.html
    (Access to this site is free; however, first-time visitors must register.)

  • "Is Open Source Apple's Salvation?"
    NewsFactor Network (04/21/03); Brockmeier, Joe

    Although Apple's OS X operating system is based on the open-source Darwin OS from BSD, and the core of its Safari Web browser is the KHTML rendering engine, this does not mean that Apple is wholly embracing open source. Only certain components of OS X and Safari are available under open-source licenses, raising the question as to whether the company's adoption of open source truly represents a change in tactics. Apple's Brian Croll characterizes open source as "absolutely core to Apple strategy," while International Data's (IDC) Dan Kusnetzsky maintains that open source will enable the company to dedicate its research and development budget to "things that are unique to them," and is a good move, public relations-wise. Less publicized is the fact that OS X's BSD core has an excellent track record in terms of security. Apple can also exploit a plethora of server and desktop applications that would be out of reach with a proprietary OS, including MySQL, Apache, Perl, Samba, and the GNU Compiler Collection. "They have a wealth of development tools and middleware that sit on top of either BSD or Linux which they can take advantage of without having to develop that themselves," notes Kusnetzsky. IDC's Roger Kay doubts that Apple's open-source strategy will significantly impact its market share, or the company's momentum. Kusnetzsky says that Apple could benefit by offering a bridge between Windows and Linux that many people desire.
    http://www.newsfactor.com/perl/story/21318.html

  • "Facing Up to the Threat From Cyber Terrorism"
    Financial Times (04/23/03) P. 9; Hunt, Ben

    Internet Security Systems CEO Tom Noonan says the United States' unparalleled reliance on computer automation has not only made it the most productive country in the world, but also the most vulnerable to cyberattacks. It is reasonable to assume that terrorist organizations such as al-Qaeda could resort to tools and methods that hackers such as the teenage miscreant "Mafia Boy" have used to cripple and shut down businesses and essential infrastructure services. Noonan is convinced that America's foes are engaged in a "quiet, deliberate, calculated compromise" of the nation's information networks. He says the U.S. government has recruited his company and other firms to help shield the national infrastructure; this is because 75%-80% of the infrastructure is controlled by the private sector. As a member of President Bush's National Infrastructure Advisory Council, Noonan is helping develop and evangelize a security strategy that solves immediate threats, and refine it so that it can accommodate future requirements and threats. He says cybersecurity for the U.S. government and other entities is sorely lacking, even though many organizations have come to realize its importance. "This is one area where government needs to be progressive and not a laggard," Noonan insists. He calls the dynamic threat protection agent the most effective cybersecurity solution. Such software not only constantly scans for threats, but enforces a usage policy so that behavior and interaction is appropriate and non-threatening, and automatically responds to threats and furnishes a report on its response strategy.
    http://search.ft.com/search/article.html?id=030423000828

  • "Report: College Grads Will Suffer From High-Tech Job Slowdown"
    Computerworld (04/21/03); Rosencrance, Linda

    Companies are devoting less money to technology investments, so hiring large numbers of high-tech college graduates is less of a priority, concludes a new Challenger, Gray & Christmas report. "Because of major cost-cutting, companies are not updating their technology as quickly," says Challenger CEO John Challenger. "What they updated at the time of Y2K will hold them for a while, at least until the economy turns up." He adds that new software boasts more ease-of-use, allowing less computer-savvy employees to familiarize themselves fairly quickly. Entry-level workers will find securing a job especially difficult because companies are chiefly pursuing candidates who have prior IT experience and are already adequately trained, Challenger says. Graduates who garner real-world experience by participating in internships or cooperative education programs will have an advantage when they hit the job market. "There will be problems for those students who just have a degree and get out and look for a job," predicts Resulte Universal CEO Ray Kurzius.
    Click Here to View Full Article

  • "Tech Forum Tackles Big Ideas"
    Wired News (04/23/03); Kahney, Leander

    Sci-fi author Cory Doctorow, one of the organizers of this week's O'Reilly Emerging Technology Conference, describes the event as "a three-day jam session for geeks," with its overarching theme being the future of technology. Attendees will be able to discuss tech topics with leading figures such as futurist Howard Rheingold and Amazon CEO Jeff Bezos at the conference, now in its second year. The conference's chief currency is ideas, so there will be no companies plugging or showcasing their latest products. The majority of the conference's discussions, seminars, and tutorials, covering subjects that range from competitive coding to virtual worlds, will be presented by newcomers to the technology scene--representatives of what O'Reilly and Associates CEO Tim O'Reilly calls the "alpha geeks," the hardworking tech enthusiasts and researchers who are the real force behind tech innovation. Major areas of concentration at the event will include social software, rich Internet applications, nanotechnology and hardware, and "untethered" and "emerging topics." Rheingold will talk about how the proliferation of cell phones and peer-to-peer connected computers will affect society, while personal computing expert Alan Kay will be on hand to discuss why computer-savvy kids are essential to the furtherance of the technology. Swarm intelligence expert Eric Bonabeau will lead a discussion about what the business sector can learn from social insects.
    http://www.wired.com/news/technology/0,1282,58588,00.html

  • "Machine-to-Machine Integration: The Next Big Thing?"
    InformationWeek (04/14/03); Zetie, Carl

    The potential advantages of machine-to-machine integration for conventional IT organizations include lower costs, better responsiveness, improved efficiency, a tighter supply chain, and even new business models, writes Forrester Research analyst Carl Zetie. The intersection of enabling technology advancements, such as falling costs and standardization, may be making machine-to-machine integration an increasingly lucrative choice for enterprises. The business case for machine-to-machine integration is becoming stronger as GPS, cell phone-based location technologies, radio frequency identification transponders, and wireless technologies start to penetrate the mainstream. Such technologies are already being used to automate toll booths and the ID of toll violators, and allowing shoppers to use self-service checkouts. Initial enterprise applications include supply chain and inventory optimization, and the reduction of shrinkage. Other possible uses of machine-to-machine integration include interconnected and totally automated building-management systems. However, Zetie notes that machine-to-machine integration's ability to eliminate human intervention is building resistance to the concept. Overcoming this reluctance will be key to the technology's adoption.
    Click Here to View Full Article

  • "Snaky Tape May Enliven Computer Interactions"
    NewsFactor Network (04/21/03); Martin, Mike

    A research team led by Ravin Balakrishnan of the University of Toronto has developed "ShapeTape," a flexible tool that can be used with specialized software to build computer-generated shapes. Calling ShapeTape a revolutionary form of computer interaction, Balakrishnan says, "It moves away from the 'one-size-fits-all' keyboard-and-mouse paradigm to more specialized tools for specialized tasks." The instrument consists of a spring steel cord encapsulated in a long rubber ribbon studded with fiber-optic sensors, and is used is conjunction with a foot pedal. The shapes on the computer screen deform in response to how the operator, using both hands, twists and bends the ShapeTape. "The functionality of the ShapeTape is part of a much larger development that is taking place--the deployment of intelligent devices in the physical world around us to create so-called 'smart spaces,'" comments UCLA computer science professor Leonard Kleinrock. He adds that smart-space development will help endow the environment with an awareness of who and what resides in physical areas that are shared with Internet-connected devices. Balakrishnan expects at least several years to pass before ShapeTape technology is ready for commercialization, and notes that graphic and industrial designers could use the tool for the design and refinement of technical drawings for various products. ShapeTape research is detailed in ACM's Computer-Human Interaction Letters, Vol. 5, Issue 1.
    http://www.newsfactor.com/perl/story/21314.html

  • "Will Code for Food"
    CNet (04/22/03); Bowman, Lisa M.

    The current hiring atmosphere for technology professionals is bleak, if Silicon Valley tech job fairs are any indication. Whereas most attendees of such events were gainfully employed at the height of the tech boom, today the majority are unemployed; the U.S. tech industry has experienced over 500,000 layoffs in two years. The number of hiring firms represented at such conferences has also shrunk--in fact, only 30 companies purchased exhibition space at last week's BrassRing event in Santa Clara, whereas over 500 were present three years ago. Attendees report that the BrassRing conference was crowded with applicants, some of whom waited as long as an hour in line for the opportunity to talk to a recruiter or add their resumes to a large pile. The dot-com implosion has raised the stock of defense and government contractors considerably: Defense firms were offering the most jobs at BrassRing, and conference spokesman Mike Jurs notes that a defense industry position is thought to be very stable. Experts argue that applying for a job online is no substitute for networking; a study conducted by DBM ranked Net surfing as the No. 4 source of new jobs, while networking was No. 1. Still, the relative cheapness of online recruiting means that companies will continue to exploit the method, while Jurs thinks job seekers should use a two-pronged search approach of both networking and Web surfing. Meanwhile, Monster.com Chairman Jeff Taylor believes the most effective job-search strategy involves a full-time commitment, constant resume updates, and applying right after finding an opening.

  • "Military Academies Face Off in Blunting Cyberattacks"
    Associated Press (04/22/03); Hill, Michael

    Computer experts in the top three U.S. military academies as well as the Coast Guard and other agencies participated in the third annual Cyber Defense Exercise last week in order to evaluate the military's ability to wage "network-centric warfare" as well as defend critical systems and information from enemy hackers. Cadets at the participating academies squared off against hackers from the National Security Agency (NSA), whose challenge was to penetrate the schools' Internet firewalls and find security flaws. The NSA awards a trophy to the academy with the most effective cyber-defenses. In the last four years, "information assurance" warfare at West Point has grown from nonexistent to essential, notes Lt. Col. Daniel Ragsdale, director of the academy's Information and Technology Operations Center. He also says that soldiers no longer consider a lack of technology skills to be a point of pride. Dan Goure of the Lexington Institute, a military think tank, says that modern information warfare involves spreading bogus information and infiltrating enemy networks as well as disabling them. He says, "We're doing network attacks, we are hacking into email systems of adversaries."
    Click Here to View Full Article

  • "Planning for the Next Cyberwar"
    Wired News (04/18/03); Borin, Elliot

    The U.S. victory in Iraq validates the concept of network-centric or digital warfare, which is poised to grow in scope and sophistication in anticipation of future conflicts. Part of the Pentagon's $500 billion budget for 2004 will be allocated for network-centric warfare research and development, which will be conducted by university and defense contractor laboratories and federal agencies such as the Defense Advanced Research Projects Agency (DARPA). "The end of the Cold War has produced an arena where threats are amorphous and evasive [and] not easy to attack," notes Dr. Allan Steinhardt of DARPA's Information Exploitation Office, who adds that accurate and unequivocal battlefield data "has never been more important." Among the initiatives to be funded by the 2004 Pentagon allocation is Blue-Force Tagging, a sensor data aggregation measure designed to more clearly distinguish friend from foe and reduce incidences of friendly fire; The Forester Project, which will employ low-frequency radar and slow-moving rotocraft in order to keep track of troops and gear hidden by thick jungle foliage; Jigsaw, which will be able to holographically represent the information collected by Forester; SP-3D, which will increase Jigsaw's range and enable planners to apply simulations to the holograms to refine attack strategies; and the Army's Future Combat System, an integration of manned and unmanned ground units and drone aircraft that can network with troops and enable them to carry out missions faster and more effectively--and at reduced cost. However, network-centric warfare pioneer and researcher Dr. John Arquilla warns that such systems could be compromised by hackers. To prepare for such a contingency, the Pentagon is building systems that gradually degrade in response to cyberattacks, as well as servers capable of automatic repair and reconfiguration.
    http://www.wired.com/news/technology/0,1282,58422,00.html

  • "Next Mac OS X Puts User at the Center"
    eWeek (04/21/03); Rothenberg, Matthew; Ciarelli, Nick

    Apple is readying its new Mac OS X 10.3, or "Panther," expected to ship in September; the system is reported to include many personalization features Apple is calling User at the Center that promise to be competitive with those in development at Microsoft. User at the Center features include the ability to shift home directories for remote access and the ability to switch user logins without closing applications, something similar to Microsoft XP's Fast User Switching capability. Sources close to development also say Apple will include a piling GUI technique for managing files. Files are represented in miniature and piled together on the screen, and users can separate piles based on content, as well as automatically assign certain content-type files to piles. Microsoft's Longhorn operating system is expected to be ready by 2005 and will incorporate user-centric capabilities as well. Panther is also expected to include journaling capabilities first released in the Mac OS X 10.2.2 server version, which allow users to rebuild data using a journal log in case of a crash. Because the Worldwide Developers Conference 2003 edition has been moved from May to June, Apple watchers speculate the company may unveil new Mac systems that use IBM's new PowerPC 970 64-bit processor, which may be included in new Mac systems to be shipped with Panther in September. Panther will support 64-bit processing, according to sources.
    http://www.eweek.com/article2/0,3959,1036533,00.asp

  • "Artificial Intelligence Scopes Out Spam"
    Network World (04/14/03) Vol. 20, No. 15, P. 29; Strickler, Dave

    Spammers currently have the upper hand because they are always probing email filtering solutions for vulnerabilities, but artificial intelligence mail-filtering software that uses natural-language processors could outpace their rate of adaptation. Although it is unlikely that a single system will be able to block all spam, AI methods have a higher rate of success than any other technique. The natural-language algorithms of AI-based mail-filtering software deconstruct the email into sentences and deduce their meaning by studying fundamental elements--keywords and such--in the reverse order from which they were broken down. The software, which resides on an application service provider's network or outside a firewall, is programmed to accept all incoming email traffic, tagging suspected spam and routing it into a quarantine area. From there, a human administrator can check to see whether it truly is junk email to be deleted or is a legitimate message that has been mistaken for spam that should be passed on. The AI software also employs transmission-pattern techniques that take into account the time the email messages were sent, their origin, and the person who sent them in order to further clarify whether messages are legitimate or not.
    http://www.nwfusion.com/news/tech/2003/0414techupdate.html

  • "The Web's Next Leap"
    Computerworld (04/21/03) Vol. 37, No. 16, P. 34; Thibodeau, Patrick

    Tim Berners-Lee, inventor of the World Wide Web and leader of the World Wide Web Consortium (W3C), believes the Semantic Web is the next step in Web technology; he describes it as "webbing" the traditional relational database so that companies' back-end systems can read and exchange data with some intelligence. The Semantic Web basically infuses data with tags computers can understand and allows them to complete tasks automatically without specific pre-programming. An airline reservation system would be able to communicate with a customer's personal calendar to avoid scheduling conflicts, for example. Experts say the precise type of applications that may arise out of the Semantic Web are difficult to predict, just as many did not know what the Web itself would spawn. But for companies, one immediate benefit that can be realized even now is the ability to integrate heterogeneous data stores. The W3C is working on Semantic Web standards such as the Resource Description Framework (RDF) that allows groups to define ontologies, or the metadata vocabularies used within different industries. XML is another piece of the Semantic Web developed at the W3C. Some firms are already targeting customers that want to deploy Semantic Web technology for their own use while preparing for wider use in the future. Celcorp, for example, helps businesses link databases and applications without the creation of specific interfaces between components; using Semantic Web concepts and technologies, Celcorp creates a "semantic learner" for the company that analyzes how applications are used and then builds a new application with component features from existing applications.
    Click Here to View Full Article

  • "September 11 Information Failures: A Semiotic Approach"
    IT Professional (04/03) P. 64; Desouza, Kevin C.; Hensgen, Tobin

    The useful exploitation of collected data will depend on the integration of information-gathering systems and associative applications, write Kevin C. Desouza of the University of Illinois and Tobin Hensgen of Loyola University, who believe a semiotic approach is a feasible strategy, one the U.S. could have used to better prepare for the Sept. 11 terrorist attacks. They argue that data-gathering efforts must focus on necessary, universal information, upon which data-gathering agencies cannot impose their own reality. The semiotic model breaks down into five levels, each of which has an equivalent of gathered data that relates to Sept. 11 prior to the attacks. The first level, morphological, includes general data, not all of which may be necessary: Examples include intelligence reports of an increase in communications among Islamic extremist groups between December 2000 and March 2001. The empirics level filters data without critically assessing it, so as to increase its significance and give it value; Sept. 11-related instances include a June 2001 memo from an FBI agent that raises the possibility that al-Qaeda could be training pilots to fly planes for terrorist purposes. The syntactics level of the semiotic model is where collected data is interpreted to determine relationships: For example, an analytical memo written by an FBI agent just before the attacks theorizes that Islamic terrorists could crash a plane into the World Trade Center, based on the case of "20th hijacker" Zacarias Moussaoui. The fourth level, semantics, focuses on measuring system uncertainty in order to infer higher-level meaning, which could have enabled the U.S. to respond to the attacks in a less reactive way, thus minimizing casualties. The pragmatics level yields actionable information, and involves debriefing members associated with each semiotic level as to how well they contributed to action decisions.
    http://www.computer.org/itpro/it2003/f2064.pdf

  • "Writing Software Right"
    Technology Review (04/03) Vol. 106, No. 3, P. 26; Roush, Wade

    Microsoft, Sun Microsystems, and IBM are trying to retool software engineering and eliminate "laissez-faire" attitudes toward software design that can lead to potentially major bugs slipping through the testing process, while also saving time, money, and debugging headaches for programmers. Microsoft is developing a more flexible iteration of the Prefix bug-finding program, which uses a defined list of common semantic errors to scan code for matching patterns. The new version can convert code into abstract structures that make it easy to spot flaws, and also allows company programmers to build personal plug-ins that search for software-specific errors. Meanwhile, Sun's Jackpot project is designed to circumvent problems that existing software-writing tools suffer from, such as the inability to detect larger structural flaws, says Jackpot group leader Michael Van De Vanter. His team is developing an "analysis engine" that can read and abstract a programmer's code into an internal software model, thus giving programmers feedback in real time. The Jackpot group is also working on visualization technology that can display nested structures as easy-to-read tables, maps, and text, as well as a debugger that can detect good and bad programming. IBM's experimental bug-finder, Slam, is programmed to identify any contravention of general programming rules by investigating all potential routes a program's execution may follow. IBM has also made a sizeable investment in Eclipse, a nonprofit network of open-source software writers who design software democratically.

  • "Recycling Not Easy for PC Makers"
    CNet (04/22/03); Fried, Ian

    PC manufacturers have started to study product recycling processes in detail, and are finding both pluses and minuses: They are learning how difficult it is to remove and dispose of certain materials, such as mercury filaments in scanners, but they are also learning how to design greener products as a result. Hewlett-Packard is one PC maker that has started taking back obsolete equipment, and Silicon Valley Toxics Coalition executive director Ted Smith says that if such practices become industrywide, then more environmentally friendly products will emerge. Smith's organization recently released a report that gave U.S. computer manufacturers a low grade for hazardous waste removal, while Japanese companies received high marks. With electronics recycling legislation looming, many companies are prioritizing end-of-life issues, instituting new programs, and giving customers incentives for recycling their discarded equipment. These companies' recycling programs are particularly successful when business goals are aligned with environmental agendas, and vice versa--for instance, Dell's simplified component integration has not only boosted efficiency and cut costs, but made its products easier to recycle. John Birkitt of HP's Design for Environment initiative admits that certain materials may make a product more biodegradable, but more power-consumptive at the same time. Dell's Don K. Brown says hazardous materials may be included as a trade-off for avoiding even more toxic compounds. For example, flat-panel displays are highly efficient and lead-free, although their backlighting systems include a small portion of mercury. Smith notes that one of the more formidable challenges PC makers face is designing products that do not use so many hazardous materials.
    http://news.com.com/2100-1041-997755.html

  • "The Grid: Computing Without Bounds"
    Scientific American (04/03) Vol. 288, No. 4, P. 78; Foster, Ian

    Grid computing is expected to "virtualize" general computational services and make processing, storage, data, and software so ubiquitous that computing will seem like just another utility. An extension of the Internet, grid computing melds computer systems through high-speed networks so that people can avail themselves of data-crunching capabilities and resources otherwise inaccessible from single or sometimes multiple computers; grid systems' reach would be worldwide thanks to shared languages and interaction protocols. Grid technology applications include large-scale scientific and business ventures between members of virtual organizations (VOs), experimentation from afar, and high-performance distributed computing and data analysis. A pervasive computing grid would, for instance, enable e-commerce enterprises to customize information and computing systems according to demand while maintaining their connections to partners, suppliers, and customers; give physicians the ability to remotely access medical records for fast diagnosis; accelerate drug candidate screening; and allow civil engineers to test earthquake-proof designs much faster. Businesses are enthusiastic about grid computing because it promises to relieve them of the time and money spent installing, upgrading, and maintaining private computer systems that are often incompatible, resulting in improved security, reliability, and economies of scale for producers, more resource optimization for distributors, and new remotely powered devices and applications for consumers. Argonne National Laboratory's Globus Project, one of the earliest grid computing efforts, involved software that connected far-flung systems into a VO scheme by standardizing ID authentication, activity request authorization, and other key processes. Its success and subsequent development has inspired work on other grid technology projects, such as the National Technology Grid. Grid computing can only be successful if it is widely adopted, and one way of ensuring this is to make core technology freely available as well as easily and openly deployable.

 
                                                                             
[ Archives ] [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM