Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to technews@hq.acm.org.
Volume 6, Issue 702:  Monday, October 4, 2004

  • "Next Big Thing: The Web as Your Servant"
    USA Today (10/01/04) P. 1A; Maney, Kevin

    The next big advance in the Internet, which some experts are calling the world network or Internet 2.0, is a far more interconnected communications network using wired and wireless networks, satellites, and new hardware and software to fundamentally change the way people interact with information. Instead of users seeking out information, data will follow users, producing a services-oriented world network that will track people and serve their needs. For example, the basic technology is in place that will allow personalized travel software to alert someone of when they need to leave their house in order to make a flight, based on traffic and flight time data automatically gathered online; if rebooking is needed, the travel software would be able to send emails out to immediate family members and perhaps the rental car company at the destination. With the aid of complementary technology such as Wi-Fi, radio frequency identification (RFID), and the Global Positioning System (GPS), the Web will be able to deliver services far more powerful than those envisioned by Web services proponents. Web inventor Tim Berners-Lee is also working on such a future, and says the Web will only achieve its full potential if automated tools are in place to share and process information. But the capabilities that are now within reach got here mostly by accident, as key technologies such as the GPS were developed for other reasons than providing location-specific Web services. And while GPS provides users' location, RFID puts items on the network so that services know what is available. Internet pioneer Marc Andreessen says the economics of the Internet have dramatically improved as well since Web startups today are able to buy technology cheaper, tap more advertising dollars, and have a larger pool of customers who are more willing to buy online.
    Click Here to View Full Article

  • "Genome Model Applied to Software"
    Wired News (10/04/04); O'Brien, Danny

    Open-source developers are looking to genomics research in an effort to reverse-engineer the secrets of private networking software, as explained by security analyst Marshall Beddoe at the recent ToorCon security and hacker conference in San Diego. He described his "Protocol Informatics" technique as the employment of algorithms used for protocol analysis in genomics to help simplify the reverse-engineering of closed, proprietary software. Protocol reverse-engineering involves the analysis of many separate computer conversations, scanning for repeating patterns and determining their meaning, a process very similar to genomic research focusing on the search for DNA sequences separated by long gaps of unknown data. Beddoe used bioinformatics algorithms to remove "junk" data among patterns of repeated commands in much the same way that genomics researchers use them to eliminate the gaps between DNA sequences. Biologists have also created algorithms that ascertain whether two pieces of DNA share common ancestry by comparing genetic variations with established mutation rates of specific DNA elements, and Beddoe used these programs to find related fields in network exchanges. In addition, the security analyst was able to render Microsoft's SMB protocol as a phylogenetic tree using equations employed by geneticists to display a species' family tree. Another challenge facing Beddoe and other researchers involves translating bioinformatics terminology into language that network engineers can understand. Rockefeller University professor Terry Gaasterland comments that "The problem of decoding the language of networks and the problem of finding signals in DNA are really two related instances of machine learning problems," and probing both problems will likely uncover universal information communication principles.
    Click Here to View Full Article

  • "Two Leaders in Their Fields Honored With Anita Borg Awards"
    Business Wire (10/04/04)

    The Anita Borg Institute for Women and Technology will honor Dr. Fran Allen and Karen Banks with the first Anita Borg Awards for Technical Leadership and Social Impact, respectively, at the Grace Hopper Celebration of Women in Computing Conference Awards Banquet on Oct. 7. Allen--the first woman named IBM Fellow and recipient of the Association for Computing Machinery SIGPLAN's Programming Languages Achievement Award, is recognized for her commitment to mentoring technical women employees, which so impressed IBM that the company created an award in her name to spotlight such women's contributions. Allen is regarded as an inspiration for the next generation of business-technology leaders. Banks, meanwhile, serves as the global coordinator for APC Women's Networking Support Program, and has spearheaded the use of information communication technologies as tools to empower underserved women and communities around the world. Examples of their work include training African women's organizations in content and dissemination to spread awareness of violence against women, women's views of HIV/AIDS, and peace building among decision makers, policy makers, and the mainstream; carrying out and coordinating regional and national Women's Electronic Network Training workshops in the Asia Pacific region; and working with local telecenters and public access centers in Latin America to make sure that both men and women have equal access to public resources. "The recipients of these first Anita Borg Awards offer us a reminder of the substantial influence technology has on our lives and the potential for technology to have a significant and positive role in shaping the future," declared Anita Borg Institute President Dr. Telle Whitney. "This year's winners embody the qualities, understanding and compassion essential for a more positive future."
    Click Here to View Full Article

    For more information about the Grace Hopper Celebration or to register, Visit http://www.gracehopper.org/.

  • "Slash of Reality: Computer Graphics Lab Creates the Visual and the Visceral"
    Montreal Gazette (10/04/04); Bruemmer, Rene

    Montreal is expanding as a center of academic and commercial activity in the area of computer graphics, a field that has exploded in recent years thanks to the increasing popularity and profitability of video games and computer-animated films. Over 700 people in Montreal are employed by video game makers Ubisoft and Electronic Arts, while students at the Universite de Montreal are developing new algorithms and software directed toward the more realistic and speedier rendition of computer-generated imagery. The school is often one of the select few to publish papers at ACM's SIGGRAPH, the leading forum for computer graphics research. Designers and academic programmers are pursuing the development of tools and methods that are published and adopted by the film or gaming industries, or are built upon by other researchers. Victor Ostromoukhov with the Universite's computer graphics lab notes that their work is released into the public domain, while professor Pierre Poulin explains that students and professors may be awarded private or government funding and fellowships. He also notes that students could secure a fruitful career in computer graphics based on the merits of their work. "Computer graphics is one of those rare fields where students are coveted," Poulin says. "Where computer science has been slowing down in the last few years, computer graphics is speeding up." Getting published is more complicated now thanks to the computer graphics boom, which has spawned fervent competition; this makes true innovation all the more prized, according to Poulin.
    Click Here to View Full Article

  • "E-Cyclers Embrace Data Destruction"
    eWeek (10/01/04); Hachman, Marc

    Computer recyclers are taking measures to verifiably destroy data as well as hardware in order to comply with federal regulations such as the Gramm-Leach-Bliley Act and the Health Insurance Portability and Accountability Act, which prohibit the public exposure of confidential data by financial and health care institutions; meanwhile, fears of civil suits are driving more traditional companies to pursue the same goal. Debate has sprung up over the best techniques to destroy data, which range from Department of Defense-compliant overwriting software to the physical shredding of disk platters. Software vendors say that overwriting a hard disk once either with other files or random bits of data is inadequate, as some or all of the information in a file can be revealed by latent magnetism. The DOD's 5220.22-M specification advises overwriting each disk sector several times with nonrandom and pseudorandom data. However, shredding is recommended for both nonfunctional drives and drives with more than 10 defects. A Sept. 30 teleconference between members of the National Association for Information Destruction (NAID) failed to resolve differences between supporters of software wiping and supporters of shredding, but attempts will be made to reach an accord before the NAID board's final recommendation on Nov. 29. Small-scale nonprofit recycling organizations are also joining the data destruction bandwagon, and a lack of certification procedures for compliance with the DOD's 5220.22-M spec is benefiting these firms by boosting competition in the data-destruction product market. Data destruction certification has been adopted by many recyclers as a saleable service, and there is little oversight in the negotiation of contracts and certifications between recyclers and clients.
    Click Here to View Full Article

  • "Presidential Candidates Spend Little Time on Tech Issues"
    SiliconValley.com (10/03/04); Gillmor, Dan

    Dan Gillmor comments that despite the importance of technology issues such as communications policy and intellectual property, presidential candidates George W. Bush and John Kerry have remained mostly mute on the subject. Although Bush has pledged to extend broadband and eliminate bureaucratic obstacles to implementing universal broadband access, his tenure has witnessed a serious lag in this area. Particularly stinging is the appointment of Michael Powell as FCC chairman, whose performance has been bumpy: Though Powell and his associates have made significant progress in lobbying for more innovation and efficiency in wireless communications and broadening the use of unlicensed spectrum, other decisions have supported monopolization by cable and phone companies. Kerry's view toward broadband does not wildly diverge from Bush's, with promises of a tax credit for companies that deploy broadband service and the expansion of spectrum for wireless broadband. Gillmor also notes that Kerry opposes rules that permit media behemoths to expand their market dominance, and supports open Internet access and prevention of discrimination over permissible content and the speed of content delivery by data providers. Meanwhile, Bush's stance on intellectual property favors copyright and patent holders while trampling over consumers' fair-use rights, though Gillmor acknowledges that the Bush administration is merely continuing a policy started by the Clinton administration. Furthermore, the White House and Congress are funneling revenues generated by patent application fees into other initiatives instead of using them to improve patent quality, a practice that Kerry has vowed to halt; Kerry has also stated that file-sharing should be regarded as illegal is if is outside "normal" channels such as between friends, roommates, etc., noting that the tech industry has a partial responsibility to solving the problem.
    Click Here to View Full Article

  • "The Technologist Who Has Michael Powell's Ear"
    CNet (09/30/04); McCullagh, Declan

    FCC policy development chief Robert Pepper says the abundance of communications technologies could actually lead to a dramatic change in the role of his organization, where the government is less involved in controlling technology than in the past. Currently, however, the FCC is overseeing the introduction of new voice over Internet Protocol (VoIP) services and wireless services that promise radically different architectures, services, and cost structures. The FCC recently ruled that the Communications Assistance for Law Enforcement Act (CALEA) did not apply to new VoIP services because the statutory definition did not apply to those type of peer-to-peer services, which Pepper says includes other technologies such as Xbox Live and Yahoo! instant messaging with voice. The FBI and law enforcement agencies still have the means to wiretap suspect conversations with a court order, but technology is perhaps not powerful enough to capture that information. Pepper says VoIP requires service providers and policy to adjust to new parameters where distance is no longer a major cost factor. New mechanisms are needed to ensure affordable phone service for everyone, such as charging fees according to a connection's bandwidth or by phone number, regardless of whether that number is for a traditional phone, VoIP from a cable provider, or for a wireless phone account. Broadband over power lines is another promising technology that many state regulators are bullish about because of the technology's ability to link electricity demand and consumption in real-time: IP-enabled dishwashers could be programmed to turn on when electricity is cheapest, for example, saving power in states such as California. Pepper says interference concerns are legitimate, but that broadband over power lines actually limits normal radio emissions in many cases.
    Click Here to View Full Article

  • "Microchip Imperfections Could Cut Cloning"
    New Scientist (10/04/04); Biever, Celeste

    MIT electronics engineer Srini Devadas has designed software that can prevent microchip cloning by generating and verifying a unique chip ID code using imperfections that are singular to each chip. The metal tracks that link a chip's transistors boast a unique thickness produced by pressure and temperature variations during the manufacturing process. Signals travel faster along thinner tracks, and Devadas suggests embedding a circuit in every chip that notes these differences and from them constructs an unique chip ID signature. The engineer says the circuit will apply a "challenge code" comprised of a 128-bit signal to certain metal tracks, and the first signals to span a track are shunted into a secret algorithm to produce a unique ID code that "is more secure than any system we have right now." This technique would be immune to hackers who reverse-engineer and clone smart cards such as those employed to decrypt pay TV signals, authenticate bank transactions, or give access to buildings. Copying the exact "physiology" of the chip to replicate its ID code would be impossible. However, some security experts note that Devadas' method could be circumvented by other ID spoofing techniques. Counterpane Internet Security's Bruce Schneier says that a counterfeiter could craft an algorithm that apes a chip's ID code response, while Secure Methods CTO Paul Clark asserts that chip cloning can only be effectively thwarted by enhanced encryption and other improved safeguards.
    Click Here to View Full Article

  • "Data Management's Misconceptions"
    Government Computer News (09/27/04) Vol. 23, No. 29, P. 17; Jackson, Joab

    Database expert Fabian Pascal views himself as a contrarian who educates administrators and application developers on the myths and realities of database technology. He says Extensible Markup Language (XML), for example, is not a good technology for data management and is currently being misused; XML was meant to establish meaning in system exchanges, but by unnecessary repetition XML tags often overwhelm the actual data being transmitted. XML provides structure for data management, but does not provide good integrity constraints and manipulation. XML's structure is basically hierarchic, but hierarchic database management was abandoned long ago. Pascal says the database management scene has been dominated by vendors so that even academia is often little more than product certification. Many of the problems in database management today are rooted in the non-adherence to the relational database model: Structured Query Language (SQL), for instance, fails because multiple views cannot be updated, which keeps administrators and users from realizing the centralized update benefits of views, or virtual tables; SQL is limited because it allows duplicates and null value for missing data, unlike the strictly false-or-true relational model. These are defects that make products difficult to use and prone to error; Pascal says administrators and application developers need to re-learn database fundamentals, such as the difference between database and application functions, and the data, business, logic, and physical models.
    Click Here to View Full Article

  • "Government Has a Vital Role in IT"
    Computing (09/30/04); Watson, James

    IT experts in the United Kingdom said the government's role in the IT sector involves providing access, ensuring proper education, exemplifying IT use, guiding standards, and funding sciences. Experts discussed the variety of responsibilities the government has in a round-table discussion hosted by Computing. The Prince's Trust chief information officer Colin Heath said country-wide broadband was especially important because it was the government's fundamental obligation to ensure equal IT access, while Corporate IT Forum Chairman Denise Plumpton said incoming Disability Discrimination Act legislation would boost accessibility for disabled users. The government should realize the importance of IT technology advancement and facilitate academic IT discoveries in reaching the commercial sector. University College London computer science professor Anthony Finkelstein said the computing field has not done as good a job in lobbying for research funding as the medical or physical sciences. Nearly all the panel participants weighed in on the issue of IT education, and said it was a fundamental work skill along with mathematics and literacy: Finkelstein noted that the number of computer science students has fallen 20 percent last year and that women are still a small minority. Institute of Directors senior policy advisor Jim Norton said the government should stop paying teachers in high-demand fields the same wages as those who teach low-demand subjects. Another important role of the government was in exemplifying IT use, since the majority of the public's contact with IT comes through interaction with the government, either through the U.K. medical or education system.
    Click Here to View Full Article

  • "Akamai Strives for a Safer, Speedier Net"
    Washington Post (09/30/04) P. E1; Walker, Leslie

    Akamai Technologies co-founder Tom Leighton is very worried about the DNS attack that took place on June 15, significantly slowing traffic for prominent Akamai customers such as Yahoo!, Google, and Microsoft. "That was a very sophisticated attack; the nature of it was novel," he says, though not willing to go into specifics because he does not want copycat exploits. Akamai's command center in Cambridge, Mass., was the first to notice something was happening and alerted federal authorities and customers. Working with federal authorities, Akamai successfully shut down the zombie "botnets" that had launched the attack, though the perpetrators remain unidentified. Leighton's firm provides Web caching services for more than 1,000 government and commercial customers, but also helps protect Internet addresses from hacker attacks that siphon off Web traffic. The June 15 attack and subsequent assaults are worrying because it shows hackers are learning techniques to compromise Akamai technology, which was developed by Leighton and graduate students at nearby MIT, where Leighton is an applied mathematics professor. Leighton is also chair to the President's Information Technology Advisory Committee subcommittee on cybersecurity, and says more federal funding is needed for research, along with difficult but technically feasible changes to the Internet address system. Akamai was conceived in 1995 when fellow professor Tim Berners-Lee lamented that the Internet's open architecture made it vulnerable to traffic congestion. The MIT computer science department algorithms group tackled the problem, leading to the birth of the company during the dot-com boom.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Giving Computers the Jitters Helps Explain Human Behavior"
    Penn State Live (09/28/04)

    Penn State researchers recently presented a paper at the 48th Annual Meeting of the Human Factors and Ergonomics Society in New Orleans that details the development of a computer program that simulates how people respond to a task they view as threatening and makes them worry before starting. The paper, "Using Cognitive Modeling to Study Behavior Moderators: Pre-Task Appraisal and Anxiety," simulated the results of people who may experience anxiety from performing a mathematical problem. "However, the program could also be used to study the effects of feeling threatened or worried before driving a car, using a computer or other stressful task--and to help develop remedial strategies," says Dr. Frank Ritter, associate professor in the School of Information Sciences and Technology. The computer program yielded results similar to published data on how the speed of performing a task slows by about 25 percent when it is considered threatening. "The correspondence suggests that the modifications we made to the program to simulate feeling threatened and worried provides a plausible explanation for the decreased performance in people on this task," says Ritter.
    Click Here to View Full Article

  • "App Developers Need to Redouble Security Efforts"
    eWeek (09/30/04); Schindler, Esther

    The recent Gartner Application Development Summit included new statistics underscoring the need for development and quality assurance teams to increase their security efforts. Gartner research director Theresa Lanowitz says the problems of IT network and physical security have been solved for the most part, which means that the application layer is the most vulnerable. Companies must take responsibility for security issues during development, or have a higher risk of a catastrophic event. According to Gartner, if 50 percent of software vulnerabilities were dealt with before production use, enterprise configuration management costs and incident response costs would each be reduced by 75 percent. Lanowitz says someone in the organization must be responsible for security issues, such as an "application security architect." This person's primary focus is the risk that a company faces, and articulating that risk to staff and management. Lanowitz says government agencies and financial institutions have been leading the way in creating application security architects that work on the same level as application architects and ensure that security testing is added to the quality assurance framework. Gartner predicts that 80 percent of development teams will incorporate application security architects by 2006. Lanowitz also expects to see a wave of development tools integrating security functions by 2007, although the market for now is in its infancy.
    Click Here to View Full Article

  • "Looking for Patterns"
    InformationWeek (09/27/04) No. 1007, P. 58; Babcock, Charles

    The father of object-oriented programming believes teamwork and collaboration, rather than working with an individual tool, will spur software development in the years to come. Grady Booch, chief scientist of IBM's Rational Software unit, says developers will produce "algorithmic snippets of code" that manipulate existing objects and create new ones, and he adds that there may be advances in "building new languages for connecting systems to systems." Booch says aspect-oriented programming is an early example of system-to-system communications is capable of. There have been no major developments in object-oriented languages, such as Java, C++, and Microsoft's C#, in years. Although Booch has had a heavy influence on modeling, he plans to take a pattern-based approach to development, considering underlying patterns sometimes exist between different systems. Java Blueprints, which shows best practices for building a Java application to interact with surrounding pieces of software, is an example of the practical use of patterns.
    Click Here to View Full Article

  • "Internet-Era Avionics"
    Aviation Week & Space Technology (09/27/04) Vol. 161, No. 12, P. 47; Hughes, David

    Driving the incorporation of information technology into commercial flight deck design is airline customers' desire for cheaper equipment with less wiring, power requirements, size, and weight. Honeywell's Randy Robinson notes that a transition to wireless flight control is being considered by the industry, and projects that the feasibility of such a system will be more firmly determined in about five years. With more sophisticated memory devices and increased bandwidth over commercial Ethernet data buses, avionics system designers can process massive volumes of data to render precision displays that pilots can raise anytime during the flight; IT is seen as an enablement technology for more easily interpreted graphic displays for pilots. Improved computer processing capability allows single modules to carry out multiple functions and run various software applications while separating flight-critical functions from other operations. "Customers are demanding [global information network connections] in the business jet, regional and airline markets, and airlines are looking for more operational efficiency," observes Mark Harris with Rockwell Collins. "All the benefits of networking other departments [in a company] are there with similar benefits of eliminating paper and streamlining processes." Global airline information service provider Sita expects initial broadband satellite communication applications to serve passengers, and the company's Jeff Plumley and Mike Warnock anticipate that cockpit applications will shortly follow on the heels of such offerings. Broadband's penetration of the flight deck will hinge on its ability to ramp up efficiency while reducing overall operational costs, and Plumley and Warnock expect real-time graphical weather maps to be the first broadband-driven, data-rich ground-to-air application.

  • "Keeping Out the Wrong People"
    Business Week (10/04/04) No. 3902, P. 90; Ante, Spencer E.

    The welter of new visa rules and regulations that have sprung up in the wake of the Sept. 11 terrorist attacks has ensnared tens of thousands of foreigners seeking employment in the United States and led to sizable immigration declines. The Homeland Security Department's annual immigration report estimates a 34 percent decrease in immigrants to 705,827 last year, which represents the most precipitous fall-off since 1953, when McCarthyism was at its peak. The refusal rate for student visas rose from 20 percent in 1999 to 35 percent in 2003, while the percentage of professionals with advanced degrees or outstanding skills admitted into the country fell 65 percent last year. "The long-term costs in goodwill will be enormous," warns National Academy of Engineering President William Wulf. One alternative to bringing in foreign workers through visas is to farm the work out overseas, but people fear the outsourcing trend could accelerate because of America's failure to discern between legitimate immigrants and potential security risks. A higher visa refusal rate is not the only problem the new regulations have caused: A lack of technical skill among government visa reviewers has retarded the visa application process, resulting in a backlog that totaled 3.8 million pending applications in January. The continued discouragement of talented foreign workers and students could entail the loss of one the nation's most treasured resources to more welcoming and aggressive countries, eroding the United States' economic and academic competitiveness. Many educators and economists agree that U.S. immigration policy needs to be rigorously overhauled, and suggest committing much higher budgets to the State Department's Educational & Cultural Exchange Program as well as setting virtually no visa limits on the most skilled foreign students or workers.
    Click Here to View Full Article

  • "Reality Mining: Browsing Reality With Sensor Networks"
    Sensors (09/04) Vol. 21, No. 9, P. 14; Boone, Gary

    Communications advances and the falling cost of networked sensor technology is driving the development of a global sensor network that can facilitate "reality mining," a concept in which organizations view real-time data gathered by sensor streams to glean usable insights not just about their physical surroundings, but also their customer, supply-chain, economic, and competitive environments. Accenture Technology Labs has prepared three demos of reality-mining software platforms with Sensor Information Systems, a combination of commercially available software, real-time sensors, customized Web crawling agents, and data modeling that blends geographic information systems software, mission planning/terrain visualization, and sensor networks. Information from live sensor data, publicly accessible Web data, and commercial financial data corresponding to a physical location is rendered as a 3D map. The first demo is designed for the detection of fires and similar anomalies in remote regions in order to more accurately predict their spread and more rapidly plan response actions: Networks of Smart Dust sensors from Crossbow identify the fire's location upon its detection, and response planning and firefighter routing is enhanced through the use of 3D aerial and satellite imagery; the system can support various levels of planning via multiple altitude perspectives. Another demo models a 3D simulation of Accenture's office and grounds with browseable icons marking restaurants and other services whose menus, schedules, and so on, can be accessed live from the Web. This is an example of "browsing reality," in which users tap high-fidelity views of their physical, social, and commercial environments through ubiquitous sensor networks and business Web services. The final demo, the Economic Weather Map, augments models of physical landscapes with a cloud-like, continuously updated overhead display of stock market performance corresponding to the office locations of the Bay area's 200 biggest publicly traded companies.
    Click Here to View Full Article

  • "VoIP: What Is It Good For?"
    Queue (09/04) Vol. 2, No. 6, P. 48; Ahuja, Sudhir R.; Ensor, J. Robert

    Voice over IP (VoIP) is setting the foundation for a host of new multimedia applications, driven by the technology's promise to cut costs and boost revenues for network and service providers. VoIP offers more flexibility as a communication services platform than traditional telephony in a variety of ways: It removes interference between information flows by decoupling signal and bearer traffic; enables multiple application and user endpoints to engage in service support by allowing all nodes to function as servers; and uses an array of foundational networks to facilitate IP transport, effecting the support of different sets of services by different network technologies. New converged services that can be built atop VoIP include call-center services accessible from Web browsers or IP phones via the integration of interactive voice response and Web components using SIP signaling. These services can control aspects of one set of services using features from other sets of services, and can be assembled out of multiple server clusters; this requires the development of in-session service coordination methods from the industry. IP-based multimedia service providers lack guarantees that their services can scale up to satisfy the requirements of large customer bases while concurrently fulfilling the services' time constraints. Filling in this gap requires the employment of appropriate transport performance and servers within the signaling and media transport streams that can respond to messages within time restrictions. Persistent SIP sessions can serve as a direct actualization of a long-term group initiative by buttressing long-term interactions, acting as the convergence point for multiple calls, and provisioning storage for data used in the calls. The refinement of VoIP entails finding ways to particularize the network needs of a specific application, map it to the multiservice network, and furnish those services and track their implementation to ensure delivery.
    Click Here to View Full Article