Timely Topics for IT Professionals
About ACM TechNews
ACM TechNews is published every week on Monday, Wednesday, and Friday.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either IBM or ACM.
To send comments, please write to firstname.lastname@example.org.
Volume 5, Issue 513: Friday, June 27, 2003
- "Michigan Senate Passes Anti-Spam Bill"
CNet (06/25/03); Festa, Paul
The Michigan state Senate on June 24 unanimously approved an anti-spam bill that would set up a do-not-spam registry and impose heavy fines on violators. In addition, the Michigan bill would establish a "parental block" that allows parents and guardians to filter the spam their children are exposed to. Sen. Michael Bishop (R-Mich.) hailed the bill as the strictest anti-spam measure in the U.S. to date, but the bill will not become law unless it is passed by the state House and state Gov. Jennifer Granholm. The bill is partially based on national legislation proposed by Sen. Charles Schumer (D-N.Y.) that also calls for the establishment of a do-not-email registry. Violating the Michigan bill carries a misdemeanor charge, a prison sentence of up to one year, and a maximum fine of $10,000. Furthermore, recipients could file civil lawsuits against spammers and demand damages of $500 per email message or $250,000 for each day they receive spam. However, Marquette University Law School professor Eric Goldman cautions that a do-not-spam registry could be a very tempting target for spammers that operate outside the jurisdiction of Michigan authorities. "If a spammer governed by the law gets his or her hands on the database and then accidentally leaks it, well, the horse is out of the barn," he says. Unspam CEO Matthew Prince claims his company has developed a system with a one-way encryption scheme to prevent this from happening.
- "The Next-Generation Net Is on the Way"
Investor's Business Daily (06/27/03) P. A6; Howell, Donna
The Internet is expected to gradually shift from the current Internet Protocol version 4 (IPv4) to version 6 (IPv6), a standard that will offer more reliable security and more dexterous data-handling for operations such as videoconferences, real-time multiplayer online games, and battlefield actions. In anticipation of IPv6's deployment, ISPs, telecoms, and other organizations are building IPv6-enabled tools and applications: NTT Verio is planning to debut the first large-scale, commercial North American IPv6 service in the fourth quarter of 2003; the Defense Department told vendors that IPv6 support must be embedded in all network technology it contracts for by October; and Larry Smarr of the California Institute for Telecommunications and Information Technology says that universities are researching the protocol's possibilities via experimentation. Smarr notes that IPv6 is starting to play an important role in university research through its support of fiber-optics and a mobile Internet medium. Key to IPv4's inevitable obsolescence is the limited number of Internet addresses it can support, a problem that IPv6 rectifies by enabling devices to sustain a constant Internet connection. "The Internet will work better if we have enough [Internet Protocol] addresses so that every person, place and valuable thing can have an address," explains Alex Lightman, who organized an IPv6 summit in San Diego this week. With devices linked to the Internet via IPv6 numbering in the billions, enormous revenues could be generated from the sale of products and services. Such devices could run the gamut from cell phones to refrigerators to weapons systems. Yanick Pouffary of Hewlett-Packard's IPv6 Forum Technical Directorate acknowledges that switching to the new protocol will require a considerable financial investment, but the cost should be minimal in light of the profits to be reaped.
- "2 PC Makers Given Credit and Blame in Recycling"
New York Times (06/27/03) P. C3; Markoff, John
Leading U.S. PC manufacturers Hewlett-Packard and Dell Computer received praise and scorn, respectively, for their recycling programs in a report prepared by the Silicon Valley Toxics Coalition and the Computer Take Back Campaign. HP was lauded for partnering with the commercial recycling sector and employing "state-of-the-art" methods for reclaiming used products, while Dell was chastised for using prisoners as a cheap work force for its recycling effort. Silicon Valley Toxics Coalition executive director Ted Smith maintains that this practice, along with the export of U.S.-generated electronic waste to third-world countries, seriously impedes the building of a profitable recycling infrastructure. The recruitment of inmates to recycle old electronics is coordinated by the Unicor program, whose director challenged the report's findings, arguing that the prison effort handles only about 36 million pounds of the one billion pounds of used equipment discarded annually in the U.S. Dell says the recycling operation meets environmental standards and notes that the work program is voluntary for inmates. Meanwhile, HP has worked with Canadian mining company Noranda over the past nine years; the companies operate two plants that recycle as much as four million pounds of electronic equipment each month. The study was released at a current EPA summit that aims to establish standards for recycling consumer electronics and computer technology components. Both HP and Dell have recycling programs in which they pick up used computers at offices and homes.
(Access to this site is free; however, first-time visitors must register.)
- "Inventor Looks to DNS's Past, Future"
IDG News Service (06/26/03); Lawson, Stephen
Domain Name System (DNS) inventor Paul Mockapetris is surprised the basic DNS framework is still in place 20-plus years after he ran the first successful test, but says the system is handling increasing demands fairly well. Additions have made DNS relevant to current technologies, such as IP phone calls and discrete device addresses, and Mockapetris expects new changes to include support for foreign language character sets and better security. More phone activity will be handled over the Internet in the future as well, making DNS stability even more critical. Currently, a major problem lies with hackers misdirecting Web users to mock-up pages where they collect sensitive financial information, Mockapetris says. He created DNS at the University of Southern California in 1983, when Jon Postel, his research manager, requested a system that would track Internet destinations; at the time, Internet host names and addresses were catalogued at SRI International. Mockapetris says the switch came gradually, though some organizations maintained backup host tables on their own systems in case DNS failed. Today, Mockapetris says approximately one billion DNS names connect to the Internet, far more than the 50 million names originally expected. While describing the system today as stable, Mockapetris says there were significant mix-ups coordinating worldwide Internet addressing, including the U.K. Name Registration System's inverse top-level and country domain arrangement and computer science departments' hijacking of Czechoslovakia's "cs" domain.
Click Here to View Full Article
- "A Look Inside Apple's New Push for Speed"
SiliconValley.com (06/26/03); Fortt, Jon
The major focus for the PC world this year appears to be speed and communication, as demonstrated by Apple Computer CEO Steve Jobs' recent announcement of his company's Power Mac G5 as "the fastest personal computer in the world." Although critics have called into question the G5's benchmark test scores, more speed is less important than efficiency, and the G5 appears capable of holding its own with machines featuring Intel or Advanced Micro Devices processors, writes Jon Fortt. The G5 chip generates so much heat that nine fans have been installed, while the machine case is aluminum and perforated. Fortt sees several advantages in this approach--the design of the aluminum skin itself and the likely price benefit for aluminum-cased PowerBook laptops. In testing the Power Mac G5's iChat AV tool, Fortt found that the video chat was prone to crashing, but the audio chat was excellent; moreover, the audio tool can still pick up voices clearly even when the speaker is 10 feet away. Fortt thinks that iChat AV ought to support three-way audio and video conversation, while the iSight camera that ties into iChat AV has even more potential as a camcorder that can easily be adapted to a hard drive. Meanwhile, the Panther OS X upgrade supports a wide array of features, including management of fonts and high-quality video; mining of secure, high-volume files and accessing those files from several different machines; and a faxing application. However, Fortt notes that certain Panther features, such as Expose and the Preview PDF reader, can strain the processor.
Click Here to View Full Article
- "Bill Seeks to Loosen Copyright Law's Grip"
Technews.com (06/25/03); Krebs, Brian
The Public Domain Enhancement Act introduced by Reps. Zoe Lofgren (D-Calif.) and John Doolittle (R-Calif.) on June 25 requires copyright owners to pay $1 to renew their copyrights 50 years after the material's first publication; failure to comply means that the unrenewed content would be released into the public domain. The legislation is a response to the provisions of the Sonny Bono Copyright Term Extension Act of 1998, which tacks on an extra 20 years of copyright protection following the death of a work's author. Lofgren and Doolittle claim that the law undercuts the value of the Internet as a medium for preserving content that might otherwise be forgotten and lost once its market value dissipates. "There are literally hundreds of millions of Web sites and other works that won't have economic value a few years from now but will continue to be copyrighted for 150 years, leaving archivists, historians and others interested in preserving these works shut out for no good reason," laments Electronic Frontier Foundation staff attorney Fred von Lohmann. He argues that the Public Domain Enhancement Act would reestablish equilibrium between copyright law and justifiable uses of material that is no longer profitable. In introducing the bill, Lofgren recalled Supreme Court Justice Stephen Breyer's argument against an upholding of the copyright extension law in January, based on the fact that only 2% of copyrighted works are still marketable after 55 to 75 years. Rich Taylor of the Motion Picture Association of America counters that the availability of movies and other copyrighted works to the public actually increases when their preservation and marketing has economic value.
Click Here to View Full Article
- "Hacker How-To Good Summer Reading"
Wired News (06/27/03); Delio, Michelle
The book, "Stealing the Network: How to Own the Box," mixes fictional scenarios with real-world technology and hacking methods, and having high-profile hackers such as Mark Burnett, Tim Mullen, and Ken Pfeil among the book's authors further raises its credibility. Capital IQ chief security officer Pfeil and book editor Jon Babcock argue that such a manual will help prevent hacking because it deeply explores the hacker mindset and ways to counteract hack attacks. "You need to know your enemy if you want to protect yourself from them, and getting to know that enemy from a variety of viewpoints is what this book is all about," Babcock asserts. Pfeil opines that keeping the state of network security under wraps, as many companies do, actually benefits the hacker community, which thrives on their targets' overconfidence. But some security experts allege that "Stealing the Network" comes dangerously close to crossing ethical boundaries by providing details on network exploitation. "If anything, [the book] greatly facilitates the real apprentice hackers out there in terms of speeding their learning curve," warns Sophos technology consultant Chris Wraight. Security consultants Mullen and Ryan Russell contribute material detailing the methodology of tracking and defending against computer worms, while Burnett analyzes social reverse-engineering, a technique in which morally ambiguous people are coaxed into exploiting their own networks. A chapter by Idea Studio founder Joe Grand involves a what-if scenario that exposes the poor state of airport network security.
- "Group Shakes SALT on Scalable Vector Graphics"
InternetNews.com (06/24/03); Boulton, Clint
The SALT Forum has released code for integrating SALT (Speech Application Language Tags) and the Scalable Vector Graphics (SVG) protocol from the World Wide Web Consortium (W3C). The new profile further enhances the capabilities of SALT 1.0, especially on mobile devices where SVG technology allows content to be displayed on screens of various sizes and resolutions. Application vendors represented in the SALT Forum say the new profile mixes visual and audio interfaces, allowing voice "hot spots" within a graphic or speech-activated display functions, such as zoom or scrolling. Redmonk senior analyst Stephen O'Grady says the release is another step in making voice a pervasive component of the user interface, especially on small mobile devices with otherwise limited input functions. SpeechWorks and SALT Forum representative Rob Kassel says the SALT effort grew out of the W3C's VoiceXML protocol, which was designed to voice-enable Web content but became a development tool for voice applications. SALT extends the capabilities of VoiceXML to other modes of input, including a mouse, keypad, keyboard, or stylus. In the future, Kassel imagines businesspeople speaking commands to a mobile device in their briefcase via Bluetooth. The W3C is yet to endorse either SALT 1.0 or the profile for SVG.
- "Girls Less Confident on Computers: Study"
Globe and Mail (06/24/03); Akin, David
Canadian high school girls are not as confident as boys when it comes to working on computers and the Internet; nor do they use them as often as boys, according to a Statistics Canada study co-authored by Dalhousie University's Victor Thiessen and Acadia University's Dianne Looker. Access is apparently not a problem--97% of the 25,000 high school students polled in the study reported that they have used a computer in the 12 months prior to the survey, and 90% said they had accessed the Internet. Using this data, the study's authors conclude that the digital divide is no longer marked by geographical constraints, but by differing attitudes toward the use of technology. Girls surveyed in the study generally felt that computer skills were not as important as other things, while boys valued computer expertise more. In addition, the study finds that "Female youths and those from families with low levels of parental education are less likely to have access to computers in their homes [and] they tend to spend less time on the computer and they tend to report lower levels of computer skills competency." Computer and Internet use is also more diverse among males, according to the report, which also indicates that boys are more likely to use assorted software programs on a daily basis. These trends will put boys in an advantageous position career-wise, because computer literacy and technology skills are becoming more desirable as qualifications for high-level, high-income jobs.
Click Here to View Full Article
- "New Finding Has Implications For Scientists Designing New Mobile Audio Interfaces"
OHSU News (06/24/03)
People who carry on conversations with text-to-speech (TTS) computer systems modify their speech patterns so they are more in keeping with the computer's mode of discourse, a technique known as speech convergence, concludes a new study led by Sharon Oviatt of Oregon Health & Science University's OGI School of Science and Engineering. This has a direct bearing on the work of researchers attempting to design new mobile audio interfaces. The study involved having children participating in the Immersive Science Education for Elementary Kids (I SEE!) Program use handhelds to converse with virtual marine animals that responded via TTS output. Four distinct TTS voices were employed to see if the kids tailored their own voices to sound like the animated characters. The results of the experiment are being modeled in order to reliably anticipate how speech convergence will differ in specific situations in order to develop more adaptive computer interfaces. Oviatt and husband/collaborator Phil Cohen have made groundbreaking strides in the research and development of interfaces that support multiple modalities. Oviatt's study, which was funded by the National Science Foundation, is the first of its kind to analyze speech convergence as a function of human-computer interface design.
- "Mystery Net Traffic Linked to Attack Tools"
New Scientist (06/25/03); Knight, Will
Rogue Internet packets exactly 55,808 bits long that have been circulating around the Web since mid-May have been linked to one of three different computer attack tools: Stumbler, 55808 Trojan-Variant A, or sdbot. Although computer security experts say the mystery packets do not represent a significant security threat, they are unusually large and are being sent to apparently randomly generated email addresses, some of which do not exist. The packets also carry fake return Internet addresses and likely are being generated on computers without their owners realizing it. Internet Security Systems first traced the packets to Stumbler, a network probing tool that looks for vulnerabilities in computer networks, but which most experts say is not well written. On Tuesday, the U.S. government's Computer Emergency Response Team (CERT) issued an alert tracing the packets to Stumbler, as well as an Internet Relay Chat remote control program called sdbot or 55808 Trojan, a stealth program, both of which are not widely used.
- "Openness Makes Software Better Sooner"
Nature (06/25/03); Ball, Philip
Damien Challet and Yann Le Du of the University of Oxford report that open-source software can be debugged faster than closed-source software. Sharing software openly allows glitches to be detected and rectified sooner. The researchers have developed a theoretical model of software debugging that indicates more users and higher-quality programmers are needed to debug proprietary software in the same amount of time it takes to debug open-source software. This is because programmers and users of open-access software enjoy a better transfer of data between them. The model formulated by Challet and Le Du has users revise their software once reported bugs have been resolved by the programmers. This is a continuous process for open-source software; in contrast, the closed-source model furnishes updates only after a certain amount of time has elapsed, during which programmers strive to correct the last assortment of errors. Challet and Le Du reckon that proprietary software debugging is slow in comparison to open-source debugging because the rate of reporting bugs declines rapidly after the release of each amended version. Furthermore, the researchers note that the open-source model supports software debugging even when it is possible that any single programmer will fail to fix a software error.
- "W3C Issues Key Web Services Standard"
CNet (06/25/03); Festa, Paul
The World Wide Web Consortium (W3C) this week announced its approval of Simple Object Access Protocol (SOAP) version 1.2 as a formal standard, allowing business IT workers and commercial software developers to use the specification without worrying about incompatibilities. Janet Daly of the W3C wrote in an email exchange, "SOAP version 1.2 is the first SOAP spec to go through any kind of independent development and review--one could say it's the first SOAP standard." SOAP was originally envisioned by the Internet Engineering Task Force as a method for implementing remote procedure calls, while Microsoft later redefined the standard as a tool that allows business software applications written in any programming language to interact over the Web. In issuing SOAP 1.2, the W3C reinforced its purpose as a mediator of Web services standards that will administer the infrastructure to enable the online interchange between commercial applications. Certain W3C members have objected to the consortium's decision to restrict the use of patented, royalty-bearing intellectual property, a policy that has prompted some Web services protocols to be presented to the Organization for the Advancement of Structured Information Standards (OASIS), a W3C rival. Organizations that helped develop the SOAP standard include IBM, AT&T, Macromedia, Microsoft, Oracle, Ericsson, Matsushita, Sun Microsystems, SAP, DaimlerChrysler Research and Technology, and Canon. W3C published SOAP 1.2 as a proposed recommendation in May.
- "Wires Make Wireless Strain Gauge"
Technology Research News (06/25/03); Bowen, Ted Smalley
Japanese researchers have created simple embedded sensors that accurately measure a building's structural integrity as it is subjected to different forces. The sensors can be measured using wireless technology and are embedded in construction materials such as concrete and fire-proof coating. The Keio University researchers developed several different sensors to measure distortion and shifting, horizontal displacement, and other movements. While current structural sensors also take such measurements, they are expensive, complex, and difficult to deploy and read. The Keio University devices, on the other hand, are mechanical, do not require power sources, and have tested favorably against laser sensor readings. The devices rely on thin aluminum wires that flex according to strain and measure electrical properties such as capacitance, resistance, and inductance. "Capacitance is our preference [because] the system is simple and cheap," explains Keio University's Akira Mita, who adds that the devices can be fabricated from any conductive material. Several versions of the sensors have been constructed by the Japanese researchers: One version features a thin, stretchable wire hooked up to a block of conductive material at one end and wedged between conductive layers on the other end; another version utilizes a variable capacitor, variable inductor, or variable resistance component that measures the strain the wire undergoes.
Click Here to View Full Article
- "Power Play"
InformationWeek (06/23/03) No. 945, P. 30; Ricadela, Aaron
The U.S. supercomputing push over the last 10 years has emphasized building PC clusters out of off-the-shelf parts to minimize costs, but experts argue that the efficiency and programmability limitations of such commoditized systems are a major hindrance to further development; their argument is strengthened by the emergence of the Earth Simulator, a vector architecture-based supercomputer funded by the Japanese government that outperforms all other supercomputers. This development has raised alarms among government users who are worried that the U.S. supercomputing effort is lagging, and who believe that the research and development strategy that government, IT vendors, and business have been following is in need of serious revision. Trying to overtake the Earth Simulator is necessary if budding computing problems in the fields of materials science, nanotechnology, artificial intelligence, and cosmology are to be addressed, contends Dan Reed of the University of Illinois' National Center for Supercomputing Applications. Cray, Hewlett-Packard, Sun Microsystems, Silicon Graphics, and IBM researchers are devising new supercomputer designs that focus more on real-world productivity than on peak performance gains. IBM won a $290 million contract with the Energy Department to construct the 100-teraflop ASCI Purple and the 360-teraflop Blue Gene/L for Lawrence Livermore National Laboratory, while both Cray and IBM have proposed or are designing vector systems that promise significant speed upgrades. HP, IBM, and the other three vendors are vying to land a $45 million Defense Department contract to build prototype systems for the department's High Productivity Computing Systems Project. There are, however, commodity supercomputing adherents who doubt that the government's R&D approach, which favors vector architecture, dovetails with private-sector objectives. "The problem proponents of vector architecture face is while they may find applications, how do you craft a business around that technology?" notes IBM deep computing VP Dave Turek.
Click Here to View Full Article
- "Uncrossed Wires"
Economist (06/21/03) Vol. 367, No. 8329, P. 72
The inability to determine what an individual is saying when other chatter is taking place, as well as having discussions become a series of long and drawn-out formal dialogues, are weaknesses of teleconferencing. However, a solution based on social chit-chat, such as what occurs at a cocktail party, is on the way. A team led by Palo Alto Research Center (PARC) researcher Paul Aoki has developed a system called Mad Hatter that is able to determine who is talking to whom during a teleconference call, and even change the volume of sound to facilitate a more normal conversation. With a typical teleconferencing system, it is very difficult for more than one person to speak at a time. Even in formal, fast-paced situations the monotonous volume of the conference call makes speaker comprehension difficult. Mad Hatter changes the volume of sound of a conference call based on the overlap of dialogue between two individuals and the elapse of time when one person stops speaking and another starts. The system accesses conference call conversion every 30 milliseconds, and renders background chatter one-fifth as loud as the immediate dialogue.
- "What's On the Technology Horizon? Six Perspectives--Part 1"
Computer Technology Review (05/03) Vol. 23, No. 5, P. 35; Durance, Mike; Dyson, Esther; Shepard, Steven
EDventure Holdings Chairman Esther Dyson, Shepard Communications Group President Steven Shepard, and Toshiba Telecommunications Systems' Mike Durance--all panelists on the advisory board for Deloitte & Touche's TMT Trends publication--offer their thoughts on anticipated next-generation "killer apps" for the TMT (technology, media, and telecommunications) sector. Dyson believes that applications will become enablement tools running on a killer platform, and foresees ubiquitous electronic information and greater traceability of people and assets as a result. Such technology will need technical and legal safeguards, and Dyson indicates that metadata will help improve knowledge and communications management. Shepard expects a major windfall for the telecommunications sector in the form of "killer access" furnished by the deployment of universal broadband, and notes that this can only happen if the FCC drastically revises the rules concerning line-sharing so that competitive local exchange carriers (CLECs) cannot tap into the resources of incumbent companies in order to vie with them for customers. This will create an incentive for providers to implement universal broadband, which could add between $400 billion and $600 billion annually to the U.S. economy, according to Forrester Research and other industry analysis firms. Durance declares that the convergence of voice and data, combined with the needs of mobile users for personalized communications, is laying the groundwork for new killer apps. He maintains that these applications should boast ease-of-use, affordability, and multimedia support. Durance anticipates the advent of "highly flexible, adaptable communications solutions that allow users to blend call handling and applications to fit their enterprise needs."
- "The Translation Challenge"
Technology Review (06/03) Vol. 106, No. 5, P. 35; Walter, Chip
First envisioned in the 1950s, software that translates text from one language into another is now a market expected to grow to $13 billion by 2007. Translation software is used to translate text for ads, corporate records, and Internet content for people overseas, although most translation is still done by person. Stephen Richardson at Microsoft Research believes the amount of text available today is too much to be translated without automated assistance. "It's growing exponentially," he says. But deciphering words with multiple meanings is problematic for translation software, which generally employs three approaches to tackle that challenge. The first approach is called knowledge-based machine translation, which involves matching words and phrases to lists of sentence structures created by humans in multiple languages, and is used by IBM in its WebSphere Translation Server for Internet applications. The other two approaches are "example-based systems" and "statistical techniques," which are based on algorithms and advanced mathematical models. Such software is 70%-80% recent accurate, which is adequate for most applications and also saves time, says Michael McCord at IBM's Language Analysis and Translation division. He says the technology will improve with "more work, bigger machines, [and] more people."
(Access to full article for paid subscribers only.)