Volume 5, Issue 535: Wednesday, August 20, 2003
- "Next Big Thing in Computing: It's Not About the Technology"
Investor's Business Daily (08/20/03) P. A1; Deagon, Brian
The next computing revolution will not center around any single technology, but will be about how technology is implemented to better suit business needs. Aberdeen Group CEO Tom Willmott says that, since the dot-com bust, no so-called killer app or radical new idea has been introduced; there is, however, a noticeable change happening in how technology is put together. Whereas point solutions were the order of the day in the late 1990s, today companies are loath to introduce new components unless they reduce complexity. American Technology Research analyst Mark Stahlman says technological complexity is the No. 1 IT concern for businesses, and notes that IBM was first to directly address this with its "autonomic computing" initiative, in which it aims to create computer systems that can manage and protect themselves similar to how the human nervous system regulates breathing and body temperature. Other IT vendors have since adopted similar programs and each industry segment is seeing more consortium work to hammer out interoperability standards. Creative Strategies analyst Tim Bajarin says the standards pieces are coming together for the next platform shift in computing. PricewaterhouseCoopers' annual technology report notes movement toward real-time computing akin to what the technology-heavy financial services industry has largely achieved. Being able to monitor business as it happens would mean a huge improvement over the current situation, where reports are often made from months-old data. Technologies such as Linux and grid computing are also putting pressure on technology vendors by allowing customers to put together powerful, low-cost solutions; Willmott says IT vendors need to focus on making technology impact the business bottom line.
- "Head of FTC Opposes Bills To Curb Spam"
Washington Post (08/20/03) P. E1; Krim, Jonathan
In a speech to attendees at a yearly technology-policy forum in Colorado, FTC chief Timothy J. Muris sharply criticized a number of anti-spam measures currently being debated in Congress, describing them as "largely ineffective." He placed special emphasis on Sen. Charles E. Schumer's (D-N.Y.) proposal for establishing a do-not-spam registry, and argued that enforcing such a registry would be futile because the most notorious spammers conceal their identities. Schumer declared, "A do-not-spam list isn't going to solve all the problems with spam, but it's the most broad-based and aggressive approach we know." Thus far, a Senate committee has passed one of the bills Muris finds fault with, a proposal from Sens. Ron Wyden (D-Ore.) and Conrad Burns (R-Mont.) advocating an increase in penalties for deceptive or fraudulent spammers. Muris also harbors strong doubts about the anti-spam community's desire that no consumers should receive any unsolicited commercial email unless they specifically request it. ISPs and marketing companies prefer an opt-out policy in which consumers receive commercial email unless they request not to. Muris said that it is impossible to determine if most consumers would favor an opt-in or opt-out system, meaning proposals including such ideas are inappropriate for anti-spam legislation. Bills that companies favor also fell under Muris' scrutiny. He contended that the language in some of the proposals would blunt the FTC's ability to guarantee that marketers conform to consumers' requests to opt out. Such bills, he claimed, would make it the commission's responsibility to prove that companies using third-party marketing firms to distribute email ads were aware that consumer opt-out requests were being disregarded.
Click Here to View Full Article
- "Skulls Gain Virtual Faces"
Technology Research News (08/20/03); Patch, Kimberly
Max Planck Institute for Computer Science researcher Kolja Kahler says that the reconstruction of human faces for forensic investigation and anthropological research could be significantly accelerated with a new computerized technique. Manually sculpting a traditional clay model of a human face based on bone structure often takes weeks; the computerized modeling process devised by Max Planck researchers, in contrast, can be done in less than 24 hours. Kahler notes that the technique builds the muscles and skin on top of the skull, and is "essentially the virtual counterpart to the de facto standard method used for manual facial reconstruction in police work." The software developed by the Max Planck researchers allows markers, or landmarks, to be studded onto a 3D virtual model of a real skull derived from laser scanning; these landmarks correspond to statistical measurements of tissue depth. The movement of the model's muscles produces a consonant change in the skin layer. The technique also incorporates dictums that ascertain facial characteristics, such as nose width and thickness of the lips. Computer reconstruction gives researchers the luxury of building alternate models to reflect varying facial expressions, weight gain, or weight loss. Kahler says that the next phase of the project involves working with forensic artists and anthropologists to make the technique more useful.
Click Here to View Full Article
- "Bugging the World"
Business Week (08/25/03) No. 3846, P. 100; Green, Heather
Great Duck Island off the coast of Maine represents the next stage of computing. Intel is sponsoring a project on Duck Island involving sensor networks, in which 200 tiny computers are scattered across the island in an effort to track everything related to the movement of the storm petrel, an enigmatic bird. The simple machines act as the eyes and ears on Duck Island, serving as chemical, motion, and temperature detectors, relaying information from one node to next, and finally to a control center with a more sophisticated computer that is able to analyze the data. Other first-generation sensor networks have been rolled out by British supermarket Tesco to track inventories, and by Shell Oil to check the status of pumps at gas stations, but in five years sensor computers are expected to be the size of a grain of sand and appear all over the globe, scattered across everything from farms to battlefields, serving as monitoring tools. Sensor networks are viewed as a more pervasive version of the Internet that could bring online anything that moves, grows, makes a noise, or heats up. However, there are some serious concerns about sensor networks acting as bugging devices, and how governments and companies would use the technology. While advocates of sensor networks seek to address privacy fears, researchers will need to find an energy supply that would allow the technology to operate for years. What is more, the cost of the technology would need to come down before there is a widescale rollout.
Click Here to View Full Article
- "Grappling With Virus Invasion"
Wired News (08/20/03); Delio, Michelle
Security experts such as Sophos' Chris Belthoff speculate that the rapid spread of the Blaster worm has inspired other virus authors to wreak havoc on the Internet by unleashing their own malicious code, as evidenced by recent outbreaks. Analysts contend that the opportunity for hacker mischief has only been amplified by carelessly written Microsoft code and end users' slowness to patch their vulnerable systems. The experts concur that the only effective strategy for curbing these computer epidemics is to develop better applications, institute more ethical behavior, and boost threat awareness through education. The most critical step, they argue, is for Microsoft to dramatically improve the security of its applications and operating systems, while Microsoft recently advised users in newspaper ads to bolster their PC protection with regular patching, firewalls, and antivirus software. Microsoft security program manager Stephen Toulouse admits that his company must "do a better job of educating and informing our users and delivering patches to them." However, security experts note that Microsoft has created a climate of distrust among users, which is why most are unlikely to accept automatic patch updates. Virus researcher George Smith considers antivirus companies' strategy of continuously advising users to update their security themselves to be pointless, arguing, "People who are not susceptible to viruses and worms don't need the advice and those who are susceptible just aren't reachable." Meanwhile, security researcher Robert Ferrell doubts that security issues will fully disappear, given people's creativity and the fact that morality and ethics will never be observed equally. Still, he believes a secure Internet is a reachable goal if the worldwide online community can unite to solve the security problem.
Click Here to View Full Article
- "Are You a Good or a Bad Worm?"
Wired News (08/19/03); Delio, Michelle
Machines affected by the recently released MSBlaster worm are being cured and patched by a variant, AntiMSBlaster, but although many computer users welcome this development, experts warn that there is no reason to think the new worm is benevolent. "Some may call this a good worm, but it can cause all sorts of problems when patches are applied to a computer unbeknownst to the administrator of a network or the owner of that computer," notes iDefense's Ken Dunham, who adds that AntiMSBlaster could install back doors that leave computers vulnerable to future hacker intrusions. AntiMSBlaster and MSBlaster share a similar modus operandi: Both enter systems via a network connection rather than as an email attachment, and only Windows 2000 and Windows XP machines that have not been patched for the RPC DCOM buffer overflow security vulnerability are susceptible. MSBlaster was designed to exploit contaminated computers to launch a denial-of-service attack against Microsoft's Windows Update Web site on Aug. 16, but Microsoft was able to fend off the attack by removing the windowsupdate.com domain name, which was specified in the worm's code. MSBlaster is relatively easy to purge, leaving some security experts curious as to why users seem incapable of fixing their own computers. However, certain users bemoaned the lack of any clear-cut information about removing the worm. "These virus and worm removal advice I see are obviously written by nerds for nerds," says user Paul Pacifico. Systems administrator Mike Fergamo admits that Microsoft needs to find a more effective way of notifying users of security flaws and distributing patches.
Click Here to View Full Article
- "Media Groups Appeal P2P Ruling"
CNet (08/19/03); Borland, John
Movie studios and record labels are appealing a federal court ruling in April that supported the legality of certain file-swapping software--the first ruling ever to favor such a viewpoint. Recording Industry Association of America (RIAA) President Cary Sherman said in a Aug. 19 statement that Los Angeles federal court judge Stephen Wilson made the wrong decision by ruling in favor of peer-to-peer (P2P) firms "that were built for the exclusive reason of illegally exchanging copyrighted works, and they make money hand over fist from it." Wilson said Grokster and Streamcast Networks P2P software was a technology akin to VCRs or Xerox copiers, and argued those companies should not be held accountable for piracy committed by their customers. Attorneys for the National Music Publishers Association, which joined the RIAA and the Motion Picture Association of America in the appeal, wrote that "The district court incredibly equated [Streamcast] and Grokster to Xerox, rather than the more analogous comparison to the illegitimate Napster service." The judge's decision spurred the RIAA to step up its push to sue individual swappers for copyright infringement, and the organization has been firing off subpoenas to ISPs demanding that they reveal the identities of purported file traders. A decision from Wilson on Sharman Networks, which is also being sued for copyright infringement in his court, is still pending; Sharman is the ISP that supports the Kazaa file-swapping service. The copyright holders have long promised that they would file an appeal, but it is only now that they were able to forge ahead following the resolution of certain procedural matters.
- "Nanotech Puts Tiny Chips in Reach, Researcher Says"
SiliconValley.com (08/19/03); Takahashi, Dean
Andre DeHon of the California Institute of Technology used Stanford University's Hot Chips conference to forecast that chips with wires 30 times smaller than current chips could emerge in three to five years thanks to progress in nanotechnology. Taken collectively, the last few years' academic and corporate nanotech breakthroughs could lead to the replacement of photolithographic methods used in today's semiconductor plants, DeHon declared. Such facilities can currently fashion 90-nm wires, while the silicon nanowires DeHon envisions--fabricated through chemical rather than photolithographic means--would measure just 3 nm long. The CalTech scientist said research has demonstrated that silicon nanowires can self-assemble by inducing a chemical reaction between gold and silane, and added that the wires can be directed to carry out memory or processing operations. Electronic Design editor-at-large Dave Bursky expressed doubt that the supplanting of photolithography will take place in so short a time, as DeHon predicts. "Researchers are typically optimistic about the time it takes to commercialize something," he commented. Other attendees were unsure whether nanotechnologists will be able to control atomic-level chemical reactions or develop a workable technique for testing nanowires. DeHon responded that the wires could probably function in the gigahertz range, but problems could crop up at higher speeds due to electrical resistance.
- "Classical vs. Quantum Computers: And the Winner Is..."
NewsFactor Network (08/18/03); Martin, Mike
The long-running battle between classical and quantum computing may have reached an impasse with a paper indicating that classical computers boast far greater energy efficiency and lower error rates than quantum computers. This is in direct contrast to expectations of quantum computing espoused by the popular media, in which the quantum bit's ability to simultaneously exist as 0 and 1 could supposedly ramp up information processing to unheard-of speeds. Physical Review A quantum computing editor Julio Gea-Banacloche and Texas A&M electrical-engineering professor Laszlo Kish argue that this assumption may be overhasty in a paper recently accepted for publication. Kish states that their report "proves, by fundamental physical arguments, that for general-purpose computing, the classical computer will always perform much better than any possible quantum solution." Greater energy consumption gives rise to heat dissipation and "thermal noise" that can trigger random "bit flipping," leading to errors. Kish and Gea-Banacloche learned that classical computers are more energy efficient than quantum machines by comparing the energy requirements of bit and quantum bit manipulation. "The main conclusion of the paper is very different from any earlier papers or popular media articles on quantum computing," notes Kish.
- "DNA Sparks a Computer Revolution"
Associated Press (08/18/03); Elias, Paul
NASA, the Pentagon, and other federal agencies are funding research projects that seek to tap DNA as the fundamental building block of a new generation of powerful computers. University of Southern California computer scientist Leonard Adleman realized 10 years ago that DNA molecules and computers have amazing similarities: The former store data using strings of molecules, while the latter do the same with sequences of 1 and 0. An even more intriguing phenomenon was a living enzyme's ability to "read" DNA in much the same way as a machine reads data, as outlined by computer pioneer Alan Turing in 1936. DNA computing became reality when Adleman was able to calculate the solution to the "traveling salesman" problem by using the predictability of DNA interaction. Although current DNA computers are only capable of solving rudimentary problems, researchers hope they will be used for a variety of functions: Columbia University scientist Milan Strojanovic is working on a DNA machine that can compute without human assistance, which NASA researcher Paul Fung says could be helpful in maintaining the health of astronauts; the Weizmann Institute of Science's Ehud Shapiro has conceived of molecular machines that store medical information and can be injected into people; and another project seeks to exploit DNA's powers of self-replication to assemble processors. Before these visions can become reality, researchers will have to devise methods to expedite the time it takes for DNA computers to calculate problems, and find a way to control biological processes to ensure accuracy.
- "First Game-Playing DNA Computer Revealed"
New Scientist (08/18/03); Hogan, Jenny
MAYA, a DNA computer that plays unbeatable tic-tac-toe using enzymes, was devised by Columbia University's Milan Stojanovic and the University of New Mexico's Darko Stefanovic. The human player makes a move by depositing one of nine DNA strands into a three by three square of nine wells, and the DNA enzymes in the mixture place an "X" or an "O" in a specific well depending on the strand that is added. The move is signaled by a green glow that results when the enzymes snip apart molecules in the mixture. Kobi Benenson of Israel's Weizmann Institute says Stojanovic and Stefanovic's breakthrough represents the most sophisticated use of molecular logic gates yet. MAYA can also be modified to always win or draw. However, the device's creators doubt that such a system will become a significant competitor to silicon computers--the system lacks reusability and cannot function without human action. The researchers are now concentrating on the development of simple decision-making solutions that can be carried out in vivo, including molecule-size solutions that can assess and fix problems among living cells.
Click Here to View Full Article
- "Patching Becomes a Major Resource Drain for Companies"
Computerworld (08/18/03); Vijayan, Jaikumar
Keeping computer systems secure against worms and viruses through regular software patching is putting a strain on companies' limited resources. Banner Health System security analyst Dave Jahne warns, "The thing about patching is that it is so darn reactive. And that can kill you." Art Manion of Carnegie Mellon University's CERT Coordination Center notes that larger and more expansive companies have the added burden of testing each new patch prior to deployment. Testing is important because patches do not always work properly and can interfere with the applications they are supposed to safeguard, according to TippingPoint Technologies CTO Marc Willebeek-LeMair. Ramping up patch testing and implementation is vital as virus proliferation is accelerating, argues Arlington County, Va., infrastructure technologies director Vivek Kundra. He explains that his county can no longer afford to spend three or four days to fully patch its networks; the job should be done in a matter of hours, if not minutes. Possible solutions Arlington County is investigating include handing the patch management process over to an outsourcer and adopting a more automated patch testing and deployment procedure. "There will be times when you may need to make a judgment call balancing risk, appropriate testing [and] mitigating factors," explains Online Resources security officer Hugh McArthur. Meanwhile, Tessenderlo Kerle CIO Bruce Blitch maintains that software patching is still the best strategy for companies, for lack of a way to guarantee absolute code security.
Click Here to View Full Article
- "Project Searches for Open-Source Niche"
CNet (08/18/03); Olsen, Stefanie
The Nutch project is developing open-source software for finding documents on the Internet, but the Nutch methodology differs from those of major search providers in that it will not be kept secret, says lead architect Doug Cutting. He insists that "People have the right to know how their search engine works, so they can trust it." The nonprofit project, which has been operating clandestinely for about a year, was conceived as an open-source search engine for academic researchers; Nutch is being funded by Overture Services, while Cutting was instrumental in recruiting O'Reilly & Associates President Tim O'Reilly and Electronic Freedom Foundation founder Mitch Kapor for the board of directors. The Java-written Nutch engine is based on Lucene, a software library partly developed by Cutting that developers can use to give email and other technologies search capabilities. Cutting explains that academic researchers or developers will be able to download and adapt the Nutch software with little difficulty, and adds that foreign governments could employ Nutch to build a nonproprietary search site for citizens instead of licensing proprietary technology supported by advertising, while corporate agencies could use the technology as the core of a for-profit business. Nutch has unveiled its downloadable software for research, which typical Web surfers will likely find too esoteric. Search Engine Watch editor Danny Sullivan cautions that broader use of Nutch could be limited because it is open-source, and thus highly vulnerable to spammers. Observers note that implementing transparency is becoming a critical issue as the Internet search industry consolidates.
- "Attack Reveals GNU Project's Vulnerability"
NewsFactor Network (08/14/03); Ryan, Vincent
A hack attack in March compromised the GNU Project's chief FTP download server, but the flaw was not detected until late July, according to a statement from the Free Software Foundation (FSF). The FSF advises anyone who has downloaded from the affected server to check their files for tampering, while the CERT Coordination Center of Carnegie Mellon University's Software Engineering Institute recommends that any sites using GNU software originating from the vulnerable system should confirm the soundness of their distribution. FSF executive director Bradley Kuhn says that subsequent investigation has yielded no evidence of damage to GNU source code, but he attributes the intrusion's slow discovery to the fact that only one system administrator oversees GNU's infrastructure; the break-in was only revealed when the hacker accessed other machines on the GNU Project network. The attack took the form of ptrace, a Trojan horse embedded into gnuftp.gnu.org. According to the FSF statement, the hacker wanted to use gnuftp "to collect passwords and as a launching point to attack other machines." An analysis of the victim machine revealed that the intruder replaced the SSH client and employed it to capture the passwords of other people who logged on. Kuhn says the fact that the incident was local indicates that the cracker could be someone who stole the password to a GNU maintainer's account or a person with legitimate access to the FTP server; the FSF has removed GNU maintainers' local shell access to the server for the time being. Since Aug. 2, the foundation has been authenticating established, trusted secure checksums of all files before re-inserting them onto the FTP site.
- "Profile of the Superworm: SoBig.E Exposed"
TechNewsWorld (08/13/03); Germain, Jack M.
Internet security experts say that the SoBig.E variant poses a serious long-term threat to the Internet because it has opened up so many computers to hackers. SoBig.E, which is primarily spread via shared files on corporate networks and secondarily through email, opens a large back door on infected machines and contains a built-in maintenance channel where the hacker can update code. Cable & Wireless chief security officer William Hancock says SoBig.E is the first worm to exploit hacking technology to deploy spam tools en masse. The maintenance channel poses an extra threat in that other hackers could reverse-engineer the code to create their own SoBig variants. To date, each version up to SoBig.E has been timed out by the worm's author and followed up by another version, but so far, SoBig.F has not appeared. Once the worm is written to network users' startup folder, only possible if people leave write access open, or opened in a ZIP email attachment, SoBig.E sends itself to all contacts in the user's address book as well as to all email addresses stored in other documents on the computer. In the email mode, the live part of the virus is called details.pif, but copies itself into winssk32.exe once opened and creates a MSRRF.DAT file some analysts say is a foothold for remote control. Hancock estimates that SoBig.E will increase the amount of spam sent over the Internet by a factor of 10 because of the number of unsophisticated victims whose computers have been hijacked.
- "Real-Time Java Takes Flight"
Software Development Times (08/03) No. 83, P. 26; Correia, Edward J.
The Goldengate Project is a joint venture between NASA's Jet Propulsion Laboratory, TimeSys, Sun Microsystems, and Carnegie Mellon University to develop a Java-enabled Mars Rover prototype by leveraging new and pending specifications designed to streamline embedded development and extend device interoperability. "We're trying to prove that Java and Linux can be flight-ready technologies and that you can build hard real-time solutions with them," explains CMU software engineer Brian Giovannoni. "In the Mars Rover, we have to control individual pieces of hardware at specific intervals on a periodic basis, otherwise you crash into a wall." TimeSys is contributing the Linux real-time operating system and the Real-Time Specification for Java (RTSJ) to the Goldengate Project; TimeSys technology VP Doug Locke notes that the advantages of the RTSJ include its ability to accommodate asynchronous events, while Java's safe object languages can support more reliable code than other languages. A key ingredient of the RTSJ is a garbage collector that tracks every reference in the program, identifying and erasing anything that lacks a reference, according to Locke. He adds that uncontrolled garbage collection is checked via the insertion of immortal and scoped memory, as well as real-time and no-heap real-time thread types; the former thread type can access heap, immortal, or scoped memory, while the latter can only access immortal or scoped memory and thus arrogate the garbage collector. Locke says the RTSJ can handle all previous versions of Java thanks to its support of standard Java threads. Asynchronous transfer of control--forced exceptions whereby one Java thread can order another to halt its operation and start a different function--is also defined by the RTSJ.
- "Demystifying the Digital Divide"
Scientific American (08/03) Vol. 289, No. 2, P. 42; Warschauer, Mark
A widely-shared view of a "digital divide"--a gaping socioeconomic chasm between those who have access to computers and the Internet and those who do not--fosters technological determinism, which assumes technology's very presence will lead to social change. Mark Warschauer of the University of California, Irvine, writes that failed attempts to improve people's lives by merely distributing technology to needy areas are a testament to the flaws in this reasoning. He observes that many people access and use technology differently, and these contextual differences are not taken into account by the binary description of the digital divide. Warschauer cites several examples illustrating how a lack of context can hobble well-intentioned initiatives: In an India-based experiment that he describes as typical of educational technology projects worldwide, outdoor computer terminals were set up in a poor urban area to give children an opportunity to familiarize themselves with the technology with "minimally invasive education;" however, most children used the machines as a plaything rather than a learning device. In academic circles, an alternative approach called social informatics is being proposed, in which consideration is given to the context of technology as it relates to hardware, software, support resources, and infrastructure, as well as people's relationships to one another and to the community at large. Studies show that computer use in educational institutions has the potential to either narrow or widen the digital divide because not all computers are used equally. Warschauer explains that high-income students in kindergarten through 12th grade employ computers more frequently for experimentation, research, and critical inquiry, while poorer students are limited to simple exercises. "From a policy standpoint, the goal of bringing technology to marginalized groups is not merely to overcome a technological divide but instead to further a process of social inclusion," Warschauer concludes.
- "Patching Things Up"
CIO (08/01/03) Vol. 16, No. 20, P. 79; Violino, Bob
The growing number of software patches released every year is threatening to become a costly administrative nightmare for companies, which are turning to automated patch management products to ease the process--but these tools can only work in tandem with an organizational effort to bring computing environments under control. Patch management products are designed to search for and study new patches, check network-connected devices for security holes, and implement the appropriate fixes; expected benefits include less downtime due to software failures, reduced vulnerability to hack attacks, and lower costs than manually deploying patches. However, customers are often forced to buy multiple products to cover all software and systems, for lack of a one-size-fits-all patch management tool. Paccar CIO Patrick Flynn notes that coupling patch tools to existing software can be difficult, and advises companies to employ a patch management software supervisor to guard against inefficiencies. To stabilize computing environments and boost patch management software's effectiveness, Forrester Research analyst Laura Koetzle recommends that enterprises choose and standardize on a handful of standard configurations. Some companies, Qualcomm being one, are attempting to spur vendors into streamlining patch management. "We're pushing Oracle to simplify the patching process and either help us provide a better patch solution or adopt [technology from a vendor] like Kintana as a standard," explains Qualcomm's Tom Fisher, who emphasizes the importance of logs that track patch distribution status. "Give me a log that tells me [the status of a patch distribution], so I know that it only happened on this machine, and I don't have to worry about the 18 other machines I pushed to today," he says.