HomeFeedbackJoinShopSearch
Home

      

Purchase a select ThinkPad� notebook and receive free double memory. Simply request this special offer or select it online when customizing your system. Offer ends July 21, 2003 or while supplies last. Call 1-800-426-7235, ext. 4098 or visit us online at www.ibm.com/businesscenter/acm. Offer valid from IBM in the US only from 7/1/2003 through 7/21/2003; while supplies last. Limit 10 per customer. Cannot be combined with any other offers or promotions. Shipping and handling not included.



ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either IBM or ACM.

To send comments, please write to [email protected].

Volume 5, Issue 521: Friday, July 18, 2003

  • "Congress Questions U.S. Supercomputing Efforts"
    IDG News Service (07/16/03); Gross, Grant

    Witnesses testifying before the House Science Committee on July 16 warned that the United States' supercomputing initiative is lagging behind that of Japan, which launched the world's fastest supercomputer, the Earth Simulator, in March 2002. "Supercomputers help design our cars, predict our weather, and deepen our understanding of the natural forces that govern our lives," declared House Science Committee Chairman Rep. Sherwood Boehlert (R-N.Y.), who added that the Earth Simulator and its implications should demand the full attention of Congress. Committee members saw the Japanese machine as a sign that the country must step up its federal push for high-performance computing, as well as encourage more cooperation between federal agencies, although those who testified at the hearing debated whether supercomputing or grid computing should be a priority. The National Science Foundation's (NSF) Peter Freeman insisted that his organization still considers supercomputing to be a priority, although a February report from an NSF Advisory Panel on Cyberinfrastructure suggested that the agency devote research to other areas, such as grid computing. Freeman argued that supercomputers, along with networks, databases, and computing grids, should be integrated into the overall cyber-infrastructure if their full potential is to be reached. However, Daniel Reed of the University of Illinois at Urbana-Champaign countered that the government should pour more money into supercomputing R&D, claiming that "Many problems of national importance can only be solved by tightly coupled high-end computers." Meanwhile, Raymond Orbach of the Energy Department's Office of Science said the launch of the Earth Simulator has a bright side--it proves that 25-plus teraflop computations at sustained speeds is an attainable goal.
    http://www.nwfusion.com/news/2003/0716congrquest.html

  • "Router Bug Threatens 'Internet Backbone'"
    New Scientist (07/17/03); Knight, Will

    Computer experts warn that a major software glitch can affect Internet routers that run the Cisco IOS operating system. Such routers essentially comprise the Internet backbone, according to Internet Security Systems consultant Gunter Ollman. If specially crafted data packets are repeatedly relayed to a vulnerable router, the router will attempt to continuously restart and become unavailable to direct valid traffic. Worse, Ollman reports that forcing a router to restart could prompt it to steer traffic in the wrong direction even after it has resumed operation. Cisco has released an alert recommending that routers be protected against possible hacking by updating the software to the latest version. Meanwhile, the U.S. Computer Emergency Response Team (CERT) has also issued an advisory, warning that the software flaw could permit hackers to launch denial-of-service attacks against vulnerable devices. The Cisco router bug was disclosed the same time that Microsoft published a software flaw that carries serious implications for all versions of the Windows operating system and that hackers could exploit to access computers. Ollman says that many organizations, ISPs especially, will be scrambling to upgrade their network security in response to the posting of the router bug, although Cambridge University computer security expert Richard Clayton declares, "I'm not saying that this isn't serious, but I've seen more things that looked like 'The End of The World As We Know It' than this does right now."
    http://www.newscientist.com/news/news.jsp?id=ns99993955

  • "Why Some Big Spammers Are Backing Spam-Control Laws"
    Wall Street Journal (07/18/03) P. B1; Dreazen, Yochi J.

    In an unusual paradox, bulk commercial emailers such as AOL Time Warner, Yahoo!, EarthLink, and eBay have come out in support of antispam legislation, while consumer groups are attempting to block the passage of such bills. "When you see some of the biggest spammers in the country backing legislation that is allegedly antispam, you really need to wonder about what these bills actually do," notes John Mozena of the Coalition Against Unsolicited Commercial Email. Among the proposals that major spammers wholeheartedly endorse is a popular bill from Sens. Ron Wyden (D-Ore.) and Conrad Burns (R-Mont.) that prohibits deceptive subject lines, requires commercial emails to include valid return addresses and opt-out policies, and imposes heavy penalties for spammers who willfully violate those edicts. The bill would also grant the FTC the authority to impose civil fines, and allow state attorneys general and private ISPs to file lawsuits directly against spammers. Sen. Charles Schumer (D-N.Y.) introduced an independent stipulation that the FTC build a "do not spam" list similar to the "do not call" list the commission recently created. Another bill sponsored by Reps. W.J. Tauzin (R-La.) and James Sensenbrenner (R-Wis.) would also establish an opt-out policy and forbid spammers from collecting random email addresses from the Internet, but state attorneys general would not be permitted to sue spammers who do not comply with opt-out requests, and the legislation gives more latitude over what constitutes permissible email. Consumer organizations' biggest gripes against these measures are that they do not allow consumers to take spammers to court, and make it consumers' responsibility to remove their names from mailing lists. Moreover, the bills still permit certain kinds of unsolicited email, instead of banning spam outright as many antispam advocates would like.

  • "Bill Aims to Curb Net Censorship"
    CNet (07/17/03); McCullagh, Declan

    The Global Internet Freedom Act passed by the House of Representatives on July 16 includes $16 million over two years to establish the Office of Global Internet Freedom, which would be responsible for developing technical methods for preventing oppressive foreign governments from censoring the Internet while allowing citizens to access the Web without fear of punishment, according to bill sponsor and Homeland Security Committee Chairman Rep. Chris Cox (R-Calif.). Under the bill, the new office would "develop and implement a comprehensive global strategy to combat state-sponsored and state-directed Internet jamming, and persecution of those who use the Internet." The measure is part of a larger bill, also passed by the House, that approves multi-year State Department funding. If the Senate approves the Global Internet Freedom Act, the new office would be founded under the Broadcasting Board of Governors. Regimes singled out for Internet censorship in a June report from Reporters Without Borders include the government of Myanmar, which imposes heavy restrictions on outside Internet access and monitors email traffic. Cox told the U.S.-China Economic and Security Review Commission in June that the Chinese government is also guilty of suppressing Net access, instituting broad online activity surveillance, and penalizing people who merely want to share information. Anonymizer.com President Lance Cottrell praises Cox's legislation, and believes such measures can give the Internet "an opportunity to live up to its billing as the single greatest democratizing technology ever invented." An earlier version of the Global Internet Freedom Act introduced in October 2002 would have earmarked a two-year grant of $100 million for the Office of Global Internet Freedom.
    http://news.com.com/2100-1029_3-1026690.html

  • "Purdue Software Promises Better Animation for Movies, Games"
    Newswise (07/16/03)

    The Swell software program designed by Purdue University engineering student Joshua Schpok is a tool that artists can use to more realistically animate volumetric cloud formations, smoke, steam, fog, explosions, and other gaseous phenomena for movies and video games, as well as weather forecasting. The software is intuitive and interactive, providing users with real-time results; David Ebert, director of Purdue's Rendering and Perceptualization Lab, notes that such animations may take traditional programs hours to complete. "So an artist wouldn't have to deal with scientific details--such as pressure and density, thermal convection, the percentage of dust and ice particles and all of these things that a meteorologist would look at--we have created a control system that an artist can actually manipulate," explains Ebert. He adds that cloud rendering is complicated because such objects are transparent and require full interior detail. Variable environmental conditions can be mimicked through the animation of latent skeletal structures and independent transformation of octaves of noise. A new goal for the Purdue engineers is to widen the spectrum of simulated lighting conditions in the animations, such as the blue color clouds attain due to the atmospheric scattering of light. The U.S. Department of Energy and the National Science Foundation are underwriters for the Swell project. The researchers will present a paper detailing their findings on July 26 at the Symposium on Computer Animation presented by the Association for Computing Machinery's Special Interest Group on Computer Graphics and Interactive Techniques, and the European Association for Computer Animation.
    http://www.newswise.com/articles/view/?id=500151

  • "Upload a File, Go to Prison"
    Wired News (07/17/03); Dean, Katie

    Legislation introduced by Reps. John Conyers Jr. (D-Mich.) and Howard Berman (D-Calif.) aims to penalize users for peer-to-peer file trading, in the interest of protecting copyrights, and also would make it possible to file federal charges against those who offer inaccurate information in the domain-name registration process. The Author, Consumer, and Computer Owner Protection and Security Act of 2003, or ACCOPS, classifies as a felony the uploading of just one file with material under copyright, stipulating a punishment of as much as five years in prison and a fine of up to $250,000. If passed, the legislation would increase funding to the Justice Department for investigating copyright violation cases as well. Electronic Frontier Foundation attorney Jason Schultz says the bill is "a sign of desperation" on the part of Hollywood and the recording industry, but Conyers says it is an attempt to protect intellectual property, the country's leading export; an aide to Conyers says the bill seeks to clarify the law in this area. ACCOPS also mandates that file-sharing services receive permission from consumers to store or search for files. A bill still pending in Congress would shield copyright holders from liability if they prevented P2P networks from illegally distributing their works. Schultz says existing laws already protect copyright holders, and notes that anyone with a computer containing copyrighted files that's connected to a public network could be liable under ACCOPS.
    http://www.wired.com/news/digiwood/0,1412,59654,00.html

  • "Exploding Universe of Web Addresses"
    New York Times (07/17/03) P. E5; Selingo, Jeffrey

    Some experts predict that the supply of available Internet Protocol addresses--the unique numerical combinations used to represent every device connected to the Net--will be exhausted in two years. Alex Lightman, who chaired a June conference that debated next-generation Internet Protocol version 6 (IPv6), envisions a time when practically all devices will be linked to the Internet, a time when "we're going to need something like 100 IP addresses for each human being." The current IP version 4 standard provides 4 billion IP addresses, while the deployment of IPv6 is expected to raise that number to 35 trillion by some estimates, although Verio's Cody Christman expects the new protocol's address capacity to be even vaster. IPv6 is designed to simplify IP address configuration, which can be a sore point when computer users transition to a new service provider. The protocol also promises to offer what Lightman describes as end-to-end security. In fact, IPv6's security advantages have prompted the Defense Department to make IPv6 compatibility a technology procurement requirement by this fall. Perhaps the biggest distinction between IPv6 and its predecessor is that IPv6 can enable virtually all electronic devices to maintain constant communication, giving people the power to run devices from any Internet connection. "It allows us to create billions of new sensors that can instantly communicate, taking human error out of the equation," declares Larry Smarr of the University of California at San Diego.
    http://www.nytimes.com/2003/07/17/technology/circuits/17next.html
    (Access to this site is free; however, first-time visitors must register.)

  • "Feature: In Sensors Smaller May Be Smarter"
    United Press International (07/14/03); Wasowicz, Lidia

    Spec, a low-cost, low-powered sensor technology developed at the University of California, Berkeley's Center for Information Technology Research in the Interest of Society (CITRUS), is a fourth-generation wireless mote crowning six years of research spearheaded by computer science professors Kris Pister and David Culler, who respectively pioneered "smart dust" and the TinyOS operating system. The Spec chip encapsulates a micro-radio, analog-to-digital converter, temperature sensor, and TinyOS into a 5-square-millimeter silicon package, and can transmit 902 MHz radio signals at a distance of 40 feet at 19,200 Kbps, according to tests. Pister plans to commercialize Spec into a device no larger than an aspirin that costs between $5 and $10, while CITRUS director Ruzena Bajcsy believes five years will pass before all the bugs have been ironed out. Potential applications of smart dust technology envisioned by Pister and others include tracking a person's vital signs via devices encased in jewelry, the enhancement of battlefield tactics through distributed sensor networks, monitoring and assessing a building's structural integrity, the detection of biochemical toxins, and meteorology. Researchers acknowledge that such technology's potential impact on privacy and security will undoubtedly raise concerns, but note that they are working on encrypting mote-gathered data to prevent unauthorized access. However, Bajcsy says, "Even if you make [the sensors] cheap and energy-efficient, you still have to ask about the environmental impact and how many of these things you want to have around before the environment gets cluttered."
    http://www.upi.com/view.cfm?StoryID=20030714-112803-3242r

  • "Researchers Delve Into the Human Factor"
    CNet (07/16/03); Fried, Ina

    This year's New Paradigms in User Computing conference at IBM's Almaden Research Center emphasized ways people can better understand how humans interact with computers rather than focusing exclusively on new user interface technologies. Computer scientists, academics, and other experts recently convened at the conference to share their findings on technology consumption, which were achieved through a variety of techniques. Leading search engine Google asks company employees their opinion on new ideas and then tests those concepts either on customers or via its live Web site. Meanwhile, Intel has recruited anthropologists such as Genevieve Bell to explore how computer use varies throughout different cultures. Such field research allowed Bell to discover unusual technology usage trends, such as some users' tendency to have their cell phones blessed by monks in certain parts of the world; more useful to Intel were Bell's observations that Chinese computer owners often keep their computers to themselves while leaving the television in plain sight, which could indicate that China may not be the best market for a Media Center PC that marries a computer with interactive TV. Hewlett-Packard researcher Joshua Tyler noted at the conference that studying a person's email inbox reveals a lot: For instance, his team analyzed 900,000 messages from within HP Labs, and discovered that email discussions are closely related to corporate projects and organizational architecture. Companies are also using eye-tracking technologies and other innovations to monitor computer usage. Eye-tracking devices are less intrusive than before and can be integrated with software that tracks mouse clicks, thus furnishing a detailed picture of the material a computer user views and what that user does.
    http://news.com.com/2100-1008-1026321.html

  • "Ralph Etienne-Cummings: Envisioning the Future of Robotics"
    Black Engineer (07/15/03); Phillips, Bruce E.

    "Neuromorphic engineering" is Ralph Etienne-Cummings' domain at The Johns Hopkins University, where he is associate professor of electrical and computer engineering. Etienne-Cummings is studying the ways in which living organisms solve engineering problems, such as how a fly's array of eyes can track and react to danger so effectively. He is working on applications in the field of robotics vision, including a handheld device that identifies nearby objects for its user. Such a device could be used by blind people, who could point the digital "eye" at an object to find out what it is. Such technology is also of interest to the Defense Department, which is currently funding robotics vision research that will allow unmanned aerial vehicles to identify and track objects through a city or countryside. Civilian first responders could use similar robots in extremely hazardous rescue situations, such as when climbing through rubble or passing through intense flames. Etienne-Cummings is also studying lamprey spinal cords at the University of Maryland in conjunction with a colleague in the biology department. He says studying the primitive nervous system of fish will one day lead to more natural robotic movement. Discrete digital values mean robots cannot exactly replicate movement in animals, since neurosystems rely on a continuum of values. Figuring out how to build circuits that mimic biological function could lead to prosthetic limbs that move naturally, or leg braces that enhance running and jumping.
    http://www.blackengineer.com/artman/publish/article_114.shtml

  • "Interview: Torvalds Gets Down to the Kernel"
    InfoWorld (07/16/03); Scannell, Ed; Fonseca, Brian

    Linus Torvalds says Linux version 2.6 will probably be finished up even faster than 2.4, although more parties are involved and there are challenges related to synchronization. Version 2.6 is important to corporate users especially, because it deals with many of the scalability and reliability issues of concern to large organizations, and it handles multiprocessor systems better. Torvalds expects version 2.6 user testing to produce bugs in device drivers, though that problem is not specific to version 2.6. In fact, Torvalds touts the new virtual machine (VM) system and file-system infrastructure as much more robust. He says normal users often ferret out obvious bugs developers often overlook. In regards to the SCO suit against IBM, Torvalds says he is less worried than before now that the claims have been clarified and are focused on contract issues instead of intellectual property; still, Linux has a better system in place to resolve such issues than any proprietary system, he says. Microsoft, which is tying its Windows Server 2003 and Office System 2003 with Share Point Portal Server middleware, will not elicit any response from Linux because Torvalds says Linux was conceived as a singular system from the beginning, meaning the operations are basically the same whether in an embedded device or running on a supercomputer cluster. Torvalds himself points out that he has used Linux for his desktop since it is possible and that Linux does not have a fragmented user base as does Microsoft. In the future, Torvalds says more integration is needed for Linux, and perhaps an architecture overhaul could be carried out in four to five years; wireless integration and 3-D acceleration are other possible features.

  • "Back Together Again"
    New York Times (07/17/03) P. E1; Heingartner, Douglas

    Recent corporate scandals have raised the profile of paper-shredding and the technologies that reconstitute destroyed documents. Such reassembling technology is similar to digital encryption and hacking in that each side is continually upping the stakes. More powerful computers and software are helping piece together large amounts of document confetti, even that which has been cross-shredded into strips less than half an inch-long. The German government is looking for another way to reassemble 16,000 bags of shredded Stasi files left over from the repressive East German secret police. "These documents contain lots of information that might be dangerous to a few politicians who are still active, still in power," notes Werner Vogeli of SER Solutions, who doubts that there is any political incentive for financing the reconstruction of those documents. Researchers from the Fraunhofer Gesellschaft institute in Berlin, whose colleagues helped develop the MP3 format, are combining handwriting analysis, biometric identification, image processing, and office automation technologies to complete the task. They intend to equip the current team, which has only finished about 300 bags since the East German collapse in 1989, with computers, scanners, and their special software. Scanning software will suggest matches between torn pieces, and the experts can either accept or reject the pairing. Many of the Stasi documents were frantically hand-torn because the agency's paper shredders soon failed as officers rushed to destroy evidence. Security experts say there are ways to destroy paper documents more effectively, including feeding it into the shredder perpendicular to how the text flows and using large type so that less complete text appears on each shred.
    http://nytimes.com/2003/07/17/technology/circuits/17shre.html
    (Access to this site is free; however, first-time visitors must register.)

  • "Searching for the Kilowatt of Computing"
    Wall Street Journal (07/17/03) P. B1; Tam, Pui-Wing

    Hewlett-Packard, IBM, Sun Microsystems, and other tech giants are seeking a unit of measurement for computing consumption equivalent to the kilowatt used to gauge electricity consumption or the cubic foot used to measure natural-gas usage. This measurement unit, which a research team led by HP theoretical physicist Bernardo Huberman has termed a "computon," is vital to the utility computing push, which needs an easy, standardized method to measure and bill users for computing power if it is to succeed. However, the computon defies scientific quantifiers--it must take many variables into account, including processing power and data storage capacity, and be adaptable to changing customer requirements. Huberman has researched pricing policies with HP's internal economists and with the nonprofit Electric Power Research Institute, and his team devotes considerable study to HP's own power and storage consumption. Although Huberman expects to have fully defined the computon in the next year, he admits more years may pass before the unit is ready for wide-scale rollout. IBM and Sun are after the same goal, but use different terminology: IBM debuted a unit of measurement called the "service unit" last year for its Linux Virtual Services utility-computing offering; IBM's Dev Mukherjee says the service unit, which combines processing power and memory, is different for each user. Jay Littlepage of Sun says his company has rolled out the "Sun Power Unit," which provides gigahertz measurements for processing power or gigabyte measurements for storage capacity. He says, "Ultimately, the more simple we can make billing and measurement, the better it is for customers."

  • "Software Is Patently Not to Be Patented"
    Financial Times-IT Review (07/16/03) P. 6; Cane, Alan

    Software can be copyrighted but not patented under European law, but critics contend that a proposed European Union directive allows underhanded software patenting that would place a chokehold on competition and innovation. Software and business-process patenting is common in Japan and the United States, which critics claim are suffering from a stifling of innovation as a consequence. The EU directive could give big software companies license to sue smaller businesses for patent infringement, and defendants could run up millions of dollars in litigation costs. This factor, critics charge, would discourage newcomers from entering the software market. A European Parliament vote on the directive has been pushed back from June 30 to September, and software lobbies such as the Foundation for Free Information Infrastructure consider this development to be a triumph for their side and an opportunity to reconsider and redraft the proposal. Alan Cane writes that software protection regulations across EU member states should be harmonized while also allowing software developers to carry on without the threat of patent searches and litigation hanging over their heads. He argues that the EU directive's wording is opaque: The proposal lists "computer implemented inventions" as patentable, but this definition fails to establish whether it refers to software algorithms or inventions whose usability is dependent on software. Cane also notes that it is harder to see parallels in software invention and physical invention, and argues that there are few truly novel software inventions because most software is based upon prior work carried out by other people.
    http://news.ft.com/news/industries/infotechnology
    (Access for paying subscriber only.)

  • "The Organizational Model for Open Source"
    HBS Working Knowledge (07/07/03); Stark, Mallory

    Harvard Business School professor Siobhan O'Mahony has made some interesting conclusions about the open-source organizational model by studying nonprofit foundations that have coalesced around a trio of open-source software projects--the non-commercial Linux distribution known as Debian, the GNU Object Model Environment (GNOME), and the Apache public domain open-source Web server. O'Mahony takes note of a contradictory phenomenon in that numerous open-source projects coordinated by the hacker community--a group that values independence and self-determination--have assimilated and erected nonprofit foundations organized into committees with assigned functions. The professor infers three major challenges the nonprofit agencies are confronted with: The first is limited resources to accommodate legal fees, conferences, or travel, although this problem is mitigated somewhat by the foundations' primary reliance on electronic communications. The second challenge is balancing the informal work culture and conventions of hacker-style programming with the predictability and formality of software release management. "People are intimately aware of the fact that too much structure will disenfranchise the very people who make the most successful open source projects possible," O'Mahony explains. The third challenge lies in maintaining pluralism in the administration of open-source software projects, which can be complicated if open-source contributors distinguish each other according to individual merit without acknowledging their employers. Although O'Mahony is skeptical that the future of software development will be determined by nonprofit foundations, she thinks that the foundations will still have an important part to play.
    "DARPA Awards Pacts to Juice Computing"
    Federal Computer Week (07/14/03) Vol. 17, No. 23, P. 10; Hardy, Michael

    Cray, IBM, and Sun Microsystems have each been awarded tens of millions of dollars to develop next-generation computing architectures for the Defense Advanced Research Projects Agency (DARPA). With the distribution of over $146 million, the High Productivity Computing Systems project has entered its second phase, the finish of which will produce a preliminary design. Goals for the project go beyond peak output, says DARPA's Jan Walker, and focus on a consistent, reliable architecture that is from 10 to 40 times faster than today's technology. Cray project manager Keith Shields, whose company teamed with New Technology Endeavors for its Cascade project, says current systems are difficult to program and unreliable, with components such as processors outpacing aspects such as bandwidth. Sun Microsystems senior scientist John Gustafson says his company's Hero project will look for a fresh approach that leaves behind legacy features dating back to the 1950s. Illuminata senior analyst Gordon Haff says DARPA is not just looking for systems that string together a bunch of processors, but allows those processors to work together with greater cooperation. He says the projects will have to address designs holistically since computer operations are so interdependent. "This is really the lunatic fringe, specialist edge," he says.
    Click Here to View Full Article

  • "The Apple Is Ripening"
    SD Times (07/01/03) No. 81, P. 28; Correia, Edward J.

    Apple Computer has a wide range of application development options that dovetail with the company's merchandising strategy to hunt niche markets for its products, which include design and CAD, according to Apple's Richard Kerris. He suggests that if an organization only wishes to build native applications for Mac OS X, then developers should select Cocoa, a native development environment for Java, Objective-C, C, and C++ programming languages; Kerris explains that Cocoa "allows ideas to go from thought to actual application sometimes in hours." He adds that Objective-C offers greater ease of use and power than C++, and uses a syntax modeled after Smalltalk to provide an object-oriented language that is almost equivalent to C while being more readable and easier to manage. Objective-C lacks portability, but Kerris says that crafty programming facilitates the development of applications that can be migrated with only a small amount of rewriting. Motorola's Greg Hemstreet says the open-source Darwin Mac OS X kernel can run numerous concurrent tasks separate from the operating system and each other through its protective memory and preemptive multitasking, which "enables developers to build applications that will respond to user input regardless of what's happening behind the scenes." Carbon, a suite of C and C++ application programming interfaces (APIs) and migration tools for building native applications for Mac OS 8.1 and beyond, is helpful for companies still using older Mac OS versions. Carbon's Carbon Dater migration tool can analyze legacy code and help ascertain what changes must be enacted so that the code is interoperable with OS X. Finally, Kerris says that AppleScript allows practically any Mac application to be automated--for instance, the "Clicker" AppleScript lets a Bluetooth-enabled cell phone run slideshows operating on a Mac, or cue certain functions if it comes in close proximity to the computer.
    http://www.sdtimes.com/news/081/special1.htm

  • "Are You Ready for MAID Technology?"
    Computer Technology Review (06/03) Vol. 23, No. 6, P. 1; Moore, Fred

    The development of new disk-based backup and archival storage options is proceeding apace, with lower-priced Serial Advanced Technology Architecture (SATA) disks taking a vanguard position between online disk storage and Nearline or automated tape library storage. The need for larger capacity tape technologies is being driven by new federal data retention regulations such as the Sarbanes-Oxley Act and the Health Insurance Portability and Accountability Act, while demand for rich media (video, text, audio, imagery, 3D graphics, HDTV, and movies) applications will also spur demand for improved storage solutions. Backup, replication, or mirroring methods are expected to boost original storage demand by 100 percent or 200 percent over the next several years, while non-digital data applications will also require the expansion of storage capacity. Massive Arrays of Inactive Disks (MAID) technology is moving to occupy the space between disk and tape storage. Many of the disks in a MAID subsystem are kept idle until requested, and increasing the number of dormant disks can cut the number of I/O path links, cache requirements, and overall controller complexity, thus reducing disk controller costs. MAID seeks to allow current SATA activity to manage extended storage needs. MAID is a cost-effective choice for accommodating backup/recovery, lower activity reference data, and fixed-content data while staying within performance parameters. Far-line storage, in which data is retrieved manually, still accounts for the bulk of global analog data.
    http://www.wwpi.com/lead_stories/070803_3.asp

  • "Casting the Wireless Sensor Net"
    Technology Review (08/03) Vol. 106, No. 6, P. 50; Huang, Gregory T.

    The development of wireless sensor nets--intelligent, self-organizing networks that act cooperatively to present users with usable information--is expected to progress to the point where the technology will be embedded practically everywhere by 2010, according to UC Berkeley researcher David Culler; however, the technology's proliferation will depend on resolving issues related to cost, battery life, node connectivity, and the development of a "killer app." Decentralized mesh network technology developed by the Jet Propulsion Laboratory's Kevin Delin consists of sensor web pods that communicate by radio to their nearest neighbors in order to save power while relaying and processing data. The pods are being used at Huntington Botanical Gardens in California to measure heat, humidity, and soil moisture to paint a picture of how much rain and sunlight the plants are receiving. Similar technologies offered by vendors such as Ember are attractive to heavy industry for their cost-effectiveness, but these solutions consume a lot of power and entail high node implementation costs; Intel research director David Tennenhouse projects that the commercial appeal of sensor nets will widen if node setup costs fall below $20 per node, and this could be achieved through standardization. One promising design paradigm is UC Berkeley's motes, which are smaller and less power-consumptive than most commercial wireless sensors. Another disadvantage of current wireless sensor technology is that networks become more vulnerable to crashes as more nodes are deployed, so Deborah Estrin's lab at UCLA is testing a system in which nodes organize into clusters and adjust on an ad hoc basis to save battery power and more efficiently handle data flow. Meanwhile, Paul Davis' UCLA team is distributing seismic sensors across the UCLA campus to develop a simulation of the structural effects of seismic activity; the results of such experiments could help urban planners to better protect buildings from earthquakes. Several other projects demonstrate the ultimate goal of making sensor nets intelligent, autonomous, and self-aware.
    http://www.technologyreview.com/articles/huang0703.asp
    (Access to this site is free; however, first-time visitors must register.)

 
                                                                             
[ Archives ] [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM