Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Parasoft or ACM. To send comments, please write to
Volume 6, Issue 717:  Wednesday, November 10, 2004

  • "Supercomputer Mastery At Risk"
    Mercury News (11/09/04); Chui, Glennda; Poletti, Therese

    Though the United States secured the top two spots in the Top 500 supercomputer list on Nov. 8, a National Research Council report issued on the same day warns that America could lose its supercomputing superiority unless the government increases funding for next-generation system development by more than 300 percent. Stanford University computer scientist Bill Dally argues that supercomputing is as important a research area to the country as submarine or nuclear weapons development. "There are a lot of problems that we could be solving that we are not, because the [supercomputing] capability is not there," he attests. The last 10 years witnessed a decrease in federal supercomputer research funding as it became more popular and economically sensible to have researchers and private industry build systems out of cheap, commercially available components, especially as computer chip costs declined and power rose. Existing applications for supercomputers include oil exploration, weather forecasting, genomic research, severe storm and earthquake modeling, and automobile crash testing, while Lawrence Berkeley National Laboratory supercomputer center director Horst Simon explains that supercomputers function as test beds for future consumer and corporate computing technologies. When Japan stole the United States' Top 500 supercomputing crown two years ago, IBM, Cray, and Sun Microsystems were spurred to create more advanced supercomputers by 2010 using hefty research grants from the Defense Advanced Research Projects Agency. IBM's Blue Gene/L was ranked No. 1 on the Top 500 list on Monday with its speed of 70.72 teraflops, while the Columbia supercomputer at NASA/Ames Research Center, which is still being installed, can achieve 51.87 teraflops. Blue Gene will be used for classified weapons research as well as novel materials design and molecular interaction research, while Columbia's focus will include space shuttle flight simulation, cosmological experiments, genomics, and meteorological and seismic prediction.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Slowdown Forces Many to Wander for Work"
    Washington Post (11/09/04) P. A1; Schneider, Greg

    The IT job market has been bleeding jobs since hitting a peak in 2000, and now the IT unemployment rate exceeds the national average, according to industry research. Experts say IT still has good prospects for the future and will generate employment, and the average salary for IT workers is $53,728, more than the roughly $35,000 an average American worker earns each year; but experts say that premium is deflating with some of the broadest categories of IT workers losing ground in terms of pay parity--computer programmers only saw an average 1.3 percent wage increase for the last two years, for example. The problem is that the IT industry is undergoing the same gutting of middle-class jobs that other industries have suffered, since companies can no longer afford to keep large in-house IT departments and instead rely on offshore outsourcing and contract workers. This has forced many IT workers to adopt a migrant lifestyle as they move from job to job, something that Joan Ozello of Potomac, Md., has come to enjoy. Not only does she make more as a security and capital planning consultant, but she puts in extra time knowing that she gets paid per hour. The IT employment situation is not the same around the country, with areas such as Dallas and San Francisco losing population due to job scarcity while areas such as Washington, D.C., and New York are seeing an increase in job postings, albeit many are for contract positions. The Internet has made IT job-seeking much easier in some ways, while encouraging the sort of migratory or temporary employment that many people complain about. Corey Frankel decided to stay in Falls Church, Va., so that his family's lifestyle would not be interrupted, but he has had to sell cars and collect unemployment in between his IT job contracts.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Digital Memories, Piling Up, May Prove Fleeting"
    New York Times (11/09/04) P. A1; Hafner, Katie

    The accumulation of digital materials--photos, emails, documents, music, etc.--has spurred a movement to devise a methodology for sustaining such vast archives, including a multimillion-dollar Library of Congress project to create uniform digital preservation standards so that material can be accessed no matter what hardware or software is employed. Ken Thibodeau with the National Archives and Records Administration explains that digital preservation "is a global problem for the biggest governments and the biggest corporations all the way down to individuals." Adding urgency to the problem is the ephemeral nature of digital recording media such as CDs and hard drives, which are threatened by obsolescence as well as short-lived usability. The problem is especially palpable for consumers, who can rarely access the expertise and resources that professional archivists and librarians routinely use to back up and retrieve files and documents. The "museum approach" of saving obsolete equipment may turn out to be the most practical solution to the digital preservation problem 10 or 20 years from now, according to Global Business Network Chairman Peter Schwartz. "As long as you keep your data files somewhat readable you'll be able to go to the equivalent of Kinko's where they'll have every ancient computer available," he predicts. Until then, one guarantee against the loss of important data is printing it out on paper. In the search for a uniform data format, one option is the Web, says Thibodeau. Meanwhile, experts at the Digital Archives are developing ways to make digital computer files uniform so that specific computers or programs are not needed to access stored data.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Milking Knowledge Anywhere at Any Time"
    IST Results (11/10/04)

    The IST program's Multimedia Interaction for Learning and Knowing (MILK) project is designed to break knowledge workers out of the office-based desktop PC environment with a suite of tools providing anytime/anywhere knowledge management (KM). Such a scheme takes into account the fact that workers spend a considerable amount of their working day away from their desks, traveling, or participating in various formal and/or informal meetings--and knowledge and data generated and shared in such instances frequently slips irretrievably through the cracks. MILK's KM system can be accessed in a diverse number of formats depending on the gadget the worker uses: Workers on the go can employ personal digital assistants or mobile phones to access databases and tools, while people in conferences or other social situations can use public wall screens and interactive displays. The MILK suite can trim redundancy from the performance of tasks, and boost innovation and productivity by giving users the means to conduct their work no matter where they are. The system is expected to be especially useful to large or mid-sized companies with 100 to 200 KM workers involved in creative pursuits, according to MILK project coordinator Maurizio Mesenzani of Butera e Partners. MILK project partners developed the KM platform using "seductive design," whereby end users, designers, and system developers each contribute their own unique views in order to "cover every angle." Mesenzani says the final product of this approach is "an attractive and easy to use system that is the result of a negotiating process between many actors with different interests."
    Click Here to View Full Article

  • "Technology Advances About to Re-Arm U.S. Army"
    Investor's Business Daily (11/10/04) P. A4; Deagon, Brian

    The $117 billion Future Combat Systems (FCS) program, whose size and scope far outstrips previous defense initiatives, aims to replace the U.S. Army's heavy fighting vehicles with lighter and in certain cases automated vehicles coordinated by troops, using a communications system of unprecedented sophistication. Eighteen new military systems programs buttressed by 32 critical technology elements, 154 complementary systems, and 34 million lines of software code comprise FCS; the technology will be distributed across all divisions of the Army. The 2004 budget for FCS research and development is about $2 billion, with $3.2 billion requested for next year, and roughly $22 million budgeted between now and 2009. Though JSA Research defense analyst Paul Nisbett does not expect the program to go into full-scale production for over five years, he predicts that billions of dollars will be channeled into FCS annually once it is up and running. The ultimate goal of FCS is to enable 3,000 soldiers to be deployed anywhere within 96 hours and expanded to five full divisions of 10,000 to 15,000 soldiers within 30 days. Implementing this system will involve the deployment of ground sensors, robotic mini-vehicles for reconnaissance, strength-augmenting uniforms, visored helmets that provide enhanced vision, wearable keypads displaying tactical data, chemical agent-detecting body sensors, lightweight chemical and kinetic energy weapons, unmanned aircraft drones, and 1-ton automated ground vehicles that use infrared and lasers to navigate--all of which will be networked by air- and ground-based sensors and computer platforms that are currently nonexistent.

  • "Agencies in Race to Smash Supercomputing Records"
    National Journal's Technology Daily (11/08/04); New, William

    The two-year reign of Japan's Earth Simulator as the world's most powerful supercomputer ended with the announcement of faster systems, first by NASA's Columbia supercomputer, and then by IBM's Blue Gene/L, which overtook both the Earth Simulator and Columbia with its 70.72 teraflops per second performance. Energy Secretary Spencer Abraham stated on Nov. 4 that the Department of Energy has prioritized investment in supercomputing, given high-performance computing's critical role in U.S. science and technology leadership. When Columbia was unveiled two weeks ago, Silicon Graphics CEO Bob Bishop projected that supercomputing will stimulate improvements to the U.S. economy and security, the creation of new domestic jobs, and enthusiasm for future generations of engineers and researchers. Rep. Sherwood Boehlert (R-N.Y.) expects the imminent passage of a high-end computing bill that would require the Energy Department to set up and maintain a major high-end computing center for advanced scientific and engineering R&D, as well as commit tens of millions of dollars to high-end computing hardware and software development for the next three fiscal years. He said that in early 2005 the House Science Committee he chairs will resume work on supercomputing legislation that would institute a national high-end computing initiative outlining the duties of agencies such as NASA, the National Science Foundation, and the National Institute of Standards and Technology. Office of Science and Technology Policy director John Marburger determined in June that the President's Information Technology Advisory Committee (PITAC) had a responsibility to evaluate federal computational science research. A PITAC progress report delivered by Daniel Reed with the PITAC subcommittee on computational science on Nov. 4 indicated that weather and climate, which affects 40 percent of the U.S. economy, is a major supercomputing application; Reed also cited a "disconnect" between commercial practice and government and academia's computing infrastructure requirements.
    Click Here to View Full Article

  • "Purdue, Japanese Researchers to Create More Human-Like Robots"
    Purdue University News (11/08/04)

    Purdue University electrical and engineering professor C.S. George Lee says the purpose of a four-year National Science Foundation-funded collaboration between Purdue and Japan's Advanced Institute of Science and Technology is "to give humanoid robots the ability to behave and move more like human beings, to have the skill-learning capabilities of humans." Lee says the engineering accomplishments of humanoid robot technology are undercut by the machines' stiff, mechanical motion, and the collaborative project aims to teach humanoid robots how to rapidly learn new movements so they can offer more assistance to people. The initiative will involve three-dimensional recording of human movements by Purdue researchers, using minuscule wire "receivers" positioned strategically on the subject's body as he or she moves in a low-level magnetic field. The movements will cause the receivers to generate a current that will be tracked by lab computers, allowing Lee and fellow Purdue professor Howard Zelaznik to perceive fundamental movement patterns from which mathematical models could be extracted. These models could then be applied to software that allows robots to carry out sophisticated movements by integrating more "primitive" skills. "We'd like to see whether we can figure out if there is a computationally reasonable way for a robot to take a set of skills and combine them into new skills rather efficiently, flexibly and quickly," says Zelaznik. The $900,000 NSF grant for the project was provided under the foundation's Information Technology Research program.
    Click Here to View Full Article

  • "European Distributed Supercomputing Infrastructure Is Being Born"
    Innovations Report (11/08/04); Jarvinen, Jari

    Deploying a terascale pan-European distributed supercomputing environment for scientific research and discovery is the chief goal of Distributed Infrastructure for Supercomputing Applications (DEISA), an initiative developed by European high-performance computing (HPC) facilities whose disciplines include material sciences, fusion research, cosmology, environmental sciences, life sciences, and computational fluid dynamics. "The DEISA concept is based on the educated guess that network bandwidth will become, by the end of this decade, a commodity very much like raw computing power became a commodity in the early 90s," explains Victor Alessandrini with DEISA project director IDRIS-CNRS. Strategic requirements of the DEISA supercomputing infrastructure include transparent, non-disruptive operation; the concealment of complex grid technologies from end users; and provision of persistent and portable scientific applications. The DEISA environment is comprised of an inner layer of computing platforms with consistent architecture and operating systems lashed together into a "distributed virtual supercomputer" spread out among different countries but perceived by end users as a unified system. This virtual cluster, formed from four IBM supercomputers in Germany, Italy, and France, will feature over 4,000 processors and more than 22 teraflops of cumulative computing power, and be able to efficiently share data across a wide area network using IBM's Global Parallel File System. DEISA's second layer will consist of a heterogeneous supercomputing grid born from the integration of the virtual cluster with selected leading computing platforms; among the services the grid will deliver to the scientific community will be UNICORE middleware-based workflow management, high performance global data management in the entire grid, multiplatform grid applications, and portals and Web interfaces that mask complex environments from end users.
    Click Here to View Full Article

  • "OSRM Tracing Linux Patents in EU" (11/08/04); Kerner, Sean Michael

    The Open Source Risk Management (OSRM) group is researching the sources and innovations behind 50 European Union software patents that pose a risk to Linux. Even though the EU formally disallows software patents, they are being granted at a rate of about 50 per year and the EU has recently made moves to sync its patent law with that of the United States. OSRM's "Patents and Prior Innovations Project" will show how software patents are a bad idea and hinder innovation rather than promote it, and volunteers will be able to log their contributions on the Grokline Web site set up by OSRM litigation risk research director Pamela Jones. OSRM Chairman Daniel Egger says the new study will show how irrelevant many existing software patents are in terms of identifying the true sources of innovation, and that study has implications for both policy makers and those considering moves to Linux. OSRM created a scare last summer when it issued its first major Linux patent study that found the open-source operating system possibly infringed on 283 unchallenged patents; Egger says his group's intention was not to frighten people, but rather to expose the threat software patents pose to software innovation. Besides influencing EU officials, OSRM hopes to prompt reform of U.S. software patent law. Egger is wary of statements by IBM and Novell that they will use their patent portfolios to protect Linux, because he says large companies should not be able to wield patents as a tool that affect the availability of certain software. Although large patent portfolios allow continued software development among those who retain them as a sort of insurance policy, they are a dangerous and impenetrable thicket to small software developers.
    Click Here to View Full Article

  • "Ideas Stolen Right From Nature"
    Wired News (11/09/04); Hooper, Rowan

    The natural world often provides ideal solutions to engineering problems, which is why biomimetic research is so popular; but it is only recently that technology has advanced to the point that it can help biomimetics realize its tremendous potential. Nature's engineering superiority is the theme of the World Expo 2005 in Aichi, Japan, which is showcasing such biomimetics projects as British researcher Julian Vincent's smart fabric, which opens and closes in response to rising and falling temperatures in the same way that a pinecone opens and closes its scales. "Nature has been conducting evolutionary experiments for millions of years, so if we're lucky enough to find something close to what we require in nature, then it's very likely to have been highly optimized, and we're unlikely to do much better," notes University of Southampton electronics and computer science professor Greg Parker, who has applied for patents based on his work. Parker and a research student studied the nanostructures and physical mechanisms responsible for producing the vivid blue color of a butterfly's wing, and replicated them in silicon; the generation of photonic crystals with diverse optoelectronic and telecommunication applications could stem from the discovery. Other biomimetic projects that have emerged recently include a design for shape-shifting airplane wings using scales that slide over each other. Meanwhile, Oxford University biologist Andrew Parker's study of a beetle that survives in arid environments using natural substances to condense water droplets on its back could lead to a commercial material that provides the same function on a much larger scale. "Biomimetics is still such a new area of technology transfer that enormous opportunities exist for people to make discoveries and transform these into successful patents," remarks Anja-Karina Pahl of the University of Bath in England.
    Click Here to View Full Article

  • "Tech 2005: What's New and What's Next" (11/03/04); Desmond, Michael

    Advanced technologies on track to emerge over the next several years include PCI Express (PCIe), a graphics interface that promises to deliver 100 percent more performance over AGP to 500 Mbps; one PCIe permutation will enable fast peripheral links to a PC over a cable, a breakthrough that could pave the way for modular PC upgrading. Another development on the horizon is dual-core 64-bit processors from AMD and Intel that could be particularly beneficial for processor-intensive tasks such as games and photo and video editing. Meanwhile, work is afoot to transform digital photography by delivering such capabilities as advanced image metadata tagging, the construction of 3D models from 2D pictures, automation of tasks such as red-eye correction, and the organization of digital photo archives by desktop software. A number of companies are looking to improve the quality of flat-screen displays: Toshiba and Canon are scaling back the weight and thickness of flat panels via Surface Conduction Electron-Emitter Display technology, while Sony and Genoa Color are trying to increase flat panels' color range with their respective Qualia 005 LCD and Multi-Primary Color efforts. The battle between advanced optical-disc format standards will heat up in the years ahead as the blue-laser based HD-DVD and Blu-ray specs compete with each other as well as with the red-laser based Digital Multilayer Disc standard. Foreseen cell phone breakthroughs include support for megapixel photos, video streaming, and 3D gaming; multiprotocol switching and miniature 1-inch drives; and in-flight connectivity with base stations. Malware is expected to become more insidious, with Johannes Ullrich of the Internet Storm Center predicting that "Everything that uses an IP address will be a target." The drawbacks of home automation products based on the X10 standard could be mitigated with the advent of protocols such as Insteon, ZigBee, and Z-Wave, while other anticipated technologies include smarter appliances, ubiquitous computation, and fast-charging, smaller, and lighter batteries.
    Click Here to View Full Article

  • "UC San Diego Scientist Wins Award From IEEE Information Theory Society for Breakthrough in Coding Theory and Practice"
    UCSD News (11/09/04); Ramsey, Doug

    The IEEE Information Theory Society has selected a paper that describes improved error-correcting codes as the best academic article written in the information theory field in the last two years. University of Illinois at Urbana-Campaign professor Ralf Koetter and University of California, San Diego professor Alexander Vardy used a new decoding technique developed at MIT to design a better soft-decision decoding algorithm for Reed-Solomon codes. Reed-Solomon codes are used in approximately 75 percent of error-correction circuits today, including many storage and communication devices. Vardy says their decoding algorithm closes the gap between probability inherent in decoding and the algebraic aspect of the technology. The new decoding algorithm makes much more effective use of probabilistic information gleaned at the receiver, allowing substantial coding gains that could render existing decoding algorithms obsolete. The best example of the new algorithm's capabilities has been tested by the ham radio operator community, which has been using it to increase their machines' sensitivity so that regular contact is made between the moon and earth. Koetter and Vardy's algorithm allows low-power amplifiers and receivers to decode "moonbounce" messages that operators bounce off the moon, says ham radio operator, Nobel Laureate, and Princeton University academic Joe Taylor; "The KV algorithm is fully 2 dB better than what I have been using, and the advantage holds up over a wide range of signal-to-noise ratios," he notes. Vardy says the academic and industrial research communities have seized upon the finding and are developing applications and newly opened avenues of research.
    Click Here to View Full Article

  • "Putting a Face to 'Big Brother'"
    BBC News (11/08/04); Belo, Roberto

    Experts such as Richard Bowden of the University of Surrey's Center for Vision, Speech and Signal Processing believe a more natural mode of human/technology interaction can be facilitated with avatars such as Bowden's Jeremiah, a downloadable virtual face that reacts expressively to visual stimuli. Jeremiah, which has been featured at the Future Face exhibit of London's Science Museum, cracks a smile when people greet him, gets angry if he is ignored, becomes sad when he is alone, and can also register surprise, according to Bowden. This is not an indication of intelligence, however: The avatar is responding to input from a surveillance tracker system in a preset way. Bowden's team is working on a more advanced and interactive avatar that will resemble a fish. Avatars could conceivably supplant the keyboard and mouse interface once they are integrated with speech and voice recognition systems. Bowden also envisions systems that can observe a person's behavior over time to anticipate his actions and execute assistive functions, such as switching on the kettle if he plans to make a cup of tea. Some of the technology's implications sound Orwellian, but Bowden says a human avatar might assuage such fears. He recalls that people in his center were uncomfortable when surveillance cameras were installed, but nobody objected to Jeremiah's presence, "because although it's still watching them, they could see what it was watching."
    Click Here to View Full Article

  • "U.S. and Europe Unprepared for Cyber Attack"
    Reuters (11/04/04); Warner, Bernhard

    U.S. Rep. Tom Davis (R-Va.), a co-chair of the federal Information Technology Working Group, says that future widescale terrorist attacks could be executed by one person with a computer rather than a bomber or hijacker. "If you can control the [power] grids, if you can do it from a computer somewhere, you can do a lot of damage," he says. Davis wants the United States to spend more of its annual IT budget on network security, and wants some of that money to go to European and overseas technology companies. He believes that European technology firms may be well situated for security contracts because they have been dealing with domestic terrorism for a longer time; Davis also thinks more international policing is needed for the Internet. Davis says the Bush administration must improve its joint efforts with Europe to improve cybersecurity, noting a rise in denial-of-service attacks with ties to organized crime. Davis says, "The U.S. is not where we need to be on defending against" cyber-terrorism, and "Europe is not where they need to be on this."
    Click Here to View Full Article

  • "Researchers Look to Save Disappearing Languages Online"
    Investor's Business Daily (11/09/04) P. A6; Riley, Sheila

    Leanne Hinton, head of UC Berkeley's linguistics department, reports that at least 50 percent of the world's 6,700 languages are in danger of fading into obscurity, and such a loss goes beyond mere vocabulary to include the erosion of grammatical, thought, and knowledge systems. Archives of endangered languages have been accumulating at UC Berkeley for about 100 years, and these archives will be digitized and made accessible to anyone through private and federal grants. The university's digitization effort will be aided by the Rosetta Project, a massive online language repository funded by the National Science Foundation. Rosetta Project curator Laura Buszard-Welcher reports that digital archiving is designed to give people ready access to material. The Rosetta repository had written and audio samples, along with grammar and sound descriptions, of 1,791 languages online at last count. Another notable archival project is George Mason University's Sept. 11 Digital Archive, a collection of 150,000 online "objects"--anecdotes, emails, audio, images, etc.--related to Sept. 11, 2001 that were contributed by ordinary people as well as historians. Roy Rosenzweig, director of GMU's Center for History and New Media, explains that digital archives are taking a cue from the technology sector's open-source movement by allowing anyone to add to the repositories. "A model of giving away things, of participation, is a very appealing one," he notes.

  • "Giving Bugs the Boot"
    Computerworld (11/08/04) Vol. 32, No. 45, P. 40; Weiss, Todd R.

    Stanford University and UC Berkeley researchers working on an alternative to system rebooting as a remedy for glitchy software have come up with "recovery-oriented computing," in which software is designed to reboot faster so that users can resume work almost immediately. UC Berkeley computer scientist David Patterson reasons that the tenaciousness of software bugs rules out the possibility of eliminating them altogether, which makes co-existence the only real option. In the concept of micro-rebooting, a sufficient portion of a program's processes are rebooted to stabilize the system without affecting data in the processing pipeline. Patterson and Stanford computer science professor Armando Fox have been exploring algorithms that monitor system processes and use normal operational baselines to detect aberrations and perform micro-reboots without the user even being aware of the problem. The researchers have been employing the open-source Java 2 Enterprise Edition (J2EE) in their experiments because the software is widely used and can be easily modified, while its modular design simplifies the cessation and rebooting of one process within a part of the application; Fox notes that another microrecovery technique under investigation could involve statistical monitoring algorithms, although the method carries security issues that need to be addressed. Key to solving rebooting problems is the determination of J2EE's positive and negative elements, as well as research that tackles the same issues in other popular computing systems. Fox reports that micro-rebooting is "not guaranteed to fix the problem, but it's guaranteed not to make things worse, so there's no reason not to try it." Patterson describes the ideal micro-rebooting system as one that would scan a PC or server and frequently reboot it under the radar so that the user does not suffer any visible failures.
    Click Here to View Full Article

  • "Bluetooth Road Map Ups Data Rates, Lowers Power"
    EE Times (11/08/04) No. 1346, P. 4; Mannion, Patrick

    The Bluetooth Special Interest Group (SIG) will reveal a technology road map this week that outlines Bluetooth improvements in security, power consumption, range, and qualification processes through 2006. Bluetooth has come under pressure as it has taken longer than anticipated to get off the ground, while other wireless specifications such as Wi-Fi and ultrawideband have gained a foothold in the high-end applications, and Zigbee, RFID, and near-field communications have gained supporters on the low-end. However, Bluetooth SIG technical director Mike Foley says the technology still has a solid foundation as a personal connectivity scheme, and is well established in areas such as wireless headsets and hands-free applications in automobiles; initial troubles were caused by a diffuse effort to make Bluetooth a solution for too many things at once, he says. Enhancements to the technology through 2006 will slowly expand Bluetooth's promise in the peer-to-peer, gaming, file-sharing, and even sensing fields. Bluetooth's Enhanced Data Rate specification has already been finalized and will increase Bluetooth throughput from 1 Mbps to 3 Mbps; concerns about security and privacy will be addressed as trouble areas are identified, and Foley says the PIN authentication scheme currently used is susceptible to hacking if devices are engaged in a pairing process. Plans are to expand the current four-digit PIN to a longer alphanumeric sequence that would be much more difficult to foil. Bluetooth developers will also examine how to make the data transaction process more efficient to save power and should produce results next year; low-power will enable Bluetooth to take on more tasks, especially sensor systems. Range is another aspect that will be enhanced through better error-correction and receiver sensitivity. Finally, Bluetooth profiles will be vetted faster through a streamlined, Web-enabled approval process.
    Click Here to View Full Article

  • "The Magic of RFID"
    Queue (10/04) Vol. 2, No. 7, P. 40; Want, Roy

    Radio frequency identification (RFID) tagging technology, which enables the identification of an object, person, or place without a direct line-of-sight via an electromagnetic challenge/response interchange, has a "magical" quality because its operations are largely automatic and invisible. An RFID scheme utilizes readers that emit interrogation signals; these signals power tags affixed to an object, and cause the tags to beam back a unique digital ID. The technology currently faces a number of unresolved technical issues before it can be widely adopted: Communications become ineffective when tags are oriented perpendicular to the reader antenna, and building less orientation-sensitive antennas is probably the more cost-effective solution. Furthermore, most RFID readers cannot operate in close proximity to another reader scanning for tags, which requires the presence of a standardized protocol to enable bandwidth sharing among these various systems. A standardized data format could permit data to be exchanged among independent organizations as it travels through a supply chain, while extending the operating range of RFID systems should become possible as the silicon components in tags become more power-efficient, and more sensitive and affordable receivers are designed. Certain packaging materials such as ferrous metals can disrupt RFID, and the existence of several RFID frequencies and standards must also be addressed, perhaps by building programmable readers that support multiple standards. The final technical hurdle to RFID adoption is to reduce manufacturing costs so that item-level tagging becomes practical, possibly through fluidic self-assembly or polymer RFID tags. Key developments in RFID's evolution include the addition of sensors, which allow tags to include experiential data as part of the tagged item's ID; security mechanisms that notify customers if an item has been tampered with; and user-programmable memory, which allows the storage of data associated with tagged products without the need for a common database shared by supply-chain participants.

  • "Protecting Public Anonymity"
    Issues in Science and Technology (10/04) Vol. 21, No. 1, P. 83; Morgan, M. Granger; Newton, Elaine

    Technologies are emerging that could lead to a society where everyone can be tracked and identified, and thus abusively controlled both socially and politically. Preventing this scenario involves efforts among law and IT professionals, civil libertarians, and the general public to bolster privacy rights and public anonymity. Carnegie Mellon University's M. Granger Morton and Elaine Newton frown upon the legal authorization of detailed shopping center surveillance data, because its benefits to marketers and law enforcement would be outweighed by the potential for exploitation by criminals, politicians, and others. The authors posit that potentially negative social consequences are more likely to be reduced or avoided if system designers carefully consider the effects of alternative designs before they make their choices, and they offer a preliminary list of design principles that include: Explicit identification of a system's intended functions; the collection of only as many measures necessary to carry out those functions; the use of measures that integrate information over space and time and are appropriate for the task's function and security level; provision of opt-in or opt-out to affected parties; and minimization of data sharing. Suggested measures to promote the growth of effective system design standards while avoiding restrictive government regulation include public anonymity-protective performance standards instituted as best practices, demonstrated compliance with such standards via certification, establishing such certification as a prerequisite for system acquisition by public and private parties, and setting up a legal liability framework for companies whose products violate privacy and data-sharing regulations. Morton and Newton think a new high-level commission akin to the one set up by the Department of Health, Education, and Welfare to shape the Privacy Act of 1974 should be established to build a legal framework to defend privacy and public anonymity while balancing them against other legitimate social goals. Concurrent with this would be the development and distribution of a set of best professional practices for privacy- and anonymity-strong system design by IT professional communities such as the ACM.