ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 663:  Wednesday, June 30, 2004

  • "Report Calls for Fixes in High-Tech Voting"
    New York Times (06/30/04) P. A16; Schwartz, John

    Touch-screen voting systems need a quick solution if they are to be used in the upcoming presidential election, and the New York University School of Law's Brennan Center and the Leadership Conference on Civil Rights have issued a report outlining how such a fix can be implemented. The report recommends that an independent security audit of the machines, their software, and procedures for monitoring election bugs be carried out. "If implemented by those jurisdictions within the obvious constraints of time and resources, these recommendations can markedly improve confidence" that the machines will perform as designed, the report concludes. U.S. Election Assistance Commission Chairman DeForest Soaries praised the report, and said yesterday that he would investigate how collaborative efforts between the commission and local election officials could be augmented with the report's suggestions. Michael Wertheimer, a researcher who established the hacking vulnerability of Diebold e-voting units, calls the study a reconciliation between computer security specialists and election officials. However, the report makes no recommendation that states require the inclusion of voter-verifiable paper trails in the machines, although many experts argue that such a measure is critical to making the systems trustworthy. Technology consultant Rebecca Mercuri takes issue with the report: She says that "adding more technologists, however well-intentioned those technologists may be, will not solve the problem of vanishing votes and vulnerable systems."
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

    For information on ACM's activities regarding e-voting, visit http://www.acm.org/usacm.

  • "Court Rejects Law Blocking Internet Porn"
    Los Angeles Times (06/30/04) P. A1; Savage, David G.

    The U.S. Supreme Court blocked enforcement of the Child Online Protection Act of 1998, which would have imposed a $50,000 fine and a six-year prison term for commercial purveyors of Internet pornography who allowed children to access illicit material. The majority opinion said the law was an unnecessary infringement on free speech, and that parents could more effectively limit their children's access to online pornography by installing software filters at home. Enabling the Justice Department to criminally prosecute offending online pornographers would be a less effective measure, especially since an estimated 40 percent of Internet pornography originates overseas. The Supreme Court rejected the Communications Decency Act seven years ago, which would have made it illegal to send "indecent" messages over the Internet--a mandate that could possibly even cover dirty jokes sent by email. In response, Congress passed the Child Online Protection Act of 1998 with a more narrow focus, defining illicit material as "patently offensive" by contemporary community standards, and targeting only commercial entities. But the ACLU filed suit against the Justice Department, leading to the current case, which has now been remanded to a lower court. Importantly, the Supreme Court did not rule the law unconstitutional, but asked the lower court to decide whether there is no other way for the government to protect children from online pornography; the dissenting opinion said Congress had sufficiently narrowed its efforts to crack down on Internet pornography by requiring adult customers to prove their age to commercial providers. To date, neither Congress nor the federal government have been able to enforce legislation regulating Internet pornography.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "When Standards Don't Apply"
    CNet (06/29/04); Becker, David

    Some common computing formats are not formally ratified as standards, yet have achieved de facto standard status because of their popularity, and software vendors that control these popular formats say standardization would inhibit their ability to keep pace with changing technology. Some de facto standards have proven very successful despite their lack of formal approval, such as the Perl Web programming language, says Sun Microsystems software expert and XML co-inventor Tim Bray. In the case of Microsoft, the company's lock on the .doc and .xls extensions for Word and Excel documents has helped keep the Office software suite dominant, while recent government pressure has led Microsoft to publish XML schemas that enable users to save their Office documents in that open standard. Although the European Union praised Microsoft for the compromise, it suggested the XML schemas also be submitted to a standard body for safekeeping. Adobe is somewhat more open with PDF, freely publishing the specification and thus allowing for hundreds of derivative tools, but PDF still remains firmly in Adobe control, which the company says is necessary in order to add new capabilities quickly, such as the recent incorporation of barcodes. Macromedia made its Flash animation popular by freely publishing the technology in the late 1990s, but now makes so little money from the technology some fear it could begin to lag in its upkeep. Scalable vector graphics (SVG) is a competitor to Flash under the control of the World Wide Web Consortium, and open-source expert Bruce Perens says that all browsers will eventually include SVG plug-ins and make Flash irrelevant. Dave Winer has kept his really simple syndication (RSS) publishing protocol from standards bodies because he fears they would complicate the technology, which he says must remain sparse; people who disagree with Winer are flocking around Atom, which could replace RSS but also works alongside Winer's format.
    Click Here to View Full Article

  • "In Wild West of Data Mining, a New Sheriff?"
    Associated Press (06/28/04); Bergstein, Brian

    Privacy activists compare the current practice of government data-mining to the lawless frontier of the Wild West, and a panel led by onetime FCC chief Newton Minnow recently issued a report concluding that data mining, though an important tool in the war against terrorism, can subvert the Bill of Rights in certain circumstances. The panel, known as the Technology and Privacy Advisory Committee (TAPAC), recommended that data be "anonymized" so investigators could search for suspicious activities without any immediate knowledge of whom they were investigating. Similar suggestions were proposed in reports from the Heritage Foundation, the Markle Foundation, and the Center for Democracy and Technology. TAPAC said the government needs to put more emphasis on the privacy and accuracy of data, and overhaul American data laws described in the report as disconnected and obsolete. Investigators would not be allowed to query personally identifiable data unless they obtain authorization from the Foreign Intelligence Surveillance Court, and the committee declared that only general data mining on American citizens should be subject to these restrictions, leaving them inapplicable to analysis of federal workers and airplane passenger registries, as well as foreign suspects and overseas intelligence. The TAPAC recommendations were hailed as a good beginning by Barry Steinhardt, director of the ACLU's technology and liberty program, while Sen. Ron Wyden (D-Ore.) was confident that the report, being made by an independent and bipartisan committee, would support his view that federal data mining be strictly regulated. Not all TAPAC members agreed with the report's conclusions, while observers believe new data mining rules are unlikely to be instituted by Congress before 2005.
    Click Here to View Full Article

  • "EFF Publishes Patent Hit List"
    Wired News (06/30/04); Terdiman, Daniel

    The Electronic Frontier Foundation (EFF) has narrowed down a list of roughly 200 questionable patents submitted by the public to 10 that are potentially invalid and are being employed to hurt innovation and restrict free expression. Acacia Technologies' digital media transmission patent is targeted by the EFF as the most important patent on the list, and the foundation is concerned that Acacia is using this patent to go after small audio- and video-streaming Web sites. Other patents on the list include Clear Channel's Instant Live patent, which encompasses technology used to produce instant recordings of live concerts; Acceris Communication's voice over IP technology; gaming and real-time ladder rankings patents held by Sheldon Goldberg; a personalized subdomains patent owned by Ideaflood; a NeoMedia Technologies patent that supposedly controls techniques for accessing computers based on ID codes; a test-making technology patent from Test Central; Firepond's automatic message-interpretation and routing systems patent; Nintendo's video-game emulator patent; and a patent held by Seer Systems covering the generation, distribution, storage, and performing of musical work files. EFF staff attorney Jason Schultz contends that "These patent owners...[are] trying to claim ownership over some fundamental part of software of the Internet that people use every day, and they're threatening small companies or individuals that can't afford lawyers." The EFF will formally ask the U.S. Patent and Trademark Office to re-examine these 10 patents. Seattle patent attorney Phil Mann says the PTO's re-examination process "allows members of the public to ask that the patent be examined once again in light of new information, in the hope that the Patent Office will say, 'Oh, we made a mistake.'"
    Click Here to View Full Article

  • "High-Tech Equity"
    Houston Chronicle (06/30/04); Everett-Haynes, La Monica

    Rice University is trying to raise the percentage of female computer science graduates through its Computer Science Computing and Mentoring Partnership (CS-CAMP). CS-CAMP is a two-week program that helps generate an interest in computer science among young women who otherwise would have no opportunity to cultivate it. The National Science Foundation sponsors the camp, which hosts sessions where almost 50 Houston Independent School District students are taught robot assembly, computer repair, and the use of Java-based programs. Duke University computer science professor Carla Ellis, co-chair of the Committee on the Status of Women in Computing Research, says it is critical to introduce computer science concepts to girls in middle and high school, because by the time they are of college age many women lack basic computing knowledge or are intimidated by classes with a dominant male presence. The National Science Center estimates that the percentage of female computer science graduates fell from 37 percent to 28 percent between 1985 and 2001; furthermore, 41 percent of all science and engineering graduates are women, yet just 20 percent graduate with an engineering degree. "The drop-off has been going on, and it's probably going to get a bit more severe," says Ellis, who attributes this decline to a paucity of role models and stereotypical views of science and technology careers as geeky or insular. Michael Sirois of Rice's Center for Excellence and Equity in Education says girls must have a better understanding of the opportunities presented by a science and technology career if they are to be successful.
    Click Here to View Full Article

  • "Software Fuse Shorts Bugs"
    Technology Research News (07/07/04); Patch, Kimberly

    Stanford University researcher George Candea says restraints on input and outputs could make software more stable, preventing much of the bug-related troubles that cost the U.S. economy nearly $60 billion each year, according to National Institute for Standards and Technology estimates. Software fails when operations extend beyond the set of conditions for which the software was tested, and Candea proposes constraining reality for software by rejecting unanticipated inputs and outputs through the use of software fuses, which are protections similar to electrical fuses regulating current flowing through a circuit. Developing these fuses requires correctly defining acceptable input and output, as well as measuring predictability so that trade-offs can be made between predictability, performance, and cost. Candea's approach treats the software application itself as a black box so that the software fuse is similarly deployed with both legacy systems and newer software. Traditional software reliability researchers may eschew limiting inputs and outputs, but Candea says the method is a pragmatic way of dealing with a very difficult problem, and should coincide with regular software quality improvements. He says, "Instead of fixing the product that fails when given wrong inputs, fix the inputs." Software fuses would guard against inputs of unexpected size, such as buffer overflow exploits used by the SQL Slammer worm, for example, or inputs of unexpected content, such as the HTML parsing technique used in denial-of-service attacks with the Apache Web server and Squid proxy cache. Other benefits of the software fuse method include the ability of third parties to install the fuses on proprietary software and their relative cost-effectiveness compared to constantly rewriting software, which often introduces new bugs.
    Click Here to View Full Article

  • "FTC Mulls Bounty System to Fight Spam"
    MSNBC (06/29/04); Brunker, Mike

    The perceived ineffectiveness of the federal CAN-Spam law has prompted the FTC to consider a bounty system in which a person who identifies a spammer breaking the law will receive a reward of at least 20 percent of the civil penalty the FTC eventually collects--a particularly attractive proposition, considering that the FTC will probably seek multimillion-dollar fines against the most flagrant violators. The bounty concept was given currency by Stanford Law School professor Lawrence Lessig, who concluded, "If the vigilantes who are working so hard to keep lists of offending email servers were to turn their energy to identifying and tracking down spammers, then this passion to rid the world of spam might actually begin to pay off--both for the public and for the bounty hunters." The FTC is accumulating and evaluating expert testimony on the plan and is expected to tell Congress whether it is feasible by September, but critics want the plan rejected. Spamhaus.org founder Steve Linford sees no point to such a system, given that the FTC has already compiled so much data about spammers' identities, while Louis Mastria with the Direct Mail Association says the plan would only encourage online vigilantism and probably would not lead to any actual arrests. But disappointment in CAN-Spam's performance is palpable and growing stronger, given reports of steadily increasing volumes of spam. Worse, IronPort Systems' Tom Gillis says spammers are increasingly using "zombie" computers as spam launching platforms in order to avoid being traced by authorities. On the other hand, CAN-Spam advocates feel the law is fulfilling its purpose, and was never intended to be an all-in-one solution, but rather "one weapon in the [anti-spam] arsenal," according to Carol Guthrie, a representative of CAN-Spam co-author Sen. Ron Wyden (D-Ore.).
    Click Here to View Full Article

  • "Apple Putting More Focus on Simplifying Searching"
    New York Times (06/29/04) P. C2; Flynn, Laurie J.

    Apple Computer has tapped technology from its iTunes online music service to build a new search feature for Macintosh Operating System X called Spotlight that will permit users to look rapidly for words and concepts stored on a hard drive. Spotlight, which Apple CEO Steven P. Jobs demonstrated at the Apple Worldwide Developer conference yesterday, will be incorporated into Tiger, the next iteration OS X, slated for rollout in 2005. "It's easier to find a document in a million pages on the Web using Google than it is to find a document on your hard drive," noted Jobs, who claimed that Spotlight will eliminate this problem because it can retrieve data on a hard drive no matter what kind of file it is stored in. Spotlight will be represented on users' screens as a icon in the top right corner, similar to search buttons common to Web pages. Analyst Tim Bajarin said that Jobs' promotion of Spotlight and other new enhancements to the Macintosh operating system took aim at Microsoft, in particular its much-touted Longhorn operating system. "They're putting in a lot of features that will be in Longhorn, and that's not coming out until 2006 or 2007," he explained. Tiger will include about 150 new features, including Spotlight and a videoconferencing augmentation.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Students Create Global Positioning System Text Messages"
    Gannett News Service (06/28/04); Vinluan, Frank

    A trio of Iowa State University computer science students has devised "spatial cues," a computer program that combines text messaging with the Global Positioning System. Spatial cues incorporates a setup similar to those in art galleries and museums, in which patrons can play recorded messages relevant to exhibits by manually entering corresponding codes. But all a spatial cues user has to do to play a message is approach a targeted location. "It knows your location not because you type a number but because of your coordinates," explains Iowa State University computer science professor Simanta Mitra. In a search-and-rescue situation, rescue workers would don a harness with a handheld computer running spatial cues and a global positioning receiver, with a wireless network serving as a transmission medium. The team behind spatial cues believes the technology could become even less distracting with audio capabilities, and could ultimately supplant traditional radio dispatches. One of the students, Shahzaib Younis, says the technology could have applications beyond public safety, such as an urban navigation system for the visually impaired or a messaging service for social groups. Spatial cues was named a top 10 finalist in IEEE's Computer Society International Design Competition, taking place this week in Washington, D.C., and will compete in the World Finals; 250 teams from 144 schools entered the contest.
    Click Here to View Full Article

  • "USC Smartens HP Server Memory"
    USC Information Sciences Institute (06/24/04); Mankin, Eric

    The "Godiva" research team at the University of Southern California's Viterbi School of Engineering has created one of the largest processor-in-memory (PIM) chips ever realized in academia, and integrated it into standard Hewlett-Packard Long's Peak server memory modules. The 56-million-transistor Godiva project chip builds on DIVA, an earlier PIM chip design that boasted 1 million fewer transistors. The Godiva PIM chip supports address translation and eight single-precision floating point units, as well as a memory interface that interoperates with DDR SDRAM memory buses. Godiva team co-project leader Mary Hall, a software specialist, hardware architect Jeff Draper, and colleagues are now testing how well the Godiva-augmented HP server performs such jobs as multimedia, complex scientific simulation, and database access. Hall has spent over four years working to improve PIM chips under the auspices of a project underwritten by the Defense Advanced Research Projects Agency. "Computer scientists have been talking about the potential of PIM chips for most of the past decade and have released devices they call PIM chips, but this is the first smart-memory device designed to support virtual addressing and capable of executing multiple threads of control," Draper asserts. The USC researchers expect the Godiva PIM chip to boast the HP server's performance by as much as one order of magnitude.
    Click Here to View Full Article

  • "APCHI Conference Starts Tomorrow"
    Scoop (NZ) (06/29/04)

    Computer experts from 17 countries are expected to the Asia Pacific Computer-Human Interaction (APCHI) 2004 conference, taking place in New Zealand beginning June 29, 2004. Hosted by the Waikato University's computer science department, the four-day conference will be the site for 80 paper presentations, and full-day and half-day tutorials, such as designing mobile applications. The theme of the conference will be usability, or making technology a less frustrating and confusing experience for users. "When you go to use your mobile phone or spreadsheet package--or any other system that has a computer in it--you should find the experience satisfying rather than irritating," says Dr. Matt Jones, conference co-chair and computer science senior lecturer. Professor Don Norman, author of "The Design of Everyday Things," which has been lauded for shaping the thinking of a generation of designers, will be the keynote speaker of APCHI 2004. "Don is one of the leading computer experts in the world, and he will point out important new trends, showing why developers need to consider emotional aspects as well as the more traditional notions of efficiency and effectiveness," notes Jones.
    Click Here to View Full Article

  • "Sustainability--A Virtual Reality?"
    IST Results (06/24/04)

    The European Union's [email protected] project merges virtual reality experiences with environmental awareness programs. Consortium members from the IT, scientific, environmental, learning psychology, and public policy fields collaborated to create the applications, which allow non-technical users to understand the implications of environmental change and recent scientific data. The [email protected] project is the first such virtual reality project to focus on interactive social learning instead of specific training for specialized tasks, such as air flight simulations. The Personal Barometer application, for example, enables a group of users to measure the impact of their lifestyles on the environment, and the Scenario Generator enables them to understand the balance between sustainability and society and the economy. Users can also experience the difficulties involved in creating environmentally sustainable public policy through the Multi-Agent Games interface, which puts users in the position of different stakeholders, including consumers, agriculturists, industry, scientific researchers, and others. The [email protected] tools are already being used among project partners, including a Masters-level program at the Universite de Versailles-St. Quentin en Yvelines, and a U.K. Environmental Agency project that investigates implementation of European Union water resources guidelines. Scientific coordinator Martin O'Connor says the [email protected] prototypes make complex scientific information and policy issues understandable for the wider public. A related ALARM project funded by the European Union is also exploring the risks to biodiversity in Europe using virtual reality interfaces.
    Click Here to View Full Article

  • "Speech Technology Starting to Make Its Voice Heard"
    Investor's Business Daily (06/29/04) P. A5; Sleeper, Sarah Z.

    The accuracy levels of speech recognition technology must dramatically improve if it is to successfully penetrate the mainstream desktop market, although it is doing well in niche markets such as dictation software and speech recognition phone systems. Gartner analyst Jackie Fenn predicts that mainstream adoption of speech technology will probably be driven by Microsoft, which has already embedded speech recognition capability into certain Microsoft Office applications. As envisioned, speech technology would enable users to control computerized devices by vocal command and free them from manual interfaces such as mice and keyboards, but Fenn says the technology must make such usage possible without adding more software on top of an existing operating system if it is to gain wide acceptance. The dictation software market remains limited because some people get too frustrated with training the software, although Fenn reports that the technology's accuracy levels are continuously improving while others note that training time is shrinking. James Mastan with Microsoft's Speech Server product group explains that users see two primary uses for speech technology--as a way for them to link to their companies' computer systems through a phone, and as a technology to supplant automated touch-tone phones. Fenn, meanwhile, foresees voice software for mobile devices becoming very popular among consumers: "It's likely to take off in the mobile space because people have a compelling reason to want to use it there," she notes. Issues that still need to be dealt with include enabling computers to extract context and meaning from the limitless forms of human expression, and consumers' low tolerance for errors.

  • "Experts: Synthetic Biology May Spawn Biohackers"
    EE Times (06/28/04) No. 1327, P. 45; Brown, Chappell

    Harvard University genetics professor George Church believes the potential abuse of synthetic biological designs is a danger as great as nuclear weapons, noting that it is much easier to genetically engineer a virus than it is to build a nuclear bomb. Deconstructing a cell's molecular biology into a series of standard components could enable biodesigners to fabricate molecular machines in a manner similar to the creation of silicon systems by system-on-chip designers; moreover, as with circuit designers, biodesigners will be able to assemble complex biochemical machines without having to be intimately familiar with biochemical processes. Church, who chaired a panel on DNA synthesis at MIT's Synthetic Biology 1.0 conference earlier this month, says the human body has not developed a general immunity against synthetic biological agents. Cutting-edge synthetic biology is currently on a par with the earliest stage of the integrated circuit revolution, in which engineers added a few gates to a chip, but the process could be accelerated thanks to the collective experience with electronic design systems over four decades of very large scale integration advances. The synthetic biology movement, if successful, would allow electrical engineers to more easily enter a nanotechnology sector with wide medical and industrial applications. Biological processes can easily support self-replication, which is what makes biosynthetic systems so potentially dangerous. This has prompted some experts to suggest banning self-replication, but such a measure would rob synthetic biology of much of its power. Church thinks that licensing and tracking the oligonucleotides that serve as the basic building material of biosynthesized systems is a viable regulatory solution. Tom Knight, director of the BioBrick Wet lab in MIT's Computer Science and Artificial Intelligence Laboratory, says, "There is a lot of power and danger here, but I would like to think that the advantages...outweigh the dangers."
    Click Here to View Full Article

  • "Richard Clarke Talks Cybersecurity and JELL-O"
    IEEE Security & Privacy (06/04) Vol. 2, No. 3, P. 11; Goth, Greg

    In reflecting on the events of the year following his resignation as White House counterterrorism and cybersecurity czar, Richard Clarke notes that the Bush administration has taken some positive steps toward better shielding the U.S. cyber-infrastructure, but continues to demonstrate a serious lack of leadership in more fully addressing the problem. He appreciates the organization of the Cybersecurity Summit and the creation of the Department of Homeland Security's (DHC) National Cybersecurity Division, but says that both were undermined by, respectively, the DHC's refusal to lead the conference and the failure to create an assistant-secretary-level position. Clarke comments that it is hard to tell whether the National Strategy to Secure Cyberspace is being properly implemented, given the lack of a mechanism for high-level evaluation of deployment progress; all the same, he thinks the strategy is still relevant and useful. The former cybersecurity czar believes the threat of a catastrophic cyberattack is alive and well, and says the fact that such an incident has not occurred is no reason for people to assume that it never will: "You cannot straight-line the kind of problems we've had in cyberspace and say that's all that's ever going to happen--yet people do that...without thinking that's what they're doing," he posits. Clarke notes that the government has made progress in pressuring vendors to secure software and services, but says there is a definite need for software with improved vulnerability-checking mechanisms, as well as a technique for drafting, writing, and checking the quality of code that can be performed by people and other software. He calls for the establishment of a national software quality assurance institute that can assess existing code, develop ways to teach people secure coding, and underwrite research into new coding methods. Clarke reports that the shortage of a simple, universal terminology of vulnerabilities is not as serious a problem as it was four years ago, but no one group is leading the effort to solve the problem. He places this responsibility squarely on the shoulders of the National Cybersecurity Division.
    Click Here to View Full Article

  • "Petaflop Imperative"
    InformationWeek (06/21/04) No. 994, P. 55; Ricadela, Aaron

    A petaflop computer capable of processing up to 1 quadrillion mathematical computations per second could revolutionize many industries, including weather forecasting, medicine, and design and engineering. Council on Competitiveness President Deborah Wince-Smith explains that breaking the petaflop boundary, like all innovation, is essential to the continued prosperity and competitiveness of the United States: "U.S. industry, government, and scientific enterprise have to have access to the next generation of tools to ask questions that may not be on their radar today," she maintains. IBM senior VP Nick Donofrio expects his company to have a petaflop machine ready within the next two years, while most computer scientists are not predicting such a breakthrough until 2010. Reaching the petaflop milestone is also seen as a matter of national pride, since the U.S. lead in scientific innovation has been slipping in recent years. But achieving that level of processing power means little without a sustainable architecture that businesses can invest in: "If we have to develop novel architectures to achieve world leadership in supercomputing, is there enough commonality between national and commercial needs to support a common architecture for both?" asks D.E. Shaw Group Chairman David Shaw. A key factor is making the petaflop machine programming-friendly, manageable, and connected to other technologies employed by the businesses it is supposed to benefit the most. IBM's $100 million Blue Gene/L supercomputer project is an effort to reach petaflop speeds, whose targeted applications include protein simulation, seismic imaging, computational chemistry, and business intelligence. The Defense Advanced Research Projects Agency, the Energy Department, and other federal agencies are also reaching for the petaflop prize by funding various projects, including one that aims to set up a supercomputer at the Oak Ridge National Lab for use in physics, biology, nanoscience, and global warming research.
    Click Here to View Full Article

  • "Give It Some Gas"
    New Scientist (06/19/04) Vol. 182, No. 2452, P. 26; Ananthaswamy, Anil

    Liberating mobile devices from today's power sources is the aim of several research teams investigating how batteries in such products can be replaced with fossil fuel-based power sources. "Ultimately, the goal is to be able to make millions of these cheaply, like you make Bic lighters or disposable batteries," explains Paul Ronney with the University of Southern California in Los Angeles. Ronney's team has constructed a scaled-down version of the "Swiss roll" device designed by Imperial College London's Felix Weinberg: The prototype measures a mere 1 cubic centimeter in volume and features a central combustion chamber with a platinum catalyst; Ronney's team is working to produce electricity through the incorporation of thermoelectric elements into the Swiss roll's walls. Sossina Haile's team at Caltech has adopted the Swiss roll design and added a single-chambered solid oxide fuel cell into which a propane/air mixture is injected, yielding 300 milliwatts of electricity. Pacific Northwest National Laboratory has built a methanol reformer that powers a fuel cell through the conversion of methanol to hydrogen, with an expected output of 500 milliwatts. MIT and Belgium's Catholic University of Leuven, meanwhile, are focusing on gas-turbine engines. Researchers at the University of California, Berkeley, have based their micro-power generator's design on that of the internal combustion engine: A team led by Carlos Fernandez-Pello has created a miniature Wankel rotary engine that runs on hydrogen and generates up to 10 watts; achieving a power output of between 30 and 60 watts would make the device powerful enough to run laptops, laptop battery chargers, power tools, and personal digital assistants, while combustion could be sustained by either stacking engines or recycling the exhaust. Micro-scale engine research has yet to meet the challenges of viable fuel delivery mechanisms, friction and heat loss, and emission of harmful substances.

  • "Indoor Location Technology Opens New Worlds"
    GeoWorld (06/04) Vol. 17, No. 6, P. 38; Kolodziej, Kris

    Jupiter Research estimates that there will be 184 million American wireless subscribers by next year, while the National Emergency Number Association calculates that people use mobile phones to make more than 30 percent of 911 calls in the United States; these and other developments clearly establish a need to address key issues for indoor positioning systems. There is a consumer interest for location-based services (LBSes): One particularly desired service is the alert-based LBS, which notifies users if critical events happen within their proximity, such as when they pass by a retail item that is on sale. Global positioning system and cellular-network-based positioning are unsuitable for indoor location applications because they can be confused by obstructive objects and signal reflection off surfaces. There are a handful of positioning techniques that can be useful, either by themselves or in combination--triangulation, scene analysis, and proximity. The most streamlined approach for indoor positioning involves the exploitation of existing communication infrastructure, such as Wi-Fi networks. Absolute positioning involves aligning the location of users to x and y coordinates: The positions of mobile hosts can be calculated through distance measurements via Wi-Fi access points (APs) relaying radio frequency signals enhanced with physical coordinates. Relative positioning, on the other hand, is used to locate users in an enclosed vicinity by referencing symbolic Wi-Fi APs, an approach that can be employed to pinpoint mobile users. Location systems designed for single environments can have a hard time performing in other environments, which makes the case for the development of hybrid positioning systems that employ diverse classes of sensors; a successful hybrid system must be based on open interoperability standards, which necessitates collaboration between different infrastructure providers.
    Click Here to View Full Article

    [ Archives ]  [ Home ]