HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 652: Friday, June 4, 2004

  • "Is the Dust On Your Computer Toxic?"
    CNet (06/03/04); Hines, Matt

    A joint report from the Computer TakeBack Campaign and Clean Production Action indicates that dust accumulating on computers contains polybrominated diphenyl ethers (PBDEs), fire retardant compounds that are known to cause reproductive and neurological disorders in lab animals. The report's conclusions are based on 16 dust samples the authors collected from computer monitors in public areas that include legislative offices, university computer centers, and a children's museum. Another point raised by the study is the continuing persistence of PBDEs in the environment and their contamination of food supplies, animals, and people; the researchers contend that North America suffers from the greatest concentration of PBDEs, and they estimate that the U.S. population's PBDE levels are increasing 100% every two to five years. The elimination of all PBDEs in consumer electronic products by 2006 has been mandated by the European Union, while U.S. efforts are laggards in comparison, according to the report's authors. U.S. computer recycling programs are moving ahead, but the volume of PCs recycled by the world's three biggest PC manufacturers is relatively small: Dell estimates it has recycled only 2 million units out of all the machines it has shipped in the past 12 years. The Computer TakeBack Campaign is pushing for legislation that would make computer and electronics vendors liable for PBDEs and other dangerous substances found in their products. Meanwhile, the sale of products that contain deca-BDE has been prohibited by the state of Maine, while California banned the production and use of penta-BDE and octa-BDE last year. Washington state has established an executive order to map out guidelines to eliminate all PBDEs, and similar proposals are being looked at in Massachusetts, Wisconsin, New York, and other states.
    Click Here to View Full Article

  • "Linux Is Inching Into College Curriculums"
    NewsForge (06/04/04); Lyman, Jay

    It is the opinion of University of California Berkeley computer science lecturer Brian Harvey that major American colleges and universities are generally slow when it comes to adding the teaching of Linux and open source software development and administration into their curriculums. "You'd do better...at technical schools, the places that advertise in the subway, and in the job section of the newspaper," he wagers. ITT Technical Institute corporate curriculum manager Wen Liu notes that two courses at his school offer Linux training and education, while Blue Star Learning's Casey Boyles reports that both corporate and federal organizations are increasingly requesting Linux and open source training. He adds that corporate demand will likely spur universities to more aggressively pursue open source as an educational component. Furthermore, Servin President Norman McEntire says that Linux and open source are growing in popularity among high-school students, which is also putting pressure on academia to add them to their curriculums. "It seems to me as an educator and a researcher that Linux and software like Linux stimulates research, innovation, and competitiveness," comments Marist College President Dennis Murray, whose school has deeply embraced Linux and open source through efforts such as the Linux Research Development Lab. Dean of the Marist School of Computer Science Roger Norton notes that the college will soon offer a new course in "open source development methodology" as part of an expansion of its Linux curriculum. Murray believes that the leading computer science and information systems schools in the United States are giving students Linux training through Unix programs.
    Click Here to View Full Article

  • "E-Democracy: The Lowdown on E-Voting"
    CIO (06/01/04); Santosus, Megan

    Concern and disagreement over the insecurity of electronic voting will be amplified as this November's presidential election approaches, but efforts are underway to solve the three most pressing e-voting problems: Providing reliable audit trails to support accurate recounts and thwart fraud; addressing the security risks inherent in software and electronic data transmission; and educating about 173 million registered voters on e-voting procedures to minimize confusion. The push for adding voter-verified paper ballots to e-voting systems is gaining adherents, but critics such as MIT professor Ted Selker and Information Technology Association of America President Harris Miller argue that such a measure will make election administration more expensive and time-consuming and have a negative impact on the efficiency and reliability of the voting process. Proposed solutions to e-voting's security risks include locking computers and making them tamper-proof, effecting one-time transmission of voting results, and training poll workers on appropriate security protocols. Scientists list buggy source code as the most serious issue with e-voting systems: At the root of the problem is the code's proprietary nature, and critics contend that voting software should be either open source or available for public review; however, Diebold Election Systems' David Bear claims that many security solutions recommended by computer scientists are impractical and unnecessary in the face of actual election administration. Meeting the third challenge requires more thoughtful consideration of ballot and e-voting interface design by election officials and vendors, respectively, according to University of Iowa professor Doug Jones. Instructions need to be clearer and more concise, while the ballot layout must be less confusing. National ballot design guidelines, usability pilot tests, and common usability standards for vendors were suggested by Susan Zevin of the National Institute of Standards and Technology's IT Lab at a February conference of the National Association of Secretaries of State.
    Click Here to View Full Article

  • "Rules Aim to Get Devices Talking"
    Technology Research News (06/09/04); Smalley, Eric

    Flemish researchers are working on communications protocols necessary for "ambient intelligence" between electronic devices in the home. The idea is to imbue each device with the right qualities so that it can cooperate with other devices in meeting users' needs; the approach is opposite to explicit programming, where devices are given specific instructions on what to do in each circumstance. Allowing devices to self-organize means electronic devices can work together on complex tasks vendors may not have anticipated when creating the devices. The Free University of Brussels research team is building its protocols using a game-playing approach, where devices learn about other available devices based on back-and-forth commands; by sending messages to one another and gauging responses, the devices can determine commonalities and the level of trust to assign the other device. As devices randomly attempt to perform a task, they also learn which ones perform that task better and work out an optimal division of labor, says researcher Carlos Gershenson. Besides communications protocols, researchers at the Free University of Brussels are examining other ambient intelligence requirements, such as the level of necessary intelligence, communications channels, and interfaces with human users. Eventually, devices with ambient intelligence will also have to function on their own as well as within a complex system, says MIT researcher Larry Rudolph, who is involved with the school's Oxygen ubiquitous computing effort; he notes that MIT is taking a hybrid approach to ubiquitous computing, incorporating both self-organizing principles and explicit programming. Ubiquitous computing applications tend to be easily disrupted and problems must be addressable by normal users, he adds.
    Click Here to View Full Article

  • "Networking Research Center Aims to Improve Computer Links, Communications"
    Stanford Report (06/02/04); Koch, Geoff

    The growing diversity of computing and wireless communications devices and protocols is complicating the linkage of these myriad tools, and easing these connections is the goal of the Stanford Networking Research Center (SNRC). Addressing these formidable and complex problems "requires expertise in everything from smart radio architecture to network operations management," says SNRC director Michael Eldredge, who aims to supply such expertise by assembling networks of researchers at the Stanford School of Engineering and managers at private-sector technology firms. SNRC's search for industry partners began four years ago, and among the first to sign up were Cisco, Sony, Bosch, 3Com, and STMicroelectronics, which established SNRC's $10 million endowment and are currently funding the center's yearly research budget. In exchange for their endowments, industry partners are given access to leading ideas, strategies, and specialists so that technologies for which there will be "market pull" can be developed. Eldredge is not surprised by the participation of automotive and auto-related companies, noting that "There are several distinct computer networks running through the newest cars--for control and monitoring of the engine, safety and comfort." An SNRC- and Accel Partners-sponsored symposium entitled "Service-Oriented Flexible Computing: Promises and Challenges of the Next Generation" will be held on June 8, where the center's industry partnerships will be in evidence.
    Click Here to View Full Article

  • "The Changing Face of Email"
    Wired News (06/03/04); Asaravala, Amit

    In his June 2 keynote speech at the Inbox email technology conference, Proofpoint CEO Eric Hahn described email as "broken" and in need of "metaphoric changes." He warned that email is threatened by information overload, and urged software developers to reject the notion of email inboxes as memo dumping grounds and start to view them as central areas where voice mail, instant messaging, email, and other communications are integrated. One example cited by Hahn was the file-folder metaphor: Whereas the tool was designed at a time when people were expected to receive no more than five daily messages, nowadays many people are receiving 10 to 20 times as many messages, which places an unreasonable burden on them to file them all. The Proofpoint CEO also said he wanted email software and IM software to be merged. During a panel on email management, University of Illinois at Urbana-Champaign researcher Ben Gross predicted that email and IM will overlap considerably by 2006, while email software developers will start to embed RSS readers into their products. Some changes to email interfaces are already manifesting themselves: Microsoft's Outlook 2003 and Google's Gmail service permit users to see related email messages sent to and from a single individual. Gmail and Bloomba software from Stata Laboratories boast advanced email search features in which users can place all their email messages in a single folder and carry out fast searches to find individual messages. Meanwhile, Yahoo!'s Miles Libbey says Yahoo! Mail will be upgraded this summer, and among the expected new features is sender authentication indicators that appear next to messages in the inbox.
    Click Here to View Full Article

  • "Executives See Swell of Net Offerings on Horizon"
    USA Today (06/03/04) P. 6B; Klemik, Martin

    The Internet is just warming up, according to a group of technology executives gathered by USA Today for a roundtable discussion: New services such as the free VoIP Skype service promise to change the way people perceive and use the Internet, as well as how companies exploit its capabilities. Motorola CEO Ed Zander says the Internet will have truly matured once people stop talking about it, but take it for granted as they do electricity; his company is pursuing Web-connected, multimedia handhelds--what he calls "the device formerly known as the cell phone"--that allow people to integrate digital technology into their lives. Draper Fisher Jurvetson venture capitalist Steve Jurvetson says the Internet still needs to address threats such as spam by adopting an autonomic response similar to how human immune systems work. The spam and virus threat will eventually come to mobile devices as well once those systems have enough computing resources and a large enough user base to attract hackers and other criminals. Netscape co-founder and Opsware Chairman Marc Andreessen says the economics of the Internet have continued to quickly improve despite the recent downturn in the technology sector: One of the largest changes for the Internet, coming in perhaps five or 10 years, is the pervasive use of encryption, predicts Andreessen. Everything from commerce, communications, and education will be unreadable by governments and companies--"The Internet is going to essentially go dark," Andreessen says. Going forward, Jurvetson says IT will converge with nanotechnology and biotechnology, and surmises that the most difficult computer science problems will be solved by mimicking biological systems in some way. He contrasts the svelte human genome against the bloated code that makes up Microsoft Office, or the efficiency of the human brain against today's super-hot integrated circuits.
    Click Here to View Full Article

  • "Are You Ready to Have a Chat With Your Car?"
    New York Times (05/31/04) P. D8; Patton, Phil

    IBM developers are pursuing advanced telematics technologies that could enable two-way communication between automobiles and motorists, earlier anticipation of vehicular malfunctions, and wireless consultation with databases. T.J. Watson Research Center scientists are striving to make voice-activated automotive controls more accurate by incorporating a video channel capable of reading the driver's lips, and audiovisual speech recognition expert Makis Potamianos says this technique has boosted voice-recognition system accuracy by 80%. Another proposed technology is the artificial passenger or conversational agent, which IBM demonstrated in a virtual trip from Connecticut to New York's Kennedy airport: The agent, dubbed "Christie," accessed weather and traffic forecasts, contacted the airline to check for flight delays, and rerouted the motorist around a traffic jam; when the agent detected the driver nodding off via camera, it began a conversation and offered to play "Name That Tune" to keep the driver awake. IBM plans to keep the obsolescence of in-vehicle electronics to a minimum through off-board computing power, which would theoretically ease software upgrades and allow communication companies to exploit decreasing hardware costs. Another IBM initiative is to identify patterns in diagnostic data that signal nascent failures through algorithms, a breakthrough that could detect problems faster than human managers and prevent product recalls. The diagnostic data would be sent directly to the automaker via a connection between cars and centralized databases. Advanced telematics systems could be a viable replacement for "idiot light" dashboard warnings and allow mechanics to monitor transient problems more closely, as well as help automakers satisfy the federal Tread Act.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "For Mac Security, Communication Is Key"
    CNet (06/03/04); Fried, Ina; Lemos, Robert

    The Macintosh operating system has enjoyed an excellent track record of being a low security risk compared to Windows, but the public disclosure in May of a series of vulnerabilities that could allow hackers to hijack a Mac has prompted some people to argue for more open communication on security issues from Apple Computer, as well as faster patching. Apple's Phil Schiller noted that his company chooses when to release patches based on a process that takes into account Apple's evaluation of the danger posed by the bug, even though industry guidelines recommend that researchers wait at least 30 days after notifying vendors of a vulnerability to publicly disclose it. Critics have also said Apple should provide more detailed information on its Web site, and let researchers report vulnerabilities to a dedicated email address. Schiller insisted that Apple does these things already, but acknowledged that many people are unaware of it. @Stake, meanwhile, contends that Apple has a tendency to discount the magnitude of potential security flaws in its descriptions: For example, eEye found a flaw in Mac OS X's QuickTime movie player that could reportedly enable the execution of malicious code, but Apple downplayed the threat, claiming that the bug could merely cause the player to crash. "They are not characterizing the [security] issue so that people can make a security decision about it," explained @Stake's Chris Wysopal. But despite assurances from Schiller that Apple will consider communicating potential threats in more detail, some Mac users prefer not knowing, as long as Apple prevents vulnerabilities from becoming serious. Mac's appeal as a hacker target is limited because its worldwide market share is less than 5%, although some believe the hacker threat could grow as the company attains distinction in the Unix sector.
    Click Here to View Full Article

  • "Passwords Can Sit on Hard Disks for Years"
    New Scientist (06/02/04); Biever, Celeste

    Computers will sometimes copy information stored in random access memory (RAM) such as passwords and credit card numbers onto hard disks, making sensitive data vulnerable to hackers. In fact, the likelihood that information will be copied onto disk increases the longer it remains in RAM. A team of Stanford University researchers led by Tal Garfinkel developed TaintBochs, a software program that models the operations of a complete computer system and allows sensitive data to be tagged and monitored as it travels throughout the system. The team then simulated computers running popular software where the handling of confidential personal information is a common occurrence, and concluded from their experiments that the programs made practically no effort to limit the length of data retention. Garfinkel believes the best safeguard is keeping data on RAM for the shortest amount of time possible, and this could be done by automatically turning all RAM data into a string of zeros once it is no longer needed. Another solution could be to encrypt data as it is entered, prior to RAM storage. The chief drawback of these measures is that they divert processing power that could be used to speed up the computer or carry out more appealing operations, but International Computer Science Institute security expert Vern Paxson contends that the increasing calculative power of processors is making computers capable of supporting built-in security without sacrificing performance.
    Click Here to View Full Article

  • "Virtual Design Reality for Europe's Construction Industry"
    IST Results (06/03/04)

    Construction projects require collaboration between large numbers of stakeholders, which can give rise to communications difficulties. Solving such problems is the goal of the Information Society Technologies program-funded DIVERCITY project, an initiative to develop a toolkit of six software applications that enable users to visualize and model project aspects in the briefing, design, and scheduling phases. The applications can be employed separately or concurrently, either in an integrated collaborative process or as part of a standalone activity. The modules cover client briefing, acoustics, lighting, thermal and constructability simulations, and site analysis. Rob Aspin of the University of Salford's Future Workspaces Research Center notes that relating design requirements to the design team is frequently difficult for a building's clients and prospective users, and DIVERCITY seeks to address this problem "by developing a virtual design workspace that enables clients, users and the design team to communicate their ideas to each other in a more understandable format." Tools in the workspace permit a graphical building program to be developed wherein the client and designer can define spatial requirements and relationships in a way that maximizes their accessibility in the latter phases of the design process; 3D spatial layouts are set up from the structured building program, and once enough detail is layered into the model, it can be exported to a CAD application. The DIVERCITY toolkit employs a project information board in which data used by all stakeholders has to be entered only once, and much of the project information can be rendered visually rather than textually via the 3D interactive visualization applications. Users are further supported by DIVERSITY in the construction stages through support for site planning and construction sequence planning and tracking via 3D interactive visual tools that impart the site configuration and building program in a graphical medium.
    Click Here to View Full Article

  • "Center Demonstrates Emerging Approach to HPC--Web Services"
    Newswise (06/01/04)

    Loosely-coupled high-performance computing applications can disseminate computation and data from desktops or mobile devices to remote servers through Web services. The Cornell Theory Center's (CTC) Computational Finance Group exemplifies this strategy in an online demo that uses an XML-based Web services architecture--Microsoft's .NET Framework--to devise and test a solution to the problem of pricing callable bond portfolios. The bond computations are launched on a remote Windows cluster by an Excel front-end; a single processor in the cluster separately prices each individual bond, after which results are sent back to the desktop. CTC senior research associate Peter Mansfield notes that the extensibility and reliability of XML-based Web services frameworks are critical factors, facilitating the seamless incorporation of servers to satisfy the performance needs of the current problem. CTC Director Thomas Coleman explains that this advantage gives developers more room to concentrate on research enablement rather than coding and systems administration, and he believes that data-intensive computing and other areas will be affected even more by XML-based Web services. "Moving data among geographically disperse, heterogeneous platforms requires industry-standard protocols such as XML and SOAP and secure interfaces," Coleman points out. Cornell's National Science Foundation Adaptive Software Project involved a demonstration of cross-platform data exchange via XML.
    Click Here to View Full Article

  • "Top Administration Cybersecurity Officials Face Scrutiny"
    National Journal's Technology Daily (06/02/04); New, William

    Federal computer networks are not being protected fast enough, according to the House Government Reform Subcommittee on Technology, Information Policy, Intergovernmental Relations and the Census. Chairman Adam Putnam (R-Fla.) says administration cybersecurity officials should concentrate management efforts on response, prevention, and detection, and that the Federal Information Security Management Act's configuration management provisions need to be advocated and enforced. He says, "Make no mistake. The threat is serious. The vulnerabilities are extensive. And the time for action is now." Federal CIOs have a major challenge in dealing with cyber threats, says Karen Evans, White House Office of Management and Budget administrator for electronic government and information technology. She says the National Institute of Standards and Technology will maintain a Web-based portal and solicit security setting requirements for all systems used by the government. Agencies are being asked for more detailed inventory reporting and information about security vulnerability patching. According to a General Accounting Office report, agencies are not implementing common practices for patch management in a consistent manner. Amit Yoran, who directs the Homeland Security national cyber security division, notes that the Cyber Interagency Incident Management Group involves officials with authority over their agencies' resources and responses to incidents.
    Click Here to View Full Article

  • "Web3D Consortium Announces CAD Distillation Format"
    XMLMania.com (06/02/04)

    The Web3D Consortium has finished the working draft of its CAD Distillation Format (CDF) that will enable companies to share their CAD designs throughout the enterprise without divulging sensitive design information. The specification allows other, non-CAD experts to repurpose portions of the CAD design for other projects, such as for high-quality graphics in marketing materials, use in PowerPoint demonstrations, or even in virtual reality walkthroughs via the Web. Organizations often fail to realize the full potential of their CAD investments because the design information is locked away in proprietary formats and cannot be shared easily without possibly releasing the design details, says Web3D President Neil Trevett. What the CDF does is enable portions of that data to be shared in a way where data security is easily managed. CDF is based on Web3D's X3D open standard for real-time 3D data transfer, and will be enhanced by the X3D Amendment 1 due out this summer; both CDF and X3D Amendment 1 will be submitted to the International Standards Organization (ISO) later this year. The Web3D CAD Working Group has also created a software toolkit, documentation, open-source code, and a best-practices guide to assist organizations in their deployment of CDF. The CDF specification is major step forward for Web3D's goal of facilitating 3D data communication across the different platforms, applications, and networks, says NASA Ames program manager Paul Keller. The government and defense industry are expected to be the largest beneficiaries of the new format, says X3D group co-chair Don Brutzman.
    Click Here to View Full Article

  • "A Virtual Music Machine"
    New York Times (05/31/04) P. C4; Chartrand, Sabra

    A virtual orchestra machine is generating a tremendous amount of controversy and scientific innovation, as well as another case study in how technology reshapes industries. New York City College of Technology professor David Smith is one of three academics who collaborated to build the machine, and says it differs from other music synthesizers in that it allows users to vary key aspects of the piece in order to match live performers: Instead of recording the entire piece on a single sound file, the Sinfonia virtual orchestra must be programmed with every individual note, sequence, and volume change. Although the device's keyboard resembles that of a musical keyboard, it actually features many computer-like functions that allow technicians to emulate the performance of instruments and sections that are absent from a theater or other performance, and that capability has made the Sinfonia a lightning rod for controversy on Broadway, where producers vie with performer unions about cost-cutting. After Broadway producers threatened to replace entire orchestras with the machine during a strike last year, the musicians' union agreed to a compromise that guaranteed a set number of live performers for each show. Other agreements between musicians and performers at the Opera Company of Brooklyn and off-Broadway theaters have resulted in bans on the machine. The Sinfonia is far from a push-button device, as it requires significant programming by experts, including the music director, and use in rehearsals with stage performers. Because each performance can vary slightly in how songs are sung, Sinfonia operators can adjust aspects of the performance that are programmable in real time, such as the tempo, by taking cues from performers on stage. Smith says the machine has been used in over 9,000 performances worldwide.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "The 64-Bit Question"
    Technology Review (06/02/04); Garfinkel, Simson

    Personal computers are unlikely to migrate wholesale to the 64-bit computing architecture because that memory addressing capacity will not likely be needed by regular users, despite marketing claims from vendors. Past experience shows that greater memory addressing capability and wider registers--signified by the 8-bit, 16-bit, and 32-bit designations--are seldom a roadblock for creative computer designers. The first IBM PC used an Intel 8088 microprocessor that provided insufficient access to main memory from the moment it was shipped in 1981; to address the shortage, the 8088 came complemented with a set of segment registers shifted to the left, allowing for effective access to up to 1MB of main memory instead of just 64KB. The 80286 came a few years later but included an emulation, or "real" mode, that allowed users to run older and more popular 8088 software, such as Microsoft DOS. Seldom were 286 chips used in their "protected" mode where they could access 16MB of memory. The shift to the 32-bit processor came in 1985, and again many of these chips operated in "real" mode in order to make use of popular software; the chips ran faster not because of the extra memory access, but because they had an advanced silicon design. Only with Windows 95 did Microsoft finally take full advantage of the 32-bit PC capability. Today's PCs are not likely to move quickly to 64-bit computing because the gains from that move will not even be as apparent as the shift from 16-bit to 32-bit processing, since bits are exponential. Computers with 32-bit chips can access up to 4GB of main memory for each application, far more than is needed for most desktop tasks. The only reasons 64-bit chips will continue to sell in the marketplace is because it justifies higher margins for chipmakers and could be used for some future computer applications, such as virtual reality or complex simulation on the desktop.
    Click Here to View Full Article

  • "Fired Up for the Supercomputer Derby"
    Business Week (06/07/04); Port, Otis

    The Defense Advanced Research Projects Agency (DARPA) has selected a trio of proposed petaflop supercomputing architectures for development into the prototype phase to see which design will win a contest to determine a real-world prototype for introduction in 2009. One contender, Sun Microsystems' Hero architecture, is designed with chips stacked close together to support wireless communication, which could potentially double the flow of data. The Cascade architecture from Cray will integrate vector chips that process strings of related numbers with "lightweight" chips that process scalar code and feature a processor-in-memory layout. IBM's proposal will boast reconfigurable chips, the brainchild of University of Texas at Austin researchers Stephen W. Keckler and Douglas Burger. The chips can switch between vector and scalar code processing because they can be rewired instantly: "On the fly, the chip could flip from vector to scalar and back to vector, whichever would be best for the code that's about to run," explains Burger. Furthermore, the chips could be manufactured in large volumes because the underlying circuitry is unchanged. DARPA is also supporting the development of new benchmarking and efficiency metrics so that prospective purchasers can evaluate the advantages and disadvantages of different systems. DARPA has teamed with other federal entities such as the Energy Department, NASA, and the National Security Agency to sponsor the contest between the proposed IBM, Cray, and Sun petaflop systems.
    Click Here to View Full Article

  • "Portable Power Supports the Digital Battlefield"
    Military & Aerospace Electronics (05/04) Vol. 15, No. 5, P. 1; Ames, Ben

    Today's troops carry an average of five electronic devices into battle, and their options for keeping them operational are limited to either packing extra batteries or recharging the equipment with diesel generators on High Mobile Multi-Wheeled Vehicles. Engineers are investigating alternate sources of power in order to lighten soldiers' load and extend the usefulness of their electronics. Fuel-cell technology is one area of concentration, although current commercial products can only recharge a mobile phone or Palm-type handheld; Mike DiBiase of General Dynamics C4 Systems reports that large-scale fuel cells that can power servers, laptops, and other large computers are three to five years away from mainstream use. A report from Frost & Sullivan concludes that "The dynamic battery market for military equipment is offering vendors significant opportunities as research and development-oriented contracts are given by the government to continuously enhance existing battery chemistries." The market growth is being spurred by safety issues with traditional military batteries, which can detonate under certain circumstances and also hinder soldiers' mobility and stamina in battlefield situations. Rechargeable lithium ion and lithium/manganese dioxide batteries under development by researchers are cheaper and boast higher power density than current models; Oak Ridge MicroEnergy is pursuing batteries with extra power density as well as rechargeable thin-film batteries that are well-suited for wireless sensors, security cards, radio frequency identification tags, and chip memory backup. Other projects focus on portable power generation technology such as photovoltaic fabrics that tap sunlight. Such technologies could reduce troops' weight burden, which Army Natick Soldier Center director Philip Brandler describes as "an ever-increasing problem, as the electronics behind future warrior systems become more sophisticated, complex, and reliant on portable battery power."
    Click Here to View Full Article

  • "In the Eye of the Beholder"
    IEEE Spectrum (05/04) Vol. 41, No. 5, P. 24; Lewis, John R.

    Using lasers or light-emitting diodes (LEDs) to project images directly into the viewer's retina is not only superior in power efficiency to PC monitors, but could revolutionize gaming, medicine, and other industries. Microvision is developing and marketing cutting-edge technologies that the scanned-beam display could derive from, such as the Nomad Expert Technician System. The product consists of a head-mounted display with a flat window designed to reflect scanned laser light into the user's eye, and the beam inflicts no physical harm on the user because the power of the laser light is restricted to about 1/1000 of a watt. The system is used by automotive service technicians to keep track of repair data on the job, and Microvision's see-through, laser-based display has been tested by medical personnel as a surgical enhancement technology. Scanned-beam displays are composed of electronics, light sources, scanners, and optics, and a modular architecture allows these components to be combined into different products. A key challenge is to point the light beam into the eye, which is constantly moving, and this can be facilitated by focusing the beam onto an exit pupil expander. Collecting and focusing light down into a pinpoint is a tough job for an LED, but edge-emitting LEDs and surface-emitting LEDs have helped increase brightness, and further advancement down the road will refine the brightness even more. Scanned-beam displays will also perform more efficiently with the anticipated doubling of memory density and processor power every two years.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM