ACM TechNews sponsored by Parasoft Logo    Automated Software Analysis - by Parasoft:
Error prevention for unit testing, coding standards analysis,
functional/performance testing, monitoring and reporting.

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Parasoft or ACM. To send comments, please write to [email protected].
Volume 6, Issue 736:  Wednesday, December 29, 2004

  • "Computer Glitches Have Airline History"
    Baltimore Sun (12/28/04) P. 6D

    Computer glitches forced Delta-subsidiary Comair to cancel its entire schedule of 1,100 Christmas Day flights, marking the reliance on unstable computer systems that caused smaller-scale outages at other carriers this year. In August, incorrect inputs into a computer system caused a cascading shutdown of American and U.S. Airways systems, resulting in several hours of flight groundings and delays; in July, Northwest was forced to cancel over 120 flights after a power outage caused a computer system failure; and in May, 40 Delta Air Lines flights were cancelled due to computer problems. Computer security expert Bruce Schneier says airlines could do more to secure their computer systems by installing ultra-reliable software such as that used by NASA and medical facilities. Backups and upgraded systems to handle more capacity are also available, but Schneier says all of these options cost too much money to make sense economically. It is easier for companies to absorb the losses caused by such outages, including compensating customers and paying for employee overtime. AMR's Tim Wagner, whose firm owns American Airlines, says his group has implemented backups to ensure against future outages such as the one suffered in August. For all their faults, computer systems are necessary to provide the efficiency and level of service customers enjoy today, he notes. Schneier warns that fault-ridden computer systems could be intentionally exploited, putting more than revenue at risk. He says, "If this kind of thing could happen by accident, what would happen if the bad guys did this on purpose?"
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Code Sleuths"
    Boston Globe (12/27/04) P. C1; Weisman, Robert

    Companies are using open source software more often, but are also ramping up audits of their proprietary software to make sure there is no license infringement. The problem is relatively new and was highlighted by SCO Group's $1 billion lawsuit against IBM in 2003 and subsequent legal action against other major Linux users. Industry experts say the code-auditing efforts to ferret out open source code is similar to the Sarbanes-Oxley financial reporting rules that require companies to value their software and assess legal risk for those products. Some firms have responded to the threat by banning open source software altogether, but others are trying to adapt with scanning software and customized search engines that review software before it is released. A number of firms have established new businesses meant to help companies check their products for open source code, especially the General Public License (GPL) that requires public release of source code that uses components previously covered under the GPL. Sometimes the GPL code is intrinsically embedded within a software product and cannot be removed without serious redesign; more ominously, many firms are not aware of open source code in their software and could be subject to lawsuits, a threat exacerbated by the fact that offshore programmers often mix open source code components with their own code they are developing for U.S. companies. Black Duck Software in Waltham, Mass., helps firms identify open source code by checking software against a massive repository of open source code, and highlights lines that are identical to open source products in different colors to mark their origin. The threat of open source code is serious but manageable, and needs to be accepted as a cost of doing business, says LCS Telegraphics President Robert Dezmelyk.
    Click Here to View Full Article

  • "Internet Lib Group Backs Anonymity Project"
    TechNewsWorld (12/27/04); Mello Jr., John P.

    The Electronic Frontier Foundation (EFF) this week officially declared its support for the Tor Project, an effort that has yielded an open-source application for helping users maintain their anonymity while they surf the Web. EFF's Chris Palmer says that upholding Tor will help protect the constitutional right to anonymity. He cites the Google search engine's practice of tracking users' IP addresses and tagging them with unique user IDs in cookies that reside on the users' hard drives. Palmer argues that IP address collection can be synonymous with harvesting personal ID data, which Tor subverts. The Tor Project's Nick Mathewson says that Tor passes a conversation through several servers, or onion routers, encrypting it with several keys at each stage so that only a segment of the path the communication travels is perceived by each router. He says, "The goal is to prevent communicating parties and possible eavesdroppers or attackers from learning who is talking with whom over a network like the Internet." Tor Project leader Roger Dingledine says the goal of the project is to furnish a privacy and anonymity platform for law-abiding users, although some protective measures targeting malicious Netizens have been embedded. "By default, most Tor servers on the Internet will refuse to send email so they can't be abused by spammers," Palmer says.
    Click Here to View Full Article

  • "A Scarecrow With Brains"
    Design Engineering (12/16/04)

    The fruit industry could potentially reap millions of dollars in annual savings thanks to software developed by a pair of University of South Australia (UniSA) researchers. Melanie Symons and Chris Clark's software digitally processes bird-calls recorded on microphone and matches them to the call traits of birds in a computer library. Once a "problematic" species (i.e., a bird that threatens crops) is identified, the software triggers a species-specific technique to scare the bird away. "Our research has suggested that scaring techniques need to be species specific and more than one scaring technique is needed to 'maximize scaring efficiency,'" notes Clark. "If you randomly combine auditory and visual scaring techniques the number of unique scaring combinations increases and the birds won't become accustomed to them as easily." The research chiefly concentrated on the Adelaide Rosella, a major pest for cherry farmers, some of whom still lose at least 30 percent of their crop to the species even with existing bird scaring methods. Symons and Clark say their software could eventually be commercialized and incorporated into an economical hardware unit for fruit orchards. The students devised the software as part of their engineering studies at the Mawson Lakes campus of UniSA.
    Click Here to View Full Article

  • "TRN's Top Picks"
    Technology Research News (01/05/05)

    Easier to use, more interactive computer interfaces constitute a notable research focus of the past year, with significant advances including NASA speech recognition technology that reads the user's throat nerve activity rather than acoustics; a user-reconfigurable control knob from Sony; a collaborative venture between England's University of Cambridge and Light Blue Optics to embed holographic projectors in numerous gadgets; and a combination handheld projector/radio frequency identification tag reader from Mitsubishi Electric Research Labs. Quantum cryptography breakthroughs this year include an always-on six-node quantum cryptography network for exchanging secure keys constructed by BBN Technologies, Boston University, and Harvard researchers. Another area of research in 2004 concentrates on data access and retrieval and data processing and presentation. Access/retrieval achievements include software from MIT and Cornell that taps the topic framework of whole articles to produce more accurate automatic summaries, while processing/presentation accomplishments include a robot scientist from U.K. researchers that can theorize, test theories through experimentation, and interpret the results. Breakthrough Internet research includes the University of Michigan's Small-World Instant Messaging system that recognizes expertise and directs queries appropriately; a Cornell and Internet Archive collaboration that measures users' responses to item descriptions by dividing the number of users who download an item by the number of users who read the description; and research from Germany's Max Planck Institute demonstrating that cascade failures caused by attacks on large network nodes could be averted by deactivating peripheral nodes. A significant achievement in the data storage sector is Boston University's development of a tiny mechanical memory cell whose high rate of vibration would allow it to switch on and off at speeds comparable to electronic memory chips, yet preserve data while deactivated as well as with radiation present.
    Click Here to View Full Article

  • "Worming Into Apple"
    Wired News (12/27/04); Kahney, Leander

    Silicon Valley has had a long history of "skunkwork," in which engineers engage in pet projects hoping they will be turned into products, even if the projects were previously terminated; some companies, seeing the potential value of such efforts, willingly turn a blind eye to this practice. Such was the case with programmer Ron Avitzur's Graphing Calculator software, a brainchild that was cancelled when Avitzur lost his job at Apple Computer in the early 1990s. Avitzur decided to continue the project by sneaking into Apple's California headquarters for six months, and paying subcontractors out of his savings to refine the software. He enlisted another unemployed programmer, Greg Robbins, as a collaborator, and both programmers worked in vacant offices and evaded Apple security. Key to the project's success was gaining access to Apple's PowerPC chip, and prohibitions on taking prototype machines out of the office made sneaking in Avitzur and Robbins' only real option. Avitzur says he undertook the project out of altruistic concerns: "We really thought this was the most important software to get to schools and the best way to get it into schools was to get it installed on every machine," he notes. Eventually, Avitzur was able to persuade teams of testers and researchers to tweak the software, and Apple managers were impressed by the calculator's potential to highlight the Mac's speed. The company tested and documented Graphing Calculator, and licensed the software for a nominal fee to its inventors; the software, which is still included in every Mac, has shipped with 20 million units, by Avitzur's estimates.
    Click Here to View Full Article

  • "Ultrafast Supercomputer to Simulate Nuke Explosion"
    Reuters (12/26/04); Tanner, Adam

    The Comprehensive Test Ban Treaty of 1996 has made computer simulation the primary means of gauging the safety and reliability of stockpiled nukes, and Lawrence Livermore National Lab researchers will demonstrate how missiles dating back to the Nixon presidency would perform today through a 3D nuclear explosion model generated by IBM's Blue Gene/L supercomputer. Lab officials say the supercomputer's approximately $100 million price tag will make the seven-minute-long simulation seem economical compared to the production and promotion costs of the recent computer-animated movie "The Polar Express." Although Blue Gene/L will be capable of 360 trillion calculations per second once it is fully up and running, officials expect the simulation to take two to four months. Christopher Paine of the Natural Resources Defense Council wonders whether test simulations are so essential to U.S. defense requirements in a post-Cold War environment. He sees less of a need for guaranteeing the simulations' accuracy and more of need for simply assuring that the weapons detonate. Associate director for defense and nuclear technologies Bruce Goodwin counters that the reliability of any one nuclear warhead is critical, in light of negotiated stockpile cutbacks expected in the coming years. He also notes that ensuring warhead safety is important, given that scientists say there are a lot of unknowns regarding what happens to a nuclear bomb's components over time. The performance of the software for simulating the nuclear blast is another point of concern for critics, who argue that flaws are unavoidable and feel that nuclear scientists need increased oversight.
    Click Here to View Full Article

  • "Apache Avalon Project Closes Down"
    InfoWorld (12/23/04); Krill, Paul

    Former Apache Avalon project committee Chairman Aaron Farr confirmed on Dec. 22 that the Avalon project was terminated and disbanded last month following the emergence of internal parallel development initiatives. "Basically, there had been disagreements about the direction the project should take," he explained. "We eventually decided rather than waste resources on infighting, there was more than enough space for the projects to exist within their own project spaces." The goal of the Avalon project was to create an Inversion of Control (IoC) architecture for container programming, with the container controlling components that exchange lifecycle and dependency functions between each other. Farr says the originally Java-oriented effort was updated to include C# as a target platform. He now serves on the management committee of the Apache Excalibur project, an Avalon spinoff that functions as a container that can be inserted within an application. Other offshoot projects include Castle, designed as a C#-based IoC framework and container managed under the auspices of SourceForge; Loom, a microkernel container overseen by the Codehaus open source concern; and Metro, a J2EE platform alternative administered by the Digital Project Meta Library, an organization founded by former Avalon developers.
    Click Here to View Full Article

  • "Isolation Nation"
    San Diego Union-Tribune (12/27/04); Sidener, Jonathan

    The convenience of computerized automation carries the price of social disconnection, with popular technologies and virtual applications such as the Internet, ATMs, automated checkout machines, email, and online dating services eliminating people's desire for face-to-face interaction. "Your world shrinks if technology helps you associate only with people who think the way you do and work the way you do," remarks retired professor, author, and urban sociologist Ray Oldenburg, who explains that casual social interaction is a vital element of society's makeup. Such interactions take place at informal meeting places, or "third places," whose significance is dwindling in the face of proliferating automation. On the other hand, Carnegie Mellon University professor of human-computer interaction Robert Kraut sees benefits in the automation of some forms of human contact, such as greater efficiency and improved quality of service. At the same time, he acknowledges that automation cannot compete with live interaction in certain scenarios, customer service being a case in point. In a study investigating the link between Internet habits and depression, Kraut determined that depression was less frequent among people with a healthy social circle who use the Net to maintain contact with friends and relatives, and more frequent among people who use the Internet to meet people and forge new friendships. Intel cultural anthropologist Genevieve Bell observes that the automation trend is conspicuously American, and primarily driven by the importance of self-sufficiency in the American cultural mindset. She says Americans in general feel obligated to automate routine, repetitive tasks, whereas people in other parts of the world often feel a social need to hire others rather than automate jobs.
    Click Here to View Full Article

  • "Facing an E-Waste Mountain"
    BBC News (12/28/04); Geoghegan, Tom

    The United Kingdom is facing a mounting e-waste problem, with an estimated 6 million electrical items buried in landfills across the country. Just 15 percent of mobile phones are recycled, partially because few people are aware that recycling programs exist. Though electrical manufacturers will be responsible for recycling returned products once the European Union's WEEE directive goes into effect in August 2005, consumers have no responsibility for returning discarded items. The EU is also pushing to eliminate hazardous materials from all electrical goods by 2006. Biffa Waste Services' Bill Conran warns that because the British government has not translated the WEEE directive into a series of U.K. regulations, the waste industry is not prepared to manage increased e-waste recycling demands. He says, "The difficulty we have is investing in something which at this stage is largely unknown. We don't know the standards, we don't know when and we don't know how." Friends of the Earth waste campaigner Claire Wilton notes that U.K. ministers have received a proposal to set up a national clearinghouse that will manage a system of collection points, with the responsibility for taking away discarded items shared by electrical manufacturers. However, she is unsure that the government can establish such a system before the proposed deadline, while electrical companies are concerned that the directive will cost the industry 30 million euros annually, by one estimate. Another problem is the export of e-waste, which is often sent to developing nations to be broken down by laborers in unsafe conditions; enforcement measures against this practice are complicated by the e-waste's Mideast export route, along which its country of origin is concealed by doctored documentation.
    Click Here to View Full Article

  • "The BEST Solution for IT Implementation"
    IST Results (12/24/04)

    The IST-funded BEST project on Better Enterprise System Implementation spent 28 months researching more then 250 IT system deployment scenarios in order to learn why many implementations fail and formulate an assessment tool and methodology for ensuring success. Project partner Peter Dodds says many of the failures uncovered by BEST were caused by organizations discounting the preparation of the human aspect for the changes wrought by new technology. "Even when companies have undertaken similar projects in the recent past, the experience and knowledge gained seems to get lost somewhere," he notes. BEST focuses on organizations either choosing an IT product or embarking on change management, and employs a diagnostic tool to evaluate the project's health and thus increase the likelihood of a successful deployment, says Dodds. The BEST assessment is the end product of a series of meetings, workshops, and face-to-face interviews at all organizational levels, and the assessment tool pinpoints cause-events-actions-outcome chains in similar scenarios through context questions. Each interconnected chain symbolizes decision areas that entail or eliminate risks. A dozen companies in 10 European nations have tested the BEST tool, and response has been generally positive. "Managers appreciated very much the fact, that thanks to the assessment methodology, difficulties or risks could be clearly defined and a more focused action being more tailored on the problem could be taken," notes project coordinator Dr. Ip-Shing Fan of Cranfield University in England.
    Click Here to View Full Article

  • "China Starts Up World's Biggest Next-Gen Internet Network"
    TechNewsWorld (12/27/04); Lyman, Jay

    The China Education and Research Network (CERN) has launched the world's largest IPv6 network, linking 25 universities in 20 Chinese cities, increasing bandwidth, and opening up vast Internet address resources. The new CERNET2 network normally transmits data at speeds between 2.5 Gbps and 10 Gbps, but achieved 40 Gbps speeds during a trial run earlier this month. China, South Korea, and Japan are collaborating on IPv6 deployment because those countries are in need of IP addresses, which were distributed unevenly in the early days of the Internet. The University of California has more allocated IP addresses than the entire country of China, which today boasts 80 million Internet users. IPv6's 128-bit addressing scheme will make IP address allocation a non-issue and is the main driver for projects such as CERNET2, says analyst Zeus Kerravala. A number of large Asian firms are actively working on IPv6 migration, including Fujitsu, Hitachi, NEC, and Samsung, and the countries involved have hinted at possibly making IPv6 mandatory in their region. U.S. industry and government is far less enthusiastic about the move because the U.S. holds 74 percent of the IP addresses under the old IPv4 framework. Kerravala predicts it could be up to seven years before the U.S. begins serious migration to IPv6. Some administrators and users have had problems dealing with IPv6, he says; for instance, because many IP address management tools are custom-built, they are not optimized to deal with IPv6 resources.
    Click Here to View Full Article

  • "At IBM, That Google Thing Is So Yesterday"
    New York Times (12/26/04) P. BU7; Fallows, James

    IBM researchers are developing what they call the third generation in search technology that will search according to the underlying meaning, and not simply by keywords, page rank, or other peripheral indicators. IBM plans to employ natural language processing for "discovery systems" that can search through files no matter whether they are structured as Web documents, databases, images, video, or audio. Although search technology has advanced significantly in just the past several months, especially with Google committed to new features and products every few weeks, the inherent limitations of Google-style search are apparent when trying to answer nuanced questions: For instance, though most Web searches are proficient in finding Mozart's birthplace, trying to find out how many Web pages are published in each language requires some extra effort on the part of the user. IBM researchers recently showed off their developing technologies at the Thomas J. Watson Research Center, including a system called Piquant that was able to answer questions from an article, even though the answers were not phrased exactly within the article. Piquant analyzed the semantic structure to answer "Who is Canada's prime minister?" even though those exact words were not used in the text. The Semantic Analysis Workbench automatically translates data to reveal useful patterns, and the researchers gave several examples to show how this would be useful. Customer service centers could analyze customer service notes to identify trends or problems, genomic research could benefit from unexpected correlations, and terrorist plots could be uncovered through the analysis of suspect telephone calls. Another technology would work in a manner similar to Google News, but return only headlines from non-English publications and provide a rough English translation upon request.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Analysis: What's New for the PC of 2005?"
    IDG News Service (12/21/04); Williams, Martyn; Krazit, Tom

    Little change is expected between today's PCs and next year's PCs, with the anticipated upgrades being mostly incremental. Stephen Baker with NPD Techworld reports that Intel and Advanced Micro Devices will introduce dual-core chips by the end of next year, but their incorporation in mass-market PCs before 2006 is unlikely. He also expects a new family of Intel chipsets supporting Double Data Rate 2 (DDR2) memory and the PCI Express interface to reach mainstream or approximately $800 systems in 2005; DDR2 will enable memory chips to transfer data at higher clock rates, while PCI Express will ramp up the speed of data transmission between the chipset and peripheral hardware. The debut of Intel's next-generation Alviso chip set in early 2005 will mark the introduction of PCI Express and DDR2 into notebooks, and IDC's Roger Kay predicts that notebook sales will continue to overtake desktop sales next year. More PCs with Microsoft's Windows XP Media Center Edition 2005 operating system will be shipped, and Baker expects to see $800 PCs with roughly 200 GB of storage in 2005, compared to current PC hard drive storage capacities of 80 GB to 120 GB. Incremental updates in DVD read and write speeds are foreseen, while the market debut of PC drives supporting the Blue-ray Disc and HD-DVD formats should be in late 2005. Also gaining momentum is Serial ATA disk technology, whose Native Command Queuing feature delivers a significant performance increase by enabling a drive to handle multiple PC commands in the most efficient order. Major PC upgrades are not expected until the consumer rollouts of Microsoft's Longhorn operating system and 64-bit applications and optical drives based on HD-DVD and Blue-ray, which are slated for 2006.
    Click Here to View Full Article

  • "Open Source Concepts Changing Software Industry"
    Oregon State University News (12/06/04); Stauth, David; Davis, Diane

    Oregon State University (OSU) launched the nation's first university-based open source laboratory in February, which has since helped save OSU money, increased educational opportunities for students, and contributed to the launch of the Mozilla Firefox 1.0 browser. The concept of organized academic contribution to open source development is catching on, and many IT hardware vendors already have significant resources dedicated to furthering open source software. OSU is partnering with schools in Hawaii and Indiana to create open source financial software that could be shared among the higher education community. OSU Open Source Laboratory associate director Scott Kveton says universities are not in the business of developing their own code, so they may as well share resources with other groups. More than half of OSU's IT infrastructure runs on open source products, including its email, Web servers, and domain name management. The laboratory is also helping to distribute the increasingly popular Mozilla Firefox browser; all downloads requests for Firefox are routed to mirror sites by OSU. Roughly 6 million downloads were registered in the first month, and OSU has received donations of bandwidth, equipment, and other resources from the non-profit Mozilla organization and others. Specific funding is provided by university constituencies or outside groups that contract the OSU Open Source Laboratory to develop a certain capability. Open source is more a philosophy for software development than a business enterprise and synchs well with higher education's goals of developing people and communities, says Kveton.
    Click Here to View Full Article

  • "Life Interrupted: Plugged Into It All, We're Stressed to Distraction"
    Pacific Northwest (11/28/04); Seven, Richard

    Scientists are concerned that the Information Age is nurturing "cognitive overload," an umbrella term for the malaise people feel as a result of distraction, stress, multitasking, and data congestion related to increasingly sophisticated technologies. People multitask because it is expected, encouraged, and considered vital, yet cognitive scientist David Meyer reports that truly effective multitasking is beyond people's capabilities. He also notes that constantly switching between tasks reduces productivity, is fatiguing, and can impact long-term health. Meanwhile, some scientists claim technology and information overload are reducing people's attention spans. Former Xerox Palo Alto Research Center director John Seely Brown remarks that an overemphasis on computing firepower has led to products based on "tunnel design" that cause cognitive overload. Professor David Levy of the University of Washington's School of Information is attempting to establish the Center for Information and the Quality of Life as a lab for probing workspaces and redesigning them to address well-being and labor in equal measure. Levy sees no point in blaming technology for the current angst, when the real culprit is imbalance brought about by their application. Levy says, "Part of what's missing from our discussion about technology, even the technology in relation our lives, is a more positive vision of where we're trying to get to. What are the measurements and criteria of well-being in the workplace? How do we even begin to talk about that?" John de Graaf, national coordinator of the Take Back Your Time organization, says productivity gains facilitated by technology are increasingly encroaching on leisure time: "We are not only working faster but even longer, and filling our limited leisure with busy activities, leading to an increasing sense of time poverty," he asserts.
    Click Here to View Full Article

  • "2004: A Year of High-Tech Changes"
    InfoWorld (12/20/04) Vol. 26, No. 51, P. 12

    Several important, unexpected events occurred over the past year in the technology sector: IBM sold its PC business to Chinese manufacturer Lenovo, marking the entrance of Chinese firms on the global market as well as solidifying IBM's new role as an IT services firm instead of hardware manufacturer. Gartner has predicted further PC-sector consolidation as a result of the deal. Oracle won its bid for PeopleSoft, with Oracle President Charles Phillips promising to integrate the two firms by the end of the year. Open-source technology has become a default option for enterprises when considering new deployments, says Wells Fargo infrastructure architecture team lead Eric Friedman; the Linux 2.6 version kernel offers more enterprise-class capabilities, and that operating system is part of the reason Sun Microsystems and Microsoft formed a landmark agreement to make their technology interoperable. As expected, Microsoft delayed the release of its Longhorn Windows project and said it would not include the WinFS file system, but still promises that Avalon, Indigo, and other critical components will be included. Overall trends for 2004 include strategic customer focus, more business intelligence software to help with government regulations, and more holistic security solutions. Service-oriented architecture (SOA) has emerged as one of the most critical new application deployments, and all of the major middleware vendors are working to establish themselves as leaders in the market. Web services also gained a boost from the de facto standardization of BPEL for Web services and the increased traction of the Web Services-ReliableMessaging specification. Storage technologies are quickly advancing, and new virtualization products are available from IBM and will be soon from EMC.
    Click Here to View Full Article

  • "Engineered Enhancers Closer Than You Think"
    EE Times (12/27/04) No. 1352, P. 1; Murray, Charles

    Some scientists see enhancement applications in developing technologies such as electroactive polymers and nanobots that can be injected into the bloodstream. Future enhancers will not be as dangerous as today's anabolic steroids, but will prompt tremendous moral and legal debate before finally being accepted as inevitable. Although some dismiss visions of bionic eyes or Olympic Games for enhanced athletes as science fiction, there are currently in-use technologies that could become important human enhancers. Electroactive polymers flex in response to electrical charges and are seen as a possibility for future artificial muscle implants; the technology is used to wipe the lenses of exploratory robots, and an electroactive polymer-powered mechanical arm is being pitted against a 16-year-old girl in an arm-wrestling contest in the upcoming Smart Materials and Structures Conference. The researchers involved in that project see it being included in the Defense Advanced Research Projects Agency's exoskeleton project, which aims to create robotic enhancements so soldiers can carry heavier loads, run faster, and jump higher. Visual augmentation, such as bionic eye implants that allow "zoom vision," could become common 30 years from now, says Jerome Glenn of the American Council for the United Nations University; electronic devices can be positioned behind the eye and coupled with artificial lenses to allow long-distance sight. Nanobots deployed in people's brains could release chemical stimulants that speed anticipation and response. The military is already reportedly identifying DNA components linked to favorable traits in elite soldiers such as Navy Seals, says Arlington Institute futurist and former National Security Council staff John Petersen.
    Click Here to View Full Article

  • "Portable Projectors"
    Technology Review (12/04) Vol. 107, No. 10, P. 72; Zacks, Rebecca

    Mitsubishi Electric Research Laboratories research scientist Ramesh Raskar believes miniprojectors could be a panacea to the growing problem of shrinking displays on handheld devices. He leads a team that has devised hardware and software for projecting digital images onto any available surface. The prototype device aims the projected image using four red lights taped onto the surface as reference points, while four lasers beam red dots to areas just outside each corner of the image; Raskar notes that a camera on top of the projector monitors all eight lights to determine how the surface and the projector relate to one other, so that the computer can adjust the image on the spur of the moment to maintain stability. Raskar's team has also developed a methodology that allows multiple projectors to shine a combined single image onto a curved surface, using cameras and algorithms to tell the computer each projector's orientation, what section of the image each device should project, and how to correct for surface distortion. Jeroen van Baar on Raskar's team says the technology can also compensate for image overlap: "We can find exactly which two or three or four pixels overlap with each other, and we can adjust accordingly," he exclaims. The miniprojectors also have enormous potential for inventory management when used in conjunction with photosensor-equipped radio-frequency identification (RFID) tags affixed to items. Raskar has created an RFID scanner that projects a series of vertical and horizontal lines of varying thickness; when these patterns are registered by the tags' photosensors, the tags can ascertain their exact location and transmit that data back to the scanner, enabling the scanner to look up online product information for each tagged item and then project the data directly onto the item.

    [ Archives ]  [ Home ]