HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Parasoft Logo    Automated Software Analysis - by Parasoft:
Error prevention for unit testing, coding standards analysis,
functional/performance testing, monitoring and reporting.
http://www.parasoft.com/products


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Parasoft or ACM. To send comments, please write to [email protected].
Volume 6, Issue 731:  Wednesday, December 15, 2004

  • "Google Is Adding Major Libraries to Its Database"
    New York Times (12/14/04) P. A1; Markoff, John; Wyatt, Edward

    Google today is expected to announce a deal to digitize the contents of major U.S. research libraries and Oxford University, and make the material freely searchable over the Internet as part of Google's regular Web service. Involved parties estimate a $10 cost for digitizing each of the approximately 15 million documents that are covered under the arrangement, while librarians anticipate that the project could take at least 10 years to complete. The goal of the project is to expand the scope of the Web and establish a virtual card catalog and searchable archive. Other projects have similar aims: For example, the Library of Congress and a group of international libraries yesterday announced an initiative to build a publicly accessible digital library of 1 million books, 70,000 of which are expected to be online by April 2005. Stanford University head librarian Michael Keller predicts that "Within two decades, most of the world's knowledge will be digitized and available, one hopes for free reading on the Internet, just as there is free reading in libraries today." Internet Archive President Brewster Kahle hopes opening up the great libraries to anyone with a Web browser will help spur the creation of new material, while Daniel Greenstein of the University of California's California Digital Library says digitization would allow libraries to concentrate more on the collection and dissemination of knowledge than on the management of printed collections. Publishers, meanwhile, will have to find a way to sustain libraries' status as major buyers of their wares without letting digital libraries subvert their ability to reap profits through the commission and publication of authors' work. Google's agreement authorizes the full publication of works that have reverted to the public domain, while allowing only excerpts of copyrighted works to be accessible online.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Security Research Suggests Linux Has Fewer Flaws"
    CNet (12/13/04); Lemos, Robert

    The Linux operating system has far fewer flaws than average commercial software, concludes a Dec. 14 report from Coverity. The report is the end result of a four-year project that uncovered 985 bugs in the 5.7 million lines of code comprising Linux kernel version 2.6. Carnegie Mellon University estimates cited in an April report from the National Cybersecurity Partnership's Working Group on the Software Lifecycle indicate that a similarly-sized commercial program contains between 5,700 and 40,000 bugs, approximately. No access to the source code of Microsoft's Windows operating system means that the Coverity study lacks data concerning bug frequency in that system, but this latest report will probably fuel arguments among Linux, Windows, and Mac OS X supporters over which system offers the highest level of protection. Coverity CEO Seth Hallem emphasizes that research on the latest Linux kernel iteration demonstrates a link between the open-source development process and the operating system's security. Microsoft evaluates its Windows code using analysis tools similar to those employed in the Coverity study, causing Hallem to surmise that Microsoft has probably lowered the number of flaws in its proprietary operating system. Code-analysis tools usually analyze source code and signal any potential glitches using software-design principles. Coverity intends to supply regular bug analysis reports on Linux and allow the Linux developer community to access a summary of the results.
    Click Here to View Full Article

  • "Looking to Wireless for Growth, Tech Giants Seek More Spectrum"
    Wall Street Journal (12/15/04) P. A1; Squeo, Anne Marie

    Technology firms have stepped up lobbying efforts aimed at freeing more wireless spectrum, and the Bush administration and Congress are responding. Wireless technology is seen as a foundation for a new round of technology growth and innovation, but the old way of allocating spectrum has left much of that valuable commodity unused; broadcasters are unwilling to give up their spectrum rights as they move to digital transmissions, which would require less radio space than old analog technology. The Dec. 31, 2006 date for spectrum reassignment is expected to slide further back due to wrangling over this issue. Meanwhile, wireless spectrum is seen as a consolidating force in the growing wireless industry and is the basis for a speculated Sprint-Nextel merger. The Bush administration has called for government to give up some of its underused spectrum, and asked federal agencies to review their current spectrum usage in order to find efficiencies, while the Pentagon is also being asked to review its spectrum holdings, which are used for exclusive systems such as the Global Positioning System. Congress has created a "spectrum reallocation fund" that would reimburse the Pentagon and federal agencies for wireless space they cede to public or industry use. The FCC also plans to auction off more spectrum in January, including that acquired from the defunct NextWave Telecom. With more wireless space, the technology industry would be able to offer new high-speed wireless data services and voice service over the Internet. Michigan State University telecommunications professor Johannes Bauer says, "The potential for this market to expand quickly is here."

  • "A Plea for Support of Innovation"
    Washington Post (12/15/04) P. E3; Schneider, Greg

    A Dec. 15 report from the Council on Competitiveness warns that the United States' global economic leadership will flag unless government and industry take action designed to spur innovation. The council's 15-month National Innovation Initiative study concludes that "the capacity for innovation is going global--and we must pick up the pace." Among the report's recommendations is the institution of tax breaks for corporate research and development, a revamping of immigration laws to encourage skilled foreigners to pursue U.S.-based employment, the Defense Department's commitment of 20 percent of its science and technology budget to long-term research, and tort reform to give companies an incentive to undertake riskier ventures without worrying about being sued for failed projects. The report also urges the Patent Office to tap its patent database and reemphasize quality to help nurture innovation. The council will map out its recommendations today, contiguous with a two-day economic summit hosted by President Bush, though council President Deborah Wince-Smith says the panel has a wider-ranging agenda: "Everything we looked at is in terms of how this can increase productivity and standard of living growth for all Americans," she reports. Among the problems the council attempts to address is the anxiety among U.S. workers concerning the growth of talented foreigners willing to work for less money. Proposed solutions include privately funded scholarships for science and engineering majors, federally funded fellowships for grad students, and health care and pension program reforms that can be retained from job to job. The council also calls for a renewal of U.S. manufacturing technology through an exchange of best practices between companies and the creation of joint manufacturing centers.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Building Thinking Robotics for the Real World"
    IST Results (12/15/04)

    Researchers at the IST program-sponsored Bayesian Inspired Brain and Artifacts (BIBA) project are uniquely applying Bayesian reasoning to the design of engineered artifacts that can learn to behave in a rational manner with incomplete data. Bayesian reasoning, which states that future or present events can be identified by past events, is used by Internet search engines and voice-recognition applications to provide hierarchically-ranked data based on previous queries. The BIBA project attempts to understand animal behavior using Bayesian principles and then apply the same reasoning to create real-world artifacts: BIBA scientific manager Pierre Bessiere of INRIA's GRAVIR laboratory notes that "Both living organisms and robotic systems face the difficulty of how to use an incomplete model of their environment to perceive, infer, decide and act efficiently." In the absence of information, humans and animals use past experience to anticipate occurrences and thus implement a course of action based on those predictions. BIBA researchers have so far developed three demo artifacts whose built-in, pre-specified environmental knowledge is restricted, thus enabling them to use evolutionary methods to discover part of their preliminary knowledge and then assess the results. One artifact is an automated video game avatar that uses probabilistic logic to learn by mimicry and imitate the human player's situational responses; another demonstrates an obstacle avoidance system for the computer-controlled Cycab to enable safe driverless transportation in streets rife with unpredictable variables. The third demo is the BIBA Bot, a machine that deposits tokens in a path that it can follow home, using the pheromone trail behavior of ants as a model. The ultimate goal of the BIBA Bot experiment is to build a robot that employs probability inference to contend with the complexity and unpredictability of a non-lab environment.
    Click Here to View Full Article

  • "New Method Ranks Impact of Computer and Information Science Funding Agencies, Institutions and Individuals"
    Penn State Live (12/13/04)

    Researchers at Penn State's School of Information Sciences and Technology (IST) have devised a technique for automatically extracting and ranking the impact of funding agencies, institutions, and individuals from documents available on the Net and on the CiteSeer public digital library and search engine for computer and information science papers. By allowing people to determine which agencies are underwriting prominent computer and information science research, the method could be employed to assess the effectiveness of funding agencies and programs at both the national and international levels as well as measure the performance of a funding agency's individual research programs, says CiteSeer developer Dr. C. Lee Giles. Giles and IST doctoral student Isaac Councill detail the method in the latest issue of the Proceedings of the National Academy of Sciences. The research, co-funded by the National Science Foundation (NSF), involved the development of algorithms to extract acknowledgements from 335,000 CiteSeer-indexed documents, which demonstrated 78 percent precision and nearly 90 percent recall in tests with 1,800 manually labeled documents. Giles says the technique can be used with any area of academic papers, noting that "it also allows us to add new metrics such as accounting for funding amounts and getting some idea of the impact for funds spent." Using the method, the researchers have determined that NSF heads the list of the 15 most often acknowledged funding agencies, followed by the Defense Advanced Research Projects Agency (DARPA)--a ranking that also holds for agencies acknowledged by the 100 most cited documents in CiteSeer. However, a metric of total citations to number of acknowledgements indicates that DARPA is in the lead, followed by the Office of Naval Research, while NSF comes in third.
    Click Here to View Full Article

  • "Physics Model Predicts Book Sales"
    Technology Research News (12/22/04); Patch, Kimberly

    Complex systems such as consumer markets, financial markets, or a population's health all respond in a similar manner when a significant event occurs, according to UCLA researchers. UCLA physics professor Didier Sornette found sales of his stock market-analysis book peaked sharply after he gave an interview to a major Internet news outlet, but that the book's popularity fell sharply in following days; the pattern is similar to the frequency of aftershocks after a major earthquake or the response of financial markets to news events. The Epidemics-Type Aftershock Sequence model is based on how disease spreads through a population, and helps scientists model earthquake aftershocks: The model shows how information is passed through exogenous and endogenous means, and predicts the outcome of each information-diffusion path. In terms of book sales, exogenous inputs are like a New York Times article or advertising billboard that quickly spreads information, while endogenous inputs linger in the network for some time as people spread word about the book. Sornette and colleagues tested the aftershock model on book sales data over a period of two years, gathered through means of an automated software program that pulled data from Amazon.com and from another researcher also studying book sales. Out of 138 bestsellers, the model accurately predicted sales performance 84 percent of the time. Publishers and marketing firms could use the model to print the right amount of books or know when to start new advertising campaigns. Eventually, the researchers hope their study of complex non-physical systems will help shed light on complex physical systems such as the human body, which more subtly blends the effect of exogenous and endogenous input.
    Click Here to View Full Article

  • "Pointsmart Mouse Software Helps Children, Adults With Disabilities 'Point and Click'"
    UB News Services (12/08/04)

    A soon-to-be-released software application developed by Infogrip and the University at Buffalo Rehabilitation Engineering Research Center on Technology Transfer (T2RERC) promises to make using the computer mouse less aggravating for children and adults with limited fine motor skills. PointSmart enables users to tweak the sensitivity of mouse movements in order to stabilize erratic motions. Users can employ PointSmart in a joystick mode that starts the mouse in one direction and allows it to continue by itself until the user decides to choose an object or change direction, and the software also facilitates adjustable functionality for mouse clicks and buttons. T2RERC project manager Wendy Strobel says PointSmart will allow disabled students to access computers that their non-handicapped classmates use regularly, enable employees with reduced fine motor control to use a mouse without worrying about productivity-affecting factors such as misplaced data or missed targets, and let aged, infirm users continue to use PCs. In addition, PointSmart can benefit visually impaired users thanks to its ability to display very big and easy-to-read mouse pointers on the screen. T2RERC collaborates with companies to research, assess, transfer, and market assistive devices for disabled people. PointSmart is slated to become commercially available in March 2005; other assistive technologies T2RERC has developed, enhanced, or tested include Automatic Sync Technologies' CaptionSync product that automatically furnishes captions for any electronic media file and its transcript, and an automatic braking system for manual wheelchairs. The UB center is supported by a nearly $5 million National Institute on Disability Information Research grant spread out over five years.
    Click Here to View Full Article

  • "University Jaume I Researchers at Work on EU Project to Improve Video Game Realism"
    Innovations Report (12/13/04)

    The European Commission is backing a 33-month multi-institutional project to improve the realism of video games by designing software that can simplify the creation of credible environments without requiring programmers to develop their own algorithms. "Our idea is to develop the technology that was used in very complex workstations so that it is immediately accessible on PCs or on low cost platforms," says Miguel Chover of Spain's Universitat Jaume I de Castellon, which is participating in the project with the Polytechnic University of Valencia and the universities of Limoges, Girona, Budapest, and Vienna. UJI, Valencia, and Limoges will be tasked with improving the realism of onscreen objects' geometry and movements. Chover explains that his researchers employ multiresolution models, which enable programmers to automatically adjust detail levels: A programmer captures a game character and feeds it into the algorithm data framework, which automatically changes the degree of detail. "This is all done while at the same time the transition of geometry is made smooth so that everything is more realistic and jumps do not occur when a character goes from a more distant to a nearer plane," Chover notes. The University of Vienna will be in charge of improving visibility in video games, while Girona and Budapest will focus on the improvement of game lighting. Video game and virtual reality companies are participating in the project, along with the Research Association from the Toy Industry; the companies will share their product requirements with researchers and test to see if the solutions the researchers come up with are well suited to industry applications.
    Click Here to View Full Article

  • "NIST Demonstrates Data 'Repair Kit' for Quantum Computers"
    National Institute of Standards and Technology (12/01/04)

    National Institute of Standards and Technology (NIST) researchers have developed a scalable error-correcting scheme for quantum computers, as described in the Dec. 2 issue of the journal Nature. Quantum computing uses the quantum states of particles as the basis for computation, but environmental noise such as fluctuating magnetic fields can introduce errors. Because quantum properties are lost upon direct observation, scientists have tried to design indirect methods for detecting and correcting errors. The NIST team used three beryllium ions for their experiments and designated one ion as the primary qubit and the others as "helper" qubits that are used to decode erroneous computations; the decoding process is carried out with laser operations on the two helper ions, which had previously been entangled with the primary ion. Information regarding the error is discovered through the two helper ions and precise laser beams are used to correct the spin of the primary ion without disturbing its quantum state, and the helper ions are then reset to detect future errors. The NIST method requires better instrumentation and qubit manipulation processes to be used in full-scale quantum computing, but the concept is scalable and there are no serious obstacles to its further development, says researcher Dietrich Leibfried. Previous quantum computing error-correction frameworks have relied on molecules in liquid and could not be reset for reuse in future computations. Quantum computing promises to revolutionize computer functions that require massive simulations calculations, such as code-breaking, system optimization, and database searching.
    Click Here to View Full Article

  • "An Adventurous Thinker"
    DevSource (12/12/04)

    Technology innovator and author Ray Kurzweil says we are on track to fulfill the visions he predicted in his 1999 book, "The Age of Spiritual Machines." In it, he prophesied that by 2009 computers would be incorporated into apparel, most mundane business transactions would be between humans and virtual agents, and translating phones would be routinely used; Kurzweil sees the seeds of these technologies in such current innovations as pocket-sized gadgets enabled for email, Web browsing, telephony, and other functions, and virtual personalities that use natural language to carry out transactions. Kurzweil anticipates a migration to a grid computing architecture in which every device functions as a network node, and argues that software developers should prepare for this transition by finding ways to optimize their applications to run in a massively parallel manner. He notes that though most of his major inventions--speech recognition, music synthesis, speech synthesis, financial pattern analysis, etc.--cover a wide spectrum, they all employ pattern recognition to some degree. Kurzweil says that humans outclass computers in terms of pattern recognition, which he says is the chief advantage of human intelligence. He attributes the failure of most technology projects to bad timing, and has devoted intense study to tech trends to avoid this pitfall; with the help of a 10-person research team, Kurzweil is developing mathematical models of technological evolution to predict the capabilities of tomorrow's computers in his writing. "Often I have a project in mind but realize the enabling factors are not yet in place," he notes. "Once I feel that they will be in place around the time that a project could be completed, and there appears to be a substantial need, then I will consider starting that project."
    Click Here to View Full Article

  • "Ecobot Eats Dead Flies for Fuel"
    Wired News (12/15/04); Sandhana, Lakshmi

    The Ecobot II robot developed by University of the West of England professors Chris Melhuish and John Greenman uses microbial fuel cells (MFCs) to convert organic matter--specifically, dead flies and rotten apples--into electricity. Machines such as Ecobot II are an early step toward the ultimate goal of a self-sustaining, self-powering robot that can function for years without human assistance. The MFCs comprising the Ecobot's "guts" contain microbes normally found in sewage that break down raw food into sugars, and the cells take in oxygen from the air to generate useful biochemical energy that is turned into electricity. Ecobot II manages a peak speed of roughly two to four centimeters every 15 minutes, an act requiring the consumption of eight flies; it currently takes about one to two weeks for the machine to extract approximately 90 percent of the energy contained in three to four flies, although the research team is working to shrink that interval down to a few days. MFC-driven robots are idle most of the time while they power up, and fuel cell performance and longevity are still easily outmatched by standard alkaline batteries. "Until today, the maximum open circuit voltage of a microbial fuel cell is not more than 0.75 volts and that goes down in current production, which is not enough to power most of the electronics, including many handheld devices," notes University of Massachusetts professor Swades K. Chaudhuri. "However, with further discovery of a novel bug that can quickly oxidize organic material or by modifying existing bugs genetically there may be a way to enhance power output." As more powerful fuel cells emerge, the Ecobot could be equipped with sensors that allow the device to detect the presence of food, or use pheromones to attract flies.
    Click Here to View Full Article

  • "The Semantics of Fighting Financial Fraud Online"
    IST Results (12/14/04)

    The IST program-funded FF POIROT project will investigate the use of multilingual semantic Web-based ontology languages for the purpose of detecting and preventing online financial fraud, according to Dr. Gang Zhao from the Free University of Brussels' STARLab. This will form the basis of an enabling platform for processing vast amounts of structured and unstructured data and managing complex knowledge. "A major part of the project is devoted to a methodology, which will guide professionals in using the tools for modeling the content of the resources," Zhao notes; achieving this goal involves the construction of testable prototypes by FF POIROT. Semantic processing in the search engine is facilitated by the implementation of the FF POIROT ontology in a portal, thus yielding greater relevancy in Web search results acquired by regulatory bodies checking Web sites for unlawful solicitation of fraudulent advice. Zhao says another goal of the project is to extract red flags and textual evidence of fraud from Web pages using the ontology, which employs a fraud forensics ontology-based scheme focusing on lexicon and grammar related to expressions of fraud evidence. This lets investigators devote more time to Web sites with high suspicion ratings calculated on the identified red flags. The ontology will also be used to help users check and evaluate VAT registrations and invoices by facilitating knowledge management and the planning of user-machine dialogue. In addition, FF POIROT is devising tools and a platform for ontology engineering, as well as a useful ontology-based strategy for knowledge engineering with an overarching goal of commercially applying the ontology as a set of Semantic Web services, ontology server prototypes, an ontology-engineering workbench, and application demonstrators.
    Click Here to View Full Article

  • "E-Voting Still Expensive, Fraught With Security Issues"
    Technology in Government (12/04) Vol. 11, No. 9; Sheikh, Fawzia

    Canadian director of Election Systems & Software Jonathon Hollins reports that Canada's enthusiasm to adopt e-voting solutions is not as strong as in the United States; he points out that, for one thing, Canadian elections are less frequent than U.S. elections. Meanwhile, Delvinia Interactive President Adam Froman says the simplicity of federal ballots, compared to those at other government levels, reduces Election Canada's desire to phase them out in favor of more expensive e-voting equipment. However, Hollins notes that the cities of Edmonton, Toronto, Oakville, Vaughan, Brantford, and Mississauga have used standalone touch-screen voting systems in municipal elections, while the towns of Markham and Prescott in the province of Ontario have pondered Internet voting. Froman says the City of Vancouver and Ontario are considering online voting for their next elections, explaining that "there will always be risks with any form of electronic voting that have to be managed and monitored...but the demand is overwhelming from the voters themselves." He says a Delvinia poll indicates that 100 percent of Markham residents who voted online last year said they would do so again, while advanced poll voting surged 300 percent. Froman concludes that security was not a major issue among voters; instead, online voting administrators were concerned that the real voters might not actually be casting a ballot. On the other hand, John Swan with Markham's IT department explains that voters still have reservations despite the security of online voting systems, particularly when it comes to accurately performing a recount or updating voter data.
    Click Here to View Full Article

  • "Apps to Die For"
    InformationWeek (12/06/04) No. 1017, P. 46; Kontzer, Tony

    Service-oriented architectures currently under construction are the likely launchpads for future killer apps, according to industry insiders and analysts. Though oft-mentioned killer apps include search, speech recognition, and autonomic security, the value of those functions is already fairly clear and their progress evolutionary. The Web services possible with interconnected service-oriented architectures are not easy to predict; the only certain thing about those applications is that they will change the way companies do business, most likely by enabling organizations to hand off non-core functions such as shipping. Service-oriented architectures will be paired with software components built by businesses themselves or supplied by third-party companies as an industry-specific service, such as a function that allows hotel guests to check via their mobile phone. On the other hand, Web services-savvy enterprises could actually spawn the next killer app by providing a set of services that can be applied broadly across many companies, says Microsoft .Net business development vice president Dan'l Lewin. Researchers at the MIT Sloan School of Management foresee an era of mega-aggregation where services are built by pulling data from many different providers, such as a shipping application that draws quotes and data from DHL, FedEx, and United Parcel Service; these types of mega-aggregate services will require shared semantics or semantic integration in order to avoid serious misrepresentations. Intuit plans to launch a tax-filing service that pulls relevant data from financial institutions, employers, and other key holders of personal income information, thus allowing users to simply click to approve their electronic tax filing and send the file on to the IRS. IBM software technical strategy director Doug Heintzman points out that componentized code will also lower the barrier to application innovation.
    Click Here to View Full Article

  • "The WLAN Standards Alphabet Keeps Growing"
    Network World (12/13/04) Vol. 21, No. 50, P. 22; Cox, John

    IEEE technical standards for wireless LANs (WLANs), which set up common WLAN creation, monitoring, and management methodologies to serve as platforms on top of which vendors can put differentiating features and functions, are expanding. The promotion of 802.11 WLANs is handled by the Wi-Fi Alliance, which is playing a bigger role in the development of WLAN product certification tests as new IEEE standards are approved; the group is readying test programs for the multi-country roaming (802.11d) and dynamic frequency selection and transmission power control (802.11h) standards. 802.11d is designed to enable WLAN access points to broadcast the country they reside in as well as the country-specific rules client network interface cards must comply with, while 802.11h is designed to avoid interference with radio and satellite transmissions in the European 5 GHz band by allowing WLAN devices to select another channel and correct power output if necessary. The 802.11n Task Group, meanwhile, is considering proposals in its mission to develop a standard for WLANs boasting a minimum throughput of 100 Mbps. Other completed or in-development WLAN standards include inter-access point protocol (802.11F), an as-yet unsupported specification for letting access points talk among themselves and rapidly transfer connection-associated data between each other; fast roaming (802.11r), which would eliminate users' need to reauthenticate at each access point or have their calls interrupted; and wireless mesh topology (802.11s) for permitting access points to serve as wireless data routers that forward traffic to nearby access points.
    Click Here to View Full Article

  • "New Paradigms in Enterprise Video"
    Business Communications Review (11/04) Vol. 34, No. 11, P. 38; Davis, Andrew W.; Kelley, E. Brent

    Image quality, high operating costs, endpoint management, difficult interfaces, and other issues limiting videoconferencing to niche markets have been largely resolved by vendors in room-based video systems, while the spread of IP-based enterprise networks and increasingly popular presence and instant messaging technology is leading to improvements in desktop video. The next two years will witness the rollout of video-enabled IP telephony, collaborative Web portals, and business data software, so enterprises will want to retool their current conferencing and collaboration environments in order to take advantage of these new offerings. The new desktop paradigm designates video an IP networking-based feature added on to another application or workflow tools, which employ presence and context as the launching points from which to call upon the video. Collaborative communications vendors are following one of three presence-based video-as-feature strategies: The IP telephony strategy makes the IP-PBX the hub of a collaborative communications environment, allowing multimedia calls to be launched from the conventional phone user interface no matter what the user device is. The collaboration portal approach sets up a separate application for rich media collaboration, which employs the presence engine to facilitate real-time collaborative multimedia sessions. And business process collaboration launched within enterprise software applications is the third strategy. All three of these approaches are expected to overlap concurrent with market development and maturity, although some end users' concerns about the intrusiveness of presence and video may be significant obstacles to adoption.

  • "Information Technology Is Key to Air Force 2020"
    Military & Aerospace Electronics (11/04) Vol. 15, No. 11, P. 34; Keller, John; Wilson, J.R.

    Though the emphasis of the Air Force 2020 roadmap has shifted from weaponry to information superiority and network-centric warfare in the wake of 9/11 and other developments, the roadmap's key enablement capabilities--global reach, authoritative air power, and reliable knowledge of allied and adversarial forces--remains consistent. The plan calls for all Air Force assets (hardware, personnel, facilities, etc.) to become networked IP nodes using off-the-shelf technologies whenever possible and custom-designed technologies whenever necessary. Air Force CIO John Gilligan explains that planners want highly automated information search; predefined information to ensure that authoritative sources, data retrieval and refinement algorithms, security, and punctuality are there to fulfill the requirements of a decision supported by a search query; strong, reliable identification of people and network assets; and reliable code to guarantee software correctness and security. The private sector has often fallen short in areas such as information security and data search and retrieval, but more and more military planners are seeking a compromise between the commercial and the custom-design approach, notes James Engle, deputy assistant secretary of the Air Force for science, technology, and engineering. Deputy director of Air Force Strategic Planning Christopher Bowie reports that over the last four decades or so, the commitment to joint enabling forces (weather satellites, mobility accesses, aircraft refueling, etc.) has taken a bigger and bigger chunk out of the Air Force budget, a trend that is expected to continue through 2025. He adds that the use of unmanned aerial vehicles has increased since the Gulf War. Also expected to be chief enabling technologies of Air Force 2020 are directed-energy communications and weapons, which could be combined into a system that can strike any target in the world within seconds of detection.
    Click Here to View Full Article

  • "R&D 2004"
    Technology Review (12/04) Vol. 107, No. 10, P. 60; Rotman, David; Gorman, Jessica; Hadenius, Patrick

    The movement of corporate research projects out of the lab and into commercial real-world applications is not always smooth and is heavily influenced by business, legal, and financial variables, as demonstrated in initiatives from Philips Electronics, Texas Instruments, and Nokia. Texas Instruments electronics researcher Christoph Wassuber and collaborators at the Swiss Federal Institute of Technology have designed a single-electron transistor that is not susceptible to the interference that has undercut the effective performance of previous iterations. This breakthrough could not only lead to ultrafast processors, but ultrasmall integrated circuits thanks to the elimination of heat dissipation problems; however, experts require evidence that Wassuber's concept can function effectively in real circuits, while manufacturing the molecular-scale features needed to keep such transistors running at room temperature is beyond the semiconductor industry's capabilities at present, and Texas Instruments cannot currently justify funding large-scale research and development in this area, in view of its expectation that the miniaturization of conventional transistors can be sustained through 2015. Nokia researchers Jukka Nurimen and Balasz Bakos hit upon a peer-to-peer scheme that would allow images and other digital files to be exchanged between cell phones, and support fast searching without overtaxing the network via reduced bandwidth requirements. But the concept has encountered a bump on the road to commercialization because of concerns that the technology could be used to swap copyright-protected content. Philips Electronics' plan to roll out cheap, high-quality flat-screen TVs that use organic light-emitting diodes has hit a technical snag because of problems with display efficiency, fading, and polymer lifetimes, but the company has forged ahead by releasing small-scale versions of the technology in electric shavers and cell phones, with more sophisticated rollouts planned.


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM