Be a TechNews Sponsor

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM. To send comments, please write to [email protected].
Volume 7, Issue 772:  Wednesday, March 30, 2005

  • "Lively Debate as Justices Take on File Sharing"
    New York Times (03/30/05) P. C1; Greenhouse, Linda

    The case of Metro Goldwyn Mayer Studios v. Grokster, which is before the Supreme Court, has become a flashpoint between the entertainment industry and technology companies over whether file sharing services should be liable for digital copyright infringement committed by customers. Lawyers for the defense cited the Supreme Court's landmark 1984 ruling absolving the makers of the Betamax videocassette recorder of any copyright infringement liability with the argument that the product offered substantial noninfringing applications. However, Supreme Court Justice Ruth Bader Ginsburg objected to the defense's contention that this decision constituted a clear rule, arguing, "I don't think you can take one sentence from a rather long opinion and say, 'Ah-hah, we have a clear rule.'" Defense lawyer Richard Taranto characterized file sharing as an "autonomous communication tool" for spreading knowledge inexpensively. This view was echoed by civil libertarians and tech industry pundits who filed friend-of-the-court briefs, but acting solicitor general Paul Clement told the court that it should consider the defendants' actual "business model" rather than "the mere theoretical capability of noninfringing uses." He added that the Betamax ruling was arrived at through solid proof that the product had significant noninfringing use. Justice Antonin Scalia noted that establishing legitimate uses of a new technology can take time, leaving the developer unprotected against copyright infringement suits in the interim. Clement said considerable leeway should, in the government's view, exist at the outset, but argued that the defendants' business model was founded on infringement.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

    For more information on Grokster, visit http://www.acm.org/usacm.

  • "Even Tech Execs Can't Get Kids to Be Engineers"
    Wall Street Journal (03/29/05) P. B1; Grimes, Ann

    Technology executives are stressing the United States' need to step up its efforts to get more young people interested in engineering so that an engineer shortage can be mitigated and the country can maintain its global competitiveness. The decline in interest in engineering careers hits close to home for many executives, as their own children are resistant to the idea because of outsourcing and other factors. A recent A.T. Kearney study of 2,800 Silicon Valley students found that most respondents were not interested in high-tech careers because they viewed such professions as socially isolating, boring, or overwhelming. Silicon Valley venture capitalist Susan Mason recalls that her two stepdaughters spurned a career in computer engineering because "They wanted to have more interactions with people on a 'human' level." Many executives' kids frequently cite outsourcing as an argument against becoming an engineer, explaining that they do not wish to relocate overseas or even to another state. Dean of the University of Southern California's engineering school C.L. Max Nikias reports that only 50 percent of the approximately 120,000 students who are initially interested in engineering at U.S. universities and colleges earn engineering degrees, and he is trying to improve retention by establishing a new curriculum as well as a career-centric speakers program. The United States' ranking in the number of undergraduate engineers and natural scientists produced worldwide has fallen from No. 3 in 1975 to No. 17 now.

  • "Wordspotter Searches Historical Documents"
    NewsFactor Network (03/29/05); Martin, Mike

    University of Massachusetts Amherst (UMass) computer science professor R. Manmatha has developed a computer interface that can search handwritten documents for information in a manner similar to the Google search engine. Toni Rath, a UMass grad student who helped create a demo of Manmatha's search tool, says the concept is comparable to searching text documents written in one language using queries in another language, as different handwriting is similar to different languages. He says, "Our system learns from a parallel body of transcribed scanned images. That is, the word images form a 'visual language.'" The UMass team's work was partially underwritten by the National Endowment for the Humanities and the National Science Foundation, and detailed at a recent ACM SIGIR conference. Xerox European Research Center director Chris Dance reports that even human readers can have trouble recognizing handwritten documents because of differences in the shapes of characters written by a single writer, while variability in pen-stroke thickness and writers' distinctive quirks are additional factors. Xerox Research Center information retrieval and machine learning expert Eric Gaussier says, "Most approaches in searching handwritten documents have taken the path of first solving the handwriting recognition problem, then solving the search problem with standard indexing and search techniques." The problem is that trying to address the enormously difficult challenge of handwriting recognition may be an even tougher proposition than handwritten document retrieval.
    Click Here to View Full Article

  • "Why IT Workers Are Lying About Their Age"
    Financial Times (03/30/05) P. 8; Thomas, Kim

    The IT industry is riddled with ageist recruitment policies, according to over-40 IT professionals who have often had to lie about their age to attract interest from prospective employers. Unemployed IT veteran Tony Wells, 49, claims these practices are partly responsible for a shortage in skilled IT personnel. The U.K. Department of Work and Pensions reported in 2002 that professionals aged 35 and younger comprise 56 percent of the IT workforce, compared to 38 percent of the overall workforce; a survey performed by the Employers Forum on Age and Silicon.com indicates that age plays a part in IT recruitment decisions made by 31 percent of those in charge of the hiring process. Association of Technology Staffing Companies CEO Ann Swain says ageism in IT is usually an unconscious rather than conscious practice. "I think there is a view that someone recruits a person like themselves," she says. "And because of the nature of IT, that has generally been someone 28 to 35, male, a graduate from a decent university." Wells and others think recruitment agencies and employers are over-emphasizing job candidates with specific skills in recent technologies, and Wells suggests that hiring decisions should be based on the candidates' ability and experience instead. New anti-ageism laws will come into effect in October 2006, and although their impact may not be immediately apparent, they are expected to force a re-examination of the IT sector's hiring processes.
    Click Here to View Full Article

  • "Laying the Foundation for the Next-Generation Web"
    IST Results (03/30/05)

    The core element of the next-generation Internet envisioned by Tim Berners-Lee is the Semantic Web, and the IST-funded WonderWeb project has made significant contributions to the Semantic Web's ongoing development by fulfilling and in some cases surpassing its goals. A major achievement is the standardization of the OWL ontology language, which WonderWeb project coordinator and University of Manchester professor Ian Horrocks describes as being "to knowledge management systems what SQL is to database management systems." OWL supports the modeling of domain architecture along with the details of a particular situation; permits the capture of detailed and accurate domain models or ontologies; upholds powerful queries at the data and schema level; and facilitates the extraction of implicit information from explicitly stated facts about the schema and the data. Other important WonderWeb breakthroughs include the development of the KAON ontology-engineering environment, and the creation of a totally novel inference engine designed to enhance reasoning services for OWL applications. These tools are complemented by the WonderWeb ontology library, a repository of diverse foundational ontologies (DOLCE, BFO, OCRE) and domain specific extensions encompassing Web services, plans, descriptions, situations, and other areas. These various innovations are being utilized by many FP6 projects funded by the European Commission. "OWL is already very widely accepted and used, even in commercial offerings, and is the de facto standard in many domains, such as e-Science for example, that are not directly related to the Semantic Web," Horrocks says.
    Click Here to View Full Article

  • "The U.N. Thinks About Tomorrow's Cyberspace"
    CNet (03/29/05); McCullagh, Declan

    The International Telecommunications Union (ITU) wants a greater say in Internet governance, including allocating IPv6 address blocks, overseeing top-level root servers, and coordinating spam fighting and law enforcement cooperation, says ITU telecommunications standardization bureau director Houlin Zhao. A power struggle is now underway between those content with Internet governance as it is and those who believe the current situation is tilted in favor of the United States. Internet technologies fall into the realm of ITU's mandate, which has traditionally included radio and telephone; voice over Internet Protocol (VoIP) should be regulated in part by the ITU, say officials. Zhao says there is widespread discontent at what is considered management of the Internet by the United States: For example, when German top-level domain registrar DENIC wanted to win the contract for .net registration, officials went to Washington, D.C., to lobby Congress and the Bush administration. The 2005 World Summit on the Information Society this fall will feature new debate about the role of the United Nations in Internet governance, and Zhao hopes a new agency will not be created because he thinks the ITU, ICANN, UNESCO (United Nations Educational, Scientific, and Cultural Organization), and WIPO (World Intellectual Property Organization) can work together to adequate solve any problems. Zhao worked for China's Ministry of Posts and Telecommunications before coming to the ITU in 1999. Contrary to the popular belief that the Internet flourished because of the relative absence of government regulation, Zhao says government involvement in Internet governance is necessary.
    Click Here to View Full Article

  • "Secure Flight Faces Uphill Battle"
    Wired News (03/29/05); Zetter, Kim

    The Transportation Security Administration (TSA) has only fulfilled one of 10 requirements set by Congress for the Secure Flight passenger screening system, set to launch in August. The Government Accountability Office (GAO) says the TSA has set up an oversight committee for the Secure Flight program, but has not yet formulated policies to guide that committee. In addition, the TSA has not yet tested the accuracy and efficacy of data nor chosen what commercial data, if any, it plans to use; also lacking are redress procedures for passengers to challenge the system's assessments or change incorrect information. Secure Flight improves on the previous CAPPS II system by placing passenger screening functions in the hands of the TSA instead of the airlines. The TSA will combine airline passenger data, government information including terrorist watch lists, and commercial data to identify possible terrorists. ACLU Technology and Liberty Project director Barry Steinhardt says airlines might have to begin collecting new information from passengers to pass on to the TSA and help verify matches against watch lists, and he doubts Secure Flight will be ready by the August deadline, when the TSA is expected to begin testing Secure Flight with two domestic carriers before rolling it out for all domestic air travel. But TSA's Yolanda Clark says the GAO report should be considered a progress report, not a final evaluation; Secure Flight is a 14-month project and was evaluated by the GAO at the eight-month point, she says. The TSA recently finished testing airline, government, and commercial data, and IT infrastructure and hardware are already in place.
    Click Here to View Full Article

  • "Brazil: Free Software's Biggest and Best Friend"
    New York Times (03/29/05) P. C1; Benson, Todd

    Brazil is becoming free software's largest benefactor with President Luiz Inacio Lula da Silva's mandate that government ministries and state-run businesses transition from expensive proprietary operating systems to free operating systems in an effort to save millions of dollars in royalties and licensing fees. Brazil's government plans to implement its PC Conectado (Connected PC) program to help low-income Brazilians purchase their first computers by the end of next month, and National Institute of Information Technology President Sergio Amadeu wants the computers the program offers to run free software only. "It's the government's responsibility to ensure that there is competition, and that means giving alternative software platforms a chance to prosper," Amadeu says. It is estimated that just 10 percent of Brazil's 183 million people are connected to the Internet, while only 900,000 computers are sold legally every year. The PC Conectado program aims to give computer makers tax incentives in return for dramatically discounting their products, while consumers would be able to take advantage of a plan in which they pay for desktops with 24 monthly installments of 50 to 60 reais (about $18 to $21.80), considered to be an affordable price for many working poor. Some people believe the government should apply its tech programs to schools and other areas where they argue a more pressing need for computers exists, and the government has promised complementary programs to spread computers' school presence as well as open many poor-area computer centers with free software and free Internet access by the end of 2005. MIT Media Lab director Walter Bender sent a letter to Brazil's government in which he stated that "free software provides a basis for more widespread access, more powerful uses and a much stronger platform for long-term growth and development."
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Going for the Code"
    Milwaukee Journal Sentinel (03/26/05); Gallagher, Kathleen

    Stepping up to the plate for the University of Wisconsin-Madison at ACM's International Collegiate Programming Contest World Finals this year are the Harmless Fluffy Bunnies, a team of graduate and undergraduate students that will vie with 77 other teams to see who can code the answers to the most problems with the fewest attempts and in the shortest amount of time. The challenges teams will be asked to tackle are presented as one- to two-page problem statements containing complicated tasks, such as determining an airplane's optimal flight plan, decrypting a secret code, or computing the motion for each component of a robotic arm. The competition is sponsored by the Centers for Advanced Studies at IBM. Centers program director Gabby Silberman explains that quick thinking is a prerequisite for competitors: "They need to be able to look at a problem and quickly categorize it in terms of what type of problem it is," she says. "And they have to have a good game plan, in terms of how to partition the work among themselves." Participants will have five hours to answer eight to 10 problems, and the Fluffy Bunnies' head coach, computer science professor Dieter van Melkebeek, thinks they will be a worthy adversary for other North American teams. Even if the Fluffy Bunnies do not win the contest, their participation in the World Finals carries a lot of status that can attract potential employers.
    Click Here to View Full Article
    (Access to this site is free; however, you will need to enter your email address, street address, year of birth and gender to gain access to the site.)

    For more information about the ACM Programming Contest, visit http://icpc.baylor.edu/icpc/.

  • "Rolling Out Next Generation's Net"
    BBC News (03/26/05)

    The Internet Engineering Task Force (IETF) is a large, globe-spanning community of network designers, operators, researchers, and vendors that collectively oversee the workings, expansion, and evolution of the Internet by developing free and open standards and protocols to ensure interoperability and smooth transfer of information between parties. The first challenge of Brian Carpenter's tenure as new IETF chair is the deployment of the next-generation IPv6 protocol. IPv6, which is gradually being incorporated into network infrastructure worldwide, will allow for virtually unlimited growth of IP addresses by replacing IPv4, which can support no more than 4 billion addresses. The differences between IPv4 and IPv6 to the average Internet user are so small as to be nearly imperceptible. Carpenter, a distinguished IBM engineer and 20-year Cern veteran, reports that the Internet is experiencing "a very clear phase of consolidation and renewed growth," and the IETF's challenge is sustaining its production of standards to accommodate the expansion. He reasons that the problem of online security will never completely go away, given the intricacies of both human and technological behavior. Carpenter says people must be made more aware of "sensible behavior" in regards to Net security, while the IETF is tasked with ensuring the creation of improved Internet security standards. "It is a never-ending battle in a sense," he concludes.
    Click Here to View Full Article

  • "VeriSign Poised to Retain Operation of ".Net" Domain"
    Associated Press (03/29/05); Jesdanun, Anick

    Telcordia Technologies, an outside firm hired by ICANN, has recommend that VeriSign be given a new six-year contract to continue running the .net registry, the Internet's third-most popular suffix. VeriSign, which already generates $200 million annually operating the .com registry, is expected to generate $20 million annually for managing .net. VeriSign was ranked first overall by Telcordia, followed by Sentan Registry Services, Afilias Limited, DENIC, and CORE++. Telcordia said all applicants were good, and the final rankings were based on "experience, risk, and price." VeriSign focused on its experience, while other applicants stressed the need for competition and international diversity. VeriSign, which now has two weeks to negotiate a price with ICANN, said it would reduce prices if it won. The contract also still needs ICANN board and U.S. government approval. If a contract is not agreed upon, Sentan Registry Services, a partnership between the operators of Japan's .jp and .biz, would get the next chance to reach an agreement. Of the current 248 domain suffixes, .net is the third most popular, with 5.5 million registrations. ICANN yesterday also announced that agreements have been reached for two new domains, .jobs and .travel, which now await board approval, and said a new country-code top-level domain has been added: .tl, for Timor-Leste, which gained its independence in 2002.
    Click Here to View Full Article

  • "Identity Theft Made Easier"
    Wall Street Journal (03/29/05) P. B1; Delaney, Kevin J.

    Identity thieves made headlines with security breaches at ChoicePoint and LexisNexis, but common search engines provide a much easier route to obtaining illicit personal information. Google hacking, the practice of crafting specific search queries using special commands to find sensitive personal data, was demonstrated at an Agora security industry meeting in Seattle, where teams raced to accumulate the most identity information in an hour. The winning team found a directory with the Social Security numbers of more than 70 million deceased persons, while the second-place team uncovered hundreds of scanned passport documents and a Justice Department site listing employees and their work credit-card numbers. The contest rules limited teams to using only Google to turn up data, though real hackers would likely employ other means to burrow further into exposed systems. Google and other public search engines are not responsible for the privacy breaches since they only index publicly available Web data; instead Web site operators and negligent users are to blame for data left open to the public, says Seattle chief information security officer Kirk Bailey, who organized the Agora Google-hacking contest. Data exposed via Google is often left open by people who think the information is hidden. Organizations have a responsibility to perform audits of their own networks to ensure sensitive data is not left exposed, and to enable firewall software that blocks public access to sensitive areas of the network; Google also plays a cat-and-mouse game with hackers as it tries to disable the most effective Google hacks while keeping the service as accessible as possible, say Google-hacking experts. There are a number of books and Web sites that provide information on Google hacks, and non-technical people can make use of them.

  • "Report: P-Languages Better for Enterprise"
    InternetNews.com (03/25/05); Singer, Michael

    A Burton Group report finds that P-Languages such as Perl, Python, and PHP have come a long way over the last several years thanks to their ability to complement the use of G-Languages such as Java, C++, and C#, and suggests that P-Languages should be favored over G-Languages because of their performance in enterprise scripting and other mission-critical functions. "P-Languages...should be viewed as additional, albeit first-class tools that information technology organizations can use to solve enterprise-scripting problems," recommends Burton analyst Richard Monson-Haefel in the report. He notes that PHP can dramatically simplify the dynamic generation of HTML and the processing of HTTP requests, while Perl is often employed in Unix and Linux system administration as well as for batch transformations of text data. Python, meanwhile, finds use in system administration, text processing, and application development, and frequently serves as "glue" code. A single line of code in a P-Language could typically execute the same number of tasks as five lines of G-Language code, thus reducing developers' writing and debugging chores and making it easier for them to learn unfamiliar systems, according to Monson-Haefel. P-Language pick-up has been significant in open-source platforms, while Burton says that Perl and Python are no more vulnerable to hacker intrusion than most programming languages. However, products that use PHP as a platform appear to be particularly vulnerable, and Monson-Haefel says the PHP community must make a more conscientious effort to toughen up the language.
    Click Here to View Full Article

  • "Cars Are Getting Computer-Jacked"
    CNet (03/25/05); Spooner, John G.

    The presence of automotive electronics is expanding both in the dashboard and under the hood, reducing clutter and freeing up designers to experiment aesthetically. "Everything is blending into one unified theme," notes Ford Motors designer Anthony Pozzi, who designed the Meta One concept sports car displayed at the New York International Auto Show; the car boasts a fluent design that features recessed buttons rather than stalks for changing gears, and a trio of LCD screens for displaying speed, navigation data, and other traditional gauges that can be customized to the driver's preferences. Nearly all auto models are expected to offer some type of MP3 player link in the next several years, and demand for in-vehicle iPod connectors has spurred several manufacturers to plan such offerings, although embedded hard drives may eventually outdate such devices. Electronics are also permeating car safety systems, such as networked sensors for measuring the vehicle's wheel speed, steering wheel angle, and yaw, which can be used to support dynamic stability control and other fail-safes. Eventually, car computer systems will be imbued with predictive capabilities so that they can facilitate collision avoidance and other safety-enhancing operations. Such systems are currently offered in deluxe models only, but auto executives at the show predicted that they will be incorporated into cheaper vehicles, either as an option or as standard gear. Computer systems perhaps have the greatest penetration in hybrid cars that run on both gas and electricity. Hybrid vehicles from Toyota use such systems to control the switch between electric and gas, and make the transition imperceptible.
    Click Here to View Full Article

  • "Quantum Computing Scheme Cuts Errors"
    EE Times (03/21/05) No. 1363, P. 46; Brown, Chappell

    In a recent issue of Nature, National Institute of Standards and Technology physicist Emanuel Knill details a proposed quantum computing architecture that could make large-scale quantum computers tolerant of the faults that might crop up in practical quantum circuits. Up to now, all proposed error-correcting architectures would yield quantum circuits where most of the computing resources would be devoted to error correction operations. Knill's proposal calls for a nested hierarchy of qubits that incorporates error correction into the logic gates' basic operation: The first stage of error correction involves a base level of qubits that executes the simplest possible correction on the base array, and then quantum teleportation passes the corrective states on to subsequent levels in the hierarchy. Clusters of neighboring physical qubits would represent virtual qubits with successively reduced rates of error. The quantum state can be transferred directly via teleportation, so there is no need for knowledge of its parameters, and subsequently no need for read/write operations; nearby qubits are not physically affected by the information transfer. Knill notes that a wide chasm exists between theory and experiment in the field of quantum computing, and suggests that his proposal could shrink that divide, "showing that quantum computers may be easier than we thought." An alternative quantum computing scheme suggested by Oxford University's Andrew Steane proposes the use of architectural features to lower errors, embedding error correction within the process of reading the qubits' state and employing a series of quantum-computing codes that are optimized to facilitate teleportation between ancilla blocks. Determining the level of complexity required by these various fault-tolerant schemes is a matter of more precise theory and computer models.
    Click Here to View Full Article

  • "Intel Goes to School"
    Computerworld (03/28/05) P. 40; Vijayan, Jaikumar

    Intel Research is funding a quartet of university "lablets" to identify and investigate technologies that merit "acceleration and amplification," according to company representative Kevin Teixeria. He says Intel has no claim on the intellectual property produced by the labs, because it is interested in "helping to grow the technology and seeing where there is a usage for it within Intel." Intel's UC Berkeley lablet is focusing on systems that employ wirelessly networked sensors to collect a wealth of information about the environment, and the TinyOS operating system and TinyDB query-processing technologies have been notable breakthroughs. Researchers are currently devising the Tiny Application Sensor Kit, a suite of tools that lab director Joseph Hellerstein says will simplify the deployment of applications that use sensor networks. A second Intel lablet at the University of Washington is combining radio frequency identification (RFID) technologies and data mining software into the System for Human Activity Recognition and Prediction, which is supposed to predict human behavior by monitoring the objects people touch and how they are used. A key tool of this research is the RFID-enabled iGlove that extracts data from objects with affixed RFID tags. Another lablet based at England's University of Cambridge under the supervision of Derek McAuley is looking into highly distributed applications, examples of which include Xen, a "virtual-channel processing" technology that allows a single system to support multiple operating systems and users more efficiently than software-based virtualization. The Carnegie Mellon University lablet's area of concentration is software for widely distributed storage systems, with emphasis on interactive searching of massive archives of non-indexed data, and the acceleration and enhanced accuracy of searches via embedded processors.
    Click Here to View Full Article

  • "Internet Governance Issue Heats Up as Next World Summit Nears"
    Information Today (03/05) Vol. 22, No. 3, P. 22; Ashling, Jim

    The second phase of the conference of the World Summit on the Information Society (WSIS) is scheduled for Nov. 16-18 in Tunisia, and is a follow-up to the December 2003 gathering in Geneva. The first phase produced the WSIS Declaration of Principles and Plan of Action, and has resulted in the undertaking of nearly 1,200 activities by governments and other stakeholders. For example, the United States has embarked on a Joint Federal Rural Wireless Outreach Initiative to spur the growth of broadband wireless services in rural areas of the country. As for the second phase of WSIS, more effort will be focused on jumpstarting the Plan of Action, which has technological goals such as linking libraries, cultural centers, museums, post offices and archives with information and communication technologies. The Plan of Action also has an economic goal of targeting affordability issues that would help make scientific information more accessible for countries. Meanwhile, Internet governance and financing mechanisms continue to be controversial issues for working groups. ICANN oversees Internet-addressing matters, but developing nations would like for the International Telecommunications Union to handle issues involving next-generation networks. The 40-member WSIS Working Group on Internet Governance is charged with finding a solution by the end of the year. The working group met last November and set an agenda that includes determining Internet governance's leading public policy issues and priorities, identifying the main players and describing the current situation, analyzing future technological and regulatory scenarios, and proposing a plan of action.

  • "Agile CMMI: No Oxymoron"
    Software Development (03/05) Vol. 13, No. 3, P. 48; Konrad, Mike; Over, James W.

    Mike Konrad and James W. Over with Carnegie Mellon University's Software Engineering Institute (SEI) write that Capability Maturity Model Integration (CMMI), a combination of Software CMM and Carnegie Mellon's other primary maturity models, should be reconsidered as an agile software development methodology. Organizational-level processes and practice examples are provided by CMMI, while SEI's complementary Team Software Process (TSP) delivers specifics for software developers and their teams, enabling team members to organize their own plans and track their work, with customers participating throughout the process. TSP proposes activities that enable individuals to learn from experience and obtain practical knowledge of best practices, examples of which include working out a quality plan, establishing objectives, and balancing the team workload. In this way, programmers' motivation is enhanced, which can help push the software engineering discipline toward maturation. Konrad and Over note that adoption of both CMMI and TSP can substantially improve organizations' process improvement results. The authors report that TSP streamlines or completely does away with a good portion of the work involved in building a high-maturity process because it deploys most of the CMMI practices; productivity, predictability, and product quality is also improved with TSP. "CMMI provides the organizational learning and infrastructure necessary to successfully adopt TSP," Konrad and Over conclude. "Together, these two technologies promise to continue maturing the software engineering discipline and practice."
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "The Ascent of the Robotic Attack Jet"
    Technology Review (03/05) Vol. 108, No. 3, P. 56; Talbot, David

    The U.S. military already uses unmanned drone aircraft for surveillance, but a new generation of autonomous aircraft capable of flying in coordinated groups and identifying and attacking targets is being developed and tested. The Defense Advanced Research Projects Agency (DARPA) has made a five-year, $4 billion commitment to the development of new airborne communications networks and control systems that will significantly boost the drones' autonomy; DARPA program director Michael Francis says "the soul [of the vehicles]...lies in the command and control, sensor, and weapons systems that enable their operation, individually and collectively." Another priority is to make the systems easy to use so that a single operator can direct fleets of the robot planes. DARPA has contracted Boeing and Northrop Grumman to invent distinct robot jets with common control systems, and Boeing appears to be in the lead with its X-45 model, which successfully demonstrated that ground controllers can modify the plane's flight plan as needed by coordinating the plan with conventional air traffic controllers. In addition, a single ground controller can control several X-45s, and operators can transfer flight control to another ground station almost 1,400 kilometers away while the jet is airborne. The X-45 was also used to deploy an inert bomb in April 2004. Among the challenges engineers are facing in the development of autonomous attack aircraft is the creation of a communications network that that can maintain communications links at very high altitudes and speeds. However, broken communication links are an inevitability, and Dave Kenyon with the Air Force's Electronic Systems Center says that such interruptions "will require new or improved network routing protocols." DARPA's Paul Waugh says an autonomous system is also needed so the drone can think for itself during these interruptions.
    Click Here to View Full Article

    [ Archives ]  [ Home ]