HomeFeedbackJoinShopSearch
Home

       HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM. To send comments, please write to [email protected].
Volume 5, Issue 569:  Monday, November 10, 2003

  • "Europe Exceeds U.S. in Refining Grid Computing"
    New York Times (11/10/03) P. C1; Markoff, John; Schenker, Jennifer L.

    European grid computing efforts have surpassed those of the United States to gain a competitive advantage, according to Peter A. Freeman of the National Science Foundation; in fact, European companies such as the Novartis pharmaceutical firm have adapted American technology to build grid networks. Both American and European technologists cite political and cultural disparities that Europe's grid initiatives throw into sharp relief: America's practical employment of grid computing is sometimes impeded by federal antipathy toward formulating industrial policies and by rival computing and telecommunications standards, whereas European governments focus more on implementing universal standards and pursuing economically advantageous technologies. In addition, the European Union has a more clearly outlined, five- to 10-year strategy for deploying a computing and networking infrastructure, and grid computing pioneer Larry Smarr calls it "a slap in the face and a wake-up call that things have gone global." The EU is readying the 2004 launch of a pair of major grid computing projects: A program led by France's National Center for Scientific Research to build a high-speed optical network linking seven supercomputers, and the Enabling Grids for E-science in Europe initiative, which seeks to build an international grid infrastructure of unprecedented scale to provide grid computing service 24/7. Other European efforts include British government-supported projects such as the Diagnostic Mammography National Database. Europeans have outpaced Americans in the race to boost the speed of optical networks, while CERN's Dr. Flavia Donno adds that Europe is also ahead of America in the effort to build end-user grid applications. All European grid infrastructure initiatives have clear objectives and private-sector collaboration, unlike in the United States. Supporters of grid computing make an issue out of laggard U.S. planning and financial backing partly because evidence indicates that grids will directly benefit the economy.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Caught in Pull of Globalization"
    SiliconValley.com (11/10/03); Davis, Aaron; Steen, Margaret

    The outsourcing of white-collar technology jobs to cheaper overseas workforces has become a flashpoint for grass-roots activists in the United States, while many academics, economists, and business executives take a contrarian position, arguing that the offshoring trend is ultimately a positive one for the U.S. economy. They contend that it will lead to the creation of more sophisticated IT positions that will keep the country at the vanguard of innovation and allow it to compete in a global economy. Former Palm employee Natasha Humphries, who lost her job and was replaced by a worker based in India who she trained, is considered by Rep. Don Manzullo (R-Ill.) to be symbolic of the distress many members of the U.S. tech workforce are feeling. "What Humphries represents--having had to train her own successor--that's like digging your own grave," says Manzullo, who chairs the House Small Business Committee and has called for limitations on the offshoring of federal jobs. Adding to tech workers' feelings of job insecurity is advanced communications technology that can more quickly facilitate the export of domestic jobs overseas; Martin Kenney of the University of California-Davis says this gives American IT workers less time to adjust. The National Association of Software and Service Companies, the premier Indian tech trade group, cites studies indicating that both the United States and India benefit from offshoring, with the United States saving money on software development. Meanwhile, offshoring advocates claim that overseas competition is a natural element of the historic tech commoditization cycle. Still, American corporations are trying to avoid the bad connotations of outsourcing by using broader terminology, such as "partnerships," "sourcing," and "multishoring" instead of "offshoring."
    Click Here to View Full Article

  • "Survey: Biggest Databases Approach 30 Terabytes"
    eWeek (11/08/03); Hicks, Matt

    The Winter database consulting company's fifth survey of the world's 10 biggest decision-support and transaction-processing databases indicates that data analysis is of growing importance for enterprises as they seek to outline trends and patterns better, according to company President Richard Winter. For the first time, the largest transaction-processing database was trounced by the largest decision-support database, which was from France Telecom and stored 29.2 TB--a 300 percent gain over the top decision-support database ranked by Winter's 2001 survey. Britain's Land Registry was the No. 1 transaction-processing database with 18.3 TB, almost 100 percent bigger than the 2001 winner in that category. The No. 2 decision-support database was a 26.2 TB repository from AT&T, and Sandy Hall of AT&T Labs reported that increasing needs to retrieve and analyze more historical data prompted AT&T to start storing two years worth of telecom network call data on its Security Call Analysis and Management Platform database rather than six months worth. Meanwhile, Winter predicts that an ever-growing amount of data driven by the growth of the Internet and new devices such as RFID tags will lead to much larger databases in the near future; Winter survey respondents expect that the biggest databases will store approximately 60 TB within three years. Richard Winter says, "What people talk about most is that computers are getting faster and cheaper and that storage is getting faster and cheaper, but there are also hundreds and thousands of devices planted everywhere that are getting faster and cheaper."
    Click Here to View Full Article

  • "The Push for Aspect-Oriented Programming"
    InternetNews.com (11/07/03); Wagner, Jim

    Aspect-oriented programming (AOP) was developed to handle cross-cutting concerns and furnish a level of flexibility not found in today's complex programs by allowing programmers to devise aspects of an object so single-point rather than cross-hierarchical changes can be made. Although the Java community has made significant strides in developing commercial AOP that the mainstream can employ, it will be some time for AOP to become a common practice because many programmers think the method is too complicated to understand. Gartner analyst Jim Duggan argues that AOP strategies at the language, metadata, and runtime levels must be standardized first. Northeastern University's Dr. Karl Lieberherr notes that AOP was developed faster than object-oriented programming (OOP), and he thinks the most progress was spurred by the deployment of AspectJ, which enables people to play with and refine the technology. The LOOM.NET AOP deployment has greater potential for acceptance because AOP is more user-friendly on the Microsoft platform, according to LOOM.NET developer Wolfgang Schult. "One has to bear in mind that the .NET solutions typically are not restricted to a single programming language, as the Java-based solutions are," he notes. One of the biggest obstacles the AOP movement faces is a dearth of independent software vendors, which is why few IT managers are devoting attention to AOP. For AOP to be more widely adopted, IT managers must become comfortable enough with AOP to make it a component of the design process. New Aspects of Security consultant Rod Bodkin writes that a killer app that flourishes on the technology improvements facilitated by AOP is also required.
    Click Here to View Full Article

  • "I Link, Therefore I Am"
    Guardian Unlimited (11/06/03); McClellan, Jim

    MIT Media Lab media arts and sciences head William Mitchell says society will increasingly take on the characteristics of a network, where each individual is considered a node linked to other information. In his latest book, "Me++," Mitchell argues that "augmented reality" will be of far more importance in the future than virtual reality, where the real world is cordoned off from the computer-based one. Augmented reality is made possible by ubiquitous computing and wireless networking technologies, and is redefining modern cities. Mitchell, who is also dean of MIT's school of architecture and planning, says architects and city planners need to take augmented reality into consideration, allowing services to be brought to people instead of the other way around. In the same way, software developers need to think about how to best take advantage of new mobile computing technologies such as wearable computers and smart clothes. A Media Lab project to build a concept car is based on these ideas, and is being completed in conjunction with General Motors and designer Frank Gehry. The car would serve users with information and services, much as a knowledgeable human taxi driver would their passenger, says Mitchell. Mitchell says that Media Lab has taken on more government funding in the wake of the dot-com bust since corporate expectations are not always in line with Media Lab's convergence tenets; the government influence does mean Media Lab is concerned with security and surveillance, says Mitchell, but in a more nuanced and critical way than the Defense Department's Information Awareness program, for example. Mitchell says that surveillance technology can be used by both enforcers and citizens to powerful effect, as evidenced by the videotape of Rodney King's beating.
    Click Here to View Full Article

  • "Ethics Center a Small Obstacle as Senate Nears Nano Bill Passage"
    Small Times (11/06/03); Karoub, Jeff

    The U.S. Senate is on the verge of passing the 21st Century Nanotechnology Research and Development Act, which would allocate over $2 billion to a three-year nanotech R&D initiative and establish a center to focus on the technology's societal and ethical implications. House and Senate leaders are currently engaged in private negotiations to refine the bill's language so it is acceptable to the full Congress, while NanoBusiness Alliance executive director Mark Modzelewski reports that the current delay is due to last-minute questions and discussions that are typical of the approval process. National Nanotechnology Initiative director Mike Roco says the passage of the bill would allow nanotech to be "recognized in an official document by the Congress...as a key technology for the U.S. future." He adds that now is the time to prioritize key areas of nanotech R&D, such as materials, chemicals, pharmaceuticals, and electronics. With demand for funding also coming from emerging areas such as medicine and energy conversion, Roco says there is not enough money to support all sectors at the same time. David Berube of the University of South Carolina would serve as the manager of the nanotech ethics center if the government selects his school as the facility's permanent home, and he believes objectivity and substance would be upheld by such a center. "The truth is, if we don't have this nano center...a bunch of public relations firms are going to take up the mantle of this," he warns. The five-year, $25 million contract would be a tremendous boost for the university's nanotech efforts, which include an August grant of over $1 million from the National Science Foundation to study ethical and societal issues related to nanotech.
    Click Here to View Full Article

  • "No Place to Hide From Hunters of the Voice Print"
    Age (AU) (11/04/03); Turner, Adam

    Students at Monash University are building on their research into pervasive computing to deliver computing access on the fly. The researchers have developed a project called SoundHunters, which uses software agents to lock in on the voice of a user, and follow the user as he or she moves around a room filled with computers. The small, autonomous applications are designed to lurk in the background and turn up at the closest computer to the user, delivering an important email message, for example. The software agents move from computer to computer, following the sound of a voice, keywords, footsteps, and even laughter. "Because we don't know where the person will go next, we clone the SoundHunters agent to the surrounding computers," explains Monash researcher Arkady Zaslavsky. "We are trying to always be one step ahead of the person." The student team built the software agents on the Java-based Grasshopper 2 platform, using IBM Via Voice Micro Edition for sound recognition. Stored voice prints authenticate users and automatically logs them in. Zaslavsky says the technology could be available commercially in two years.
    Click Here to View Full Article

  • "Web Game Reveals Market Sense"
    Technology Research News (11/12/03); Patch, Kimberley

    Swiss researchers are using the Internet to study how humans make decisions in market situations, such as in financial markets. Using a Web-based game, the researchers studied how humans interact with 94 computer-controlled players in a simple game meant to emulate a market situation: In each round, players chose to join one of two groups; if their group was smaller, they won points according to the spread between the groups. The only information the human or computer players have to base their decisions upon is previous market history. Each computer player maintains a defined strategy, and the human player's effect can be considerable, says University of Fribourg researcher Joseph Wakeling. As the game's complexity intensifies, the spread between positions narrows and makes human player's strategies less effective, since the complexity of the game outstrips their logical capacity to predict market movement. An important finding of the research was that human players compensate in complex market situations by sticking to repetitive strategies, which actually tends to work better than random decisions, says Wakeling. Changing positions often actually puts the player at a disadvantage since the player's decision affects the outcome more in a complex situation. The results shed light on humans' logical capacity and response; Wakeling also suggests that humans have an instinctive, not conscious, sense of market trends that gives them an advantage. Wakeling says the program could be tailored for training financial traders in three to four years' time, and his team will continue to study why and how humans switch from active to repetitive positioning strategy. Eventually, the goal of the research is to build a good theoretical understanding of how individual decisions affect and are affected by surrounding behavior.
    Click Here to View Full Article

  • "Spammers Can Run But They Can't Hide"
    New York Times (11/09/03) P. 3-1; Hansell, Saul

    The Spamhaus Project, based in England, is a nexus in the battle against spam: Founded by activist Steve Linford in 1998, Spamhaus.org compiles the most reputable nonprofit list of known spammers and is used by many second-tier and smaller U.S. ISPs to identify spammers operating from their networks. Large ISP organizations such as Time Warner and Microsoft's Hotmail use commercial anti-spam services. Linford's team of 15 volunteers has been credited with preventing up to half of all sent spam from reaching its intended target, but new spam techniques have degraded Spamhaus' ability to track and identify sources of spam. A common technique Linford uses to identify spam is to find the IP address of a Web site cited in the spam email and check if it has been added to Spamhaus' block list already. Other Spamhaus members conduct deeper investigative work, sometimes lurking in chat rooms or actively engaging spammers in an effort to dissuade them from continuing their work. Spammers have been actively increasing their abilities too, joining with crackers, or hackers with malicious intent, to propagate spam through Internet viruses and worms; harnessing large numbers of zombie machines, these spam-allied crackers route spam messages and conduct distributed denial-of-service attacks against Spamhaus.org and other anti-spam groups. Linford, who personally finances Spamhaus with funds from his Web design and hosting firm, says he does not intend to give up the fight against spammers and is making headway, such as newfound respect from Chinese ISPs which are loath to have their email traffic on Spamhaus' block list. Linford praises a recently passed European Union law that makes spam illegal, but thinks the U.S. Can Spam Act is too weak and will eventually be replaced.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "War Driving No Game to IT Managers"
    IT Management (11/05/03); Gaudin, Sharon

    Wayne Slavin has given computer users access to software that can be used to determine whether companies have secure Wi-Fi wireless networks. There have been 5 million downloads of the NetStumbler wireless LAN discovery tool at Slavin's NetStumbler.com Web site, where war drivers also can submit their findings for a national map of wireless networks. Slavin says, "This is about security. It's about letting people know there's this fantastic new technology out there and it will revolutionize networking," but it does have security issues. Although Slavin says the purpose of NetStumbler is to show unsuspecting companies how vulnerable their wireless networks could be to people looking to break into their systems, IT managers and security administrators running networks say corporate spies or high-tech thieves could use the wireless network detection tool maliciously. Slavin, who believes 80 percent to 85 percent of wireless connections are not secure, says companies should want to know whether they have secure Wi-Fi connections. "From a company standpoint, the fear is that anybody could come in through a wireless access point and connect into the corporate LAN," says security consultant Ken VanWyk. "Once they're in, it's just like they've plugged into a network from a conference room or a person's office." Slavin's NetStumbler is one of many available wireless network detection tools, and he believes that war drivers are helping to advance the use of wireless technology.
    Click Here to View Full Article

  • "Let Reverse-Engineering Go Forward"
    Business Week (11/04/03); Woellert, Lorraine

    Interoperability will likely be one form of permissible reverse-engineering under the 1998 Digital Millennium Copyright Act (DMCA). The U.S. Copyright Office ruled on Oct. 27 that Static Control Components has the right to break down the chip technology on Lexmark toner cartridges because the DMCA, which makes it a crime to circumvent technology used to protect copyrighted material, protects reverse-engineering a product when the purpose is to make another product work with it. "Interoperability necessarily includesconcerns for functionality and use, and not only of individual use, but for enabling competitive choices in the marketplace," wrote Copyright Registrar Marybeth Peters. Static Control asked the Copyright Office whether it could reverse-engineer computer chips in Lexmark printers under the DMCA. The request came after Lexmark sued Static Control for copying a chip that would allow the small printer parts maker to make cartridges that were compatible with its printers. Computer programmers, academics, and civil libertarians have joined many manufacturers in criticizing the DMCA, arguing that its provision will have a negative impact on innovation and prices.
    Click Here to View Full Article

  • "From the Data Center to the Desktop: Linux Grows Up"
    Serverwatch (11/03/03); Newman, Amy

    Linux's suitability for the enterprise is no longer in question, and companies need to begin seriously considering how they can best utilize the technology. Despite its pervasiveness, Linux does not meet every need but performs well under certain conditions: Unix still scales up much farther than does Linux, offering scalability up to 256 processors, compared to the 32-way capability on the way in the Linux 2.6 kernel. The Linux market has also matured so that the number of major players involved has boiled down to Red Hat and Suse Linux. Illuminata President Jonathan Eunice says the Suse distribution is more technically oriented while Red Hat's Linux caters more to business needs. Linux has done particularly well among financial services firms, which like the modularity and security advantages that provides. Hewlett-Packard Linux strategist Mike Balma says Linux will likely take over the Web and high-performance computing spaces, and continue to grow in network infrastructure, utility computing, and technical areas as well; however, Unix's scalability and middleware support make it the best choice for robust databases and application servers. Balma says most enterprises make the switch to Linux along with their hardware platform, since one of the main advantages of Linux is in hardware savings. Organizations that switch from Windows to Linux without switching their hardware do so because of security concerns, he says. As the open-source system matures, standardization efforts are picking up: The Free Standards Group's Linux Standard Base certified 19 products for the 1.3 version in order to bridge the gap between developers and the needs of the industry. Although Balma does not think Linux is ready for all desktop users, he says it is sufficient in certain cases.
    Click Here to View Full Article

  • "NU Program 'Watson' Could Speed Searches"
    Daily Northwestern (11/06/03); Nelson, Andy

    The "Watson" software tool developed by Northwestern University professors and students is designed to narrow the search for information online by continuously scanning the Internet for data related to text documents and other files on the user's PC. Unlike Google and other search engines, Watson supports automatic query generation, and also stores data about to produce results based on users' inclinations. Watson developer and Open Road Technologies CTO Jay Budzik says that if a user wants to find a child's toy, for instance, the program could fine-tune search results to include a toy construction set if the user frequently accesses information about construction. Open Road CEO Christine Mason notes that users are often discouraged by current search engines that generate a lot of unrelated results. Budzik says there is no reason for users to harbor privacy concerns about Watson, since the program can be deactivated and will only store personal information on hard drives, not online. Open Road has acquired the rights to license Watson, which is undergoing focus-group testing; Mason reports that her company will team up with a large corporation to market a consumer version of Watson. In addition, Open Road intends to market Watson to other corporations as a tool to scan their internal databases, and Mason notes that the software could help facilitate keyword monitoring, liability prevention, and regulatory compliance. Budzik says the concept for Watson was inspired by a program he was developing to answer vague questions by consulting the Web.
    Click Here to View Full Article

  • "Pitt Research Delves Into Amazing Shrinking Computer Chips"
    Pittsburgh Post-Gazette (11/06/03); Spice, Byron

    As computer chip transistors shrink in size, their functionality will be determined by quantum physics rather than classical physical laws. Hrovje Petek of the University of Pittsburgh is conducting research into this phenomenon--his latest project, as reported in the Nov. 6 issue of Nature, is a collaborative effort with Japan's National Institute for Materials Science that has yielded a "quasiparticle" that could be used to improve electronic devices. Petek excited a piece of silicon using 10-femtosecond-long pulses of ultraviolet laser light, causing the crystal to shift from an electrical insulator to a conductor. In the process, Petek discovered that the movement of electrons and the normal atomic vibrations in the crystal become entangled in a quasiparticle that shares properties of both atoms and electrons. Electrical current flow is currently characterized as the movement of electrons through a piece of silicon, but Petek says chip design will have to be rethought as electrons assume more wave-like behavior concurrent with the shrinkage of chip dimensions to tens of nanometers, which could be reached within the decade. The control of silicon devices by lasers could result in transistors that are 1,000 faster than current models, Petek speculates. "The day will come when quantum physics directly influences the functionality of computers and other electronic equipment that we use in everyday life," writes Alfred Leitenstorfer of Germany's University of Konstanz. "The question is: when?"
    Click Here to View Full Article

  • "Search Gets Serious"
    InfoWorld (11/03/03) Vol. 25, No. 43, P. 45; Moore, Cathleen

    A growing number of enterprises are exploiting more and more sophisticated search technology to manage the rising tide of corporate information, a trend that analysts such as Jupiter Research's Matthew Berk say is extending search's range into the realm of business intelligence. There are two general classes of enterprise search tools: External-facing Web site tools for customers and field workers that focus on natural language; and internally-focused search tools that aim to improve the scope and effectiveness of search through auto categorization, taxonomy development, summarization, and personalization. External-facing site search not only seeks to furnish solutions to customers' problems, but to inform customers about other services that could prevent recurrences, notes iPhrase's Andre Pino. Content is clustered into related groups via categorization, while taxonomies connect similar terms and concepts. Berk explains that "[Enterprises] need to think about a shared services architecture that can be deployed enterprise-wide, and have different line-of-business applications take advantage of [the architecture]." Vendors are also introducing search applications tailored to help companies deal with compliance and other nascent areas of difficulty. The end result of all these varied search approaches is to allow workers to uncover information they may have previously been unaware of.
    Click Here to View Full Article

  • "Ethernet: Every Little Gigabit Helps"
    InformationWeek (11/03/03) No. 962, P. 58; Fitzgerald, Michael

    The penetration of Gigabit Ethernet into the mainstream and enterprises will be driven by cost trends rather than application requirements. Steven Shalita of Cisco Systems expects bandwidth needs to rise as software trends such as automatic updates and backup progress, while the roughly 10 percent of corporate network environments Meta Group estimates Gigabit Ethernet accounts for should increase significantly this decade. Another factor pushing the migration to Gigabit Ethernet is companies' growing desire for switches that support services such as wireless security, dedicated storage, and high-level network management--services that will in turn spur backbone-based networking equipment upgrades. The Yankee Group estimates that Fast Ethernet port shipments will surge to 155 million in 2005, as falling costs will lead to new uses and its adoption by developing nations. However, by 2005 Gigabit Ethernet will become comparable in price to Fast Ethernet, leading to its widespread adoption on desktop. It took three to four years for Fast Ethernet to supplant 10Base-T Ethernet on the desktop, and the switchover from Fast Ethernet to Gigabit Ethernet will probably take just as long; however, industry observers such as Yankee Group's Zeus Kerravala believe Gigabit Ethernet will remain an essential component of desktop connectivity for about eight years, partly because of the time it will take to enable 10 Gigabit Ethernet to run over copper. Whether computers will be able to manage the dramatic speed increases is a lingering question, and the PC architecture will need to be fundamentally reworked before 10 Gig can penetrate the desktop. Meta analyst Chris Kozup predicts that enterprises will employ Gigabit Ethernet to build storage networks, which will ignite the use of Voice over IP and other technologies with low latency tolerance.
    Click Here to View Full Article

  • "No One Understands Me as Well as My PC"
    New Scientist (11/01/03) Vol. 180, No. 2419, P. 28; Brooks, Michael

    The goal of IBM's "superhuman" speech recognition initiative is to develop superior speech recognition in which computers can easily accommodate natural language by the end of the decade, although reaching this objective will require an enormous effort. Speech recognition researchers have employed hidden Markov models so computers can overcome the lack of a single distinguishing frequency and energy for each speech sound produced by every person--but this approach is limited, so researchers are therefore attempting to train computers on the myriad nuances of language. Microsoft researcher Li Deng, for instance, is developing a speech recognizer that analyzes the resonant frequencies of speech in order to extrapolate which frequencies correlate to which positions of the tongue, and therefore which speech sound is being produced. If the computer can learn how the tongue, lips, and other biological "articulators" operate and interact, then the machine could become capable of compensating for rapid speech as well as eliminating any sound combinations that are physically improbable. IBM computers have been programmed to read lips in order to deal with background noise, but a further challenge lies in creating a single program that could learn to recognize speech in any language. Such a breakthrough could transmogrify technology usage in non-English-speaking cultures. "Superhuman" speech recognition benchmarks do not necessarily have to mean flawless, according to experts such as Philip Woodland of the University of Cambridge. "For one thing, we get bored--there are some applications where machines are as good as people, or you would just never want to get people to do them," he observes.

  • "An Army of Small Robots"
    Scientific American (11/03) Vol. 289, No. 5, P. 62; Grabowski, Robert; Navarro-Serment, Luis E.; Khosla, Pradeep K.

    A challenge set forth by the Defense Advanced Research Projects Agency for roboticists to develop diminutive reconnaissance machines has led to a focus away from bulky multi-sensor platforms and toward small fleets of simple robots that can squeeze into tighter spaces and carry out operations collectively like ants, with the various sensors and capabilities distributed among several machines. Getting robots to function as a team has forced engineers to devise new methods for tasks such as mapping the environment and determining position, as demonstrated by Carnegie Mellon University's millibot project. Millibots follow a three-layer design: The bottom layer contains motors, treads, and other components to give the machine mobility; the middle layer features two microcontrollers and a radio modem to facilitate real-time processing and control; and the top module houses sensors, including sonar, near-infrared, mid-infrared, a video camera, and another radio modem to communicate with the home base and other millibots. Millibots cannot individually support the capabilities of larger robots because of size and battery power limitations, so the CMU researchers have combined the design approaches of specialization and collaboration--some robots are programmed to map their surroundings, while others provide live feedback for the human operator or carry mission-specific sensors. One collaborative task the millibots perform is localization, whereby a team of robots ascertains its position when individual millibots emit omnidirectional radio and ultrasonic signals that the other robots listen for. The robots determine the distances between each other by alternating their roles as transmitters and receivers, while the team leader collates all the data and uses trilateration to compute robot positions; localization allows the millibots to navigate without fixed reference points. The CMU researchers have developed updated millibots that scale obstacles by linking together in a chain. Roboticists have started to emphasize the improvement of mini-robot control systems, a multidisciplinary effort that will include economics, military logistics, and political science.

 
 
[ Archives ] [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM