HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to [email protected].

Volume 5, Issue 529: Wednesday, August 6, 2003

  • "Voting Suit Gains Momentum"
    Wired News (08/05/03); Glasner, Joanna

    A lawsuit that questions the constitutionality of computerized touch-screen voting machines is moving to an appeals court, at a time when the reliability of such systems is a topic of heated debate. The 9th U.S. Circuit Court of Appeals in San Francisco will hear oral arguments in the case of Riverside County, Calif., resident Susan Marie Weber, who sued the state in 2001 for allegedly trampling on citizens' constitutional rights by deploying touch-screen voting systems that do not furnish paper ballots. Weber claimed that the terminals are more susceptible to fraud than other voting systems, a charge that voting machine manufacturer Sequoia Voting Systems contested. The U.S. District Court for the Central District of California ruled that Weber "presented no admissible evidence to support her claim that the use of the AVC Edge System [made by Sequoia] by Riverside County effects differential treatment of voters or vote dilution" some 11 months earlier. Weber says the appeals court originally turned down her request for a hearing, but may have reversed its decision when other plaintiffs and their advocates supplied many documents buttressing her case. Members of VerifiedVoting.org recommend the implementation of computerized voting systems that print out vote records in order to ensure an accurate count. California's current secretary of state organized a task force to consider the security ramifications of touch-screen voting machines earlier this year, but the group failed to agree on whether paper ballots should be a standard feature of such systems, according to a July report. Counties in California and elsewhere are feeling mounting pressure to overhaul their voting systems, and the campaign to recall Gov. Gray Davis only adds to their burden.
    For more information on e-voting, visit http://www.acm.org/usacm/Issues?Evoting.htm

  • "U.S. Backs Florida's New Counterterrorism Database"
    Washington Post (08/06/03) P. A1; O'Harrow Jr., Robert

    Florida's Multistate Anti-Terrorism Information Exchange (Matrix) counterterrorism database is being developed as a resource that national law enforcement agencies can use to scan billions of available records about both lawbreakers and innocent citizens in order to rapidly find patterns of suspicious activity. "The power of this technology--to take seemingly isolated bits of data and tie them together to get a clear picture in seconds--is vital to strengthening our domestic security," insists former commissioner of the Florida Department of Law Enforcement James Moore. The Matrix program is slated to be extended nationally via a $4 million grant from the Justice Department and an $8 million grant from the Homeland Security Department, but civil liberties advocates are worried that the deployment of such a system will give the government even more latitude to infringe on Americans' privacy. The Matrix system is similar in concept to the Pentagon's Terrorism Information Awareness program, which became so contentious an issue that legislators severely cut the project's funding. The Matrix system was developed by Seisint under the authorization of company founder Hank Asher, who refused to accept financial compensation, according to Phil Ramer, special agent in charge of statewide intelligence. Seisint President Paul S. Cameron says concerns of Matrix's potential for privacy infringement are inflated, considering that the police have long had access to the records the system would tap into. Ramer does not deny that Matrix could be misused, but vowed that the system would be used discreetly. More controversy has erupted over Asher's past as a drug smuggler and an informant, and Jennie Khoen of Florida's Department of Law Enforcement calls it "prudent and responsible for us to do a comprehensive review of his background."
    Click Here to View Full Article

  • "Free Software Faces a Rocky Road to Court"
    Financial Times (08/06/03) P. 8; Foremski, Tom

    Hanging over the LinuxWorld trade show this week in San Francisco is the SCO lawsuit against IBM over Unix intellectual property infringement that not only threatens the future of Linux, but potentially a host of commercial software as well. SCO says IBM copied key enterprise computing features from Unix to Linux in order to make it ready for business-critical applications. Proving otherwise could be difficult, given that common programming practice today involves software engineers with areas of expertise to work on similar projects. When those programmers migrate to other companies, they take with them knowledge of intellectual property that could work its way into new products. Linux leader Linus Torvalds said there are strict controls protecting against the illegal inclusion of intellectual property, though he admits they are difficult to enforce with thousands of volunteer programmers worldwide. Microsoft Chairman Bill Gates said open-source software probably infringed on a wide range of intellectual property, especially when the project focuses on emulating commercial programs. Countering that claim, Torvalds said Microsoft software most likely contained some Linux intellectual property as well. Industry analysts see Microsoft and Sun as bankrolling the SCO courtroom crusade, given that both companies recently signed multimillion-dollar Unix licensing agreements with SCO. Sun notes that its unique license arrangement, agreed to by former Unix owner Novell, protects Linux users who are also Sun customers; however, one Linux programmer says the open source community will simply circumvent the entire infringement program technically once SCO's claims are made clear.

  • "Computer Groupthink Under Fire"
    Wired News (08/05/03); Delio, Michelle

    A July House Science Committee hearing was the focus of a heated debate on the relative merits of supercomputers compared to those of grid and cluster computing configurations, which critics charge are insufficient for certain computing operations. Grids and clusters are less expensive than supercomputers, but supercomputer proponents called for more federal investment in supercomputing, given that the U.S. effort lags behind that of Japan, which currently rules the roost with the NEC Earth Simulator. "When we hear that the U.S. may be losing its lead in supercomputing, that the U.S. may be returning to a time when our top scientists didn't have access to the best machines, that our government may have too fragmented a supercomputing policy--those issues are a red flag that should concern all of us," warned House Science Committee head Rep. Sherwood Boehlert (R-N.Y.). David Reed of the University of Illinois at Urbana-Champaign's National Center for Supercomputing Applications testified that supercomputers, grids, and clusters are all necessary, but no single architecture can handle all computing needs. He advised the government to formulate a solid supercomputer development strategy. The first step involves the implementation of large-scale systems using contemporary designs and technologies to provide enough resources for science, engineering, and national security research and development. The next step requires the deployment of a far-horizon R&D endeavor to develop and build systems for broader applications. True supercomputer construction projects are underway in the United States: IBM won a Department of Energy contract last November to create two supercomputers with a collective peak speed of 467 teraflops, which promises to outdistance the 500 current fastest machines.

  • "Three Minutes With Marcus Sachs"
    PCWorld.com (08/04/03); Brandt, Andrew

    Marcus Sachs, who collaborates with analysts on the development of the Homeland Security Department's Cyber Program, asserts that the program is "really, really, really prepared" to defend the United States against cyberattacks, but maintains that the program alone cannot fulfill its role without the participation of software companies. He says that these firms are obligated to inform their customers of any vulnerabilities they uncover in a timely manner, while the government has a responsibility to work with the private sector to spread awareness of security flaws. However, Sachs assures that the DHS is not authorized to force companies to comply with best practices for monitoring computer users' behavior. He explains that the DHS does not give open-source or proprietary software special consideration--its only criteria is that the software carries out its designated function. Sachs emphasizes that trusted rather than trustworthy software should have priority: In other words, the software should not do anything that the user does not know about. "These kinds of questions are taking on more significance because in the past year we've started to see some of the increased capabilities of software, stuff like adware and spyware," Sachs notes. He also advocates the elimination of Easter eggs in software, which he believes is long overdue. Although no loss of life has been directly linked to a cyberattack, Sachs cautions that the possibility exists, and adds that Internet fraud is an important area of concern to the DHS because some terrorist groups may be funded through online scams.

  • "Country-Coded Computer Worms May Be Ahead"
    New Scientist (08/04/03); Knight, Will

    Jonathan Wignall of the U.K. Data and Network Security Research Council used the DefCon 11 security conference to address the possibility that computer worm creators could more effectively distribute their malicious code by targeting specific countries rather than randomly assaulting Internet-linked computers. He explained that a worm could be designed to download a prepared list of Internet protocol addresses to besiege from an individual server or group of machines, a tactic that could significantly reduce the bottlenecking that stems from duplicate requests being sent to each machine. "It means it could spread very quickly," Wignall noted. "And it would also allow you to pick a country [by IP addresses] and jam up traffic in the country after the worm has spread." However, University of California, Berkeley scientist Nicholas Weaver said the speed effects of the targeted IP address approach would be restricted, while the same technique would raise the risk of detection for hackers. Weaver hypothesized another country-specific attack scenario, in which the worm is programmed to disregard computers running a certain language. He cited the recently released Migmaf trojan, which does not operate on computers that use Russian-language keyboards, as an example. Sophos' Graham Cluley added that a country-specific worm attack has limited appeal among most hackers.

  • "A Brilliant Future for the Smart Home"
    TechNewsWorld (08/04/03); Koprowski, Gene J.

    Futuristic home networking technologies are reaching the commercial market, allowing users to unobtrusively integrate computing into their everyday lives. Homebuilders and household appliance vendors such as Sears are promoting smart technologies in the house, including remote control, security monitoring, and advanced control features such as voice-recognition. These components are just pieces of a vision where all the different appliances and systems in a home communicate with one another and can be managed centrally through the Web. Embedded chips would allow a refrigerator to identify what items it is holding, for instance. On their way home, people could check what they need to buy from the grocery store by tapping into their refrigerator, or they could pull up recipes and have them displayed on the kitchen countertop. New homes are increasingly being built with wired communications infrastructure, as evidenced by growing sales at Home Director, a home-networking spinoff of IBM that has funding from Cisco and Motorola. In the Netherlands, networking firm Echelon is working with Continuon Netbeheer of the local utility for widespread deployment of its system. TDK Systems managing director Nick Hunn says Bluetooth is being adopted widely as a home networking technology, noting that TDK Systems' Blu2i adapters have been surprisingly in demand. Meanwhile, other firms are working on ways to use wireless applications to control home security systems and lighting; 2Wire says new technology under development will eliminate Wi-Fi cold spots by enabling the technology to transmit through walls. Structured home wiring is also gaining in popularity lead by home builders such as Centex and Pulte; 20 percent of new homes built in 2002 contained structured wiring for computer networking or entertainment uses.

  • "Eyes Off, Screen Off"
    Technology Research News (08/06/03); Patch, Kimberly

    Duke University researchers led by Angela Dalton have developed a prototype energy-efficient device that detects people's presence and can activate or deactivate a computer screen depending on whether anyone is watching or not. The FaceOff system, fashioned from commercially available components, employs a wireless motion sensor that determines presence, a WebCam that captures images of visitors, and a face detection algorithm that deduces whether they are staring at the display. Dalton explains that the system's power requirement while waiting for the motion sensor to activate is negligible, while the camera's peak consumption is between 1 and 1.5 watts. Even more energy could be saved with the incorporation of miniature cameras that use no more than 20 milliwatts. The FaceOff methodology could be used as an energy-saving technique for many mobile devices thanks to the proliferation of low-cost, low-power sensors, Dalton observes. She reports that the researchers' next challenge is to conduct a full user study to ascertain the system's most useful applications, as well as investigate ways to make the system more responsive to motion detection or the lack thereof. The Duke researchers will also outfit the FaceOff prototype with a light sensor, enabling the screen's brightness to be adjusted so even more power can be conserved. Dalton predicts that usable power-saving sensor systems could be available within two to three years.
    Click Here to View Full Article

  • "Battle of the Blog"
    CNet (08/04/03); Festa, Paul

    The format that controls Web logs (blogs) is at the heart of an acrimonious dispute between Really Simple Syndication (RSS) manager and Harvard University fellow Dave Winer and those who resent his decision to freeze the RSS core and his governance over the format, and are developing an alternative to RSS. IBM developer Sam Ruby is advocating the RSS alternative, which would reportedly define syndication as well as publishing, editing, and archiving, and probably be controlled by the Internet Engineering Task Force. Winer counters that large vendors such as IBM "want to make [RSS] complicated so they can charge hundreds of thousands of dollars to implement it for you." Ruby contends that the alternative technology would retain the core but eliminate the uncertainty and acrimony that has characterized RSS. In fact, Ruby says the syndication format would be simplified even further through the removal of multiple versions and elucidation of the rules for adding extensions. Furthermore, Ruby suggests that the project seeks to build a standard that consolidates rival blog formats. Executive director of Harvard's Berkman Center John Palfrey argues that the transfer of the RSS format from UserLand Software to the Berkman Center is an indication of Winer's desire to remove himself as RSS controller, and adds that the debate has been improperly focused on Winer's working relationships with people and organizations rather than on the technology's merits.

  • "Life Imitates Art"
    Melbourne Age (07/31/03); Williams, Fiona

    Dean Economou of the Commonwealth Scientific and Industrial Research Organization's Center for Networking Technologies for the Information Economy comments that sci-fi movies not only offer a window onto future technologies, but exert an influence on current research. "[The films] mean people have a vocabulary about the future and you find a lot of the young researchers were very inspired by 2001, Star Trek, Blade Runner or The Matrix," he observes. "In a very real way, the technologists are inspired by the sci-fi people and the sci-fi people are similarly inspired by the technologists." Research developments have prompted filmmakers to strive for greater scientific accuracy in sci-fi movies, and scientists are often sought as technical consultants on such films. MIT Media Lab researcher John Underkoffler, among others, was tapped by the makers of the 2002 release Minority Report to extrapolate mid 21st-century technology from today's innovations; future technologies highlighted in the film that are currently under development include a gestural recognition interface and updateable electronic newspapers. Other emergent technologies reflected or envisioned in sci-fi films include artificial intelligence and robotics (as seen in the Terminator and Star Wars movies), cloning (The 6th Day), virtual reality and holography (Star Trek), and brain implants (Total Recall). Philip K. Dick, the sci-fi author whose work inspired Blade Runner, Total Recall, and Minority Report, speculated about retina scanning and fingerprint recognition roughly 50 years ago, according to Chris Winton of Biometrics Australia. Biometric technologies in use today include a network of cameras in London that can identify offenders via facial recognition software, and fingerprint scanners attached to consumer products. However, Winton notes that Hollywood often portrays technologies such as biometrics in a sinister light, which may hinder their adoption and public acceptance.
    Click Here to View Full Article

  • "'People Want Simple Home Networking'"
    Financial Times (08/04/03) P. 6; Cole, George

    A household featuring an array of networked devices that seamlessly share digital content--music, video, pictures, etc.--is the goal of the Digital Home Working Group (DHWG), an industry alliance dedicated to making future networking products compatible. "Our aim is to use existing open industry standards and provide guidelines on how companies can implement them to improve interoperability," explains DWHG board of directors chairman Scott Smyers. Eddy Odijk of leading DHWG member Philips Consumer Electronics argues that the opportunity is ripe thanks to the penetration of digital appliances and broadband in the home, not to mention the rising affordability of embedding wireless technology in devices. But Ian Fogg of Jupiter Research notes that the alliance faces a number of formidable challenges, one being the difficulty in coaxing PC devices to interoperate with consumer electronics products. Meanwhile, the DHWG does not yet have a strategy to authorize a unified digital rights management system, which is used to shield content from unlawful duplication and distribution. The organization could run into more difficulty with content distributors, especially in light of the headaches the entertainment industry already faces with unauthorized file-sharing and downloading. Smyers, who expects the first DHWG-compliant devices to debut in the second half of 2004, believes that everyone will benefit if the alliance succeeds in its mission. He asserts that hardware producers will be able to build compatible products, while "content developers will gain new revenue streams and consumers will be able to use content more flexibly."

  • "Tech Future for Women Starts Young"
    Toronto Globe and Mail (07/31/03); Sayiner, Marcie

    Despite studies indicating that more women than men are going online and assertions from female tech professionals that gender has little to do with their struggle to rise in the industry, IN CONTEXT Managing Partner Marcie Sayiner foresees a shortage of IT women. She cites a Statistics Canada report estimating that the portion of women in the computer and telecommunications industries was about 33 percent in 2002, compared to roughly 38 percent in the early 1990s. Sayiner does, however, see a possible solution to the projected shortage through programs that support next-generation female IT professionals by nurturing an interest in information technology at an early age. Simon Fraser University runs a summer program for girls in sixth and seventh grades that trains them in IT savvy and inventiveness through team-coordinated multimedia projects. Through two summer sessions, participants are "given the opportunity to tell their own digital stories using video, sound and interactive technology," explains SFU Surrey researcher Cindy Poremba. "The focus is on creativity, teamwork, and above all, having fun." Projects are developed with an emphasis on safe Web surfing and technology careers, while more fun is added to the mix with recreational activities. Sayiner is hopeful that programs such as these, coupled with a new generation of women growing up with computers, will narrow the IT gender gap.
    Click Here to View Full Article

  • "Robotics to Play Major Role in Future Warfighting"
    United States Joint Forces Command (07/29/03); Schafer, Ron

    A study being conducted by the U.S. Joint Forces Command's Project Alpha think tank suggests that integrated autonomous machines may become a standard component of battlefield tactics by 2025. Project Alpha organized a Johns Hopkins University workshop where artificial intelligence and robotics experts convened to develop a basic outline for robot battlefield deployment and make the Defense Department more aware of robot technology's possible war applications. The think tank's Unmanned Effects Leader Gordon Johnson explains that one objective of the study is to promote the creation of a DoD office that would oversee the implementation of robotics technology throughout the entire military, rather than have separate divisions risk incompatibility by developing isolated robotics programs. The study is also supposed to articulate the advantages of tactical autonomous combatants, such as their ability to operate in multiple environments, weather tough environmental conditions, and function in areas saturated with radioactivity as well as chemical and biological contaminants. Johnson also points out that robots offer greater mobility, can react faster and cost less than humans, and possess unmatched sensing capabilities. Project Alpha director Russ Richards says that battlefield robots will probably not be humanoid, but will instead be configured into shapes designed to "optimize their use for the roles and missions they will perform." He adds that the biggest challenge to widescale acceptance of robots in the military is likely to be changing cultural attitudes that favor the human element, although he notes that autonomous machines such as "smart" weapons are already replacing humans in certain capacities. Richards warns that the United States should not fall behind other countries' efforts to incorporate robots into their armed forces.
    Click Here to View Full Article

  • "Interview With Brian Kernighan"
    Linux Journal (07/29/03); Dolya, Alexsey

    Former Bell Labs programmer and Princeton University computer science professor Brian Kernighan muses that computing's biggest headache is the difficulty of using and programming computers. He argues that the last half-century of progress has not reduced the problem's magnitude, and he expects it will still be a major disadvantage 50 more years down the line. Kernighan says that programming languages will become more expressive, but this will not be plain to see because people will be conducting even more complex operations than they are today. Mechanization is the area where Kernighan foresees the most improvement, and he predicts that language levels will maintain their upward progress as languages become more declarative and efficiency becomes less of a priority for computations. Kernighan says he cannot make any definite forecasts on whether computers will become easier to use, given that progress in this area has been less than stellar over the last 10 to 15 years. The Princeton professor laments that many of the fun elements of working with computers are gone, and he characterizes current systems as overly complicated and unrewarding for the most part. Kernighan notes that UNIX systems are very reliable, but adds that he favors no one UNIX operating system because he is a casual programmer--there is little to distinguish Linux from BSD and other OSes, as far as he is concerned. Kernighan's feelings toward Microsoft are mixed: Microsoft has helped boost computing's penetration with the deployment of a unified environment, but Kernighan criticizes the operating system's propensity to crash, and the difficulty in using and programming the Windows environment. Kernighan is especially enthusiastic about the C language, which he describes as "perhaps the best balance of expressiveness and efficiency that has ever been seen in programming languages."

  • "Random Numbers Hit and Miss"
    Nature (07/29/03); Ball, Philip

    Stephan Martens of the Abdus Salam International Center for Theoretical Physics and Heiko Bauke of Otto von Guericke University have found the root cause of the difficulty computers have in generating random numbers, which is a key process in scientific computer simulation. Nearly all random-number generators spawn a sequence of seemingly arbitrary numbers by calculating each number based on a subset of previous numbers. As with a coin toss, the 1s and 0s that make up each sequence are supposed to each appear with 50 percent likelihood, and the sequence is expected not to repeat itself during the number of iterations generated in an average simulation. However, more powerful computers will be tasked with performing bigger simulations, and the randomness of number sequences will start to break down. Martens and Bauke prove that in a seemingly random series of 1s and 0s, a typical algorithm inserts a bias because of its tendency to group zeros together. Avoiding zeros entirely is the solution the researchers proscribe, but this requires a more time-consuming algorithm. Martens cautions that "In times of high precision there is no place for bad random-number generators."

  • "Critics Tell Congress ICANN Needs Reforms"
    IDG News Service (08/01/03); Gross, Grant

    At the recent Senate subcommittee meeting on ICANN, critics raised concerns about the organization--specifically in relation to the transparency of ICANN's business transactions, ICANN's involvement of normal Internet users in rulings, and its decision to eliminate at-large board members--as well as general accountability issues. ENom CEO Paul Stahura discussed concerns about ICANN's management of WSL and whether it would harm domain registrar competition. ICANN CEO Paul Twomey responded that ICANN should not be involved in blocking development in domain name registering that vendors drive. Alan Davidson of the Center for Democracy and Technology mentioned concerns about ICANN's ruling last year to cut at-large members from its board, citing worries that representation of community and noncommercial entities is inadequate, though he supported continuation of ICANN's memorandum of understanding with the Department of Commerce (DCO) for a year. Sen. Conrad Burns (R-Mont.) requested that the DOC's National Telecommunications and Information Administration provide the subcommittee with recommendations concerning ICANN's future by late August. The DOC has yet to rule on whether or not to extend the memorandum of understanding that allows ICANN to control the U.S. DNS. Burns noted, "I am particularly concerned that the lack of accountability for this quasi-governmental organization poses serious dangers for American national security."
    Click Here to View Full Article

  • "The IT Productivity Gap"
    Optimize (07/03) No. 21, P. 26; Brynjolfsson, Erik

    Statistical research suggests a definite link between a company's general productivity and its IT capital per worker, although company performance varies considerably, according to Erik Brynjolfsson of the MIT Sloan School of Management. IT intensity does not guarantee comparable productivity returns--the IT investment must be complemented with new strategies and business and organizational restructuring. Corporate investment in intangible assets such as human capital and business processes is higher--and more highly valued--than IT investments. Data that Brynjolfsson's team culled from over 1,167 large American companies indicates that IT always corresponds to changes in work practices and how labor's performance is gauged, directed, and documented; the real productivity difficulties stem from staff's inability to deal with an increase in information flow resulting from computerization. Brynjolfsson writes, "In the information economy, the scarce resource is not information, but the capacity of humans to process that information." Decision-making that especially lends itself to automation includes simple protocols tied to individual transactions or other operational work, while more intricate and intellectually intensive tasks are notoriously hard to computerize. Brynjolfsson notes that several trends are emerging out of the shift to an information economy: Proficient, educated workers, managers, and professionals are valued more, and there is a growing need for these employees to be adept in noncognitive "people skills." Brynjolfsson's team has devised a formula that companies can follow to attain higher productivity levels than their rivals through the combination of six business practices--computerization of routine processes; a highly skilled workforce; more decentralized decision making; improved vertical and lateral information flow; a strong incentive system; and assigning greater importance to training and recruitment.

  • "Future Tech: 20 Hot Technologies to Watch"
    PC Magazine (07/03) Vol. 22, No. 12, P. 77; Metz, Cade; Howard, Bill; Levin, Carol

    Technologies that promise to fundamentally change computing are in various stages of development: Autonomous driving is far from reality, but technologies that could facilitate the development of self-driving cars are converging, such as assistive vision systems, global-positioning receivers, and digitized maps; experts decree that the systems will need to be more accurate and enhanced with multiple redundancy to be viable. The organic light-emitting diode (OLED) market is expected to reach more than $3 billion by 2010, boosted by the technology's energy efficiency, ease of fabrication, and flexibility--this last property makes spray-on electronics workable, clearing the way for practically limitless OLED applications. Silicon photonics promises to accelerate the speed of the Internet backbone once the integrated optical circuit emerges, while small, potentially disposable data delivery devices may be on the horizon thanks to Microsoft's Smart Personal Objects Technology initiative. Mesh networks--decentralized, wirelessly connected sensor networks that can adjust to changing environments and boast greater reliability than other networks--offer tremendous potential for enhanced driving, entertainment, and battlefield operations; grid computing, whereby users can tap into a vast well of unused processing power like any other utility, is being employed by drug companies, financial services, and engineering and chemical research projects; and radio frequency identification tags, which are currently being used to automatically pay tolls and track grocery items, promise to revolutionize supply chain and inventory management in the short term before being embedded in all kinds of objects, assuming privacy implications do not complicate matters. Quantum computers promise to break previously unbreakable code, so the only real solution is quantum cryptography that uses the quirky behavior of quantum physics to ensure that data is protected. Gaming is poised to become even more social and communal thanks to more interactive gaming technology, improved connectivity and speed, and the incorporation of voice chat and voice command. Text-mining software, which seeks to extract meaning from raw text, promises to significantly augment security and intelligence operations, while the growing problem of e-waste is prompting researchers to develop economically feasible e-waste recycling solutions by focusing on optimal reverse production systems. Finally, cognitive computing projects aim to make robots capable of learning and thinking for themselves so they can interact with humans more naturally and navigate unfamiliar surroundings.

[ Archives ] [ Home ]