HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 625:  Wednesday, March 31, 2004

  • "Pentagon Drops Plan to Test Internet Voting"
    Washington Post (03/31/04) P. A23; Keating, Dan

    Deep-rooted security issues have pressured the Pentagon to scrap a $22 million pilot initiative to test Internet voting for Americans stationed overseas, according to officials. An audit of the Internet voting system conducted by academics in late January concluded that the integrity of an entire election could be compromised because of inherent weaknesses in the Internet, and a few weeks later Deputy Defense Secretary Paul Wolfowitz elected to not count Internet ballots in the presidential tally, but still proceed with the experiment. The termination of the experiment altogether is a disappointment to academics and Accenture eDemocracy Services, who hoped that the trials would serve as a valuable learning experience. Because the cancellation was ordered while the online system was being tested for certification with the federal voting standards, there was no certification decision made, notes eDemocracy Services head Meg McLaughlin. The experiment would have spanned 50 counties in seven U.S. states. Meanwhile, Internet ballots are being vigorously tested in eight European nations, including England, France, Spain, Italy, and Switzerland, to boost voter participation. "In Europe, they have such a well-developed process for doing experiments in elections, which provides them with so much good information compared to the ad hoc way that we [Americans] do things," explains Thad Hall, co-author of "Point, Click and Vote: The Future of Internet Voting." Such experiments have demonstrated positive results: A test of Internet voting for French citizens living in the United States last year helped curb the declining turnout of expatriate voters in that country.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "A Better Way to Squelch Spam?"
    Technology Review (03/26/04); Johansson, Eric S.; Dawson, Keith

    Recently proposed anti-spam sender authentication measures such as Yahoo!'s DomainKeys and Microsoft's Caller ID are unlikely to be effective spam deterrents because they are centralized solutions that chiefly benefit major ISPs and advertisers. The Camram (Campaign for real mail) project is an open-source "hybrid sender-pays" system designed to maintain decentralized operations while imposing a computational cost on spammers and leaving legitimate email users unhindered. Camram employs proof-of-work stamps, a sender-days postage consisting of a difficult mathematical puzzle whose solution is easy to confirm. This avoids the difficulties associated with money stamps, an electronic micropayment that require a centralized infrastructure for a worldwide micropayment system. The Camram project uses a "hashcash" puzzle, which feeds a seed value comprised of date, email address, and a random number into a mathematical function that performs a calculation based on the input; generating the stamps consumes a substantial amount of CPU time, which impedes spammers to the point that their operations lose profitability, but requires no centralized infrastructure. Camram's work stamps are integrated with other anti-spam measures consisting of at least three filters--a stamp filter, a smart white list, and a content filter; this scheme allows strangers with stamped email and familiar correspondents to circumvent the content filter. In addition to decentralization, the Camram system boasts incremental adoptability and wide configuration coverage. So that sender-pays systems have less impact on mailing lists, such messages could be allowed to traverse the content filter or be labeled with a cryptographic stamp. The risk that spammers could generate valid stamps using hijacked "zombie" systems is lessened by the amount of computational power involved, while spammers who do resort to such tactics could be thwarted by increasing the number of bits in a legitimate stamp.
    Click Here to View Full Article

  • "U of C Takes First Victory at ACM Tournament"
    ITBusiness.ca (03/30/04); Schick, Shane

    On the eve of the Association of Computing Machinery's 2004 World Finals in Prague, a team of students from the University of Calgary won an international programming competition, the Java Challenge, which pitted them against 67 other participating teams. Teams had to write components to a segment of code running "Code Ruler," a game of Medieval conquest; in a typical tournament, up to six "rulers" compete by ordering their forces to capture land or the kingdoms of their rivals, and the students had to program how their forces would move and interact with those of opposing teams. Game co-designer Tim DeBoer of IBM's WebSphere Studio said teams are scored according to how efficiently each team coordinates their forces and claims land, adding that although many students are unfamiliar with Java, the competition "allows us to give them greater exposure to Eclipse as a way of working with the code." Eclipse is an IBM-led project to furnish an all-purpose software development toolset. The Calgary team earned the highest score of 30,558 points, while a team from the University of Illinois was ranked second with 17,201. In a later contest, all 73 participating teams will compete to tackle a set of about eight programming problems in five hours. The Calgary team's Sonny Chan says preparation for the event involves students practicing for at least five consecutive hours each week, while fellow team member Kelly Poon says the ACM contest provides a big boost to her education and makes her college classes seem easy.
    Click Here to View Full Article

  • "How E-Voting Threatens Democracy"
    Wired News (03/29/04); Zetter, Kim

    Much of the voting community's drive to address flaws in touch-screen voting systems has been fueled by the crusading efforts of Bev Harris. Harris was intrigued when an online article she read raised the concern that e-voting systems cannot verify the accurate recording of votes, making election fraud all too likely. When Harris began investigating e-voting machine miscounts, she documented 56 cases that were attributed to software flaws. Harris found that follow-up stories explaining the glitches were rare, and the few that were made dismissed the errors as irrelevant; this reflected a growing worry among critics that election officials too often depend on vendors' word that their machines are reliable. While researching a book about voting companies and their products, Harris found a file transfer protocol site for Diebold Election Systems containing about 40,000 unsecured computer files, including e-voting machine source code, tabulation software program files, and live vote data from dozens of precincts in a 2002 California primary election, among other things. The files raised the specter of untraceable vote-rigging, and Harris brought in academic specialists such as Johns Hopkins' Avi Rubin to analyze the code; the researchers concluded that Diebold's systems were vulnerable to internal and external tampering, and lacked many basic encryption protocols. Rubin's report implied that the process and standards by which e-voting systems are federally certified is seriously flawed, yet critics are unsettled by the decisions of state election officials to proceed with touch-screen rollouts despite numerous reports confirming the machines' vulnerabilities. The growing sense of public outrage over the e-voting issue has prompted legislators to take action: California Secretary of State Kevin Shelley mandated that all e-voting systems in the state must print out voter-verified paper trails by July 2006, while Sen. Rush Holt's (D-N.J.) proposal that paper trails be a national requirement has attracted 128 co-sponsors.
    Click Here to View Full Article

    To learn more about ACM's activities regarding e-voting, visit http://www.acm.org/usacm.

  • "Purdue Engineers Design 'Shape-Search' for Industry Databases"
    ScienceDaily (03/31/04)

    Purdue University engineers are working on a system that will allow people to mine industry databases for engineered parts by basing searches on the components' three-dimensional shapes. Karthik Ramani of the Purdue Research and Education Center for Information Systems in Engineering (PRECISE) says such a technique could add up to millions of dollars in annual savings "by making it unnecessary to design parts anew and enabling you to mine for other knowledge, such as past decisions regarding costs and design advice about the part." The mechanical engineering professor, who also serves as PRECISE's director, says the method starts with a 3D model of a component that is broken down into small cubes or voxels, which are converted by software algorithms into a "skeletal graph" based on "feature vectors." Users can select a part in the inventory that resembles the part they want and query the system to locate a "cluster" of similar components; users can also sketch parts from memory, or modify a part in the inventory. Users can tell which parts the system retrieves are the closest matches, and this relevance feedback helps the system narrow the search, according to Ramani; algorithms that employ neural network software are critical to processing relevance feedback. Ramani estimates that searching for information on parts eats up about six weeks of engineers' time a year, and he believes up to 80 percent of this time could be shaved off by the shape-search system. Experiments demonstrate that the system is up to 85 percent accurate in helping users find parts, and is an average 51 percent more accurate than single-step approaches. The Purdue engineers have enlisted associate professor of psychological sciences Zygmunt Pizlo to help augment the shape-search system with data about human perception so that the system can "bridge the gap between what's in your head, your idea of what the part looks like, and what's in this huge inventory of parts," notes Ramani.
    Click Here to View Full Article

  • "CMI Launches 'Pervasive Computing' Initiative"
    The Cambridge--MIT Institute (03/24/04)

    The Cambridge-MIT Institute, a cross-Atlantic collaboration meant to boost academic and industry partnership, is launching a Pervasive Computing Community that will find ways to make computers more responsive to human needs. The proliferation and commoditization of computers over the last 40 years has not really changed basic human-computer interactions: Human users still have to learn special languages and use specialized interfaces such as keyboards to work with computers. The new Pervasive Computing Community will partner Cambridge University researchers with colleagues from the MIT Computer Science and Artificial Intelligence Lab, as well as industry partners wishing to join. The community will tackle technical barriers to making computers easier to use and interact with, including power-efficiency technology, computer vision, and novel networking schemes. Cambridge University Computing Laboratory's Simon Moore says, "We can take a 1960s supercomputer, shrink it to the size of a sugar cube and sell it for under 10 pounds, but how do we use it to make your life better? It needs to be sentient, loyal, small, and low maintenance." MIT computer science researcher Victor Zue says computer users in the future will have a nomadic lifestyle and demand instant information no matter where they are, and the Pervasive Computing Community will develop the requisite technologies to make that happen. For example, systems need to be aware of the user's end-goal so that a request for collaboration with a friend is not just handled by a desktop PC, but the task is seamlessly transferred to other computers or handheld devices as the user leaves an office and moves about. Computers also need to be able to understand human speech, gestures, and body language so that users can be freed from using limited input modes such as keyboards, mice, and traditional desktop software.
    Click Here to View Full Article

  • "Black Scholars Tackle Digital Divide Problem"
    Investor's Business Daily (03/30/04) P. A7; Riley, Sheila

    Bridging the gap between those who have computer access and those who do not is one of the goals of the nonprofit Institute for African American E-Culture, which was founded in 2001 to establish a robust online black community. "There's a need in the context of the digital divide to create a vibrant 'e-culture,'" maintains Boston University electrical and computing engineering professor Roscoe Giles, who leads the institute. He adds that providing computer access to underserved minorities is only part of the solution; equally important is giving people the power to create technology and cultivate entrepreneurism. Auburn University's Juan Gilbert points out that educating African-American kids about technology in a culturally relevant context is also essential, and to this end he has crafted software designed to teach algebra to urban black junior high school and high school students in a culturally relevant manner: Tools the software uses in this regard include animated black cartoon-character instructors and word problems about scenarios the students might connect with. "When you have a diverse population creating new technologies, you will get better solutions to problems," Gilbert asserts. Portland State University's Bryant York estimates that blacks earn less than 2 percent of computer science Ph.D.s, while last year 4,000 blacks earned bachelor's degrees in computer science out of 38,000 national recipients. Pew Charitable Trust researcher Tom Spooner notes that the line between digital haves and have-nots is not strictly racial--education and income play an even bigger role. Internet use among blacks offers a glimmer of hope: Such use has increased from 35 percent in 2000 to 50 percent by the end of 2003, while Asian and white Internet use have been stuck at the high 50 percent to low 60 percent level since 2001.

  • "Open Solution to Managing Distributed Software Developments"
    IST Results (03/30/04)

    Information Society Technologies' open source GENESIS software platform was designed to tackle the challenge of coordinating distributed teams of people working on the same project so that they are kept apprised of any developments, according to Pierluigi Ritrovato of the Center for Research in Pure and Applied Mathematics at the University of Salerno. Tools for resource management, process flow control, event notification, artifact management, automatic metric collections, collaboration, and cooperation are integrated in a complementary and low-intrusive manner with the GENESIS platform, which also supports improvisatory changes to the process flow within the project. "We created a workflow engine and a workflow language able to accept incremental definition and 'on the fly' changes in the process," explains Ritrovato. "So new activities can be defined, different persons can be assigned to a task." Software engineering projects' various stages can be apportioned to geographically separated teams via the GENESIS platform, and both formal and informal communications schemes are supported. Software design modeling, control, and measurements, along with development and maintenance processes and communication between software engineers in different development teams, are also supported. Since its completion last November, GENESIS has been published on SourceForge.net. Ritrovato notes that thus far the open source community has made no overtures to further develop the platform.
    Click Here to View Full Article

  • "Passport Safety, Privacy Face Off"
    Wired News (03/31/04); Singel, Ryan

    The International Civil Aviation Organization (ICAO) is developing new worldwide passport standards that include digitized photos and may also feature radio-frequency identification (RFID) chips for storing and transmitting information about travelers, which a computer could compare to a mug shot database. Though advocates believe such a system will help deter passport counterfeiting and be an effective weapon against terrorism, a coalition of privacy groups including the ACLU and the Swiss Internet User Group fired off an open letter to ICAO requesting that they delay the standards' finalization until privacy issues are fully assessed: "We are concerned that the ICAO is setting a surveillance standard for the rest of the world to follow," the coalition wrote. Privacy International thinks the ICAO should have embraced a specification that would permit computers at border crossings to match travelers to digital passport photos, but not allow governments to retain a central photo database. In addition, the RFID chips would let rogue operators or unprincipled officials secretly read passports. Travel Privacy proponent Edward Hasbrouck suggested that an augmented bar code that can only be read with an optical scanner would have been better. The U.S. government is eager for the new standards to be finalized; it passed two laws requiring companies in the Visa Waiver Program to issue machine-readable, biometric passports by Oct. 26 of this year, but Secretary of State Colin Powell and Department of Homeland Security chief Tom Ridge warned of "grave consequences" if the deadline is not pushed back. Lou Fintor of the State Department reported that his department and other federal agencies plan to start issuing biometric passports by next year, while the ACLU's Barry Steinhardt cautioned that such a passport would displace the driver's license as Americans' preferred form of ID.
    Click Here to View Full Article

  • "Future Search Efforts Will Make Google Look Like 8-Tracks"
    USA Today (03/31/04) P. 4B; Maney, Kevin

    Kevin Maney forecasts that within 10 years--if not sooner--the popular Google search engine in its current iteration will be as archaic as eight-track tapes, as the building blocks of far more intuitive searches start to take shape. Most search engines query the visible Web, which consists of billions of pages that are for the most part worthless, but several initiatives seek to mine the invisible Web: Google is among those attempting to implement invisible Web search; others include Technorati, which offers blog search engines, and Amazon's Search Inside the Book feature. Idealab's X1 software can search a user's hard drive, but searching both the Internet and the hard drive simultaneously is beyond the capabilities of search engines, although Microsoft's upcoming Longhorn operating system promises to do this. Localization technology from firms such as Quova adds another layer of intuition to Web searches by allowing Web sites to pinpoint users' locations, although the privacy implications will no doubt be a source of controversy. Google and others are working on personalization technology that allows search engines to tailor searches to individual users based on their needs and preferences. The Eurekster site, for example, allows users to establish an online network of friends and monitors what they click on to build profiles of their interests that can be applied to searches. All of the above technologies will eventually converge, and be coordinated by software that can comprehend what a user is typing or reading and constantly seek out related material. "A search engine of 2010 will know who you are, where you are and what you're doing, and look across every form of information to automatically find what will help you," writes Maney.
    Click Here to View Full Article

  • "Researchers Question I.T. Subcultural Values"
    NewsFactor Network (03/29/04); Martin, Mike

    Over three-quarters of IT projects are doomed to failure not because of complexity, usability, and new technology's unpredictability, as many people assume, but because an IT occupational subculture often clashes with users and managers, according to Jeffrey Stanton of Syracuse University. "Occupational subcultures are groups of individuals who, based upon their occupation, develop their own language, values and behaviors that distinguish them from other groups within an organization," Stanton observes. His conclusion that subcultural conflicts were responsible for failed IT projects is based on 18 months of research with Kathryn Stam encompassing 12 New York organizations that were implementing major tech projects, and over 100 interviews. The technical jargon IT staffers use acts as a barrier that blocks outsiders' access to knowledge, while another source of frustration is IT personnel's tendency to explain things too rapidly. "A tech problem often seems routine to the IT worker, but--as one user said--it doesn't help 'when they come in and go zoom, zoom, zoom, zip, zip, zip with a mouse and they've totally lost me, so I never learn anything,'" notes Stanton. IT workers, on the other hand, often harbor feelings that their contributions are underappreciated by management and end users. Clif Boutelle of the Society for Industrial and Organizational Psychology attests that outsiders respect IT workers who are helpful, responsive to problems, and are "good teachers," although Stam reports that many IT personnel she and Stanton interviewed expressed a reluctance to nursemaid non-IT personnel. Stanton says his research emphasizes the need for organizations to close subcultural gaps, but cautions that assigning blame is more likely to exacerbate the situation. A more effective solution is acknowledging and understanding subcultures within an organization.
    Click Here to View Full Article

  • "Advanced Speech Research Sounding Sweet to IBM"
    eWeek (03/26/04); Pallatto, John

    Increasing the power of speech applications and interactive voice response systems is the goal of advanced speech-technology research efforts at IBM, declared general manager of IBM's Pervasive Computing Group Gary Cohen during his keynote speech at the AVIOS SpeechTEK 2004 conference. The company's speech-technology research concentrates on superhuman speech, conversational biometrics, expressive output, and advanced speech tooling. Cohen said that superhuman speech would strengthen speech-recognition applications and allow them to operate "in all sorts of environments--natural environments, noisy environments--where the connection on the phone is not all that great all of the time." He explained that research into conversational biometrics could make user authentication more effective by integrating voice prints with other kinds of relevant user data, adding that the technology could reduce the likelihood of an authentication failure by a factor of 50. IBM's expressive output research initiative seeks to address the automatic and appropriate expression of emotion, not only in response to human speech but in a conversational manner. Cohen said that research into free-form dialog support would let people communicate with machines more conversationally, noting that IBM plans to incorporate this research into VoiceXML 3.0. IBM is also confident that it can construct the common class libraries and building-block assemblies to support advanced speech applications, given its expertise in building enterprise-class applications.
    Click Here to View Full Article

  • "InfiniBand Ambivalence"
    Byte and Switch (03/30/04); Jander, Mary

    InfiniBand could be seriously undermined by an alternative technology should one emerge fast enough, according to industry experts. Startup Precision I/O does not yet sell products, but claims to have a more cost-effective solution that uses standard Ethernet instead of requiring the new networking fabric InfiniBand does. Precision I/O co-founder Judy Estrin says, "Customers want network unity, not separate networks in the data center." Even an uninvested observer such as Arun Taneja of Taneja Group admits InfiniBand could be scuttled by a fast-rising competitor that offers better performance, though he says the possibility of such an alternative arising fast enough is slim. Standards groups such as the RDMA Consortium and the Internet Engineering Task Force (IETF) are investigating the use of RDMA (Remote Direct Memory Access) over IP to solve network latency problems caused by TCP/IP and operating system overhead, and RDMA-over-IP products are already in development. InfiniBand is already proven and deployed in high-performance computing environments; the technology also offers far greater speed, topping 10 Gbps, than any alternative is likely to come out with in the near future. Theoretical solutions carry little weight unless proven and InfiniBand is set to stay a generation ahead of Ethernet for a while, says Topspin Communications' Stu Aaron. The low volumes to date may be partially because many enterprises that have deployed InfiniBand are not eager to announce it, notes Intel representative at the InfiniBand Trade Association Allyson Klein. Acuitive co-founder Mark Hoover says the industry has already rallied around the InfiniBand standard, though companies are slow to adopt it because of its high cost, especially in terms of managing the new technology once it is used to connect separate machines.
    Click Here to View Full Article

  • "ICANN Chief Meets Annan"
    Associated Press (03/29/04); Hawley, Chris

    ICANN President Paul Twomey met for the first time with United Nations Secretary-General Kofi Annan on March 26. Though Twomey declined to discuss the nature of the talks, the meeting was held at a time when the global Internet community has become increasingly vocal about having a say in regulating the Internet. Globally, ICANN has come under fire for being too closely aligned with the United States, and some observers are calling for the U.N. to assume responsibility for governing the Internet. The U.N. hosted a meeting last week on the topic of Internet governance, with some 200 company representatives, Internet activists, and diplomats in attendance. Many representatives of the computer industry were cool to the idea of the U.N.'s participation in Internet governance, while Deputy U.N. Secretary-General Louise Frechette called for greater cooperation at the international level in the areas of network security, spam, and privacy. Some countries are concerned that ICANN's relationship with the U.S. Department of Commerce would give the United States the power to disrupt Internet traffic to any country it wants. VeriSign's director of government relations, Michael Aisenberg, said that such fears are unwarranted, noting that a move of this sort would prompt a tremendous international outcry. "You would be such a pariah, you would have your role as a custodian ripped away from you," Aisenberg said. Meanwhile, despite ICANN efforts to broaden the international membership of its board and emphasize its independence, critics say more work is needed. Talal Abu-Ghazaleh, vice chairman of the UN Information and Communication Technologies Task Force, says, "ICANN has to be more international and it has to be more transparent."
    Click Here to View Full Article

  • "Federated Identity Standards: Confused?"
    TechNewsWorld (03/25/04); Norlin, Eric; Platt, Darren

    Real-time business relationship management is becoming more important as business itself is decentralizing and becoming more virtual, making access to distributed resources and identity management a top concern. Federated identity technology lets companies share identity information between secure networks, so as to create identity-based applications that can allow more access to cross-company information. This lets local identities and their data remain in place, but links them through higher-level mechanisms, and it can help organize controlled links among distributed user identities to make management and control more efficient. Federated identity standards began with security firm Securant and eventually evolved into Security Assertion Markup Language (SAML), and the Liberty Alliance consortium formed to create an open standard for federated identity built on SAML. Meanwhile, other vendors wrote specifications for the WS-Security specification. Existing standards and dependencies include SAML 1.0, 1.1, and 2.0 for the secure exchange of user information between security domains; Liberty Phase 1, which adds its own profiles to SAML 1.0 for account linking, global logout, and identity-provider introduction; Liberty Phase 2, which adds functionality to Phase 1; Liberty Phase 3, which is still in development; WS-Security, which defines mechanisms for security-token-based integrity and confidentiality on Web Service messages; WS-Security Extensions, which layer authentication, authorization, and policy across multiple security domains; and Oasis WSS, which is still in development. Enterprises wanting to link to preexisting accounts this year should use the Liberty Alliance specifications, but things will change when SAML 2.0 is released; SAML 2.0 should get incorporated into new products beginning in 2005.
    Click Here to View Full Article

  • "Time to Enlist a 'National Guard' for IT?"
    Network World (03/29/04) Vol. 21, No. 13, P. 8; Greene, Tim

    Military emergency management officials, speaking at the recent Norwich University e-ProtectIT conference, said the United States is not prepared to recover quickly should a major cyberterrorism attack take place. They also say that such an attack might require government mobilization of IT professionals. Retired Army National Guard Maj. Gen. Jack D'Araujo suggested the possibility of a cyber national guard to react to attacks, noting that there is no existing official chain of command for such an organization. D'Araujo says, "We're really plowing some new ground. We flat-out aren't prepared to deal with it." Former National Computer Security Center director Patrick Gallagher said that IT community members do know what to do during a cyberattack, but they lack leadership. Gallagher says that "we have network groups who can and do talk to each other and speak a similar language and have the same training. What we need is the leadership to pull that together." Qovia vice president Pierce Reid pointed out that since no cyberdisaster has yet taken place, it is not known what will be required or how fast damage can be fixed. The Cyber Security Early Warning Task Force recently issued a report urging the creation of an early-warning network and a CERT-run national crisis coordination center to collect attack information and issue warnings. Information-sharing systems already exist, but they do not have official powers, D'Araujo said, and many companies are reluctant to share information. Norwich CIO Phil Sussman, who led a seminar on network security, says even minor attacks "will shake confidence in the network itself with a series of things people expected but are no longer there." U.S. Marine Gen. Commendant Alfred Gray says IT professionals must get "street-wise" and examine their systems the way attackers do to look for cracks and seams in their operations.
    Click Here to View Full Article

  • "Computer, Heal Thyself"
    Federal Computer Week (03/29/04) Vol. 18, No. 8, P. 42; Moore, John

    Computers that can self-configure, self-repair, and self-optimize are highly desirable for organizations that implement information grids and other highly distributed computing models, while autonomic computing's promised benefits to others include more reliable and resilient machines that require less hands-on maintenance. "It addresses the out-of-control costs of doing basic monitoring of operations and maintenance of IT systems," says Ric Telford of IBM's Autonomic Computing effort. Autonomic computing has attracted the most interest from scientific and technical government entities such as NASA, the Energy Department, and the Defense Advanced Research Projects Agency (DARPA), which often undertake projects that require distributed data analysis; vendors pursuing the technology besides IBM include Sun Microsystems, Hewlett-Packard, and specialty firms such as Stottler Henke Associates. Some officials believe autonomic computing can provide augmented security, and DARPA has created the Self-Regenerative Systems program for such a purpose--namely, the development of systems capable of automatic response to cyberattacks. Other projects, such as Sun's N1Grid, aim to manage multiple machines as if they were a single computer, notes Dennis Govoni of Sun's government division. Peter Hughes of NASA Goddard Space Flight Center's Information Systems Division reports that autonomic computing could find its way into NASA projects such as the Mission Services Evolution Center, which will supply a unified framework for ground and flight systems. The IRS, meanwhile, plans to use autonomic computing to cut operational costs and bolster customer service in one of the few initiatives outside of the technical computing arena. Industry and government executives think agencies should prepare for the emergence of autonomic computing by refining their IT management practices.
    Click Here to View Full Article

  • "The Gentle Rise of the Machines"
    Economist Technology Quarterly (03/04) Vol. 370, No. 8366, P. 29

    A recent report from the United Nations Economic Commission for Europe concludes that the growth of the domestic robot market is poised to overtake the industrial robot market within the next several years. The projected domestic robot explosion is attributed to the falling cost of computer power, which enables programmers to craft more advanced software that boosts robot intelligence. Other contributing factors include the plunging costs of camera and sensor chips, and accelerated robotic vision, control systems, and communications research. However, the broad penetration of robot technology into the household has been promised for a long time--as far back as 1939, when a whimsical humanoid machine was showcased at the New York World's Fair. Domestic robots may manifest themselves as toys whose functionality could be expanded when they are plugged into a network: For example, a wirelessly networked plaything could be used to remotely monitor homes or water plants. One of the long-cherished applications for robots is as automated assistants and nurses for the elderly population, which is steadily growing. Joe Engelberger, creator of the first industrial robot, says the success of domestic robots hinges on their reliability and a clear return on investment. Still, robots may be more ubiquitous than most people think, if one moves away from traditional humanoid robot concepts reinforced by scientists and movies: ATM machines, autonomous trains, and mail sorters are just a few of the many devices out there that technically qualify as robots.
    Click Here to View Full Article

  • "Building a Technology Portfolio"
    Military Information Technology (03/15/04) Vol. 8, No. 2; McCarter, Mickey

    A core component of the U.S. Defense Department's push toward network-centric warfare is horizontal fusion technologies that integrate data from a wide array of sources for rapid decision-making among warfighters. John Osterholz of the office of the DoD CIO says the rationale for a horizontal fusion portfolio stemmed from the attack on the Pentagon three years ago, which demonstrated that network communication between DoD offices was inefficient; the horizontal fusion project, now in its second year, was formulated as the best strategy to address the changing requirements of the DoD as emerging technologies boost warfighters' ability to access information. As long as the portfolio remains active, a Quantum Leap test will be held annually to determine which horizontal fusion projects will be fielded, extended, or terminated: U.S. troops in the Middle East were given new information tools approved by Quantum Leap in fiscal 2003, among them basic language translation services (BLTS). BLTS enables soldiers to translate foreign languages almost instantly by scanning written text into a portable computer and receiving an abstract of key words. The DoD would like to focus horizontal fusion technologies into the op-intel arena, where close collaboration between operators and intelligence analysts is critical to the effective prosecution of moving targets. At least two major programs of record have been affected by the horizontal fusion portfolio in its first year: The Navy's cooperative engagement capability (CEC) program, in which radar resources are pooled, and net-centric enterprise services (NCES). In the first instance, the portfolio has expanded CEC capacity to avoid the discarding of critical data, while in the second it serves as a testbed for enterprise services to be deployed in the Global Information Grid. Osterholz contends that horizontal fusion must succeed on three levels if the DoD is to reach its objectives: The strategic investment level, the managerial level, and the basic capability level.
    Click Here to View Full Article



    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM