Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to technews@hq.acm.org.
Volume 7, Issue 748:  Monday, January 31, 2005

  • "Teaching Computers to Reason Is Next Big Test"
    Associated Press (01/31/05); Hill, Michael

    Rensselaer Polytechnic Institute professors Selmer Bringsjord and Konstantine Arkoudas have won a year-long contract from the Defense Advanced Research Projects Agency (DARPA) to develop a computer capable of learning by reading. For machines to successfully read, sentences must be rendered as formal logic equations or other computer-comprehensible formats, and the Rensselaer professors want to instill this ability--as well as the ability to reason--through algorithms incorporated into their "Poised-for-Learning" machine. Bringsjord expects artificial intelligence to evolve to the point where computers can read military plans or manuals and adapt on the spur of the moment in mid-battle, a capability that is currently lacking in today's systems. He foresees future AI robots that receive data input in real time either through reading or listening to spoken instructions. The approximately $400,000 DARPA grant could be extended to a three-year $1.2 million grant, and Bringsjord wants a Poised-for-Learning system that can read basic texts ready in three years. DARPA's Jan Walker says the grant is part of the agency's overall program to develop more advanced cognitive systems. Director of DARPA's Information Processing Technology Office Ronald Brachman envisions a "computer-permeated future" where systems will need to improve themselves over time by recalling previously accumulated knowledge. Meanwhile, Cycorp's Cyc project aims to build a repository of human knowledge capable of making intelligent decisions through reasoning.
    Click Here to View Full Article

  • "Graduate Cryptographers Unlock Code of 'Thiefproof' Car Key"
    New York Times (01/29/05) P. A10; Schwartz, John

    A team of Johns Hopkins University graduate researchers has successfully cracked Texas Instruments' "immobilizer" antitheft system used in millions of cars, which could have serious implications for the security of many other deployments of radio frequency identification technology. A transponder chip in the owner's ignition key receives and encrypts a random number from a tag reader in the vehicle before ignition, and the car will not start unless the transponder sends the new number back. To beat the system, the researchers have determined that a potential thief needs to get within inches of a key, and send numbers to the key with his own tag reader; the key calculates "answers" using its secret numerical code, and the thief's computer uses a table of all possible secret codes to find the code that will generate the same answer as the key, and allow the key to be cloned. TI itself is an unwitting accomplice in this decoding scheme: The company sells tag readers, and a general diagram of the system taken from a technical presentation by the company's German branch was posted on the Internet. Team leader and computer science professor Aviel Rubin says there are numerous scenarios in which a thief can get close enough to read an ignition key's code--in a crowded elevator, for example. The TI chips are also embedded in Speedpass tokens that allow motorists to pay for gas without using credit cards, and are just as vulnerable to cloning, according to the researchers. Rubin says wrapping the chips in tinfoil could thwart duplication. The paper detailing the researchers' work does not provide enough information to permit the work to be copied, but UC Berkeley professor David Wagner warns that "The white hats don't have a monopoly on cryptographic expertise."
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Executives Prescribe More Technology Use to Boost U.S. Health"
    Investor's Business Daily (01/31/05) P. A4; Bonasia, J.

    Health care providers need more technology in order to improve patient care and reduce costs, but building a national health network means hundreds of millions of dollars and haggling over standards. Attendees of the World Health Care Congress, a three-day gathering in Washington, D.C. that ends Tuesday, cited technology as the most important problem facing their industry, according to a poll taken by Capgemini. Just two weeks before the event, eight of the largest IT vendors joined together to propose a national health network to national health information technology head Dr. David Brailer, whom President Bush appointed to push such a plan last year; members of the Interoperability Consortium include IBM, Microsoft, Oracle, Accenture, Cisco, Hewlett-Packard, and Computer Sciences. Bush has repeatedly highlighted the need for modernization in the health care industry, and the Department of Health and Human Services estimates a national health data program could result in $140 billion in annual savings when completed--10 percent of total U.S. health care spending. Data-driven systems would improve care delivery by suggesting treatments based on analytical data, speeding registration, eliminating clerical errors, and preventing dangerous drug combinations. Some hospital groups have already set up systems that share patient records data, such as a collaboration between competing Indiana hospitals to share patient records on emergency room entrants. Linked patient records systems would eliminate redundant storage systems in some cases, and would allow doctors to easily pull up test results instead of having to order new ones. Helping the trend toward automation and efficiency, Medicare and some private insurers have begun shifting to pay-for-performance plans where health care providers are compensated on results, not the amount of care given.

  • "Taking Full Control of Distributed Applications"
    IST Results (01/31/05)

    The IST-funded GeneSys project has developed a middleware framework for generic distributed systems and applications supervision, eliminating the need to apply costly, proprietary, and excessively complex solutions, according to project coordinator Jean-Eric Bohdanowicz with EADS-Space Transportation. GeneSys earned a Simulation Interoperability Standards Organization Euro-SIW 2003 Award for distributed systems simulation, and the second prototype of the architecture was recently submitted to the World Wide Web Consortium in the hope that it could become a standard. Bohdanowicz says GeneSys is capable of supervising all three system layers--system, network, and applications--while the second prototype uses a communication core and agents to sustain the architecture's openness and generic characteristics. Bohdanowicz notes that the publicly available core resides on a server at Stuttgart University's High Performance Computing Center. He says, "[The core] can create a catalog of the services provided by its plug-ins and route communications between plug-ins on the network." Both the Stuttgart center and the Hungarian Academy of Sciences' Automation Research Institute are employing GeneSys to coordinate networks and servers, and Bohdanowicz expects organizations that engage in remote collaboration with teams of experts in different fields to have a great need for the software. The modular GeneSys software is available for download on the SourceForge open-source portal, and starting next year GeneSys will be commercialized for small markets such as distributed applications by German project partner Navus, says Bohdanowicz.
    Click Here to View Full Article

  • "Encouragement, Not Gender, Key to Success in Science"
    San Francisco Chronicle (01/28/05) P. B9; Holmgren, Janet L.; Basch, Lisa

    In response to Harvard President Lawrence Summers' recent assertion that the gap between men and women's qualifications for success in science and math is innately gender-based, Janet Holmgren and Linda Basch with the National Council for Research on Women (NCRM) counter that significant progress has been made in the last 30 years or so. For instance, the percentage of female Intel Science Talent Search national finalists rose from about 25 percent in the 1970s to 45 percent in 1999, while girls captured top honors in the Science Talent Search from 1999 through 2001. However, the authors observe that there are several transitions where women are likely to drop out of the sciences: Between high school and college, between their freshmen and sophomore college years, between undergraduate and graduate school, and between graduate school and their professional lives. Holmgren and Basch cite NCRM studies showing that women's upward mobility in academia is limited by discrimination and traditional academic practices, while the National Academy of Sciences estimates that women today account for less than 10 percent of full professors in the sciences. In addition, the National Science Foundation (NSF) estimates the yearly wage for women in computer and mathematical science fields to be dramatically lower than their male counterparts. A study published in the American Economic Review indicates that women are twice as likely as men to abandon science and engineering jobs for work in other fields. Recommendations the authors make include implementing a long-term commitment to nurture more scientifically-minded women from kindergarten on, while the NSF advises universities to put female professors on influential committees where they can act as mentors and role models. Holmgren and Basch challenge that "With an economy increasingly based on technology, and our future defined by science, we must maximize the talents of all."
    Click Here to View Full Article

  • "Real-Time HDTV Broadcast From USA to Japan Enabled by Advanced Networks"
    UCSD News (01/20/05); Ramsey, Doug

    Networking researchers at the Japan Gigabit Network 2 (JGN2) Symposium in Osaka recently watched University of California, San Diego (UCSD) professor Larry Smarr deliver his keynote address via a live HDTV transmission. Smarr, who is also director of the California Institute for Telecommunications and Information Technology, spoke from the University of Washington in Seattle about new cyberinfrastructure that used dedicated optical links to enable new networked applications between clusters of computers and scientific instruments. His appearance on the HDTV television positioned above the podium gave a glimpse of future collaboration opportunities as scientists and engineers are able to communicate with one another as though in the same room. Smarr noted UCSD and the University of California, Irvine, were incorporating video-over-fiber technology in new buildings, including digital cinema and HDTV production facilities and large-format displays in public spaces. The Osaka-Seattle link was developed by the University of Washington and consisted of a server in Seattle that transmitted uncompressed high-definition audio and video signals to a client system in Osaka, using the Pacific Northwest GigaPoP network in Seattle, then traversing the Pacific via a 10 Gbps link to Tokyo, and then to Osaka on the JGN2 network. University of Washington computing and communications vice president Ron Johnson said the Smarr keynote address signified a milestone for international networking research. The goal is to harness multiple wavelengths of light, called lambdas, over single optical fibers so that remote scientific instrumentation, high-definition telepresence, and other applications are persistent and ubiquitous.
    Click Here to View Full Article

  • "Computing's Silent Revolution"
    CNet (01/31/05); Becker, David

    The problem of ever-increasing heat output often forces computer designers to add cooling systems that are stronger but noisier, and an entire industry has sprung up from user demands for quieter PCs. Technical writer Mike Chin, whose Silent PC Review Web site is one of the most highly referenced sources for people seeking solutions to PC noise, says the problem usually stems from PC designers' lack of consideration for acoustics. He says most people want PC noise to be reduced to the level of ambient sound in the room, although a small number debate the issue obsessively: "It just becomes a game for them--you take care of one thing, and then you can hear the next most annoying thing," Chin notes. Insight64 analyst Nathan Brookwood says major PC companies are not dramatically addressing acoustical problems because silent PCs are much more expensive than more straightforward cooling techniques. Advanced Micro Devices' Cool'N'Quiet technology, for instance, throttles down chip power when the processor is inactive or carrying out simple tasks, which AMD's Jonathan Seckler says allows the processor to generate less heat and thus reduces fan speed. Via Technologies, meanwhile, designs low-power motherboards and PC processors that operate without a cooling fan. Chin says undervolting, in which a PC's setting is adjusted so that the processor gets less power than required and thus produces less heat, is sensible in cases where the machine is not being strained to its limits. "For your average user, an incremental drop in performance just doesn't matter," he argues.
    Click Here to View Full Article

  • "Home PCs Predict Hotter Earth"
    Wired News (01/31/05); Leahy, Stephen

    This week's edition of Nature details a British distributed computing experiment that harnessed the processing power of 90,000 PCs in 150 countries to simulate future climatological trends; the results indicate that average temperatures could rise by 20 degrees Fahrenheit in less than half a century due to global warming. The climateprediction.net project is modeled after the SETI@home public-computing initiative, in which volunteers download software that studies data from outer space for signs of intelligent life while their PCs are not being used. SETI@home project director David Anderson with the University of California at Berkeley's Space Sciences Laboratory co-developed the Berkeley Open Infrastructure for Network Computing so that users could get involved in numerous Internet computing projects and instruct their PCs on how much time to commit to each initiative. Anderson says distributed computing aligns well with climate simulation given the huge number of models that must run concurrently to test all climate factors, such as air movement, incoming and outgoing radiation, precipitation, etc. An idle PC participating in climateprediction.net runs a simulation and uploads the outcome to the project server after two or three months. U.S. climate researchers are forgoing global climate simulation via distributed computing in favor of highly expensive supercomputers, despite Anderson's insistence that "We can do the same thing on a shoestring with public computing." He wagers that simulation times could shrink faster with more people participating rather than with bigger, costlier systems. Whereas the Blue Gene/L supercomputer is expected to have the processing power of 400,000 PCs when it is up and running, Anderson and Oxford University physicist Myles Allen hope the ranks of the climateprediction.net project swell with 400,000 or more participants.
    Click Here to View Full Article

  • "Internet2 Announces Key Milestone in Its HOPI Initiative"
    HPC Wire (01/26/05)

    The Internet2 Hybrid Optical and Packet Infrastructure (HOPI) initiative has established guidelines and an outline for a national testbed infrastructure that would be used for experimenting on a next-generation Internet2 network architecture. Researchers and scientists worldwide will use the national network testbed infrastructure to explore dynamically provisioned bandwidth, circuit switched environments, new transport protocols, and other new networking technologies. HOPI nodes will be deployed in sites that allow for considerable international connectivity, such as the Pacific Northwest GigPoP in Seattle, StarLight in Chicago, and the NYSERNet/MAN LAN location in New York City; connected to existing routers of Internet2's Abilene Network in their corresponding cities; and connected to Internet2's experimental facilities on the National Lambda Rail network. The setup will provide Abilene researchers with a connection to HOPI facilities. "With the testbed design in place, we are now poised and ready to work with our members and our international partners to bring this global experiment to fruition," says Rick Summerhill, director of network research, architecture, and technologies for Internet2 and co-chair of the HOPI Design Team. "By combining the technical expertise and leading-edge resources of our members and partners, we are confident that the HOPI initiative will yield breakthroughs for the future of high performance networking."
    Click Here to View Full Article

  • "Open Source Used for Tsunami Management System"
    Age (AU) (01/27/05); Varghese, Sam

    Sri Lankan programmers have developed an open-source disaster management system to assist in the tsunami relief effort in their country. In use at Sri Lanka's Center for National Operations, Sahana (which means peace or calm in Sinhalese) consists of an organization registry that keeps track of relief participants and contributions; a request management system that coordinates the efforts of organizations to respond to requests from camps; a camp registry that serves as a database with historical information; and a people registry that is a database of the missing, dead, and orphans, and includes pictures, fingerprints, DNA samples, and an enhanced search feature. District office volunteers can access the Web applications from laptop computers, and are able to run Sahana offline on a single PC. Other Web applications in the works include an assistance management system, a damage database, a burial registry, a logistics management system, and a housing reconstruction coordination system.
    Click Here to View Full Article

  • "The Encryption Factor"
    Independent (London) (01/26/05); Arthur, Charles

    The field of quantum computing is drawing considerable attention from governments and bankers throughout the world because of its potential to effortlessly break modern encryption, at least in theory. Quantum computing promises to address the problem of transistor miniaturization, which is expected to reach its physical limits within the next decade or so. The technology will also revolutionize problem-solving by tapping the unusual nature of subatomic particles to calculate extremely fast and process massive amounts of data that far exceed the capabilities of today's computers. However, a quantum computer's ability to factor huge prime numbers virtually instantaneously makes cracking encrypted data--which depends on primes--a simple matter. On the other hand, this same method can be applied to quantum cryptography, which ensures the protection of encrypted code from prying eyes using the phenomenon of quantum entanglement. Entanglement allows two identical cryptographic keys--one for the sender and one for the receiver--to be generated, and any attempt to intercept a key in transit will collapse the system. The concept of quantum cryptography is no longer theoretical, as an Austrian bank facilitated an actual bank transfer using quantum encryption last April. Bradford University professor Apostol Vourdas, who was recently awarded a grant of 62,000 pounds to coordinate a quantum computing project involving a network of academic and commercial participants, is convinced that quantum computing will eventually be refined for mainstream use. He says, "The computing side is just one aspect of the field. The whole field of quantum technology is growing, taking in communications, computing, and cryptography."
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "Center for Advanced Studies Releases Policy Appliance Reference Model"
    U.S. Newswire (01/27/05)

    The Policy Appliance Reference Model (PARM) was unveiled Jan. 26 by the Center for Advanced Studies in Science and Technology Policy. PARM is part of the Global Information Society Project at the World Policy Institute, and comprises a high-level analytic framework geared toward systems architects, information managers, policy makers, technology developers and other interested groups. They will use PARM to get a shared understanding of policy development, information process change, and technical systems design in networked data systems. PARM describes an enterprise architecture for information sharing and knowledge management in connection with policy appliances, smart data, and intelligent agents. As a result, information management policies can be reconciled and enforced for security, data quality, and privacy protection across assorted data sharing systems and networks. "Interconnected systems for information sharing--whether for national security, counter-terrorism, law enforcement, routine government activities, or commercial needs--will increasingly require dynamic negotiation and agreement of terms through technical mediation to determine which policies will govern specific information as it flows between systems for particular uses," said Kim Taipale, executive director of the Center. He added that "immutable and non-repudiable user and data logs subject to transparent but confidential oversight will be necessary to meet accountability and compliance needs for both operational and civil liberties policy."
    Click Here to View Full Article

  • "Touching Your Own Future: Haptic Tools"
    InformIT (01/28/05); Rowell, Laurie

    Today's cutting-edge haptics technology is only a continuation of the way humans have been using sticks and other tools as extensions of our bodies, according to "Natural Born Cyborgs: Minds, Technologies, and the Future of Human Intelligence," by University of Edinburgh professor Andy Clark. Basic tools such as a cane provide haptic force-feedback to people, allowing them to "touch" the ground. In the same way, people rely on their cell phones to extend their physical capabilities. Professor Clark also says gaming consoles are actually one of the major ways humans are learning to operate their bodies in virtual environments. More normally accepted haptics technology includes products from Immersion and SensAble Technologies, each of which seems intent on dominating the field of tactile I/O devices. Immersion recently won a $82 million settlement from Sony for violating Immersion intellectual property with the PlayStation DualShock vibrating controller; Immersion now has more than 240 patents. The Immersion Desktop software works with the Logitech haptic mouse so that users can sense on-screen icons, windows, links, and other features that are available for download from the Web. In the academic realm, French university scientists have created a haptic feedback device that trains medical students on how to deliver a baby. The University of Washington Human Interface Technology Lab has created technology linked to physical card markers where virtual objects are attached to the cards--users holding a set of Solar System planet cards can see 3D recreations of those planets, and set them in orbit when they are arranged properly. To enhance the virtual value of cell phones, Finnish firm F-Origin has developed IRIS navigational software that allows users to "look through" the cell phone display in order to see a larger image behind it. Users manipulate the cell phone in the same way they would use a handheld mirror to look behind them.
    Click Here to View Full Article

  • "Tech Is on the Way"
    CITRIS Newsletter (02/05) Vol. 4, No. 1; Shreve, Jenn

    U.C. Berkeley electrical engineering and computer sciences professor Eric Brewer is principal investigator for the Information and Communication Technology for Billions (ICT4B) program, whose mission is to study, create, test, and deploy new technology for people in impoverished and/or devastated regions. The recent tsunami catastrophe in Asia is a perfect example of the program's application: In the wake of the tragedy, ICT4B team members are being sent to Indian fishing villages to set up Wi-Fi antennas so that inhabitants can cheaply and easily access weather reports, health news, crop prices, and disaster alerts. Researchers are also testing a computer program that employs artificial intelligence to accurately diagnose diseases typical of the developing world so as to mitigate a shortage of medical personnel, while another initiative focuses on developing cheap and mobile sensors for identifying illness in the absence of hospital-lab facilities. Brewer founded the Technology and Infrastructure for Emerging Regions (TIER) group to address the need for affordable, low-power, and easy-to-use tools in developing areas. Efforts TIER is undertaking toward this vision include improving voice-operated user interfaces so that illiterate people will not be impeded by the digital divide. Brewer cites the importance of the Center for Information Technology Research in the Interests of Society's contribution to the ICTB4 program through its effective blending of the technology and social science disciplines. ICTB4 is funded by a $3 million grant from the National Science Foundation and receives additional support from project partners such as Intel and Microsoft.
    Click Here to View Full Article

  • "IT Hiring Inches Upward"
    Network World (01/24/05) Vol. 22, No. 3, P. 44; Mears, Jennifer

    U.S. businesses will add IT staff this year as they start new projects and seek desired skills, such as wireless networking expertise and integration. Harvard Medical School and CareGroup Healthcare System CIO John Halamka plans to hire five new networking, wireless, and security experts this year to help wireless-enable his groups' campuses. Halamka will use wireless networking for radio frequency identification, mobile VoIP, and access to patient files, and each of those applications requires different security profiles and attention to configuration. IT staffing firm Robert Half Technology (RHT) conducted a survey of more than 1,400 CIOs last year, finding 11 percent planned to add employees in the early part of 2005, and IDC estimates an approximately 6 percent increase in IT spending this year. RHT's Katherine Spencer Lee says, "We're continuing to see improvement in the market. It's not yet what it was in the dot-com boom, but it's certainly marked improvement." Because businesses are expanding, people with networking expertise are needed, while Windows and Cisco network administration, security, storage, VoIP, and Web services are also skills of interest. Merger and acquisition activity is increasing the need for integration skills. Many companies are also taking a second look at offshore outsourcing and deciding that in-house resources are worth the expense, says Foote Partners President David Foote. Linux and security skills are still in demand, but less so than last year because there are more people available to fill these positions. RHT's survey found that specialties generating the most grow were networking (20 percent), information security (13 percent), help desk/end user support (11 percent), and applications development (10 percent).
    Click Here to View Full Article

  • "DVD Copy Protection: Take 2"
    IEEE Spectrum (01/05) Vol. 42, No. 1, P. 38; Perry, Tekla S.

    The Advanced Access Content System (AACS) consortium is confident that their copy-protection technology will block the pirating of DVDs while permitting consumers to legally transfer optical-disc data (movies) over their home networks, play them on multiple devices, and store them on home media servers. The AACS methodology keeps the data encrypted both on the disc and in transit, with decryption performed by the players the data is sent to. AACS will use the National Institute of Standards and Technology's 128-bit Advanced Encryption Standard (AES), which will prove much harder to crack than the 40-bit encryption key employed in first-generation DVDs. However, even so-called strong keys such as AES can be cracked, which is why AACS is designed to maintain data protection even when the player key has been compromised by preventing vulnerable devices from playing new releases, forcing the cracker to go back to square one. Michael Ripley with the AACS alliance's technical working group cites the relevance of media key block technology to this effort. Critics are doubtful that DVD infringers will be deterred by such measures, especially if AACS fails to deliver the easy-to-use in-home distribution it promises at the outset. The fact that the consortium is a mix of electronic and entertainment companies (such as IBM, Microsoft, Warner Bros. Studios, and Disney) is cause for skepticism, as similar projects such as the Secure Digital Music Initiative have collapsed from conflicting industry interests. Whereas entertainment companies generally frown on copying, electronics manufacturers view copying as a key driver of product sales.
    Click Here to View Full Article

  • "Managing That Millstone"
    Software Development (01/05) Vol. 13, No. 1, P. 36; Feathers, Michael

    In his book "Working with Legacy Code," Object Mentor consultant Michael Feathers writes that changing legacy code while preventing bad changes from escaping and contaminating the rest of the software is possible through testing. "When we cover our code with tests before we change it, we're more likely to catch any mistakes that we make," he notes. The author explains that regression testing around the parts of the code where changes are going to be made serves as a protective harness to ensure that most software behavior is unchanged and that the developer knows he is modifying only what he intends to modify. However, regression testing is usually practiced at the application interface level, and Feathers calls for "finer-grained" testing. The author distinguishes large tests from unit tests, the latter of which supports rapid error localization and execution; Feathers expects to see unit tests that facilitate near-instantaneous feedback within his own lifetime. He calls dependency one of the biggest problems software developers face, and writes that putting in tests without changing code ranges from impractical to impossible. At any rate, breaking dependencies between classes is unavoidable, and Feathers recommends conservative refactoring and less concern with maintaining an aesthetic design sense. The author outlines a five-step legacy code change algorithm: Identification of change points, localization of test points, breakage of dependencies, test writing, and modification followed by refactoring.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "The Many Faces of Smartphones"
    Business Communications Review (01/05) Vol. 35, No. 1, P. 50; Wexler, Joanie

    Smart phones are evolving into compact, powerful, and affordable devices supporting wireless network links, and the productivity and usability enhancements they promise for certain mobile business users must be balanced by robust security and management. It is recommended that companies consider top-down strategies such as opting for enterprise-class products or managed services from carriers in order to enable structured smart phone asset tracking, security policy enforcement, and over-the-air configuration and updates; deployment of mobile antivirus software on all user devices; and enforcement of policies prohibiting users from listing their mobile phone numbers in the impending wireless 411 directory. In addition, analyst Bob Egan says IT departments should start clamoring for multiple operating system management tools and develop a knowledgeable support group that, among other things, delivers second-level support via a relationship with network service providers. Companies that include Sprint, Intel, and Wavelink are focused on relieving IT of some of the converged mobility management and security headaches. Sprint is readying the industry's first enterprise-class managed mobility service, which the company's Kenny Wyatt thinks will simplify the top-down deployment of mobile devices for companies and save money on volume by distributing minutes across mobile business user services. Intel is augmenting mobile devices with a streamlined version of the Common Information Model software, while Wavelink is offering Avalanche, a centralized converged mobility management system. In terms of security policy enforcement, In-Stat/MDR analyst Neil Strother suggests that access to smart phones should be severely restricted, while encrypted hard drives, biometrics, and password protection should be implemented according to the user's profile.

  • "Beating Common Sense Into Interactive Applications"
    AI Magazine (12/04) Vol. 25, No. 4, P. 63; Lieberman, Henry; Liu, Hugo; Singh, Push

    Imbuing increasingly complex computer applications with commonsense knowledge promises to expand their helpfulness, but though progress in this area has been significant over the last several years, new interface designs that only partially address commonsense issues should be investigated. Early experiments were hampered by the narrow scope of commonsense knowledge bases and the unreliability of inference because of ambiguity, exceptions, logical paradoxes, and so on. A group of MIT Media Lab researchers has concluded that intelligent interface agents can apply commonsense knowledge much more effectively than direct question-answering applications because they are less of a burden to the system. An interface agent latches onto a conventional interactive application, observes user interactions, and can operate the interface the same way the user does to provide assistance, suggestions, common-task automation, adaptation, and interface personalization. The agent is in continuous operation, so that if it cannot provide an immediate response, it can compile further evidence and possibly deliver a meaningful interaction later on, as well as ask the user to fill in the gaps in its knowledge. The researchers have applied this scheme to a variety of early-stage prototype commonsense applications, including affective classification of text, annotation and retrieval of digital photos, video capture and editing, topic spotting in conversation, storytelling, word completion, and mapping user goals to complex actions. Each application uses common sense in a unique way, but none engage in "general-purpose" commonsense reasoning; they also eliminate the need to revert to general logical mechanisms by delivering sufficient context about what inferences are plausible to make. Despite the limitations of these applications, they demonstrate the potential to apply commonsense knowledge in ways that at least make sense to the user.