ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM. To send comments, please write to [email protected].
Volume 8, Issue 890:  Wednesday, January 18, 2006
Be a TechNews Sponsor

  • "As Europe Tries for United Patents, Italy Moves Alone"
    New York Times (01/17/06) P. C8; Meller, Paul

    In an effort to streamline Europe's labyrinthine patent process and encourage innovation, the European Commission is attempting to reduce the cost of patent registration fees, though Italy, the European Union's fourth-largest member state, has abolished its fees altogether, raising the concern that Italy's patent system will be flooded with applications. The current system requires an applicant to file in each country in which he seeks patent protection after garnering initial approval from the European Patent Office in Munich, a costly and inefficient process that has led European leaders to pursue an EU-wide system with a single patent court and litigation process for the whole continent. Italy's move departs from the unification process, which would reduce the cost of patent protection by an estimated 60 percent. "This would still be more expensive than in the United States," said the European Commission's Grazyna Piesiewicz, because patent applications are scrutinized more closely in Europe and the process takes longer, but it would be a huge improvement on the system at present." Years of debate over the languages patents should be translated into have stymied the commission's efforts to unify the process. Italy's move to drop many of its registration fees, which took effect at the beginning of the year, is intended to encourage innovation by enabling inventors to protect their intellectual property, though it could invite people to patent inventions that they have no intention of using, effectively closing off that area of research to others for the duration of the patent.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Waging War Through the Internet"
    San Francisco Chronicle (01/15/06) P. E1; Arquilla, John

    While the U.S. anti-terrorist effort over the last four years has focused on tracking al Qaeda and other terrorist organizations and thwarting their attempts to stockpile sophisticated weaponry, the threat of Internet-based threats that could devastate communications, energy, and transportation networks has largely gone ignored. Terrorists could remotely transmit a virus that takes down power to a vast swath of the country, or gain remote access to and then crash highly automated controls that run many sensitive operations. Top-level government agencies such as the Department of Defense have already come under attack, as have nuclear and waste treatment plants. One current attack threatening U.S. military and scientific networks, Titan Rain, appears to come from China, where some of the world's most skillful hackers live. The annual financial cost of settling claims resulting from cyber attacks exceeds $40 billion, roughly the same amount as the insured losses stemming from the Sept. 11, 2001, attacks. Though the consequences would not be as dire, a major cyber attack is far more likely than terrorists acquiring and using a nuclear warhead. Al Qaeda is already presumed to have dispatched at least one agent to the United States to study computer science, and the number of hackers around the world is steadily growing. The traditional focus on preventing violent attacks on domestic soil has only exacerbated the cyber threat, as it is now extremely difficult for terrorists to plan and execute a physical attack, leaving cyber terrorism a logical recourse. The United States also suffers from an overconfidence about its cyber defenses, placing too much faith in firewalls and other security applications that can easily be circumvented by new or slightly altered viruses. Even in the military, strong Internet encryption is not generally employed. Moreover, the United States should be attacking the terrorists' systems with the same vengeance that we pursue their operatives in caves and spider holes, writes John Arquilla, professor of defense analysis at the U.S. Naval Postgraduate School in Monterey.
    Click Here to View Full Article

  • "New GPL Free at Last"
    Wired News (01/16/06); Baard, Mark

    Free Software Foundation (FSF) founder Richard Stallman released a draft of GPL version 3 earlier this week, the first update to the GNU Public License since 1991. The new version contains language prohibiting the application of GPL code to digital rights management, and imposes restrictions on the patent rights that developers can claim for their programs licensed by the GPL. Curtailing digital restrictions and patent rights is consistent with Stallman's copyleft principal that guided the original version of the GPL, where developers could release their code without worrying about a commercial entity appropriating it and imposing usage restrictions, creating an atmosphere of intellectual freedom that engendered projects such as the Linux operating system. The GPLv3 draft comes amid renewed fears of corporate encroachment into independent software development, as Stallman cites the increased use of digital rights management (referred to by the FSF as handcuffware), and a growing number of software patents. The FSF's Eben Moglen notes that the draft also contains enforcement provisions and language concerning remote services. The draft will be debated over the next year, though Stallman believes that it could not be more timely, given the patent crush that threatens the open-source software community. Microsoft, for instance, was recently awarded a patent for the FAT file system format, which could end the distribution of Linux if Microsoft opts to collect royalties.
    Click Here to View Full Article

  • "Repetez, en Anglais, S'il Vous Plait"
    Technology Review (01/18/06); Greene, Kate

    Despite its considerable progress over the last 20 years, computerized translation software still produces many embarrassing errors. Software developers have recently been looking beyond electronic documents and static Web pages to more real-time translation applications. While the software continues to improve, it still cannot handle the demands of business or military situations, largely because it is algorithm-based, meaning that it must sort through thousands of grammatical rules and exceptions, often faltering when rules conflict, according to Kevin Knight of the University of Southern California's Information Sciences Institute. Knight's approach relies on the probability of matching words and phrases across languages, rather than the algorithm approach. Taking a statistical approach, Knight's technique draws on a vast repository of already translated documents to find probable matches. The statistical translation software is more accurate than conventional techniques, and it is self-improving, as it continually absorbs new translated documents into its base of reference. "A few years ago, for our Chinese and Arabic languages, all we could get was the basic topic of what an article was about," said Knight. "Now, the resolution is at the sentence level." Using the same concept, though drawing on the whole Internet for translated examples, Google edged out the Information Sciences Institute in a DARPA-sponsored translation contest last August. DARPA has also unveiled the Linguistic Data Consortium program to acquire vast numbers of translated documents to be widely distributed to accelerate the progress of machine translation, though Knight acknowledges that proper names and grammatical inconsistencies will continue to confound machine translation programs.
    Click Here to View Full Article

  • "In Reversal, Silicon Valley Added Jobs in '05"
    New York Times (01/16/06) P. C7; Flynn, Laurie J.

    Job growth is the latest sign that Silicon Valley is rebounding, according to Joint Venture Silicon Valley, a nonprofit organization in San Jose, Calif. For the past two years, the economy has turned around but no jobs were added. However, a newly released report from Joint Venture indicates that local companies added 2,000 jobs in 2005, which represents the first time the region has recorded job growth in four years. "All the signs are that this will hold up," says Joint Venture CEO Russell Hancock; the modest 0.2 percent increase in jobs has brought the number of employed people to 1.15 million in the region. Joint Venture acknowledges that the figure does not compare to the number of jobs that were being added in the dot-com era, when there were gains of more than 300,000 new jobs. Hancock says a large number of the new jobs last year were high-end positions that involved software, in both a creative and knowledge-intensive capacity. Stephen Levy, director of the Center for Continuing Study of the California Economy, says tech companies have had to persevere with fewer workers. "Production is up, sales are up, and all that can exist without job growth," says Levy.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Information Technologies Reshaping the Real Estate Landscape in Unexpected Ways"
    Penn State Live (01/13/06)

    Despite the expectation that computers and Internet listings would eventually replace the real estate agent, the number of realtors has actually increased over the last 10 years, as has the number of professionals required to handle the growing body of information involved in the transaction process. Penn State University associate professor of information sciences and technology Steve Sawyer was the lead author of a study detailing the actual impact of technology on industry as compared to the rhetoric surrounding its projected effect. Sawyer and his team opted to study residential real estate because of the unprecedented increase in the use of technology among realtors in the last 10 years (only 2 percent of realtors used IT in 1995, compared to 97 percent in 2005), and the similar rate of economic growth in the real estate market itself. Working with the National Association of Realtors and the federal government, as well as mining academic work, the researchers conducted surveys, interviews, and collected data on the ways that computing has changed the real estate profession. Information and communication technologies have provided consumers with data that had previously been inaccessible, such as listings, neighborhood demographics, and mortgage rates. Better-informed consumers have demanded a higher level of service from their realtors. Virtual tours could replace visits to physical properties, and sellers and buyers are now able to conduct transactions online. With information available nationwide, increased competition has driven down real estate commissions, and buyers can now conduct a wider search for the lender with the most favorable interest rate. Sawyer found that most uses of technology in real estate were unanticipated, and that predicting which new technology will shape the field in the future is still at best a hazy science.
    Click Here to View Full Article

  • "Congress Takes Aim at 'Analog Hole'"
    TechNewsWorld (01/17/08); Mello Jr., John P.

    A bill pending before the House of Representatives designed to close the analog hole to guard against piracy of CDs and DVDs has drawn the ire of many civil libertarians and technologists. Congress is also reviewing the broadcast flag, which regulates the guards against the distribution of television content on the Internet. While Rep. James Sensenbrenner (R.-Wisc.) referred to the broadcast flag as the counterpart of the analog hole, the Consumer Electronics Association's Michael Petricone notes that the broadcast flag is a voluntary, evolutionary development, while the analog hole is simply a strong-arm effort by content holders that has no consensus. He also notes that the vague language of the legislation could be applied to any piece of software code that could convert data from analog to digital. Hardware makers object to the precedent that the legislation would set by requiring them to prove that their products will not be used by pirates. The Electronic Frontier Foundation's Fred von Lohmann objects to the provision in the legislation that requires all devices capable of analog to digital conversion to be able to tag digital content originating from an analog source. "In order for that to work," he said, "they have to have the federal government require every technology to detect this mark of the beast and obey it." The two marking technologies that the bill refers to, CGMS-A and VEIL, are unlikely candidates for such universal protection, says Lohmann, as CGMS-A has been around for years, but is largely unused, and, to his knowledge, VEIL has never been used to protect content.
    Click Here to View Full Article

  • "Hands-Free Police in Control"
    Boston Globe (01/15/06); Long, Tom

    The University of New Hampshire's Project 54 continues to develop a system that allows police officers to deliver voice commands to their cruisers. Three faculty members in the electrical and computer engineering department, along with four engineers, a technician, and about 25 students, are now working to extend the voice-activated technology to personal digital assistants, which would enable officers who have PDAs to command a background check and other tasks while they are as far as 300 feet away from their cruiser. The technology retrieves the background information, while the police officer can keep their focus on a suspect who may attempt to flee the scene. Project 54 has already developed a voice-activated system that is designed to work with cruisers that have a computer onboard. The technology, which costs about $1,000 to install, transforms the computer into a voice-based system for automatically activating the antenna, radar, and background checks, and switching radio frequency. In fact, an officer can say "pursuit," and the computer will activate the siren and blue lights, flash headlights, and relay his or her position to the dispatch center. "You can keep your hands on the wheel when you're in hot pursuit, when a split-second decision can mean the difference between life and death, particularly for a pedestrian," says Stephan Poulin, an officer with the Exeter, N.H., police department. The technology has been embraced by about 65 departments, including those in Massachusetts, Maryland, and California, and has been deployed in about 450 police cars.
    Click Here to View Full Article

  • "Mass Spying Means Gross Errors"
    Wired News (01/18/06); Granick, Jennifer

    While mass government surveillance is fast becoming precedent in the age of terrorism, there are limitations on the technology in use that could have troubling consequences for citizens, writes Jennifer Granick, executive director of the Stanford Law School Center for Internet and Society. The communications Assistance for Law Enforcement Act (CALEA) mandated that phone companies augment their networks with mass surveillance capabilities, though the CALEA-directed spying initiative collects data at a far greater pace than the Justice Department can issue warrants, indicating that law enforcement might automate the system with voice recognition technology that calls for human monitoring when it encounters a hit. Law enforcement could also introduce facial recognition technology in airports and other public venues, funneling the information into large government databases. While the FISA law does not authorize mass surveillance without probable cause, some legal and intelligence experts believe that it should. Harvard law professor Charles Fried argues that mass surveillance is an imperative, claiming that despite its dubious legal status, human spying would be minimal in the first stages of the program. Fried also claims that the algorithms and scan techniques used to mine government databases must remain classified, and argues that a mass surveillance program will be doomed to failure if its details are publicly released. However, Granick says the effectiveness of mass spying is tenuous, as terrorists are accustomed to modifying their speech and correspondence to elude surveillance. Regardless of its success rate, mass surveillance will inevitably produce false positives, and without enough agents to follow up on the hits, such a program could misinform law enforcement and disrupt innocent citizens' lives more than it combats terrorists.

  • "Where Now for Agent-Based Computing?"
    IST Results (01/18/06)

    The IST-funded AgentLink III program has unveiled a blueprint for the development of agent-based systems designed to inform policymakers and steer the industry toward the most relevant areas of development. Agents, or autonomous software systems, underpin many of the most important IT developments in recent years, including the semantic Web, the Grid, and ambient intelligence. Daimler-Chrysler witnessed a tenfold productivity increase when it deployed an agent-based system in one of its factories. Similarly, a shipping company designed its tankers as agents and the simulations improved the management of its shipping routes. Michael Luck, coordinator of the roadmap, believes that at this point, agent-based computing must be proven to be compatible with existing software, but that many of the impediments to adoption are cultural. "Very often, the difficulty in implementation is the need for a new way of thinking, a re-look at the way people do things." Luck warns of the diluted value of agent-based technologies as they grow in popularity and become co-opted into larger, proprietary systems. In the same vein, Luck is also concerned that despite its ubiquity, users will no longer recognize agent-based software, a that which could effectively curb further development.
    Click Here to View Full Article

  • "Clever Car Keeps an Eye on Stray Pedestrians"
    New Scientist (01/12/06); Knight, Will

    Pedestrian-recognition technology is one of the latest developments in automobile autonomy. Volkswagen, DaimlerChrysler, and several other technology companies have teamed up to develop the Save-U system, which makes use of a network of radar sensors and visual and infrared cameras to determine that a pedestrian or cyclist is in the road ahead. With the aid of a connected computer, the system could also warn the driver or even attempt to evade the person on its own, such as by applying the brakes. "The main idea is that the sensors will recognize pedestrians and if a pedestrian has a high probability to collide with the vehicle then automatic braking will be initiated by the system," explains Marc-Michael Meinecke of Volkswagen. Tests of a prototype system in the United Kingdom reveals that the technology has the potential to save lives. Chris Wright, a traffic safety expert with Middlesex University in the United Kingdom, says future developments in automobile autonomy could include systems that automatically steer a vehicle and keep a safe distance from other drivers. Although Wright believes robot vehicles will be safer, he adds that the legal issue of liability in the event of an accident will have to be addressed.
    Click Here to View Full Article

  • "Multitasking: Attention at Half Mast"
    Globe and Mail (CAN) (01/18/06); Immen, Wallace

    Long revered as a time-saving device for busy managers, the efficiency of multitasking is a false promise, as a recent study found that trying to email or text message while performing another task resulted in a five point to 15 point drop in IQ, and that attempting to do two things simultaneously reduces aptitude for both. While study leader Glenn Wilson, a psychiatrist at the University of London, admits that the reduced intelligence is only a temporary symptom that goes away when the distraction disappears, he questions the long-term impact on workers' concentration by continually splitting their attention. Workers often forget what they were doing when trying to perform multiple tasks simultaneously, which inevitably results in lost productivity as they retrace their work. Researchers Victor Gonzalez and Gloria Mark in the computer science school at the University of California at Irvine studied 36 corporate technology and finance employees as they worked and found that they were interrupted every 11 minutes on average. They found that following the interruption, it took an average worker 25 minutes to refocus on their task, causing an estimated loss of 28 percent of a workday, or 2.1 hours, to interruptions. Researchers say the trend is only escalating as workers find themselves balancing more responsibilities and using an increasing number of gadgets. Fully completing one task is the most effective way to budget a workday, and with complex projects, workers should devise a series of action steps that can be completed in sequence, which marks progress and helps employees remember where they left off. Email and voice mail should not be chronic affairs, though they often are. Workers should delegate one hour or two hours each day to address them with undivided attention.
    Click Here to View Full Article

  • "mSpace Mobile: Exploring Support for Mobile Tasks"
    University of Southampton (ECS) (01/16/06); Wilson, Max; Russell, Alistair; Smith, Daniel A.

    In a comparison between the mSpace Mobile and Google Local Web application interfaces, mSpace Mobile performed more effectively in supporting location-based discovery and planning tasks in both stationary and moving mobile devices. Specifically, mSpace performed 30 percent faster than Google Local in the stationary scenario, and nearly 40 percent faster in the in-motion scenario. The authors attribute mSpace's superior performance to the fact that it delivers Web content outside the Web page paradigm, and thus facilitates mobility support for new and more powerful interfaces. Instead of presenting information in a page, mSpace displays information as areas of information or domains that each contain a series of associated dimensions; mSpace combines a spatial, multicolumn display with a zoomable focus+context interface. A user can choose a dimension element and raise associated information about the element, making it easy for people to view the associated contexts of any selection and quickly switch between them for comparison and contrast. Under both stationary and mobile conditions, Google Local's performance is affected by the time needed to load requested external Web pages, while in mSpace information related to a selection is sent in chunks and cached, reducing calls to the network and speeding up interface response. It is postulated from the results of the mSpace/Google Local comparison that mSpace meets three conditions of effective performance of location-based planning tasks with mobile devices: Rapid data transfer, a lowered text entry requirement, and reduced requirement for activities that require a target to be both acquired and held. The authors also show that mobility--specifically, walking--can negatively affect sequential task performance when following the traditional Web-page-as-unit paradigm.
    Click Here to View Full Article

  • "The Science Scare"
    National Journal (01/14/06) Vol. 38, No. 2, P. 36; Friel, Brian

    Many of America's political, institutional, and industry leaders support the argument that U.S. global economic dominance is slipping and will continue to slip unless the country improves its math and science education, churns out more engineering graduates, and substantially raises its federal research budget. According to leaders gathered at the National Summit on Competitiveness in December, federal R&D dollars have fallen from almost 1 percent of GDP in 1970 to less than 0.5 percent today, while China raised its R&D investment from 0.6 percent of GDP in 1995 to 1.2 percent in 2002; furthermore, American high school seniors' average score in general math and science ranks below the international average, and U.S. students' scores in advanced mathematics are behind those of students in 11 of 15 other industrialized countries. And finally, just 11 percent of U.S. bachelor's degrees are in the sciences and engineering, versus 23 percent in the rest of the world and 50 percent in China. Many of the summit's points reiterated arguments from a report issued by the National Academy of Sciences last October: The study found that the United States has lost its status as a net exporter of high-tech products. In addition, only three of the 10 leading recipients of U.S. patents in 2003 were American companies, more money was spent on tort litigation than on R&D, and 11 engineers in India can be bought for the price of one in the United States. But some observers are questioning the reality of the competitiveness crisis, citing the American economy's long history of resilience, the superior flexibility of the U.S. economy and workforce, and the myth of an actual engineering shortage. Education Sector co-director Andy Rotherham thinks a far more pressing issue is inadequate and underfunded education for minorities.

  • "And They Call It Robot Love"
    New Scientist (01/14/06) Vol. 189, No. 2534, P. 48; Nowak, Rachel

    In a recent interview, Mari Velonaki, an electronics enthusiast and an artist with a Ph.D., described her research on the interaction between humans and robots at the Australian Center for Field Robotics. Velonaki developed the Fish-Bird exhibit, where patrons observe the interactions of a pair of robots disguised as wheelchairs and Velonaki in turn observes the patrons as they project distinctly human qualities onto the moody and smitten robots. Based on a Greek myth of the impossible love between a fish and a bird, Velonaki chose old wheelchairs as her medium to represent physical limitations, embedding computers and processors within the upholstery. The two robots communicate wirelessly through a Bluetooth radio link, and the chairs contain cameras, infrared collision sensors, and scanning laser measurement devices to gauge their proximity to each other. Fish and Bird are programmed with seven moods--one for each day of the week--and three emotional states for how they feel about themselves and each other: not very happy, neutral, and positive. The robots have a memory, so their mood alters in response to recent events, and their interaction with human visitors is also determined by how many people are in the room, how close they are, and how much time each person spends with the robots. The robots send each other love letters and write messages to the patrons that drop out of thermal printers underneath their seats. People have a variety of reactions to the robots, with men typically inspecting them to see how they work, while children are more likely to pet them and try to coax more messages out of them. Velonaki is astounded at how closely people relate to the robots, despite the fact that they are wheelchairs.
    Click Here to View Full Article

  • "Help! I've Lost My Focus"
    Time (01/16/06) Vol. 167, No. 3, P. 72; Wallis, Claudia; Steptoe, Sonja; Cole, Wendy

    Gadgets designed to make people more efficient multitaskers more often drive them to distraction and disrupt their productivity, and efficiency experts, psychologists, and information-technology experts have been studying this problem in order to develop solutions that will restore balance. Psychiatrist Edward Hallowell says the inability to prioritize when overwhelmed with incoming messages and competing tasks leads to irritability, disorganization, distractibility, impulsive and hasty decision-making, and feelings of guilt and inadequacy. Basex estimates that workers spend about two hours a day dealing with interruptions and distractions, which costs the U.S. economy $588 billion a year in lost productivity; meanwhile, a study by Microsoft researcher Mary Czerwinski finds that interruptions at the beginning and end of a task have the most negative effects on worker performance. Research shows that surprisingly few people are bothering to take basic measures to reduce interruptions, such as turning off cell phones or waiting until a task is done to check emails. Such behavior has led experts to conclude that the compulsive use of digital devices triggers a pleasurable and addictive neurochemical sensation. Czerwinski is busy designing a smart office-communication system that determines whether an interrupting message should be sent immediately to the recipient or held up on the basis of numerous factors, such as the worker's past preferences and habits, appointments, and current projects. Another area Czerwinski is focusing on is new software products that let workers maintain tasks longer, even with the presence of onscreen interruptions; in such a system, an incoming message should contain sufficient information for the worker to ascertain whether to read it immediately or save it for later. The single most important strategy for controlling productivity is for workers to stop deluding themselves about their multitasking abilities.
    Click Here to View Full Article
    (Access to the full article is available to paid subscribers only.)

  • "IETF Hums Along at 20",br> Network World (01/16/06) Vol. 23, No. 2, P. 1; Marsan, Carolyn Duffy

    Jan. 16, 2006 marks the 20th year of the Internet Engineering Task Force (IETF), the standards-setting entity responsible for developing many of the underlying Internet protocols, such as directory services, fundamental routing, email, and telephony protocols. The IETF is credited with the creation of Border Gateway Control, Open Shortest Path First, Session Initiation Protocol, Post Office Protocol, Internet Message Access Protocol, Lightweight Directory Access Protocol, MPLS, the IPSec Security protocol, and IPv6, among others; these standards are ranked among the group's greatest achievements because they allow the Internet to remain functional in the face of dramatic growth and the introduction of new services. "Despite all kinds of centrifugal forces, the Internet's technology has stayed reasonably unified and coherent during the tremendous growth of the last 20 years, the enormous changes in underlying transmission technology and the era of telecommunications liberalization," reports IETF Chairman Brian Carpenter. The IETF's openness and democratic approach to standards creation are the group's best known qualities, and combining its openness with the seasoned expertise of individual participants makes for standards of higher quality, according to Cisco fellow and former IETF Chairman Harald Alvestrand. "We describe different functions that get done and principles by which they work, which is a different way to do [Internet] architecture," explains Fred Baker, another former IETF chair. The IETF has had disappointments, including its failure to generate standards quickly enough for the marketplace in areas such as firewalls, spam, and instant messaging; the group's tardiness in recognizing the importance of built-in security is perhaps its biggest mistake. Harvard University consultant Scott Bradner notes that the IETF takes a non-governmental approach to standards development, with the result that the group does not concentrate on defending existing companies or industries.
    Click Here to View Full Article

  • "Tomorrow's Technology Today"
    Software Development (01/06) Vol. 14, No. 1, P. 30; Riley, Mike

    Future breakthrough technologies could conceivably be created with tools available today. One such tool is smart system management software from IBM to enable autonomic or self-healing computer systems; IBM's Web site describes the Autonomic Computing Toolkit as "a collection of Self-Managing Autonomic Technology components, tools, scenarios and documentation that is designed for users wanting to learn, adapt and develop autonomic behavior in their products and systems." OQO's recently updated Windows XP-based model 01+ portable computer could be used to build mobile, next-generation data communications applications for power users on the move, while the book "Car PC Hacks" by Damien Stolarz offers a how-to for migrating computing from static workstations to automobiles. The BlackDog from Realm Systems combines a 256 MB or 512 MB USB thumb drive with a 400 MHz Power PC processor, a biometric fingerprint reader, and Debian Linux; the idea is that the device could be a self-contained active desktop, and Realm is hosting a contest to find applications for the technology. The BlackDog could complement flexible, low-power displays from E Ink, which can retain an imprint of an image even after it has been deactivated. The field of PC-based robotics is making strides with inventions such as White Box Robotics' 914 PC-BOT, a machine equipped with a Web camera, a Windows-based PC, and base software that allows enthusiasts to generate their own instructions. Text-to-speech conversion toolkits from Acapela Group and AT&T are helping bring more human-like tone and inflection to TTS interfaces. Finally, VMware ushered in the virtual computing revolution with its VMware Workstation, and its release of a free VM player that can run any VMware-created virtual machine can facilitate rapid implementation of preconfigured applications in the running environment, giving developers the means to programmatically deploy virtual machines.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

    [ Archives ]  [ Home ]