HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Parasoft Logo    Automated Software Analysis - by Parasoft:
Error prevention for unit testing, coding standards analysis,
functional/performance testing, monitoring and reporting.
http://www.parasoft.com/products


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Parasoft or ACM. To send comments, please write to [email protected].
Volume 6, Issue 734:  Wednesday, December 22, 2004

  • "Researchers Find Way to Make Internet Video More Appealing"
    Ohio State Research News (12/16/04)

    Ohio State University computer science and engineering professor James Davis observes that viewing lectures over the Internet can be difficult for distance learners. Bandwidth limitations mean that the video is often rendered in a fuzzy low-resolution format that lacks subtleties in the lecturer's facial expressions and hand gestures. Davis and a team of students have developed software that sharpens up a speaker's face and hands while slightly blurring the rest of the image, thus making these nuances more pronounced without adding extra bandwidth. The inspiration for this project is rooted in Davis' experiences as a MIT Media Laboratory research assistant, where he interacted with scientists who analyzed video of speakers in order to draw insights on how speech and gestures are employed in communication. Davis and former undergraduate student Robin Tan wrote algorithms to identify hand gestures at Ohio State, and integrated them with other algorithms that track a speaker's face and hands; the algorithms were then added to a publicly accessible MPEG encoder and decoder. Initial tests with the software were reported as successful by Davis and Tan in a recent edition of Computer Vision and Image Understanding. Five volunteers watched augmented and unaugmented versions of three video sequences side-by-side, and preferred the enhanced version; both versions used the same number of kilobytes of data per frame, so transmission bandwidth requirements did not vary. Davis intends to add algorithms that can concurrently track the movements of multiple speakers, as well as selectively augment background objects referenced in a lecture.
    Click Here to View Full Article

  • "Tenets of Academic Rigor Spread to Computer Games"
    Boston Globe (12/21/04); Bray, Hiawatha

    Worcester Polytechnic Institute (WPI) will launch a four-year Interactive Media and Game Development major next year. The new major focuses on computer game design and is touted as the first program of its class in the United States to require humanities as well as computer science coursework. The major will be taught by faculty from the institute's Computer Science and Humanities and Arts departments, which will share administrative duties. Courses will emphasize such subjects as narrative structure, art theory, programming and design tools, and game design's social and ethical facets. Students will also need to operate in game development teams in order to refine collaborative skills that are essential in the real-world gaming industry. Drama and theater instructor Dean O'Donnell, who already employs computer games in his courses, explains that 3D action games, stripped of their more violent elements, can be used to construct a theatrical framework. He says that creating a compelling storyline for games is just as challenging as programming, and notes the lack of narrative sophistication in current industry offerings. "We haven't seen the great works of art in gaming yet...they're more like murder mystery dinner theater," O'Donnell remarks. WPI computer science professor David Finkel says his eyes were opened to the potential of a game design major by the popularity of the school's game development clubs, whose members have custom-modified existing games or built games from scratch for the challenge and the experience; one example is MassBalance, an online game developed in conjunction with State Sen. Richard Moore that lets citizens attempt to balance the state budget.
    Click Here to View Full Article

  • "The U.S. Patent Game: How to Change It"
    Harvard Business School Working Knowledge (12/20/04); Cullen, Ann

    In their book, "Innovation and Its Discontents: How Our Broken Patent System is Endangering Innovation and Progress," Brandeis University professor Adam Jaffe and Harvard Business School professor John Lerner argue that patents can be an incentive for the improvement of existing technology, provided that "the improvement embodies some idea that was not covered by the patent on the underlying technology." But the U.S. patent system employs an overly broad definition of patentability that actually discourages improvements by giving original patent owners the latitude to sue those who have modified patented products for infringement, thus breeding monopolization. The authors contend that innovators with solid ideas about improving important products should be able to work out an agreement with original patent owners that permits the deployment of the improvement; one way to do this is to grant the inventor a license to use the original patent, or to sell or license the improvement back to the original patent owner. However, original patent owners are often reluctant to enter into such agreements, and this tension is amplified by the fact that different firms may independently apply for patents on what is basically the same concept. The authors say such applicants should be awarded patents only on the strength of the unique and truly new aspects of their inventions, when in reality the patents they receive are ambiguous and cultivate uncertainty about what products they can and cannot market. Jaffe and Lerner say that retooling the patent system is a tough challenge because many lawyers perhaps unknowingly sustain the flawed model by resisting any radical reforms in order to continue to enjoy substantial benefits from the current system, while the effects of bad patent policy are hard for consumers to see and understand. The reduced difficulty of winning patents devalues patents for young, innovative firms, which are also frequently targeted for litigation by major patent owners.
    Click Here to View Full Article

  • "'Attentive' Cubicles Help Workers Focus in Busy Offices"
    Queen's University (12/20/04)

    Attentive system technologies being developed by Queen's University's Human Media Laboratory (HML) in Canada seek to reduce distractions and refocus user concentration in order to make workplaces more efficient and attentive disorders more treatable, among other things. "The computer can empower the brain to work more efficiently by aiding attentional mechanisms," remarks HML director Dr. Roel Vertegaal with Queen's School of Computing. "We're moving toward enhancing brain function by directly tapping into a person's sensory system." Vertegaal's research team is attempting to turn computers into "sociable" systems that can identify and react to non-verbal cues people employ in group conversations. One of HML's inventions is an "attentive office cubicle" that filters out objects that might distract the user and directs workers' attention mechanisms so that their brains can focus energy where it would be most efficient. The cubicle uses walls of translucent "Privacy Glass" embedded with liquid crystals that switch from opaque to transparent when overhead cameras monitoring co-workers' "social geometry" detect potential communication partners; meanwhile, workers wear headphones that mask the noise of people in neighboring cubicles, and automatically shut off their noise-cancellation feature when they detect co-workers staring at the wearer. Vertegaal says the headphones can also pause or fast-forward live conversations so that wearers can listen to two conversations simultaneously, as well as automatically query Google with key phrases from the conversation via speech recognition. The HML team is considering how eye contact sensor technology could be used to treat autistic children; one such application is eyewear that detects when a child is looking at its parent and produces sounds to reinforce such behavior.
    Click Here to View Full Article

  • "Ultrawideband Spec Battle Takes Shape"
    TechNewsWorld (12/21/04); Korzeniowski, Paul

    Vendors are duking it out to establish a de facto ultrawideband (UWB) standard because they could not reach a consensus on an IEEE-ratified standard. On one side of the battle are 170 companies supporting multiband orthogonal frequency division multiplexing (OFDM) under the aegis of the MultiBand OFDM Alliance (MBOA), while on the other side are backers of direct sequence signaling led by Freescale, a Motorola spinoff. Either approach upholds low-cost wireless connectivity at speeds ranging from 400 Mbps to possibly 1 Gbps at a maximum distance of 10 meters, and the UWB standard could eventually be used to exchange data between set-top boxes, flat-panel digital displays, digital cameras and camcorders, stereo components and speakers, wireless home-theater systems, DVD players, digital video recorders, cell phones, and PCs. But the lack of a resolution between these two warring camps could endanger UWB's opportunity to become a mainstream wireless connectivity movement. In-Stat/MDR director Joyce Putscher says the creation of a UWB standard is being impelled by video's growing role in consumer electronics, and by both consumers and businesses' need for high-speed enterprise networks that deliver the bandwidth to support video; in addition, Parks Associates' Kurt Scherf comments that UWB can spare users the headache of properly deploying cabled networks of devices. The clash that led to the current UWB vendor schism was related to differing opinions on the potential for OFDM-enabled UWB to interfere with other radio users, the cost and complexity of OFDM-supporting chips, and the power requirements for compliant devices. Although MBOA lags behind Freescale in terms of moving toward compliant products, its membership has surged while Freescale has thus far only garnered support from third parties.
    Click Here to View Full Article

  • "Landweber on Networks, Open Source, and NSF Budget Cuts"
    Wisconsin Technology Network (12/20/04); Stitt, Jason

    Network infrastructure will one day enable vast data transfers, but it is unclear what normal users will do with this bandwidth, says University of Wisconsin-Madison computer science professor Larry Landweber. T1 connections provide megabit-per-second speeds for moving video, images, and multimedia, but the IT industry has not yet figured out what to do with 100 Mbps connections, for instance; such speeds would require high-speed wireless connections or fiber-optic lines to individual homes to bridge the "last mile" gap, says Landweber, who helped create some of the earliest international Internet links. Fiber-optic networking presents serious obstacles because current technology requires the use of mechanized mirrors or time-consuming signal transfer technology. Researchers are currently working to create router designs that could separate and direct light beams based on their color. Open-source software is a relatively new trend that harnesses developers' desire for recognition, but also is attracting corporate interest for financial reasons. With so much talk about open-source development, however, Landweber says it is worthwhile to ask whether truly new and innovative applications will come from open-source or more entrepreneurial development. Landweber also works as a National Science Foundation (NSF) advisor, and says the recent NSF budget cuts are a cause for worry because science and engineering are vital to the nation's competitiveness; IT and biotech continue to present huge opportunities for the United States, and continued investment is needed to keep a lead in those areas. Landweber observes that electronic devices are becoming more generalized, such as iPods morphing into storage systems and PDAs that have become mobile phones.
    Click Here to View Full Article

  • "Purdue Engineers Define 15 Dimensions of 'E-Work'"
    Purdue University News (12/15/04)

    In an effort to help people and machines collaborate over global networks, Purdue University industrial engineer Shimon Nof has defined 15 e-dimensions of electronic work (e-work) that can help assign context to various e-work components (robots, sensors, software, hardware, databases, etc.). "You can see how to design a system because you can identify the separate components, you know what they are capable of and you can determine how they will interact with each other," Nof explains. The e-dimensions deal with such issues as how to smoothly combine the workings of computers, how to help people and computers integrate correctly, enabling "knowbots" to automatically perform tasks, furnishing systems that flag design conflicts, and guaranteeing the seamless flow of work in manufacturing operations. Nof notes that effective e-work is crucial for realizing the full potential of nascent and promising e-work activities, and people working under him at the Production, Robotics, and Integration Software for Manufacturing & Management (PRISM) Center have identified core tenets to manage e-work systems by taking advantage of the 15 e-dimensions. The e-dimensions are divided into four domains: E-work, distributed decision support, active middleware, and integration, coordination, and collaboration. The e-work domain consists of dimensions such as agents and protocols, elements of which include sensors, robots, and software for autonomous tasks; distributed decision support dimensions include those designed to help collaborators address problems via mathematical models and simulations; the active middleware domain's dimensions include grid computing and hardware and software that enable companies and institutions to employ a blend of old and new computers; and the fourth domain is comprised of techniques for facilitating effective interaction between people, computers, and companies.
    Click Here to View Full Article

  • "Supercomputing Gets Low-End Extension"
    ServerWatch (12/20/04); Robb, Drew

    Supercomputing is no longer solely the domain of scientific laboratories, but is increasingly prevalent among commercial entities who use the machines for creating digital special effects, oil exploration, designing new car models, and running complex financial simulations. Although the hardware itself is becoming less expensive, there are also pay-as-you-go options that enable smaller outfits to rent massive computing resources. Cluster architectures first began to displace massive mainframe machines nearly a decade ago; the shift toward low-end technologies is reflected in the latest Top 500 supercomputing list, which features 296 clustered systems and 320 systems built using Intel processors. IBM Power processors are also gaining ground and are used in 54 of the top 500 systems because of their improved power management and efficiency, says Gartner analyst John Enck. The new No. 1 supercomputer is an IBM BlueGene/L system built for the Department of Energy, clocking in at 70.72 Tflops. Other top listings on the supercomputer list include the No. 2 SGI Altix system run by Itanium 2 processors and the Linux operating system. Notably, the No. 7 entry is Virginia Tech's X-system built from Apple Xserve servers, also based on the IBM Power processor. With these changes in technology has come dramatic increases in supercomputing power, with the cumulative performance of all top 500 supercomputers now at 1.127 Pflops, compared to 813 Tflops six months earlier.
    Click Here to View Full Article

  • "Open Systems--Mutual Understanding Without Limits"
    Russian Science News (12/17/2004)

    Universal open system technology developed by researchers at the Russian Academy of Sciences' Institute of Radio Engineering and Electronics, the All-Russian Scientific Research Institute of Standards (State Standards of Russia), and the Moscow Institute of Radio Engineering, Electronics, and Automation is expected to ultimately eliminate "mutual incomprehension" between networked computer systems with disparate software. The technology actually functions as a universal translator interface, employing diverse, original hardware and software to build a unified set of standards that all system participants can understand. Project manager and professor Alexander Oleinkov notes that the Institute of Radio Engineering and Electronics has been developing the open system technology for about a decade, and the research has yielded a system for metallurgical integrated operations. Basing the information infrastructure of any level on open system standards and technologies makes obvious sense, as the alternative nullifies the point of networking systems if those systems cannot share their data. The project is backed by the Russian Foundation for Basic Research and the Foundation for Assistance to Small Innovative Enterprises.
    Click Here to View Full Article

  • "Extras for Premiere Digital Film Production"
    IST Results (12/22/04)

    The MetaVision system, a IST Prize 2005 nominee, enables filmmakers to convert films into various formats without suffering from a downgrade in quality, as well as produce effects beyond the scope of analog technology in post-production. "The goal was to revolutionize the way films and TV programs are captured, produced, stored and distributed," notes MetaVision project coordinator Paul Walland. At each step in the cycle between production and distribution, the system employs consistent metadata describing footage that includes the type of cameras employed, lighting conditions, production crew details, and so on. The inclusion of "intimate" metadata taken at the same time as the original footage facilitates multi-format conversion. Placing stereo cameras on either side of the central electronic film camera and synchronizing all the captured imagery produces a depth map, and this metadata can be applied to the insertion of 3D effects--virtual objects and the like--during post-production. The Material eXchange Format in which the primary content and the depth metadata is stored is envisioned as a "wrapper" where the system incorporates video, audio, and metadata. The MetaVision project created a breakthrough digital camera that records images at the standard 24 frames per second (fps) rate while capturing motion within a scene at a speed ranging between 72 fps and 150 fps; system overload is evaded by transferring the motion at a lower resolution per frame, and engineers can replicate any frame rate using data of the closest frames and the motion between them. A show reel demonstrating how MetaVision can handle a wide spectrum of picture content from state-of-the-art high-resolution, high frame-rate cameras will be released in February by partners in the IST MetaCamera project.
    Click Here to View Full Article

  • "Software Could Predict Terrorist Growth"
    Fort Wayne News-Sentinel (IN) (12/20/04); LeDuc, Doug

    Indiana University-Purdue University Fort Wayne (IPFW) professor Larry Kuznar made a presentation to the National Science Foundation last week in the hopes of securing $750,000 in funding to develop Nomad simulation software that could predict the emergence of terrorist organizations. The software would build upon a system that analyzes decision-making among nomadic groups. "It might be useful to anticipate what areas of the world and sectors of society are going to be the most volatile," Kuznar explains. "In targeting those areas for economic development or monitoring or whatever people would do to prevent these [terrorist activities], we think we can direct people's attention to where the real problems are." Kuznar draws a parallel between decision-making in traditional herding societies and the decision-making involved in portfolio management, the key area of concentration being the distribution of scant resources in uncertain conditions. Nomad's underlying system inspired the establishment of IPFW's Center for Excellence in Decision Theory, which is currently employing Nomad to study conflicts between herders and villages. Kuznar says the system could be tweaked to forecast how other types of group relationships, such as economic or business partnerships, can break down into enmity. He believes the software could have a useful role in the organization of business, political, and peacekeeping tactics, and local businesses such as the Fort Wayne division of Northrop Grumman have already pledged financial support to the IPFW center.
    Click Here to View Full Article

  • "Digital Fashion Design Ain't as Easy as It Looks"
    Salon.com (12/21/04); Lidgus, Sarah

    For computer animators, designing computer-generated clothing can be just as painstaking as crafting its real-world equivalent, a task made all the more daunting when outrageously proportioned virtual characters perform amazing stunts, as in Pixar's "The Incredibles." CG clothing must fit onto a character's body as it would fit onto a three-dimensional person, and pattern-making software is employed to stitch together costumes in virtual space. Clothing animation is facilitated by a cloth simulator program that teaches the computer how fabric behaves in specific situations. However, creating effective CG attire requires a marriage of programming and real-life tailoring--a combination embodied by Pixar animator Christine Waggoner, whose Cloth Team created more than 140 articles of clothing for "The Incredibles" over approximately three years. An actual tailor was hired to help design the costumes. Waggoner says the movie's characters were built out of multiple layers of muscle, skin, and clothing, noting that "We used this approach in an effort to create a higher level of realism: muscles sliding under skin, and cloth sliding, resting or folding over skin." Despite the efficiency offered by virtual tailoring programs, Waggoner says clothing simulation remains a formidable challenge, given the mathematics and algorithms needed to model even the tiniest details. Prior to an earlier Pixar release, "Monsters, Inc.," CG clothing appeared unnaturally rigid and unconvincing.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "ITU Chiefs Target ICANN Turf"
    Computer Business Review (12/20/04)

    The United Nations' International Telecommunications Union (ITU) is becoming more explicit in its plans to claim some of the Internet governance power currently wielded solely by ICANN. The 40-member UN Working Group on Internet Governance (WGIG) has been trying to define "Internet governance" ahead of the World Summit on the Information Society to be held next September. ITU Secretary-General Yoshio Utsumi is pushing for the WGIG to define governance narrowly, as those functions covered by ICANN, particularly the management of top-level domain names. However, his suggestion has yet to find support within the working group. Utsumi's position on the matter appears to signal his desire to give the ITU more control over two master Internet databases--the root directories of the domain name system and the master list of Internet protocol addresses--which are currently controlled by ICANN and its Internet Assigned Numbers Authority (IANA). In their efforts to take over both telecommunications and the Internet, ITU officials are proposing several initiatives, such as allowing countries that have problems with ICANN to have the ITU take over IANA functions. Paul Kane, chairman of the Council of European National Top Level Domain Registries, opposes this idea, suggesting that the ITU seek a role in ICANN's Government Advisory Council instead. Other proposals have similarly run into opposition from those who say that dividing power between ICANN and the ITU would complicate the system and add an extra level of bureaucracy.
    Click Here to View Full Article

  • "Online Dangers Likely to Continue Growing in 2005"
    TechNews.com (12/16/04); Krebs, Brian; MacMillan, Robert; McGuire, David

    Although the current year has been a bad one for cybersecurity, security experts are predicting that cyberspace will grow progressively more dangerous in 2005. Statistics from online security firm Symantec show that new Internet worm variants increased 400 percent from the first half of 2003 to the first half of 2004. Meanwhile, phishing attacks--in which crooks use fraudulent emails to steal consumer data--turned out to be one of the biggest Internet security concerns during 2004. Indeed, the number of phishing attacks increased from 176 during January 2004 to 1,422 in June 2004, according to the Anti-Phishing Working Group. Security experts predict that phishing con artists will eventually have to start targeting smaller, less-aware businesses as larger businesses become better about protecting against phishing attacks. On the legislative front, 2005 is expected to be a year of renewed emphasis on eliminating spam and spyware. In general, cybercrime legislation will be more likely to come from the states than the federal government in 2005, says Eric Allman, founder of email security firm Sendmail. Spam email is flooding users' inboxes in such large numbers that just 12 percent of all email on the Internet was considered legitimate in 2004, a number that is expected to fall to 8 percent in 2005, says email filtering firm Postini. Postini's Andrew Lochart says, "The general trend is that the problem is getting worse." The Center for Democracy and Technology associate director Ari Schwarz expects Congress to be more aggressive in overseeing cybersecurity, although no new antispam legislation is expected.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Cultivating Future Services"
    eWeek (12/13/04) Vol. 21, No. 50, P. 46; Coffee, Peter

    Jim Spohrer with IBM's Almaden Research Center estimated that more than 70 percent of U.S. workers are employed in the services sector, but IT has failed to deliver the same productivity gains in services that it has in other fields, such as farming or manufacturing, according to 270 experts gathered at the center to discuss a new academic science to study services improvement. Software Engineering President Randall Jensen said his group's study of software development between 1960 and 1990 showed slowing productivity growth, starting at 1.8 percent per year and tapering off to just 1.1 percent growth per year. The drop in software development productivity also reflects the increased demands on software applications, such as the need for security, improved user interfaces, and the need for real-time operation in business settings. Northwestern University Kellogg School of Management professor Michael Radnor warned that no companies would be successful if they did not address services excellence, because even the best manufactured product can eventually be replicated at less cost elsewhere; businesses need to become flexible and responsive to the market in order to sustain competitive advantage. Today, developed infrastructure is not an absolute barrier to less-developed nations assuming top value-chain roles, warned U.S. Department of Commerce senior technology advisor John Sargent. John Deere tractors are a good example of how companies can leverage IT for sustained competitive advantage. By adding specialized GPS technology to the tractors, farmers are able to more precisely deploy fertilizers and other chemicals and increase their profitability. Flexible IT operations are needed to keep innovating, but companies also need to keep an eye on their customers and help them optimize their end of the supply chain or buy into new technology.
    Click Here to View Full Article

  • "User Group to Reveal Model for IS Security Feature"
    Network World (12/20/04) Vol. 21, No. 51, P. 11; Fontana, John

    The end result of over a year's worth of research by the Network Applications Consortium (NAC) will be published on Jan. 1 as a 121-page document titled "Enterprise Security Architecture: A Framework and Template for Policy-Driven Security." The document will map out a policy-based security scheme that NAC hopes will become an industry standard for corporate information system protection. The ESA document defines the policy, technical, and operational models companies should embrace when developing a security framework, and the architecture is founded on a series of policies employing National Institute of Standards and Technology- and International Organizations for Standards-approved policy creation specifications. The purpose is to connect security policies' definition, execution, and enforcement to the network's physical security elements; the policies for each component will ultimately be automated throughout the physical network. Over the last eight months, NAC has updated the document to include a detailed description of the requirements and interdependencies for security operations such as compliance, asset, vulnerability, event, and incident management, while also inserting a template describing automated policy creation from a series of business needs. But NAC reports that current technology and standards cannot support the integration the process requires. For the moment, the document outlines what steps companies can take to migrate toward a policy-driven security scheme. The consortium aims to promote awareness and further development of the ESA blueprint by collaborating with organizations such as the Open Group, the Distributed Management Task Force, Microsoft, and Cisco.
    Click Here to View Full Article

  • "Emerging Technologies Update"
    Computerworld (12/20/04) Vol. 32, No. 51, P. 23; Mitchell, Robert L.

    Since their introduction, PCI Express (PCIe), Bluetooth, the 802.11g 54 Mbps wireless LAN specification, and power over Ethernet (PoE) have had varying degrees of success. PCIe, designed primarily to eliminate I/O bottlenecks, has made significant progress, though analyst Nathan Brookwood does not anticipate a major surge in the technology's momentum until 2007. PCIe supplants the PCI bus with a serial framework that uses as many as 16 wires to support between 500 Mbps and 16 Gbps of bandwidth; PCIe is also replacing PCs' accelerated graphics port for high-end graphics, although the technology is thought to be an excessive measure for other I/O requirements. Bluetooth has been more heavily embraced by consumers than enterprises, partly due to lingering interoperability and security concerns, power management and ease-of-use issues, and competition from the much faster ultrawideband PAN spec. The 802.11g spec delivers backward compatibility for users of the slower 802.11b spec, but many businesses do not require 802.11g's additional bandwidth, notes analyst Ken Dulaney. Still, the technology's maturation has prompted enterprise-class WLAN product vendors to replace b-mode-only access points (APs) with g-mode APs, which has interested some businesses. Graham Melville with Symbol Technologies reports that 802.11a, which boasts 11 channels to 802.11g's three, is beginning to draw interest concurrent with expanding bandwidth needs, while the nascent 802.11n spec, which promises a 400 percent gain over current WLAN speeds, should deliver mature products in 2007. Since the IEEE's ratification of the 802.3af standard last year, PoE has matured into a global standard for delivering low-voltage power to networked devices, but PoE is only offered in high-end switches right now, though IDC expects around 41 percent of all Ethernet point shipments in 2008 to be PoE-enabled.
    Click Here to View Full Article

  • "Is There a Future for Speech in Vehicles?"
    Speech Technology (12/04) Vol. 9, No. 6, P. 27; White, Kenneth; Ruback, Harvey; Sicconi, Roberto

    Speech recognition technology is being embedded in automotive systems in order to improve passenger and driver safety by minimizing distracting elements and keeping the motorist focused on the road. Effective automotive speech systems must strike a balance between available system resources and the complexity of the solutions, exhibit a high signal-to-noise ratio, accommodate various dialects and speaking styles, and address high user expectations. Because noise varies so widely in a vehicle, the recognition system and the acoustic model's training process must adapt to a broad spectrum of variables, such as bumps, clicks, horns, wipers, and other transient sounds. One solution is to deploy directional, noise-reducing microphones in close proximity to users, but not every automotive environment can meet such a system's design, manufacturing, and cost requirements. Many technical challenges can be tackled via more sophisticated input audio processing prior to speech recognition processing through the use of microphones, advanced algorithms, and enlarged acoustic models, although their feasibility can be limited by cost and general automotive needs for size and reliability. A significant breakthrough is represented by the prototype Conversational Interface for Telematics, which uses conversational language to permit hands-free control of various in-vehicle systems, prevent the driver from nodding off, and enable Web-based services. The Audio Visual Speech Recognition project, meanwhile, aims to increase the speech recognition engine's accuracy and do away with repetitive prompts by training lip-reading cameras on the driver's mouth. Other breakthrough technologies on the horizon include real-time traffic navigation systems that update the car's navigation display with traffic data as it occurs, and in-vehicle navigational systems that provide turn-by-turn directions in response to spoken street and city names.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "Is the Software Industry's Productivity Declining?"
    Software (12/04) Vol. 21, No. 6, P. 92; Groth, Robert

    Software productivity has been called into question by Business Week's "Industry Outlook 2004," which estimates that productivity has been falling at an annual rate of 0.9 percent for the past five years. Economy.com economist James Glen explains that though real-dollar output per worker fell over the five-year period, employment rates experienced a rapid increase and a slower decline than output during the recession; however, Glen notes that software productivity has not advanced much when compared to computer hardware. Eric Schurr of IBM's Rational Software Products Group disputes such conclusions, citing the acceleration of software development thanks to more sophisticated tools and reports of "dramatic" productivity increases from customers. Klocwork CTO Djenana Campara says Economy.com's numbers provide no clear way of measuring productivity, and argues that increasingly complex applications are affecting workers' productivity, a problem amplified by development teams' geographic dispersal. In addition, Campara says that comparing productivity between developers has become tougher with the emergence of development challenges that were nonexistent five years ago. Attempts to shore up software productivity vary: Microsoft is focusing on improving productivity measurement as a member of the Information Work Productivity Council, while IBM's Schurr says Rational is concentrating on automating the development process, employing automated testing solutions, and introducing best practices. Campara says Klocwork is focusing on detecting defects earlier in the development cycle as well as speeding up developer training, increasing code maintainability, and promoting code reuse and refactoring via new visualization techniques. All vendors consistently cite the absence of an industry-wide definition for software productivity, the increasing complexity of software applications, and the need for more standardized processes in the overall industry.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM