Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Thunderstone or ACM. To send comments, please write to technews@hq.acm.org.
Volume 7, Issue 764: Friday, March 11, 2005

  • "ACM Awards Honor Distinguished Contributions to Computing"
    AScribe Newswire (03/10/05)

    The ACM has announced the recipients of the 2004 Distinguished Service Award and the 2004 Outstanding Contribution to ACM Award for their role in advancing the field of computing. The University of Arizona's Richard Snodgrass, ACM Fellow and editor-in-chief of ACM's Transactions on Database Systems, will receive the Outstanding Achievement Award for spearheading an initiative to get ACM's Special Interest Groups behind the goal of widening the scope of the ACM Digital Library, and turning the computing research archive into a portal encompassing the whole of computing literature. A past chairman of ACM's Publications Board and ACM's Special Interest Group on Management of Data (SIGMOD), Snodgrass is the winner of the 2002 ACM SIGMOD Contributions Award, and counts database design, temporal databases, query optimization and evaluation, storage structures, and query language design among his research interests. The Distinguished Service Award will go to E.G. Coffman of Columbia University, who performed pioneering research into time-sharing systems and computer networks, and also made significant contributions to performance evaluation, scheduling theory, queuing theory, and combinatorial optimization. In addition to being an ACM Fellow, Coffman is co-founder of ACM's Special Interest Group on Metrics (SIGMETRICS), a former editor-in-chief of the Journal of the ACM, and the winner of the 1987 Outstanding Contribution to ACM Award as well as the first ACM SIGMETRICS Award in 2002. Coffman's efforts were vital in obtaining an AT&T Bell Laboratories grant to fund Russian groups researching applied probability and information theory immediately following the collapse of the Soviet Union.
    Click Here to View Full Article

    For more information on this year's ACM awards recipients, visit http://www.acm.org/announcements/homepage.html.

  • "Senate Spotlight Turns to Data Security"
    Wall Street Journal (03/11/05) P. A4; Conkey, Christopher; Nelson, Emily; Fields, Gary

    Security breaches at personal data vendors ChoicePoint and LexisNexis have enlivened congressional debate about how to strengthen privacy regulations in a market where the government currently has an unclear role. In testimony before the Senate Banking Committee, FTC Chairwoman Deborah Platt Majoras admitted her agency lacked clear mandates over data brokers, and said hackers and vast collections of personal data in cyberspace amounted to something akin to the Wild West. Fast-growing stores of personal data and rapid transmission of information had caused debate over regulation before the Sept. 11, 2001, terrorist attacks, but afterwards homeland security and other concerns overshadowed privacy. LexisNexis CEO Kurt Sanford emphasized at the hearing the fact that law enforcement and financial institutions rely on personal data brokers to provide secure, quick investigations so consumers can easily get new checking accounts, for example; he also welcomed data-security standards for his industry. Sen. Jon Corzine (D-N.J.) offered his Identity Theft Protection & Victim Notification Assistance Act, which expanded on a bill first proposed in 2003 and included Sarbanes-Oxley-type executive accountability, FTC oversight, and customer notification requirements. Another proposed bill from Sen. Bill Nelson (D-Fla.) and Rep. Ed Markey (D-Mass.) would also give the FTC more clear responsibility and give consumers more power to correct erroneous information. Meanwhile, the Library, Bookseller, and Personal Records Privacy Act, proposed by Sen. Russell Feingold (D-Wisc.), amends the Patriot Act to restrict access to personal records for counterintelligence purposes. Senate Banking Committee Chairman Sen. Richard Shelby (R-Ala.) and House Energy and Commerce Committee Chairman Rep. Joe Barton (R-Texas) are members of the Privacy Caucus, formed in 2000.

  • "Why Women Leave IT"
    NewsFactor Network (03/09/05); Hill, Kimberly

    Women's share of the IT workforce fell from 41% to 35% between 1996 and 2002, and University of Arkansas professor Deb Armstrong says this decline is gaining traction. This trend could seriously impair employers facing a scarcity of technical workers, and also hurt the career prospects of women who otherwise would fill those positions. Male IT professionals face the challenge of maintaining and managing a balance between their work and family responsibilities, while women must accommodate these factors as well as their desire to build skills and tackle challenging projects--a situation that causes so much stress that many women voluntarily drop out of the IT field. Male and female IT workers often share the same requirements for challenging work and the same career ambitions, but striking a balance between the two can be more difficult for women because of certain lifestyle choices or proclivities. For instance, women are less inclined than men to attend training courses in the evening, or leave town for a time to get training. The decision to go on maternity leave after a child is born can also affect a female IT worker, as changes in her absence can be a source of additional stress. However, analyst Sheryl Kingstone sees some positive trends for women in IT: By now the first generation of female IT workers has reached professional maturity, which means they may already have moved up to management and executive positions. Furthermore, there is a broad spectrum of related positions women can enter if IT proves to be too stressful, including business analysis and line-of-business slots.
    Click Here to View Full Article

    For information on ACM's Committee on Women in Computing, visit http://www.acm.org/women.

  • "Next Big Step for the Web--or a Detour?"
    CNet (03/09/05); Festa, Paul

    Enterprise applications for the Semantic Web is the theme of this week's Semantic Technology Conference, where advocates will address doubts that World Wide Web Consortium (W3C) director Tim Berners-Lee's vision of a next-generation Internet is practical, much less achievable. Among the Semantic Web's purported benefits are more precise search engines, automatic Web site reconfiguration based on user requirements, and more automatic exchange of information between applications--all of which would stem from protocols that enable computers to consume and share information about context and meaning. Supporters are counting on data reuse and "recombinant effects" facilitated by the Semantic Web to open up new avenues of merging and exploiting data in unpredictable and unpredictably valuable ways. The Semantic Web's promised ability to make data interchangeable has critics worried about the potential for unintentional disclosure of information. The W3C says it plans to set up a Semantic Web information sharing rules system, and is calling for position papers by March 18 for its late April workshop on rule languages for compatibility. In a March 8 keynote address at the conference, W3C Semantic Web activity lead Eric Miller argued that the Semantic Web has already started to emerge with the development or rollout of Semantic Web technologies by Nokia, IBM, and others; IBM Internet Technology Group, for instance, is developing Semantic Web applications for the life sciences, while Nokia has released its Wilbur Semantic Web toolkit onto the SourceForge.net open-source development site. "We are not yet really seeing the benefit of application areas being connected together in unexpected ways," admitted Berners-Lee. "But in certain areas, the critical mass has been passed."
    Click Here to View Full Article

  • "World's Most Powerful Computer Is Doubled in Size"
    IDG News Service (03/10/05); McMillan, Robert

    Lawrence Livermore National Laboratory researchers report that the 32,000-processor Blue Gene/L supercomputer--the fastest machine on Earth, according to the Top500 ranking--has increased its size by a factor of two, and its processing power is expected to double as well. Lawrence Livermore Production Linux Group leader Robin Goldstone says the supercomputer consists of around 32,000 two-processor nodes, which adds up to approximately 64,000 processors' worth of power. A 33,000-processor Blue Gene/L prototype led the Top500 list with a peak performance of 70.72 TFLOPS, but Lawrence Livermore's expanded version should be able to raise that performance by about 100 percent. IBM is busy commercializing Blue Gene and has made the 5.7 TFLOPS eServer Blue Gene Solution available to high-performance computing clients. The company has also made arrangements to provide Blue Gene systems to the University of Edinburgh, the San Diego Supercomputing Center, and other research institutions. A 100 TFLOPS Blue Gene system will also be put into operation this month at IBM's Thomas J. Watson Research Center, partly for use in life-sciences research. Once completed in June, Lawrence Livermore's Blue Gene/L will have 130,000 processors and be capable of 360 TFLOPS, yet it will only be about half the size of a tennis court. The system will also be more power-efficient; the supercomputer should consume roughly 1.6 megawatts of power, compared to the 4.8 megawatts the ASCI Purple system is expected to draw when it comes online in June.
    Click Here to View Full Article

  • "Revised Spyware Bill Moves Ahead"
    Wired News (03/10/05); Grebb, Michael

    An amended version of the Securely Protect Yourself Against Cyber Trespass Act, or Spy Act, was unanimously passed by the House Commerce Committee on March 9. The act is designed to inhibit the hijacking of home pages or the capture of users' keystrokes by purveyors of spyware and deny the collection of personal data without users' express consent; the bill also says spyware programs should be clearly marked and removable, and authorizes the FTC to fine violators a maximum of $3 million for each transgression. The Spy Act incorporates changes introduced last month and on Wednesday by Subcommittee on Commerce, Trade, and Consumer Protection Chairman Rep. Clifford Stearns (R-Fla.) in response to concerns by tech companies that the legislation could also target legitimate software. The revisions exclude software cookies and Web beacons from the bill's definition of spyware, and exempt embedded ads on Web pages from the requirement that online ads be easily identifiable and removable. The changes also allow companies to monitor activity on their own Web sites and control advertising based on that monitoring without having to comply with the Spy Act's notice-and-consent provision, and forbid "evil twin" attacks. Ranking Commerce Committee member Rep. John Dingell (D-Mich.) expressed concern that the bill's cookie exemption might be too broad in scope, and said that "we need to make sure that we are not creating dangerous loopholes that are inconsistent with the purposes of this legislation." The Senate lacks a companion bill for the Spy Act, though Sen. Conrad Burns (R-Mont.) intends to resuscitate anti-spyware legislation "in the next one or two months," according to a representative. The Spy Act is sponsored by Rep. Mary Bono (R-Calif.).
    Click Here to View Full Article

  • "Europe Lacks Vision on Innovation"
    Financial Times (03/10/05) P. 10; Sherwood, Bob; Tait, Nikki

    Despite the recent breakthrough on limited software patents among European Union industry ministers this week, the issue of intellectual property protection is still confused in the EU. The European Parliament will review the software patent legislation and either approve, reject, or offer amendments for it. Proponents, mostly large companies, say the uniform law is needed to clarify discrepancies between different member states and increase European competitiveness against U.S. and Japanese industry, both of which are considered to have stronger intellectual property protections. Smaller companies and individual software developers fear software patents for code that makes "a technical contribution" could lead to de facto general software patents where innovation is inhibited. In general, the EU has failed to establish more efficient patent protection, as illustrated by the proposed "community patent" that was to be implemented across the region but has been held up in debate for 30 years. As a result, obtaining patent protection in the EU is not as straightforward a process as in other countries, such as the United States. Some pragmatists have suggested augmenting the existing bundled patent system, administered through the European Patent Office, with specialized courts that would handle litigation for those patents. Currently, groups wishing to challenge patents can shop around for friendly jurisdictions; the European Patent Litigation Agreement details the function of these specialized courts, and the idea has received the backing of the European Patent Office and patent judges.

  • "Humanoids With Attitude"
    Washington Post (03/11/05) P. A1; Faiola, Anthony; Yamamoto, Akiko

    Analysts call Japan a world leader in the application of artificial intelligence to everyday life, a reputation that is being cemented by the many sophisticated robots employed in the country as security guards, receptionists, guides, pets, and hospital workers, among other things. Though the level of AI technology advancement in the United States is perhaps equal to Japan's, American AI research efforts mainly focus on military applications, while the bulk of Japanese efforts are directed at consumer applications. This trend is being driven by a number of things, including concerns about a future shortage of factory workers due to depopulation, and young people becoming less inclined to accept hazardous, dirty, or physically rigorous work. Robotic solutions being considered or deployed in this vein include a line of versatile worker robots with human-like hands envisioned by Toyota, and cyber-security guards from Alsok that use sensors and paint guns to detect and stop intruders. The sensor-equipped Paro robotic baby harp seal, designed as a therapeutic toy for the elderly, can recognize the voice and hand gestures of its owner and respond with soothing sounds and movements. Even more advanced is a robot receptionist at the Tokyo University of Science that can perform 700 verbal responses using voice recognition, and express emotions facially. ATR Intelligent Robotics and Communication Laboratories director Norihiro Hagita says the Japanese are more accepting of humanoid robots than Westerners partly because a precept of the Shinto religion is the presence of gods within all things, whereas most people in Western countries subscribe to monotheism. A January report compiled by Japanese officials forecasts that there will be a robot in each Japanese household by 2015 or earlier.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Search Engines Build a Better Mousetrap"
    New York Times (03/10/05) P. E4; Gnatek, Tim

    Spurred by the success of Google, competing search engines such as MSN Search and Ask Jeeves are rolling out new tools that offer more intuitive search capabilities. Ask Jeeves' "Smart Search" feature directly answers definitive queries in addition to providing Web links, and this spring Ask Jeeves will debut Direct Answers From Search, a new feature for finding answers to natural-language questions by scouring the whole Web rather than just its own database. Amazon's A9 Yellow Pages service can look for and provide directions to local businesses, and can also display pictures of a business in the context of its surroundings. Microsoft's MSN Search has been upgraded to simplify queries by complementing Boolean terms with adjustable slide controls, and another new feature returns results based on proximity to user locations. More accurately predicting what the user is looking for and simplifying the navigation of search results are likely to be the core goals of continuing search engine improvement initiatives. "The top frustrations among searchers are that the results aren't comprehensive enough: The results were difficult to sort through, and, at times, irrelevant," notes Keynote Systems research director Bonny Brown. "Any time a site thinks for customers, people always appreciate that." A recent study by the Pew Internet and American Life Project concludes that fully exploiting search technology may require a significant shift in human behavior; report author Deborah Fallows believes that search will improve dramatically once most users get in the habit of making more discriminating queries that search services expect.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Computing Success"
    Daily Bruin (03/10/05); Fernando, Menaka

    UCLA's Department of Education and School of Engineering and Applied Science and the Los Angeles Unified School District (LAUSD) are engaged in a collaborative program to boost the percentage of female and underrepresented minority students taking Advanced Placement (AP) computer science courses by helping high schoolers bone up for AP exams. Since the AP Readiness Program's inception last summer, the number of high schools offering AP computer science courses has doubled to comprise 22 out of 57 schools; the number of African American students has risen by a factor of two; the number of female students has expanded threefold; and the number of Hispanic students has tripled as well. "The challenge now is to sustain this over time," notes UCLA Department of Education researcher and AP Readiness Program co-founder Jane Margolis. Margolis says she and teacher-training institute director Joanna Goode discovered that schools with high minority populations did not offer high-level computer sciences courses, while teachers lacked familiarity with the subject and were often compelled to concentrate on testing in reading and math to satisfy federal standards. The program's founders say some 25 LAUSD teachers were trained to take an interdisciplinary approach to computer science with UCLA resources, and overtures have been made to the National Science Foundation to expand the teacher-training institute with a $1 million grant. UCLA School of Engineering dean Vijay Dhir attributes the low numbers of college-level female and minority AP computer science students to problems in the K-12 education system, and he intends to set up an Engineering and Science Corps program in which LAUSD elementary, middle, and high school math and science students are mentored by UCLA graduates and undergraduates in the engineering school.
    Click Here to View Full Article

  • "Open-Source Leader Highlights Technologies for Developers to Watch"
    eWeek (03/06/05); Taft, Darryl K.

    Open source advocate and Spring application framework founder Rod Johnson said the J2EE platform has gained strength over the last two years and encouraged Java developers to use framework-oriented development at TheServerSide Java Symposium in Las Vegas. Johnson criticized J2EE at TheServerSide meeting in 2003, but this year he said J2EE has become the premier development platform with more regular and predictable successes, and that Microsoft's .Net is not as threatening as it once was. Johnson advocated several technologies: Inversion of Control and dependency injection design patterns, unit testing and test-driven development (TDD), post struts 1.x Web technologies, and rich client technologies. Frameworks and methodologies are important to achieving desired goals, and framework-oriented development facilitates better technology choice than does in-house framework. TDD is becoming more popular because smaller teams can use it, while Johnson said standardization is not necessary for every technology, and in the case of O/R mapping, bypassing Java standardization could have saved months of wrangling over Java Data Objects and Enterprise JavaBeans. The convergence of IBM's AspectJ and BEA Systems' AspectWerkz is benefiting aspect-oriented programming (AOP), and Johnson's own Spring framework is interoperable with AspectJ, leaving the competing JBoss solution as the only proprietary AOP technology. Johnson said open source software is not about making closed-source software obsolete, however, but was simply a better way to create software products. Open source is about "community, pride, freedom of information, a close relationship to users, and easier debugging," he said.
    Click Here to View Full Article

  • "H-1Bs Now Open to the Less-Educated?"
    CNet (03/10/05); Frauenheim, Ed

    The U.S. Citizenship and Immigration Service (USCIS) has come under fire for its interpretation of a new law exempting foreigners with advanced degrees from an H-1B visa cap. The agency says it could broaden the category for 20,000 exemptions by including all qualified foreigners, not just those with master's degrees or higher from a U.S. educational institution. President Bush last year signed a law that allowed up to 20,000 foreigners with advanced degrees to skirt the stated H-1B cap of 65,000 visas. Businesses are divided over the H-1B program, and Compete America, a coalition of universities, trade groups, and companies, criticized the USCIS for ignoring the intent of the legislation and for disrupting business planning; USCIS' Chris Bentley said the agency has not yet finalized the rules. Last year, companies hit the 65,000-visa cap less than halfway through the fiscal year, and industry leaders have said H-1B visas are necessary to slow offshore outsourcing and fill skills gaps. Critics of the H-1B visa program claim it undercuts U.S. salaries and facilitates the movement of skilled work overseas. In 2003, 39 percent of 195,000 allocated H-1B visa slots went to foreigners working in computer-related jobs, with nearly 37 percent of those going to India-born workers; in 2005, all 65,000 non-exempted visas were allotted on Oct. 1, the first day of the government fiscal year. The USCIS has also been criticized for not yet accepting visa applications for the exemption program, despite the new law taking effect this week.
    Click Here to View Full Article

  • "Hackers Target U.S. Power Grid"
    Washington Post (03/11/05) P. E1; Blum, Justin

    The electric industry claims it is getting serious about cybersecurity, but government officials, including Federal Energy Regulatory Commission Chairman Patrick Wood, are skeptical of the industry's efforts, claiming that terrorists or hackers could cause serious damage to power plants and cause a widespread blackout. Computer hackers make several hundred attempts per day to breach the computer network of Constellation Energy Group, which operates Baltimore Gas and Electric. Constellation's chief risk officer, John Collins, says the company has no idea who the hackers are, only that they are attempting to breach security on a daily basis. Richard A. Clarke, the former federal counterterrorism chief, says the U.S. power grid is only as strong as its weakest utility company, noting that every time the government holds a simulated cyberattack on the grid, hackers have been able to breach power companies' security. Sophisticated hackers are probably capable of breaching each of the three U.S. North American power networks and bringing sections of the power grid down, Clarke warns. James Andrew Lewis, director of technology policy at the Center for Strategic and International Studies, says the real threat is from insiders with knowledge of utilities' computer networks, not outside hackers. Lewis also claims that terrorists are more likely to conduct a physical attack on power plants and electric lines than a cyberattack. Security consulting firms say that their tests have shown that power companies lack basic cybersecurity equipment, and they also say that they have easily been able to penetrate the companies' live networks when running cybersecurity tests.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "New System Enhances Images in Crime Investigation"
    New York Times (03/10/05) P. E8; Ricadela, Aaron

    Software originally developed by MacDonald, Dettwiler & Associates (MDA) to help Mars rovers get around obstacles has been refined into a prototype system that can convert a few seconds of video from a handheld stereo camera into a 3D model of a location, which is being tested by Canadian and U.S. police departments as a tool for aiding crime investigations. MDA's Instant Scene Modeler software improves on earlier computer vision systems because it can function in natural light, process data faster, and operate without the need for special markers in the environment to indicate where frames should be stitched together, thus eliminating concerns about disturbing the scene. The software builds the scene in 3D using the local invariant features technique, in which the most distinctive portions or features of a photo are extracted, and information about their location, orientation, size, and brightness is encoded in a small file. Descriptions of several hundred features for each photo are stored in a database that a computer can search when it is faced with an unfamiliar image; the software can accurately choose a matching image even if as much as 90 percent of the features do not match. The software splices images together using common features in each frame as indicators of where to join frames, and new video can be spliced into the model after it is constructed. MDA's software employs University of British Columbia professor David Lowe's Scale Invariant Feature Transform algorithm, which Evolution Robotics has also licensed for use in its computer-vision software. Microsoft researchers are developing software that can turn digital photos into a panorama using their own invariant features algorithm, while similar algorithms for identifying objects or sites and assembling 3D models from video and photos are also under development.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Novel, Computer-Assisted Method for Colorization"
    PhysOrg.com (03/08/05)

    Researchers at the Hebrew University of Jerusalem's Benin School of Computer Science and Engineering have developed an effective tool for adding color to black and white images and movies. Dani Lischinski, Yair Weiss, and graduate student Anat Levin have developed new software that allows users to scribble a particular color into the interior of a desired region, as viewed on a computer screen, and an interactive process takes over by automatically adding the color to the rest of the image and in other frames of the movie. The computer-assisted method is a step up from the labor-intensive process of manually coloring images, considering automatic algorithms are unable to differentiate between fuzzy and complex region boundaries. As a result, an artist would have to manually determine the boundaries between the hair and face of a person, for example. Similar difficulties are encountered when colorizing movies because a fully automatic and reliable algorithm for tracking regions during frame movement of scenes still does not exist. The software, which was presented at the top computer graphics conference in Los Angeles, also could be used to enhance digital photographs and for special effects.
    Click Here to View Full Article

  • "TCP/IP Pioneer's Past Is Prologue"
    EE Times (03/07/05) No. 1361, P. 1; Wirbel, Loring

    Packet Design and Precision I/O chief scientist Van Jacobson, creator of the TCP Protocol for Header Compression and a recipient of ACM's SIGCOMM Award in 2001, recalls that his interest in networked control began in the mid 1970s when he faced the challenge of finding a way to minimize the number of wires needed to control large, distributed systems such as particle accelerators during his stint at the Lawrence Berkeley Lab. He worked on such control systems for over 11 years, and the application of the insights gained from this research to more generalized IP networks followed in the mid 1980s. Jacobson says the impetus to improve TCP came out of responding to real-world problems, with scale being the root cause of most of the problems. "Every time a part of the network got ratcheted up a notch, new problems had to be addressed," he notes. According to Jacobson, simple priority-based quality-of-service problems are consistently frustrating because of the vast amount of documentation on the subject offering excessively complex solutions. Another source of concern is an apparent resurgence of interest in maximizing the number of circuit-like traits IP can support. Jacobson warns that "Anytime you try to apply scheduling to a problem to give latency strict bounds, the advantages are not worth the cost of implementation." Jacobson has outlined a way to improve Border Gateway Protocol (BGP) scaling with his BGP Scalable Transport proposal, and his area of concentration at Precision I/O is the improvement of server performance within the data center.
    Click Here to View Full Article

  • "Eclipse Chief Talks Up Projects, Awaits Sun and Microsoft"
    InfoWorld (03/04/05); Krill, Paul

    In an interview with InfoWorld Editor at Large Paul Krill, Eclipse Foundation executive director Mike Milinkovich attributes the enormous popularity of Eclipse open-source software to its excellent architecture, and notes that the enterprise IT space is a major user of Eclipse tools. He says the goal of the nonprofit foundation is to construct "an open universal development platform" on top of which commercial interests can build software products to be sold for commercial gain. Milinkovich details numerous Eclipse projects and their projected impact on enterprise IT: The Business and Reporting Tools (BIRT) project aims to enhance Java applications with embedded enterprise reporting capabilities, while the goal of the Eclipse Parallel Tools Platform project is to help parallel tool vendors promote compatibility between their products. The Test & Performance Tools Platform project will deliver a platform upon which tools can be built, tested, profiled, and monitored, and the Eclipse Communication Framework effort is geared toward the enablement of peer-to-peer communications between developers as they work on projects. Milinkovich says the Rich Client Platform (RCP) project's objective is to span the gap between the rich client user interface and the Web-based user interface in order to build very rich end-user applications that can be implemented on Windows, Unix, Mac, and Linux. Eclipse's future goals include greater participation in embedded development, and the provision of a comprehensive kit of embedded tools from within the first open source projects is a primary target. Milinkovich also expresses confidence that Eclipse is becoming an industry standard IDE thanks to the backing of high-profile member companies such as IBM, BEA Systems, Sybase, and Borland Software.
    Click Here to View Full Article

  • "A Head in the Clouds or Hopes on Solid Ground?"
    Speech Technology (02/05) Vol. 10, No. 1, P. 46; Berners-Lee, Tim

    In his keynote address at the SpeechTEK 2004 conference, World Wide Web Consortium (W3C) director Tim Berners-Lee discusses the general state of speech technology, the standards it is based on, and their relationship to the Web. "What's difficult for all the people out there who are wondering about speech technology is that they're stuck between thinking that it's hopeless and expecting it to do absolutely everything," he explains, stressing the need to draw a distinction, given that the technology is still immature. Berners-Lee notes that as new features and new facilities become broadly manageable and deployable, they are standardized by the W3C and made more powerful through Web technology crossbreeding. He expects users to demand multimodal interfaces that deliver graphical user interface and conversational interface interaction simultaneously. Berners-Lee observes that speech technology's proliferation may not be influenced by its reliability or users' comfort levels so much as by the simplicity of developing new applications. He sees several options: The more time-consuming option is to make an application as polished and difficulty-free as possible, while the less arduous option is to make the application prompt users with terms it understands while keeping it flexible enough to load speech grammars off-the-cuff. At any rate, Berners-Lee thinks that speech technology, despite its shortcomings, is a far easier and less stressful communications mode than touchtone. He urges speech technology developers to focus less on monopolization: "You can fight over a pie later, but please fight over a piece of the pie by making better software, not by just trying to track your proprietary solution to standard around the hope that everybody will follow," he argues.
    Click Here to View Full Article

  • "Toward Interoperable First Response"
    IT Professional (02/05) Vol. 7, No. 1, P. 13; Miller, H. Gilbert; Granato, Richard P.; Feuerstein, John W.

    Interoperable wireless communications among U.S. public-safety organizations is a necessity all too clearly demonstrated by tragedies such as the Sept. 11, 2001 terrorist attacks. Integrating the many federal and non-federal first-responder services in the United States requires a national architecture based on eight core principles. The architecture should be founded on and support the existing framework and procedures for handling operational incidents (the National Incident Management System); a general objective validation and prioritization process for continually evaluating extreme requirements should be incorporated into the architectural framework; the scheme should account for changes in technology, operational requirements, and other influential factors; the initial phases of deployment should rapidly fulfill critical, immediate needs such as basic connectivity; and multiple solution paths should be provided in order to improve system flexibility and robustness. Another important architectural principle is to re-establish spectrum planning at the national level and set up a standard national channel plan, while still another principle is to promote national procurement schedules contracts. The last key principle of developing the national first-response architecture and strategy is to consider centralized governance and coordination. The maturation of the architecture and its development/deployment scheme will follow Safecom's publication of its proposed architectural framework. The federal government will need to become more proactive and broader-minded in order to take advantage of the opportunities presented by this maturation.
    Click Here to View Full Article