HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Parasoft Logo    Automated Software Analysis - by Parasoft:
Error prevention for unit testing, coding standards analysis,
functional/performance testing, monitoring and reporting.
http://www.parasoft.com/products


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Parasoft or ACM. To send comments, please write to [email protected].
Volume 7, Issue 743:  Wednesday, January 19, 2005

  • "IT Compensation on the Rise in '05"
    Computerworld (01/18/05); Lee, Katherine Spencer

    A gradual resurgence of economic optimism should boost IT compensation this year, writes Robert Half Technology executive director Katherine Spencer Lee, who cites the Robert Half Technology 2005 Salary Guide's estimate that IT workers' average base pay will increase 0.5 percent overall, while high-demand specialties such as information security and quality assurance will experience higher raises. Trends the firm notes include multiple employment offers for people with highly sought-after skills, and a subsequent rise in employers' attempts to retain key performers through workplace improvements, better worker recognition efforts, and employee morale monitoring. IT hiring is being driven by companies' deployment of processes to ensure compliance with Sarbanes-Oxley and other regulations; sustained investment in Web-based applications to augment collaboration, customization, management, and customer service; the need for wireless communication and business intelligence; increased demand for security as a result of spam and virus growth; and application and technology upgrades. The Robert Half Technology IT Hiring Index and Skills Report indicates that the most sought-after IT skills are wireless network management and SQL server management, while the networking, technical support, Internet/intranet development, database management, application development, and project management specialties are also thriving. Above-average demand for certain positions is expected in 2005: Base compensation for system auditors is projected to rise 5.1 percent to the range of $63,250 and $81,750 yearly; the average starting salary for pre- and postsales consultants should increase 3.9 percent to between $53,500 and $78,250; and starting salaries for programmer/analysts should experience a 3.6 percent gain to the range of $52,500 to $83,250. Lee says many organizations are still cautious when it comes to full-time hiring, and prefer recruiting specialized professionals on a contract basis while waiting for indications of a sustained economic rebound.
    Click Here to View Full Article

  • "Harvard Chief Defends His Talk on Women"
    New York Times (01/18/05) P. A14; Dillon, Sam

    Harvard University President Lawrence Summers' suggestions concerning gender and how it may relate to the underrepresentation of women in science and math provoked shock and outrage among several female attendants at an academic conference last week. Summers apologized for any misunderstanding in a Jan. 17 interview but stood by his assertions, arguing that "raising questions, discussing multiple factors that may explain a difficult problem, and seeking to understand how they interrelate is vitally important." At the conference, which was entitled "Diversifying the Science and Engineering Workforce: Women, Underrepresented Minorities, and Their S. & E. Careers," Summers theorized that gender inequality on university math and engineering faculties could perhaps be partly explained by an unwillingness among married women with children to commit to the grueling work schedules. He also suggested that gender-based biological differences may account for the fact that fewer high school girls than boys tend to score very high and very low on standardized math tests, and this remark so infuriated MIT biology professor Nancy Hopkins that she walked out. In his defense, Summers said, "I was trying to provoke discussion, and I certainly believe that there's been some move in the research away from believing that all these things are shaped by socialization." Georgia State University economics professor Paula Stephan said Summers' suggestions did not offend her, and noted that research conference participants should expect the presentation of theories that may contradict their own ideas and then talk about them using research findings as a foundation. Harvard economics professor and conference organizer Richard Freeman also defended Summers, arguing that he intended to stimulate debate in an effort to encourage diversity in science and engineering workforces.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Multilingual Speech-Based Technology to Talk About!"
    IST Results (01/19/05)

    IST's DUMAS project has devised a multilingual speech-based system for innovative communication in its mission to investigate and mitigate three key shortcomings of current technology, according to DUMAS coordinator Dr. Bjorn Gamback: Limited processing and comprehension of structured, multilingual text; a narrow scope of conversational contexts dictated by design; and the systems' inability to adapt to specific user needs because of their failure to recall specific user behavior. DUMAS researchers based their systems on fundamental speech technology such as speech synthesis and recognition, and emphasized dialogue level difficulties to create products that process spoken and text inputs in multiple languages as well as give users the proper verbal responses. The project has yielded 27 outputs in various stages of development or commercialization, among them the prototype AthosMail email application. A key component of AthosMail is the Language Identifier, a text classification program that currently supports English, Swedish, and Finnish, and enables automatic selection of the linguistic analysis tool of the appropriate language. Meanwhile, the prototype AthosNews telephone system can read English and Finnish newspapers for the visually handicapped or people whose sense of sight is occupied, and Gamback says adding keypad navigation and document reading control is a relatively uncomplicated matter. Available DUMAS project outputs include Timehouse SpeechServer, which provides voice functionality for desktop apps and offers a simple and powerful interface for Microsoft's Speech API for developers; the Searcher text document indexing and retrieval tool, which matches queries and documents using the established Vector Space Model; and AthosCal, a multimodal calendar app enabled for PDAs, desktops, and other platforms.
    Click Here to View Full Article

  • "FBI Tosses Carnivore to the Dogs"
    Associated Press (01/18/05)

    Bureau oversight reports acquired by the Electronic Privacy Information Center under the U.S. Freedom of Information Act indicate that the FBI has scrapped its proprietary Carnivore Internet surveillance technology in favor of commercial wiretap software. The FBI's Paul Bresson said the commercial software is not only cheaper, but much better at copying a targeted online account's communications without disturbing other subscribers. Documents submitted to Senate and House oversight committees said that not once was Carnivore used in the 13 Internet wiretaps the FBI carried out in fiscal 2002 and fiscal 2003; the bureau, which once touted Carnivore as being vastly superior to off-the-shelf software, previously reported that the technology was used about 25 times between 1998 and 2000. Chicago-Kent College of Law professor Henry Perritt Jr., who directed an oversight study of Carnivore four years ago for the Justice Department, said the FBI developed the surveillance system in-house because commercially available tools were found wanting. He also said he did not know of any commercial wiretap software with audit features capable enough to persuade a federal judge that emails from innocent parties were accidentally collected. Experts drew parallels between Carnivore's approximately four-year life span and the life expectancy of state-of-the-art products developed by the private sector. Outside analysts estimated that Carnivore's development probably cost the government between $6 million and $15 million.
    Click Here to View Full Article

  • "Touch Screens More Likely to Be Flawed, Analysis Finds"
    South Florida Sun-Sentinel (01/16/05); Milarsky, Jeremy

    A South Florida Sun-Sentinel analysis of Florida voting machines used in the Nov. 2 presidential election found that touch-screen systems were 50 percent more likely than older paper-based systems to cast a flawed or unregistered presidential ballot. Some 18,555 instances of flawed or unregistered presidential ballots were extracted from the 5 million votes studied in the November election: 11,824 ballots out of 2.7 million touch-screen votes were determined to be unregistered ballots, while 6,731 ballots out of 2.3 optical-scan votes were flawed or unrecorded. Absentee or early voting period votes were not included in the analysis. The study estimated that both touch-screen and optical-scan machines had a less than 0.5 percent error rate, while a University of Florida analysis of the 2000 election found a 1.5 percent undervote rate for counties that used punch cards and a 0.3 percent rate for counties that used optical-scan machines. Florida spokeswoman Jenny Nash wrote in an email to the Sun-Sentinel that the differences between the paper-based and paperless voting systems were insignificant. U.S. Rep. Robert Wexler (D-Fla.), who lost a federal lawsuit in 2004 calling for touch-screen machines to provide a paper trail, was not surprised with the results of the study. Nor was Leon County election supervisor Ion Sancho, who is worried that touch-screen machines make it impossible for voters to independently confirm their choices after casting their ballots, and offer no assurance that undervotes are not being recorded due to a technical malfunction or voter difficulties. The November election study's outcome reflected an analysis of the March 2004 primary estimating that voters using touch-screen devices were six times more likely than optical-scan voters to cast flawed ballots.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Peer-to-Peer 'Seeders' Could Be Targeted"
    New Scientist (01/14/05); Knight, Will

    BayTSP's FirstSource software could help copyright enforcers track down file-traders who propagate copyrighted content throughout peer-to-peer (P2P) networks and narrow their focus of litigation on individual users rather than networks or users en masse. FirstSource can ascertain which user first uploaded a specific file for trading by very rapidly transmitting multiple file requests, and then unmask the initial perpetrators by assuming that only they will have the entire file available for download. "Once the file is confirmed to be the content in question, we can then monitor individuals who download from the initial source," explains BayTSP's Jim Graham. Most files are believed to originate from a relatively small number of sources, according to recent evidence, with movies or music albums that are leaked out or obtained without authorization being strong examples. Some P2P network experts call BayTSP's system highly flawed: "If you are spidering over millions of nodes that are rapidly appearing and disappearing how can you say for sure that one node with 100 percent of a file's contents didn't pull it from another node that just disappeared a minute before?" queries Rojo Networks programmer Brad Neuberg. Meanwhile, Freenet programmer Ian Clarke notes that network developers could probably subvert FirstSource monitoring by making "trivial" modifications to their software. Graham acknowledges that FirstSource has its faults, but he still thinks the software could be a good tool for copyright enforcers to have in their arsenal.
    Click Here to View Full Article

  • "U.S. Tech Jobs Sneaking Back, But Not Like During the Boom"
    Investor's Business Daily (01/18/05) P. A8; Howell, Donna

    Tech industry observers report an increase in hiring, although automation, outsourcing, and offshoring appear to be undermining demand for U.S. tech employees. "What in the past would be considered growth is being picked up by independent contractors," reports AMR research analyst Erik Keller, who chiefly blames offshoring for his projected forecast of flat or declining U.S. tech job growth this year. He says the people most likely to land secure jobs are those with a combination of IT and business skills. Meta Group analyst Maria Schafer notes that many companies are making queries into hiring, with particular emphasis on flexible contract labor rather than permanent, full-time employees; fellow Meta analyst Howard Rubin anticipates more long-term domestic job erosion from offshoring, with smaller companies spearheading U.S. tech job growth. Other analysts see evidence of a tech hiring rebound: Monster.com founder Jeff Taylor calls the increasing presence of online want ads a key indication of job growth, with IT ranked the fourth hottest field. Robert Half International executive director Katherine Spencer Lee claims that a tech "refresh" is underway, with increased demand for U.S. business analysts, project managers, and programmers being driven by rising IT spending budgets, more computer system upgrades, and additional hires by clients. Meanwhile, Barrington Research economist Alexander Paris expects increased hires in the second quarter as chip orders stabilize. The Labor Department estimates a 2.6 percent rise in the four main tech employment categories of computer and electronics manufacturing, Internet publishing and broadcasting, computer systems design, and related services in 2004; however, the department also reports significant losses in tech manufacturing and data processing jobs in December.

  • "Road Map to a Digital System of Health Records"
    New York Times (01/19/05) P. C1; Lohr, Steve

    The federal government should guide creation of a national digital health records framework similar to how it oversaw the creation of the Internet, according to a study presented to the Bush administration's national health IT coordinator by a group of 13 health and IT organizations. Such a system would save tremendous amounts of money as well as improve patient safety. Besides providing some initial funding, the federal government should establish a new third-party standards body that would coordinate industry efforts and policy, said the report from the Liberty Alliance Project, the American Health Information Management Association, the Healthcare Information Management Systems Society, and other groups. The 54-page study described an open-standards system similar to email that would allow health care organizations, laboratories, insurers, and others to transfer patient records electronically similar to how the email system works today; the report weighed in against a centralized national database and health ID cards, and said people should have control over their information so that permission must be granted before it is used in a study, for example. A separate article published in the Health Affairs journal says a standards-based digital health records system could save the United States $78 billion per year, but that partially open standards would significantly reduce those savings. Unless information-sharing is made as open as email, for example, participating groups would have to make costly software tweaks in order to send and receive data. The cost for building a national digital health care information infrastructure is pegged at $276 billion over the next decade, according to the Health Affairs article. Experts note that Congress only allotted $50 million to the newly appointed national health IT coordinator, David Brailer, who requested the industry recommendations.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Sun's Gustafson on Envisioning HPC Roadmaps for the Future"
    HPC Wire (01/14/05) Vol. 14, No. 2; Curns, Tim

    Sun Microsystems senior scientist and HPCS principal investigator John Gustafson puts major HPC developments in perspective and outlines future trends. The technology currently generating the most excitement for Gustafson is proximity communication, which he says can yield nearly three orders of magnitude more bandwidth inside and outside of chips and allow bandwidth upgrades to keep pace with Moore's Law. Future developments Gustafson anticipates include more practical optical interconnects for short-range communications; the application of asynchronous design principles to chips for speed optimization and energy efficiency; a comeback for liquid cooling as air cooling reaches its threshold at large HPC sites; the deployment of multiple-thread, multiple-processor cores on chips; and significantly better floating-point math techniques. Gustafson foresees the chief goal of HPC to be addressing the disparity between the cost-effectiveness of computing clusters and the supercomputing applications they support. He argues that grid-based supercomputing should only apply to closely-knit clusters or SMPs because the geographic dispersal of processors for parallel processing adds up to greater consumption of electricity and money. One of Gustafson's pet projects focuses on the generation of "immortal code," which is his term for computer programs that can weather at least half a century of technological change; he explains that algorithms are often rendered invalid by technology that demolishes the assumptions the programs are based upon, and attributes this problem to people exceeding their bounds and doing other people's jobs poorly. The solution to this problem is to have all involved parties remain in their respective roles and clearly relate happenings at their interfaces. Gustafson sees value in preserving or rediscovering knowledge and sometimes physical specimens of historical computers, which could spare engineers the burden of overhauling old concepts.
    Click Here to View Full Article

  • "Self-Learning Machines for Document Interpretation and Analysis"
    IST Results (01/17/05)

    Researchers in IST's KERMIT project have devised a system that can learn, by being fed paired electronic documents in different languages, to capture and easily categorize the central content of succeeding documents; the system employs a technique that improves significantly on the keyword and phrase recognition strategy most existing info categorization methods use. "The idea was to make the information retrieval language-independent," explains KERMIT project coordinator John Shawe-Taylor of Britain's Southampton University. "So that you could enter a query in English, and the system would pull in relevant information from, say, the French and German documents you have stored, as well as those in English." Shawe-Taylor says the language used has no bearing on the system's ability to categorize documents on its own, once it has been trained on the paired documents. In a test that gauged how well the system retrieved French documents from an English query, the software raised document retrieval accuracy 18 percentage points to 91 percent. Some KERMIT project partners are participants in the PASCAL Network of Excellence, an initiative with a heavy emphasis on general user-interface applications for machine-learning technologies. The project focuses on the analysis and interpretation of communications in multiple modes. PASCAL is also generating interest in its collaborative filtering potential; the partners developed a more sophisticated approach to building "recommender" applications that could be employed with applications such as Web portals that recommend similar documents or products once the user has chosen several items.
    Click Here to View Full Article

  • "DNA Scheme Builds Computers"
    Technology Research News (01/19/05); Patch, Kimberly

    Computers fabricated from self-assembling DNA promise to deliver dramatically more power efficiency and processing speed in a much smaller and cheaper package, and researchers at Duke University, the University of North Carolina at Chapel Hill, and Rambus have taken a significant step toward this vision with the construction of two computer architectures from DNA. The architectures employ single-strand molecules of artificial DNA with silicon rods on their ends that organize into circuit patterns, with the junctions between molecules metal-plated to form the circuitry. This is a departure from DNA computing research, in which the engineered strands represent computations. One architecture boasts a decoupled array-multiplexer design in which processors communicate solely via a central control unit, while the other uses an oracle design involving a computer synthesized for a specific problem using DNA to match question-answer pairs. Both architectures are simple parallel processing schemes that could be tapped to process large optimization problems such as the classic "traveling salesman" dilemma. Duke professor Chris Dwyer notes that redundancy is designed within the nanoelectronic circuitry to handle expected errors in the self-assembly process. The researchers' work was sponsored by the National Science Foundation and was featured in the Oct. 28, 2004 edition of Nanotechnology. Dwyer estimates that the development of a proof-of-concept DNA computer will take five to 10 years, while a practical DNA computer is over a decade away.
    Click Here to View Full Article

  • "Google Joins Effort to Put Millions of Books Online"
    NorthJersey.com (01/18/05); Kladko, Brian

    Initiatives to digitize books and make them accessible online for public consumption have generally kept a low profile, but Google has significantly boosted awareness with the recent announcement of its Google Print project, which aims to convert millions of printed works to electronic form. However, lesser-hyped nonprofit efforts such as Carnegie Mellon University's Universal Library and Project Gutenberg are designed to keep the digitized material unrestricted. "Our objective is to ultimately take the works of man...digitize it and make it free to everybody," declares Carnegie Mellon computer science professor Michael Shamos. Whereas Google Print will only display a small excerpt of works published since 1923, the Universal Library shows the whole text of some books that are still under copyright. Furthermore, works no longer subject to copyright are only displayed one page at a time, without a printing option, on Google Print, but Project Gutenberg, which deals exclusively with books in the public domain, does not restrict viewing, copying, or downloading of material. The Universal Library has thus far reached agreements with over 60 publishers to digitize some 51,000 out-of-print copyrighted books, but Carnegie Mellon librarian Denise Troll Covey says the project will not make overtures to commercial publishers until it has enough money to support features such as a "Buy It" option in which copies of out-of-print copyrighted books are purchased and laser-printed on request, with the publisher and author sharing the money. The Universal Library also plans to pursue a collaboration with Google. Project Gutenberg founder Michael Hart suggests that such cooperation would, for one thing, spare Google the headache of re-digitizing the Universal Library's archive of 100,000 books.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Int. Gathering to Focus on Partnering Industry"
    Scoop (NZ) (01/17/05)

    The University of Canterbury in Christchurch will host the Human Interface Technology Laboratory New Zealand's (HIT Lab NZ) two-day international Virtual Worlds Consortium in February, where about 200 Kiwi and international human-computer interface experts will focus on the organization of partnerships between creativity, research, and industry, which together represent the vital ingredients for successful collaboration. "Partnerships between research institutions and industry are essential to promoting a culture of innovation and to taking technology forward," comments HIT Lab director Mark Billinghurst. "They draw together those who produce new knowledge with those who know how to use it productively." Among the event's keynote speakers will be MIT associate professor of media arts and sciences Hiroshi Ishii and University of Saskatchewan associate professor of computer science Dr. Carl Gutwin. Ishii, who is co-director of MIT Media Lab's Things That Think consortium and director of the lab's Tangible Media group, frequently participates in collaborative efforts that meld together various artistic, design, and scientific fields. Gutwin's area of expertise covers topics such as information visualization, the practicality of distortion-oriented visualizations, and groupware architectures and performance, with a particular emphasis in how groupware systems can better complement the mutable and natural interaction of face-to-face collaboration. The first day of the event will include demonstrations of technologies from participating partner universities, companies, and the HIT Lab, while the second day will be filled with workshops covering such subjects as next-generation collaboration, entertainment computing, and the intersection of technology and art.
    Click Here to View Full Article

  • "Director of Science Agency Foresees More Budget Cuts"
    National Journal's Technology Daily (01/12/05); New, William

    National Science Foundation (NSF) director Arden Bement says the science communities will have to adapt to the move by the Bush administration to place a higher priority on reducing the budget deficit. The NSF saw President Bush cut its budget by 3 percent in fiscal 2005, and he expects the difficult times to continue when the president introduces his fiscal 2006 budget in February. In a recent interview at NSF headquarters, Bement says the foundation's budget is not likely to double from fiscal 2003 through 2005, as appropriations are laid out in the Investing in America's Future Act of 2002. Nonetheless, he intends to point out to budgeters "the importance of investing in the future and the strong linkage between science investment, economic development, and job creation if we're going to maintain our own." The science communities must avoid parochial stances and speak with one voice, which Bement says is the key to improving funding for the NSF. And he adds: "Anything we can do to link our university research programs to the challenges facing the nation will enhance our chances for budget success." The NSF's focus on homeland security R&D, nanotechnology, networking and information technology R&D, and extreme events such as tsunamis reflect the priorities of the White House's Office of Science and Technology Policy and Office of Management and Budget.
    Click Here to View Full Article

  • "Machine Wars"
    InformationWeek (01/17/05) No. 1022, P. 54; Claburn, Thomas

    The battle between hackers and system administrators is increasingly reliant on automated tools: Bot software is so pervasive that newly connected PCs are subjected to attack within 15 seconds, while the CERT Coordination Center has stopped counting the number of annual hacking incidents. CERT reported 21,756 incidents in 2000 and 137,529 incidents in 2003, which was its last tally. The increasing importance of the Internet in commercial activity and improvements in bandwidth and processing speed make cybercrime more appealing to criminal organizations, which sometimes hire programmers to create automated tools. One of the most common and effective tools is the Trojan horse downloaded via email or Web site, which allows the hacker to issue commands to the PC or quietly monitor transmissions for sensitive information such as Social Security numbers or online banking passwords; hackers rent out control over these zombie computers at rates between 3 cents and 8 cents per unit per week, says Symantec Security Response senior director Vincent Weafer. Spammers are also circumventing Captcha protection used by e-commerce firms and free email providers by writing script that gets human users to solve puzzles; even if software used against Captcha problems is just 10 percent effective, that is enough for some groups, such as ticket scalpers who buy newly issued tickets in bulk. Porn spammers seeking to set up free Web email accounts have set up systems where Captcha problems are presented to Web site visitors as a requirement for entry. System administrators say automated tools are helping them fight off growing onslaughts, including automated patching or anti-virus capabilities offered by Microsoft and other vendors. More security vendors are creating solutions that do not rely on signatures or rules at all, and instead are able to adapt to changing network threats.
    Click Here to View Full Article

  • "IBM Plan to Open Software Patents Seeds IP Debate"
    EE Times (01/17/05) No. 1354, P. 1; Wilson, Ron; Merritt, Rick

    IBM's recent issuance of 500 patents to the open source development community has sparked new discussion about how to handle the increasingly onerous issue of software patents. IBM favors a patents commons approach and has especially put its weight behind Linux, which is the foundation of more and more of IBM server hardware. Businesses will eventually see the benefit of working within a safer patent framework, says Public Patent Foundation executive director Daniel Ravicher, who notes, "People should compete by spending money on R&D, not on patent attorneys." By donating the 500 patents toward the open source community, IBM has given the movement more respectability among business users at a time when uncertainty is causing many companies concern. Besides the SCO lawsuits against IBM and other Linux companies, industry watchers say Microsoft has shown subtle indications that it might be preparing a legal attack against Linux; Microsoft is a much more potent threat because any legal attack from them would likely focus on patents themselves, unlike the SCO lawsuits which are increasingly seen as dealing with copyright and contract disputes. Insiders say Microsoft has privately warned Linux companies that the open source operating system violates its IP. Hewlett-Packard Linux vice president Martin Fink says IBM's patent donation is not as safe a protection as Hewlett-Packard's indemnification of Linux customers against patent infringement lawsuits. Red Hat deputy general counsel Mark Webbink says congressional reform of the Patent Act this year could prove an even more useful solution, and Rep. Rick Boucher (D-Va.) is expected to resubmit such legislation this year.
    Click Here to View Full Article

  • "The Open Mind Common Sense Project"
    AI Magazine (12/04) Vol. 25, No. 4; Lieberman, Henry; Liu, Hugo; Singh, Push

    Common-sense knowledge repositories are being developed at MIT and by a number of other groups for commercial and academic purposes. The Open Mind Common Sense (OMCS) Web site is one of the most inclusive efforts because it gathers information from the general public via Web forms; contributors do not need any special training, and use structured templates and free form entries to input simple English assertions, such as "People live in houses." Since the project's launch in September 2000, more than 14,000 people have contributed to a common-sense database with approximately 600,000 pieces of knowledge. Tailored lexico-syntactic pattern-matching rules were used to create a semantic network from the common-sense facts, resulting in the ConceptNet semantic network that consists of 300,000 concept nodes and 280,000 links. Although specific meanings are difficult to pin down using ConceptNet, the overall effect is useful and comes easily, especially with "fail-soft" applications that do not need exact inferences. Other MIT common-sense knowledge projects include the Open Mind Word Expert that allows contributors to tag word senses in the OMCS database; Open Mind 1001 Questions, which queries users for more knowledge based on what it already knows; and Open Mind Experiences that records stories, not just individual pieces of knowledge. Other common-sense database projects include CyCorp's applications and SensiCal, which provides common-sense checks for calendar applications so that people do not schedule a steak dinner with a vegetarian friend, for instance. Andrew Gordon created a photo retrieval system for the Library of Congress that uses word relations to identify relevant, annotated pictures.

  • "Software-Defined Radio and JTRS"
    Military & Aerospace Electronics (12/04) Vol. 15, No. 12, P. 22; McHale, John

    The U.S. military's next-generation Joint Tactical Radio System (JTRS) will be founded on software-defined radio (SDR), an interoperability platform that will allow soldiers to connect to a broad array of old and new communications systems. The $1 billion effort involves the replacement of traditional radio hardware with devices that can mimic any radio's capabilities via software modules, with upgrades facilitated through a wireless data network. "Because radio receivers can be reconfigured over the air, there are decreased maintenance requirements," state army descriptions. Bundled into a Joint Tactical Radio are waveforms, radio hardware, and associated hardware environment complying with the open-framework Software Communications Architecture, which is designed to augment the mobility of the waveforms across different radio architectures, according to Spectrum Signal Processing officials in British Columbia. A key factor in the improvement of SDR, and therefore JTRS, efficiency is the use of field-programmable gate arrays, which provide the parallel processing necessary for most of the signal processing the waveforms carry out, notes Xilinx's Manuel Uhm. He predicts that "Reconfigurable computing will play a very significant role in JTRS, especially as bandwidth and data throughput continue to increase." Researchers at Venture Development Corp. (VDC) say, "The military has successfully demonstrated SDR's abilities and will continue to push this technology to its limits. Meanwhile, commercial wireless products have slowly evolved toward SDR architectures. New device and architectural design advances are opening up a wealth of opportunities in existing and new wireless markets." VDC says the worldwide market for vendor embedded computer boards in military and commercial communications systems will be roughly $1.4 billion, with SDR hardware platforms accounting for over 11 percent of the market by 2007.
    Click Here to View Full Article

  • "What's Next for Google"
    Technology Review (01/05) Vol. 108, No. 1, P. 38; Ferguson, Charles H.

    Google is headed for a showdown with Microsoft in the race to control the organization, search, and retrieval of all digital information, and Google's hold on the market remains tenuous despite its popularity, writes Vermeer Technologies co-founder Charles Ferguson. The search industry is characterized by a lack of standards and incompatible architectures, and the spread of new computer platforms and the increased mingling of data between these various platforms is bringing Google and Microsoft's rivalry to a head. The status accorded Google on the strength of its PageRank algorithm has started to wane, and most innovative search functions are coming out of startups rather than established players. One of Google's biggest disadvantages in its competition against Microsoft is its lack of control over any standards for the platforms that will fight to be the reigning search architecture, while Microsoft has a monopoly on Windows, Internet Explorer, and Office. Ferguson warns that Google's hesitation to provide core search engine application programming interfaces, which could stem from a desire to leverage infrastructure expertise or postpone the war with Microsoft as long as possible, could be a calamitous mistake. Still, there is no guarantee that Microsoft will achieve victory in the search architecture war: Google CEO Eric Schmidt may have the technological expertise and experience to prevail against Microsoft CEO Bill Gates; Microsoft's search offerings have been less than stellar so far; and Google's primary services piggyback on a platform--the Web--that is outside Microsoft's control. Regardless of the ultimate winner, there are reservations about the prospect of a monopolized search industry, including the risk of privacy infringement given the vast amount of personal data the ruling firm would be privy to, and intensified consolidation of media ownership into a worldwide corporate oligarchy.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM