ACM TechNews sponsored by Learn more about Texis, the text-oriented database providing high-performance search engine features combined with SQL operations and a development toolkit, that powers many diverse applications, including Webinator and the Thunderstone Search Appliance.   Learn more about Texis, the text-oriented database providing high-performance search engine features combined with SQL operations and a development toolkit, that powers many diverse applications, including Webinator and the Thunderstone Search Appliance.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Thunderstone or ACM. To send comments, please write to [email protected].
Volume 7, Issue 788: Friday, May 6, 2005

  • "An Endless Frontier Postponed"
    Science (05/06/05) Vol. 308, No. 5723, P. 757; Patterson, David A.; Lazowska, Edward D.

    The United States risks losing leadership in IT development unless significant investments are made to bolster long-term, academic IT research, writes ACM's President David Patterson and University of Washington computer science and engineering chair Edward Lazowska. The ACM will award computing's highest prize, the A.M. Turing Award, to Vinton Cerf and Robert Kahn next month for their work 30 years ago in creating Transmission Control Protocol (TCP), one of the foundations of the modern Internet. The accomplishment underscores the importance of government-funded IT research, as the TCP project was funded by the Defense Advanced Research Projects Agency (DARPA). The National Science Foundation and DARPA together have driven a number of fundamental IT breakthroughs over the last four decades, including the microprocessor, the Internet, graphical user interfaces, and single-user workstations. In 2003, the National Academies compiled a list of 19 government-backed IT discoveries that helped create billion-dollar industries. However, government funding for basic IT research has dropped off significantly in the last three years as DARPA has shifted from basic IT research to intermediary projects that aim to deploy existing technologies. Other agencies have not filled the vacuum created by DARPA's pullback from basic IT research. This turnabout in government commitment to IT research comes at a time when the field is growing tremendously and international competitors are ramping up their own capabilities.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

    For more information on the history of NSF funding, see David Patterson's Article in the April 2005 issue of Communications of the ACM:
    Click Here to View Full Article

  • "NSF Sets New Strategy to Improve Nation's 'Cyberinfrastructure,' But Details Are Lacking"
    Chronicle of Higher Education (05/04/05); Kiernan, Vincent

    The National Science Foundation (NSF) has formulated a strategy to bolster the U.S.'s cyberinfrastructure, said NSF director Arden Bement Jr., speaking to the Internet2 consortium spring meeting on Tuesday. The national cyberinfrastructure is as critical as the interstate highway system or electricity grid, and one of the most important investments for the 21st century, he said. The cyberinfrastructure includes network technology, computer storage, and the ability to retrieve and analyze information. The NSF Cyberinfrastructure Interim Working Group presented the strategy to Bement last week, after two months spent developing the plan. Some NSF scientists complained about a lack of direction in regard to the agency's computing projects, prompting the report. Bement said many researchers' work is held up by inadequate cyberinfrastructure, which prevents them from making use of an ever-increasing supply of data from scientific instruments. The report also advocated new dynamic computer models that would incorporate new information as it came in. Despite the importance of improving cyberinfrastructure, Bement indicated the NSF would not be dramatically increasing spending in the area because of budget constraints. Instead, he mentioned changes to management structures that would promote cyberinfrastructure.

  • "At 10, Java's Wild Success, Missed Chances"
    IDG News Service (05/05/05); McMillan, Robert

    Sun Microsystems' Java programming language celebrates its 10th birthday this month, having grown from a language aimed at Web developers to a much broader collection of software and specifications used on mobile phones and mainframe computers. "It's been a rocket ride that nobody expected would ever get near this far," says Sun President Jonathan Schwartz. Java was originally left-over code from the FirstPerson interactive television venture and was released by Sun in 1995 as a way to animate images on Web sites. Java's main appeal was its "write once, run anywhere" promise that freed developers from having to compile their code for different hardware. The freely available source code was integrated with the Netscape Navigator Web browser and drew 6,000 attendees to the JavaOne conference in 1996. JavaLobby.org founder Rick Ross says the language drew together the large IT vendors in an unprecedented way. Analyst James Governor notes Java dramatically changed IBM's software group by proving IBM could successfully build on something it did not own, a model IBM then followed with Linux. For Sun, however, the success of Java was mixed. While Sun was not successful as competitors in marketing its Java developer tools and application servers, the company did benefit from somewhat stymieing Microsoft's .Net effort, which Sun CEO Scott McNealy says would have closed the window on Sun's hardware business. Sun also missed an opportunity to get Java on the desktop, spending seven years fighting Microsoft in court over its desktop implementation. Meanwhile, Sun also made it difficult for companies such as Apple and Intel to contribute to Java, Ross says. Looking back, it is important to remember how quickly Java grew in such a short time. "There were so many opportunities, it was hard to know what to do," says Sun vice president and Java platform fellow Graham Hamilton.
    Click Here to View Full Article

  • "Bluetooth Tech to Interoperate With UWB, Future Radios"
    Extreme Tech (05/04/05); Hachman, Mark

    The Bluetooth Special Interest Group (SIG) says it will include "command and control" functions for different types of wireless radios in the Bluetooth specification, boosting Bluetooth technology into a new role as a universal wireless interface. Both the UWB Forum and WiMedia Alliance, the two main ultrawideband (UWB) standards groups, praised the move for helping to simplify future wireless environments; an inclusive Bluetooth protocol would benefit users by boosting wireless interoperability and would benefit designers by consolidating design requirements, said leaders from both groups. Bluetooth SIG executive director Michael Foley says the goal is to create a flexible architecture that will accommodate future radios beyond UWB. Technically, the new Bluetooth protocol will combine UWB PHY so that Bluetooth devices can use the UWB interface. By merging Bluetooth and UWB technologies, bandwidth available to Bluetooth devices would be increased drastically using UWB as the air interface, says analyst Dan Benjamin; in addition, developers seeking to create widely interoperable devices would be able to tap the combined capabilities. The use of Bluetooth to implement UWB could hinder wireless USB technology, which also uses UWB for air interface but is relatively immature. Backwards compatibility for existing Bluetooth devices is a given, since nearly 1 billion Bluetooth devices are expected by the time the hybrid chips hit the market. In addition, the Bluetooth Enhanced Data Rate version is not threatened, says Foley.
    Click Here to View Full Article

  • "IEEE-USA Reports Gigabit Networks Should Be U.S. Priority"
    Control Engineering (05/05/05); Montague, Jim

    The United States lags behind other countries in network capability and is endangering its global lead in the knowledge economy, according to a new white paper from the IEEE-USA Committee on Communications and Information Policy. Information is the main driver behind modern economic growth and improved lives, but current network infrastructure is insufficient to distribute information properly, says IEEE life fellow and former National Telecommunications and Information Administration chief scientist John Richardson. Existing DSL and cable broadband offers approximately 2 Mbps asymmetric speeds, but gigabit networks enabled by fiber optics and high-speed wireless technology would boost speeds between 50 to 5,000 times faster. In addition, such networks would offer symmetric upload and download capabilities. The IEEE-USA group said gigabit networks would spur greater economic growth in digital home entertainment, improve online education, and enable telemedicine. The report also called on Congress, the executive branch, and the private sector to act on fundamental principles to encourage gigabit network creation, including regulatory flexibility and supporting user-owned networks. Richardson notes modern telecommunications infrastructure no longer deals with discrete streams of voice, data, and media, but rather a convergence of those digital streams. Compared to network capability in Japan and South Korea, the United States is woefully unprepared to handle those converged information streams, he says.
    Click Here to View Full Article

  • "Feds to Release 20,000 H-1B Visas Next Week"
    Computerworld (05/04/05); Thibodeau, Patrick

    The U.S. Citizenship and Immigration Services (USCIS) department announced yesterday that an additional 20,000 H-1B visas would be made available to foreign workers with at least a master's-level degree from a U.S. school, starting May 12. The visas were approved by Congress last year amid complaints from American tech companies and academic groups that the H-1B program's 65,000-worker cap was not sufficient to satisfy demand, while critics contend American citizens and permanent residents are losing jobs due to the arrival of skilled foreign professionals. Supporters of the cap increase were distressed by an earlier statement from USCIS that the visas would also be available to applicants who do not hold an advanced degree from U.S. universities. Such a change would violate the spirit of the H-1B Visa Reform Act of 2004, says Compete America director Sandra Boyd. But she is now confident the agency has made the "right interpretation" of the law. Congress was asked by tech industry groups to approve over 20,000 additional visas last fall, and those organizations are likely to call for even more visas if the ones to be issued next week are rapidly picked up, as some immigration attorneys anticipate.
    Click Here to View Full Article

  • "Summit Calls for 'National Software Strategy'"
    PRNewswire (05/05/05)

    A national software strategy is needed to improve software trustworthiness, empower the U.S. software workforce, reinvigorate software research and development, and encourage software industry innovation, according to a report from the 2nd National Software Summit (NSS2). Software is of extreme importance to the nation because of its role in supporting critical infrastructure, including communications, transportation, finance, and the electrical grid. But compared to what can be built using known best practices, today's software products are unacceptably vulnerable to error and malicious disruption, said Center for National Software Studies (CNSS) President Alan Salisbury, who is also the Army official in charge of U.S. Army software. "For far too long we have simply accepted poor quality software as a fact of life. The real facts are that we know how to build much better quality software today, and we need to invest in even better software engineering for the future," he said. There are several critical gaps that compromise software development today: the lack of adequate development tools and technology needed to build error-free software, failure to apply best practices in software development, and a software workforce facing significant threat from overseas competition. The NSS2 report recommended government, industry, and academic representatives create a National Software Strategy that would ensure the competitiveness of the U.S. software industry while enabling the routine development of trustworthy software products and systems. Implementation of the strategy would then by governed by a similarly representative National Software Strategy Steering Group that would meet approximately every three years.
    Click Here to View Full Article

  • "Electronic Medical Records Seen as Worthy Goal"
    Investor's Business Daily (05/05/05) P. A4; Howell, Donna

    The federal government aims to transition most patients to electronic medical records in an effort to lower costs and increase the efficiency and effectiveness of the U.S. health care system. Electronic medical records within a decade is the goal of a federal plan President Bush introduced last year; the plan promises to reduce hospitals' dependence on paper records and speed up service for patients by eliminating the need to fill out health history forms. Proponents say electronic records could benefit insurers by preventing expenditures on unnecessary, improper, ineffective, or inefficient treatment, which amount to roughly $300 billion a year. Booz Allen Hamilton's Susan Penfield says electronic record keeping can reduce costs for transcription and other operations at physicians' offices, though the information's usability to other providers, payers, and researchers must be guaranteed. Another challenge for health care providers is ensuring that medical records can be exchanged between facilities in a format that is readable by all IT systems. Interoperability problems must be overcome for electronic medical records to become a reality, and Penfield says one strategy the government may opt for is the encouragement of regional initiatives to tackle this challenge. She says such efforts could be funded through pay-for-performance programs in which insurers provide physicians discounts if medical outcomes are improved by electronic records, while national coordinator for health information technology David Brailer says federal funding may need to be allocated to smaller offices. Despite these challenges, Intermountain Health Care's Dr. Brent James is confident that health care IT "is to the point where it will take off whether we provide massive federal funding or not."

  • "Evaluating a Patent System Gone Awry"
    Washington Post (05/05/05) P. E1; Krim, Jonathan

    Traction for major patent system reforms is starting to build on Capitol Hill as overwhelmed patent examiners approve applications for dubious patents and infringement lawsuits snarl the courts. The system in its current form has also encouraged a new kind of entrepreneur, or "troll," who acquires patents and enforces them against companies for profit without actually producing anything. To discourage the troll business model and lessen the chances of their products being pulled off the market if they are found to be infringing, big tech firms want a higher standard for granting patent injunctions--and new criteria for determining damages based on the component covered by the patent under discussion rather than on the entire product. Small inventors and representatives of biotech and other industries counter that removing such measures would give big companies license to stamp out smaller competitors' patents. Absent from Congress' discussions are more sweeping reforms designed to eliminate the granting of software and Internet business method patents. These patents have played a key role in the patent glut, according to many economists and technologists mentioned in a 2003 report from the FTC. A later study from the National Academies of Science argues for the revitalization of a key standard for patent approval--a patent's uniqueness, usefulness, and non-obviousness--to eliminate low-quality patents that frequently spark litigation. All parties involved in the issue concur that the U.S. Patent and Trademark Office currently does not have enough resources to function properly.
    Click Here to View Full Article

  • "The Speedy Way to Capture a City"
    New Scientist (05/05/05); Singer, Emily

    University of California, Berkeley, engineer Avideh Zakhor has devised a way to quickly generate an accurate 3D model of an urban area, with potential applications for the military, urban planning, emergency services, and the tourist trade. Zakhor's "virtualized reality" technique can produce precise models much faster than virtual reality (VR) systems. VR model generation requires programmers to manually integrate 2D images and distance measurements, a process that SET Associates' Bruce Deal says can take months; virtualized reality, which is automated, takes just hours by comparison. The process involves scanning an urban area with lasers and digital cameras on a truck or plane, while another laser measures the truck's movement and checks its location against data collected from the aircraft laser. A computer then combines this information into a photo-realistic 3D simulation. Deal says the speed of the virtualized reality system is advantageous to soldiers engaged in urban warfare, while Zakhor plans to develop a real-time modeling system with a spinoff company funded by the Defense Advanced Research Projects Agency. The U.S. Navy's development of inexpensive drone aircraft for reconnaissance could further accelerate virtual model generation. A 4D modeling system is also being developed that involves 3D models moving through a virtual space.
    Click Here to View Full Article

  • "Mining for Information"
    Federal Computer Week (05/02/05) Vol. 19, No. 13, P. 52; Chourey, Sarita

    Stanford Linear Accelerator Center (SLAC) director of computing services Richard Mount envisions moving data storage another rung up the evolutionary ladder with his Huge Memory System, a memory-based system that would allow researchers to access vast amounts of data at unprecedented speeds, thus making the reservation of computing time unnecessary. SLAC is a component of the Department of Energy's Office of Science's High Energy Physics Division. The proposed national facility would hold 30 TB of SLAC data, though Mount says DOE officials have made no commitment. SLAC's current data storage system is disc-based, whereas 15 years ago data was stored on magnetic tapes. Mount says the center's computer security has increased 300 percent over the last five years, adding that SLAC's operations were not significantly affected by the Sept. 11, 2001, terrorist attacks. "We have never done any classified research," he notes. SLAC skips rule-based, prescriptive security solutions, and has instead made security the responsibility of three people. Mount says each security division is "fully intellectually justified, so that scientists and the security team have aligned goals."
    Click Here to View Full Article

  • "The No-Computer Virus"
    Economist (04/30/05) Vol. 375, No. 8424, P. 65

    The sorry state of IT in the health care industry is reflected in the scores of hospitals and offices that lack electronic medical records--and more importantly, the glaring absence of interoperability between computer systems. Health care IT expert David Bates with Brigham and Women's Hospital in Boston says just about 2 percent of the health care industry's revenues are invested in IT, while other information-intensive industries commit 10 percent of their revenues to IT. Hewlett-Packard's Jeff Miller says 25 percent to 40 percent of $3.3 trillion in yearly global health care expenditures could be saved with adequate IT. In addition, well-designed IT could prevent many medical errors that the Institute of Medicine says are responsible for 44,000 to 98,000 deaths each year in America alone. A report in Health Affairs says $77.8 billion in net benefits could be generated each year by a fully interoperable electronic health records network, to say nothing of statistical improvements that would allow disease outbreaks to be identified more rapidly. In the most desirable future health care architecture, nurses, doctors, and perhaps even patients would wirelessly enter input, which would be formatted and stored in accordance with recognized standards, enabling individual computer applications to find and communicate with each other over the Internet; the system must also be secured through authentication and password protection. Although refinement is necessary, Accenture's Robert Suh is confident that "the technology is [generally] very, very ready." Factors impeding the U.S. health care system's modernization include a fragmented industry, which means that financial incentives for funding a new IT infrastructure are mismatched; a dearth of public confidence in privacy laws is another obstacle.

  • "Get BUFF"
    Government Executive (04/15/05) Vol. 37, No. 6, P. 71; Harris, Shane

    The U.S. military's transition from a monolithic entity to a collection of "light" and "agile" units that can be rapidly deployed anywhere in the world is accompanied by a growing hunger for real-time battlefield information. Dealing with the overwhelming amounts of data is the goal of the BrUte Force Fusion (BUFF) Program being carried out at the Army's Battle Command Battle Lab under the leadership of lab deputy director Jason Denno. BUFF's goal is to attain the second level of the data fusion hierarchy: Level I is what happens when a sensor detects the movement of an object in a battle space, while level II blends data from multiple sensors; level III anticipates the units' most probable future actions. The complexity of level II fusion is such that the construction of systems for managing the sensor traffic is over two decades away, by most scientists' reckoning. However, Denno says such a goal can be reached within just a few years with BUFF. He says analysts traditionally examine the freshest set of facts and evaluate what was occurring at that moment, and they make later evaluations by looking at the freshest data acquired since the last set; Denno argues that this method may overlook historic trends, which BUFF would take into account by basing the assessment on the totality of collected data. The system would theoretically tap advancements in data storage and distribution technology in order to manage the vast volumes of accumulated information.
    Click Here to View Full Article

  • "Born Again"
    InformationWeek (05/02/05) No. 1037, P. 32; Babcock, Charles

    Silicon Valley's entrepreneurial spirit is being rekindled by new startup vendors committed to the development of better, faster, and less expensive IT infrastructures through greater discipline, global connection, and dedication than their forebears. Many of the Bay Area's leading companies are focusing on the delivery of software as a service, while increasing security challenges stemming from the tighter integration of corporate IT infrastructure and the Internet also present a major opportunity for Valley startups. One of the biggest advantages new vendors have is the wellspring of local brainpower; Stanford University's Graduate School of Business reports that 38 percent of its graduating classes of 2003 and 2004 remained in the Bay Area, while its engineering school says 53 percent of its graduates stayed. Networking opportunities, potential partnerships, and talent are encouraging established businesses to relocate or set up shop in Silicon Valley. Collaborative Economics President Doug Henton says most jobs in Silicon Valley fall into the technology design and software development categories, with the amount of venture capital dedicated to software rising from 23 percent to 27 percent between 2003 and 2004. Foreign nationals have constituted a sizable percentage of Bay Area talent in recent years, and Berkeley School of Information Management and Systems Dean AnnaLee Saxenian says professional organizations such as the Chinese Software Professionals Association and the Indus Group function as social networking hubs. VentureOne estimates that VC spending in 2004 totaled $20.4 billion, 30 percent of which went to Silicon Valley companies.
    Click Here to View Full Article

  • "Blades Have the Edge"
    IEEE Spectrum (04/05); Wright, Jane

    Compact blade servers that are more power- and space-efficient than conventional servers benefit IT by reducing clutter and simplifying installation, management, and repair. A blade server is stripped down to the bare necessities--a processor-equipped motherboard, memory, networking circuitry, and perhaps a hard drive--and it attaches to a special enclosure equipped with a power supply, a cooling system, disk drives, and interfaces for mouse, USB, keyboard, video, and network links; this reduces power, space, and cabling requirements substantially. Blade management software facilitates the automation and simplification of provisioning and other chores that are arduous and monotonous with conventional servers. The proliferation of blades brings administrators closer to the long-cherished vision of computer rooms with machines operating at peak capacity. The chief consumers of blades are large organizations such as ISPs, financial institutions, and pharmaceutical and telecom companies; reasons for blade adoption include the desire to free up space without sacrificing computing power, the lower administrative and service costs of blades thanks to their upgradability and reconfigurability, and their ability to be easily replaced in the event of failure. Wide blade adoption is being hindered by software and hardware incompatibility issues, as well as cooling and electricity requirements. The low heat output of individual blades can become cumulatively large if the blades are clustered together in a single location, and both major and minor vendors are vying to be the first to resolve this problem. Because of these challenges, many companies are electing to gradually transition to blades as their conventional servers become outdated, limiting blades' adoption to specific applications.
    Click Here to View Full Article

  • "Conceptual Data Modeling Patterns: Representation and Validation"
    Journal of Database Management (04/05) Vol. 16, No. 2, P. 84; Batra, Dinesh

    Florida International University professor Dinesh Batra defines design patterns in the context of data modeling as descriptions of objects, relationships, and attributes that are customized to address a generalized conceptual or logical database design problem. He examines a number of sources and from them extracts 11 data modeling patterns that are commonly found in business applications. Those patterns are identified as entity event, entity type, generic transaction, discrete transaction, time-based transaction, data warehouse, subsequent transaction, recursive, strict hierarchy, plan, and generalization. Batra validates these patterns by analyzing how often they crop up in three data model sources; one source is aimed at data modeling students and the other two target practitioners for the most part. The author concludes that most of the structures occur frequently enough in the sources to be considered patterns, although some patterns are employed more frequently than others. Type structure and transaction structures (generalization especially) are underrepresented in the academic source, while the plan structure is completely absent. The hierarchy structure is poorly represented in two of the sources, leading Batra to speculate that the hierarchy pattern may need to be renamed. The three sources alternately designate generalization, type, and transaction as the leading pattern.

    [ Archives ]  [ Home ]