Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to technews@hq.acm.org.
Volume 7, Issue 864:  November 7, 2005

  • "Researchers Look to Create a Synthesis of Art and Science for the 21st Century"
    New York Times (11/05/05) P. A17; Markoff, John

    Larry Smarr, director of the California Institute for Telecommunications and Information Technology (Calit2), believes that artists will be central to the future of computing technology. The fusion of aesthetics and science is readily evident in the design of the building that is home to Calit2, where Smarr claims that artists enjoy the same stature as any scientist. While collaboration with the artistic community is eschewed in many research environments, it is central to Calit2, where artists can routinely be seen advising and conferring with physicists. One example of the collaboration between science and art can be seen in the New Media Arts group at Calit2, which helped design a tiled computer that occupies an entire wall, projecting 100 million pixel images of the brain. Because the technology displays so much more information than standard screens, the researchers have had to create a new control language, a project jointly led by artists and scientists. Another project underway will create a pack of robotic dogs to be unleashed for some socially relevant purpose, such as seeking out pollution. Ruth West, Calit2's first artist in residence focusing on an artistic approach to the biological sciences, has undertaken a project linking genetics and culture, known as Ecce Homology, which examines human evolution through a visual comparison between human genes and those of rice plants. Calit2's focus on art and culture could help distinguish it from the many labs laboring under reduced funding, and has invited comparisons with MIT's Media Laboratory for its broad and innovative vision. Smarr sees in Calit2 a response to the broad decline in government funding for scientific research, and hopes to work closely with the corporate community to promote an atmosphere of ongoing collaboration.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Internet Fathers Get Presidential Medal"
    CNet (11/04/05); Reardon, Marguerite

    Vint Cerf and Robert Kahn will receive the Presidential Medal of Freedom for their work in developing the TCP/IP protocols that power communication across the Internet this Wednesday at the White House. Earlier this year, they were awarded ACM's 2004 A.M. Turing Award, considered by some to be the computing industry's version of the Nobel Prize. Working for DARPA in 1973, Kahn and Cerf encapsulated packets of information in datagrams that could be perfunctorily read by gateway computers and transmitted to host computers for a detailed dissection of their contents. The network created by TCP led to the interconnected group of networks that became known as the Internet. Eventually the protocol was split and renamed TCP/IP, which is now the standard to which all Internet communication adheres. Cerf is currently employed by Google, after spending much of his career at MCI when it was known as WorldCom, and the Corporation for National Research Initiatives (CNRI). Cerf is also developing a new group of communication protocols with NASA's Jet Propulsion Laboratory, and chairs the Internet Corporation for Assigned Names and Numbers. After a 13-year tenure at DARPA, Kahn moved to CNRI when it was founded in 1986, and eventually became its chairman, president, and CEO.
    Click Here to View Full Article
    For more on Cerf and Kahn as recipients of ACM's A.M. Turing Award, Visit http://www.acm.org/awards/turing_citations/cerf_kahn.html

  • "Fab Labs Unshackle Imaginations"
    Associated Press (11/06/05); Jewell, Mark

    MIT has established seven Fab Labs around the world that bring expensive design tools within reach of the amateur inventor, such as Makeda Stephenson, a 13-year-old who used the resources to design a flight simulator program. Each lab contains toolsets that would cost roughly $25,000, though by offering free access the program's advocates believe they will foster a climate of innovation and empowerment in small or impoverished communities with scant resources. Among the projects undertaken so far have been a tracking system for Norwegian sheep farmers to monitor the movements of their herds, a measurement tool Indian dairy farmers have designed to gauge fat content and guarantee the health of milk, and an initiative in Ghana to convert solar power into electricity. The devices in the labs run on open source software, and include commercially available tools such as a laser cutter, a milling machine, and a sign cutter. Designers are encouraged to share their ideas with others both in the lab and at the other labs around the world, bringing the open source approach to hardware. So far, Fab Labs have been welcomed by the community of industrial designers, who see it not so much as competition but as a catalyst for further innovation. While individualized designs are almost universally welcomed, some question where the raw materials will come from to support a decentralized model of production. Supporters of the labs, however, argue that their value does not lie in the commercial viability of the designs, as many are drawn to them for the personalized approach they offer.
    Click Here to View Full Article

  • "High-tech Industry in High Spirits"
    The Hill (11/03/05); Snyder, Jim

    Lobbyists predict a strong showing of support for the tech industry when the House and Senate agree on their budget terms for deficit reduction. While many other industries can expect to see funding cuts, the tech sector is likely to enjoy comparatively strong support due to the impending conversion from analog to digital television, the provision expanding the number of H1-B visas that bring skilled foreign workers to the United States, and the grant funding for college students concentrating in math and science. Senate Republicans are calling on the tech industry to step up its lobbying efforts, as the fate of the controversial budget reconciliation hinges on the tech provisions. When the broadcast spectrum in use by analog transmission is auctioned off, the government expects to see $10 billion of revenue, while the tech industry will enjoy increased sales stemming from the demands of the new users of the spectrum. A task force commissioned to promote talking points to tech lobbyists emphasizes the competitive necessity of freeing up analog bandwidth in order for the United States to keep the pace set by countries such as Japan and South Korea. The version of the bill in the Senate provides for 30,000 more visas than the House version. The reconciliation bill calls for $2.25 billion in tuition funding for students with concentrations in math, science, engineering, or foreign language, and would augment the Pell-Grant program that offers low-income students $4,050 in tuition funding. Lobbyists for the tech industry are confident that the Senate version will prevail, as the House version increases the cost of H1-B visas, which Republicans are likely to reject as tantamount to a tax increase. The tech industry still sees the immigration reform as an interim step, and will continue to push for a further loosening of the restrictions on emigrating foreign workers.
    Click Here to View Full Article

  • "US Business: New Internet Governance Not Needed"
    IDG News Service (11/03/05); Gross, Grant

    The argument over U.S. Internet governance will come to a head on Nov. 16 in Tunis, Tunisia. The World Summit on the Information Society, sponsored by the United Nations, will attempt to restructure Internet control. Several countries want to take away U.S. control over ICANN, which manages domain names, root servers and IP address, and make it more international. This decision has not been met without criticism from the United States. "Countries can lambaste the U.S. because they don't like the way ICANN is structured," said Fred Tipton, director of international organizations and development at Microsoft. "In the long run, we need to respond to what I think is a legitimate responsibility of government officials in every country in the world to understand how the Internet impacts their fundamental cultural, economic and social situation." ITAA President Harris Miller asked participants at a recent forum what the worst possible outcome of the WSIS would be, and several replied that a move toward an international governing body would be detrimental to the Internet. "If the purpose is to slow down the development of the Internet and to inhibit entrepreneurship and innovation, that is the way to do it," said BellSouth's Frank Urbany.
    Click Here to View Full Article

  • "EA Tank Contest Seeks AI Talent"
    BusinessWeek (11/03/05); Campbell, Colin

    Electronic Arts believes artificial intelligence will be a major determining factor in which games stand out from the rest of the pack over the next five to 10 years. Games are no longer differentiated by realistic graphics, which are starting to be produced by specialized hardware. As a result, EA is sponsoring a competition that will help the company identify the best young talent in artificial intelligence. Open to students at select universities across the country, the competition calls on participants to write an artificial intelligence program that pits two military tanks against each other in an arena, and program one tank to emerge from the battle victorious. For the Tank Wars competition, EA will provide the software code for free, and entrants do not relinquish ownership of their intellectual property. Technical skills, originality, and creative merit will determine which entries are chosen as winners by the EA judging panel. "With this competition, we hope to find people with a passion for AI and understanding of the magic that makes a game truly fun to play," says John Buchanan, university research liaison at EA.
    Click Here to View Full Article

  • "The Private Spy Among Us"
    National Journal (11/05/05) Vol. 37, No. 45, P. 3457; Harris, Shane

    Congress has joined privacy advocates in expressing concern about how the federal government is using data on U.S. citizens and foreigners gathered and analyzed by ChoicePoint in Georgia. Although existing laws prohibit the government from keeping tabs on citizens and foreigners, the FBI and the Pentagon are using the data aggregator's private databases, which hold information on nearly everyone in the country. The FBI refuses to discuss its monitoring activities as well as its arrangements with ChoicePoint, which has legally obtained over 19 billion records for building profiles such as concealed weapons permits; marriage and death certificates; registrations for boats, aircraft, and automobiles; eviction notices; credit card information; hazardous-materials-handling permits; and employment histories. The government considers people who have overstayed their visas to be security risks, and says relying on private data brokers is a more effective strategy for gathering information than sending investigators to courthouses and clerk's offices across the country. Nonetheless, lawmakers have started to introduce legislation to ensure that the federal government does not misuse the information, but privacy advocates say the bills are too limited in scope and are not as strong as legislation that is being introduced at the state level. Moreover, reining in private data brokers will be difficult because they do not have to answer to Congress, says Steven Aftergood, director of the Project on Government Secrecy at the Federation of American Scientists. "And the secrecy of most intelligence work makes them all but impervious to independent oversight," he says. "If they broke or bent the law, we might never find out."

  • "'Next Big Thing' May Be Process, Not a Product"
    EE Times (10/31/05) No. 1395, P. 6; Brown, Chappell

    The next revolution to transform the global technology landscape will not be a specific product, but rather what IBM calls business process transformational services (BPTSs), says IBM Research director Paul Horn. IBM is betting that the next engine for growth will be streamlined processes that can remove inefficiencies from worldwide GDP. Horn sees in the open source movement a major disruption in the business value of software, as open source software has had a sweeping impact on every sector of the industry, which is forcing commercial enterprises to revisit their business models. Horn admits that technologies such as new hardware architectures and redesigned semiconductors will shape the future whether or not they are creating profits for a company. IBM remains committed to its original philosophy of encouraging innovation on every level through a combination of science and business. Horn says that IBM's basic research work has historically led to business success, and thus will remain the company's focus. IBM's Blue Gene system still occupies the top spot on the list of the world's fastest supercomputers, though the company's Tilak Agerwala notes that power limitations are slowing the increase of frequencies. In light of those limitations, Agerwala acknowledges that the new breed of performance-driven applications will not be sustained merely by improvements in frequency. The Blue Gene model is an attempt to provide the integrated approach that offers data processing of a high throughput within a unified architecture that will position itself in competition with emerging servers and workstations, as well as new freely available software packages.
    Click Here to View Full Article

  • "Carnegie Mellon's Entertainment Technology Center Develops Peacemaker Videogame"
    Carnegie Mellon News (10/27/05); Sloss, Eric

    Students and faculty at Carnegie Mellon University's Master of Entertainment Technology program have created PeaceMaker, a new videogame simulation developed as an alternative to today's increasingly violent video games. It was tested in Pittsburgh with focus groups from two local schools and the team worked with professors Tina Blaine of the Entertainment Technology Center and Laurie Eisenberg of the History Department. "PeaceMaker is a unique video game in that it teaches students how to achieve peace through negotiation and cooperation, unlike many video games that rely on violence," said PeaceMaker producer Asi Burak. The team of students on the project have a background in graphics, game design, math, programming, advertising, and music. The concept for the video game is based on the struggle between Israeli and Palestine. Each player can assume the role of Palestinian president or Israeli prime minister and learn to negotiate conflicts, communicate with other politicians, respond to suicide bombers, and take military action as a way to resolve conflict. PeaceMaker will be available for free online to the public in spring 2006.
    Click Here to View Full Article

  • "ACLU Challenges Patriot Act"
    CNet (11/02/05); Reardon, Marguerite; McCullagh, Declan

    The ACLU is asking the 2nd Circuit Court of Appeals to uphold two separate lower court rulings in Connecticut and New York that reduced the scope of the U.S. Patriot Act's provision regarding an FBI mandate to secretly demand public information from libraries and Internet service and other communications providers on the grounds of national security. The mandate also requires the recipient of the request to remain silent about the request. In one of the cases, a district court judge ruled that the "national security letter" (NSL) statute, relating to the letters that are sent requesting information about subscribers--including addresses, calls made, email subject lines, and logs of visited Web sites--was unconstitutional, violating free speech rights as guaranteed under the First Amendment. The Connecticut case, filed by the ACLU on behalf of the American Library Association, ended in a ruling that the gag order also violated free speech rights. The second case was sent back to the 2nd Circuit by U.S. Supreme Court Justice Ruth Bader Ginsburg, who rejected a association's request to lift a gag order. Congress is now working on a bill that could expand the Patriot Act. A measure meant to allay concerns has already been approved by lawmakers in the House; the measure would allow recipients to consult with their attorneys, and a similar proposal is pending in the Senate. If passed, the measures would in effect halt the two lawsuits by making NSL procedures less vulnerable to court scrutiny.
    Click Here to View Full Article

  • "US Youths Use Internet to Create"
    BBC News (11/04/05)

    The popularity of Web tools as instruments for creating and sharing online content among American teenagers should serve as a wake-up call to traditional media companies, which should reconsider their relationship with this increasingly powerful demographic, according to Pew Internet and American Life Project director Lee Rainie. "These teens would say that the companies that want to provide them entertainment and knowledge should think of their relationship with teens as one where they are in a conversational partnership, rather than in a strict producer-consumer, arms-length relationship," he explained. Recent Pew research indicates that 12- to 17-year olds were more likely than adults to read and possess a blog, and those with blogs were more likely to remix and share music and images. Girls were more likely than boys to share their work with others online, while almost one in five who use the Net said they used other people's content to help create their own. Adolescent bloggers are far more concerned about using blogs to form and maintain relationships with peers, and research estimates that 25 percent of 15- to 17-year-old girls who are online blog--10 percent more than boys. The Pew report finds that 51 percent of U.S. teens downloaded music and 31 percent downloaded video with full knowledge that doing so was wrong; 75 percent of that group said the ease of downloading and file-sharing was enough to make expectations that people would not do it unrealistic. "At a time when social norms around digital content don't always appear to conform with the letter of the law, many teens are aware of the restrictions on copyrighted material, but believe it's still permissible to share some content for free," said Pew report co-author Mary Madden.
    Click Here to View Full Article

  • "Redefining Cool"
    Computerworld (10/31/05) P. 28; Mitchell, Robert L.

    Since 1996, Hewlett-Packard has devoted a team of engineers known as the Cool Team to develop innovative approaches to reducing the heat in data centers. As computer densities have increased, heat management has become a more pressing concern, said Cool Team founder Chandrakant Patel, who noted that cooling issues now constitute half of the cost of a data center. Patel said that in a data center with 100 racks of servers, power consumption could easily be 1.3 megawatts, which translates to $1.2 million a year, half of which is devoted to cooling costs. With the goal of cutting those costs by half, Patel's team runs a fluid dynamics analysis of the data center layout, examining the air conditioning and exhaust systems to optimize the racking configuration. Patel's strategy also seeks to reduce the workload of the compressor by adding flexibility to adjust the air flow through advanced refrigeration systems. Creating an adaptable data center requires multiple thermostats that can identify the different cooling requirements of precise locations. In the absence of sensor inlets in every server, Patel has used a robot to walk the aisles of a data center to transmit temperature data to a remote location where they can be plugged into algorithms that adjust the operating capacity of the air conditioners. Robots are ideally suited for the data centers because they can gather precise measurements in an environment where temperatures can be highly localized. Patel believes the future of data center heat management will be defined by the flexibility to alter power settings and to shift the operating capacity of a given region of the center if an air conditioning unit fails. He also envisions cooling on a global scale, where the workload of a company with data centers throughout the world is concentrated most heavily in the areas where the compressors will have the least work to do based on the location's ambient temperature.
    Click Here to View Full Article

  • "Serious Games, Serious Computing"
    Intelligent Enterprise (11/05) Vol. 8, No. 11, P. 10; Grimes, Seth

    In a nod to the enduring influence gaming has on the development of advanced computers, the 2005 High Performance Computing Users Conference featured a session entitled the "science of games." While they are often enormously complex environments with elaborate rule systems, games offer a human dimension not found in many other applications. Presenting at the conference was Michael Zyda, the creator of America's Army, a massively multiplayer online game (MMOG) with millions of users that also found an unanticipated function as a military training tool. Serious games are those used to instruct, but forfeit their utility if they are not fun to play. Zyda endeavors to introduce structure to serious games by integrating scientists into the development team to address human performance engineering. The goal is to produce an immersive framework that can simulate characters and emotions in a MMOG. Developers are also using artificial intelligence to facilitate widescale interaction among participants. Network and end-user hardware enable the MMOG functions of connectivity and rendering with the goal of supporting clients attempting to connect to a game, download the code required for display, and interact with other players. The relationship between high-performance computing (HPC) and gaming is reciprocal, as advances in gaming depend on HPC, which the $15 billion-a-year gaming industry will in turn support with extensive research funding. Zyda believes that improved player interaction and physics modeling are areas for growth in HPC where the gaming industry could contribute. Zyda envisions best practices for HPC emerging from serious gaming, as the development of realistic scenarios and characters draws on a collaborative team of experts by necessity.
    Click Here to View Full Article

  • "Where's MIMO?"
    Network World (10/31/05) Vol. 22, No. 43, P. 48; Mathias, Craig

    Those who are looking to buy standards-based, Wi-Fi Alliance-approved, enterprise-ready 802.11n Multiple-Input, Multiple-Output (MIMO) gear will have to wait while the competing vendor groups resolve their differences. Although the 802.11n effort has been ongoing within the IEEE for three years, a final standard is not expected to be ready until the end of 2006, with the first products coming to market in 2007. One 802.11n proposal being considered is called TGnSync--which includes Agere, Atheros, Cisco, Intel, Qualcomm, and Symbol Technologies--whose key position is the support of wider bandwidth channels (40 MHz vs. 20 MHz used in 802.11), which could potentially simplify the designs of standards-based products. The other proposal is called WWiSE and includes Airgo Networks, Broadcom, Conexant, HP, Motorola, Siemens, and Texas Instruments. One interesting aspect of WWiSE is a royalty-free contribution of intellectual property, which could potentially lower costs for products based on the standard. Both groups were asked to meet offline and resolve their differences, with a final agreement expected by next month, since neither one had been able to get the 75 percent approval that is necessary to pass a standard. Although a proposal could be ready by next month and approved as a draft standard in January, the completion of a draft standard is not the same as the completion of a standard. Much more work is needed after the completion of a draft standard, including resolving errors and inconsistencies.
    Click Here to View Full Article

  • "The Open Source Maturity Model"
    Enterprise Open Source Journal (10/05) Vol. 1, No. 2, P. 22; Golden, Bernard

    In his book, "Succeeding With Open Source," Navica CEO Bernard Golden details the Open Source Maturity Model (OSMM), a tool designed to help organizations determine whether open source products under consideration can satisfy their unique support, training, documentation, integration, and services needs. The OSMM measures these elements, gives each product component a maturity score, and produces a numeric ranking representing the product's overall maturity. Early adopters willing to accept incomplete products comprise only about 15 percent of the market, while the lion's share consists of pragmatists who demand mature, efficient, easy-to-use, fully functional products. Yet the economic structure of open source development is such that few providers can deliver products mature enough for pragmatic organizations, and this is where the OSMM comes in. The OSMM offers a three-step evaluation framework: Critical product elements are assessed in the first step through the definition of organizational needs, location of resources for open source products, determination of each element's maturity level, and assignment of a maturity score on a scale of 0 to 10. The second step weights each component's maturity score based on the organization's requirements, and individual element scores are totaled to yield an overall product maturity score between 0 and 100 in the last step. Recommended minimum maturity scores are included with the OSMM to guide organizations as to what level of maturity needs to be present for a product to be suitable for experimentation, pilot/departmental, or production environments.

  • "Frontiers of Search"
    Computer (10/05) Vol. 38, No. 10; Ramakrishnan, Naren

    As the gatekeepers to information, search engines are of paramount importance to any Web user. Today's search engines must wrangle not only with the question of pairing results of optimal relevance with a user's query, but also of combating the manipulative search engine optimization tools that seek to unfairly improve the ranking of certain sites. Some have argued that the type of search where a user is trying to "re-find" a site they have seen before should be treated distinctly from original "finding" searches. The conventional approach to search solicits a brief query from the user and returns a list of results ranked by an algorithm, but there is an emerging school of thought that views that method as incomplete, and at best as an imperfect representation of the user's intentions. Instead, they argue, search should draw out the user's needs by engaging him in a dialogue. It is also important to view the Web as an organized structure of related, hierarchical information in order to refine the traditional search algorithms. Bharath Kumar Mohan has defined the concept of nurturers, which correlate to sites that have precipitated the association networks that are becoming more important as social networking occupies a more prominent position on the Web. The Semantic Web also offers opportunities for more precise searching, as it introduces logical reasoning into the equation, where searches can factor in a user's preferences in a more nuanced fashion than traditional ranking algorithms.
    Click Here to View Full Article

  • "Managing Semi-Structured Data"
    Queue (10/05) Vol. 3, No. 8; Florescu, Daniela

    Oracle's Daniela Florescu writes that managing information today is complicated by the fact that the semi-structured data most information consists of defies quick and efficient modeling with traditional schema tools, techniques, or software. Most automatic information processing tools will not work unless the information is modeled under some variation of an entity-relationship schema, and the data schema must be developed before the data is created, which is not always the case. Eliminating schemas altogether can only compound problems because they assign meaning to data and thus permit automatic data search, comparison, and processing. "We have to learn to use and exploit schemas as helpers, but not rely on their existence or allow them to be constraining factors," Florescu explains. The author examines various tools--search engines, Semantic Web technologies, and XML--and notes that none can serve as an all-purpose solution to the problem of semi-structured data. In short, the challenge cannot be addressed by any single methodology. Among the tasks that must be achieved to resolve the semi-structured data management problem is the development of improved information-authoring tools that can generate data in an automatically digestible, easy-to-use form; renewed dedication to information extraction techniques; the creation and reuse of standard schemas and vocabularies; the production of automatic or semi-automatic schema-to-schema mapping tools; and automatic processing of semi-structured data.
    Click Here to View Full Article