Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Thunderstone or ACM. To send comments, please write to technews@hq.acm.org.
Volume 7, Issue 873: Friday, December 2, 2005

  • "Electronic Voting Under Scrutiny as Federal Compliance Date Looms"
    Associated Press (11/30/05); Bergstein, Brian

    As a Jan. 1 e-voting compliance date approaches, election officials are scrambling to ensure that their machines are reliable and immune to viruses and hackers, as well as capable of providing a paper recording mechanism, though it remains unlikely that the required improvements will significantly affect next year's elections. North Carolina is expected to release the names of vendors whose machines meet the new standards of compliance, such as the requirement that a machine's code be put into escrow in the event of a malfunction, a clause that could drive Diebold to withdraw its equipment from the state, as it has contended that it cannot disclose the code that powers the machines' proprietary Microsoft operating system. In California, election officials are considering a measure that would force vendors to prove that their systems are resistant to hacker attacks. Several voting disruptions have already occurred due to malfunctions in equipment that has also been proven to contain security vulnerabilities. California is expected to miss the compliance deadline set by the Help America Vote Act of 2002, which called for the gradual transition from punch card ballots to electronic machines. In 2004, 23 percent of voters cast their ballots electronically, compared to 2000, when just 12 percent voted electronically.
    http://www.acm.org/usacm

  • "Secure DNS Faces Resistance"
    Computer Business Review (12/01/05)

    Members of the domain name industry are expressing skepticism about the deployment and adoption of a security enhancement to the domain name system (DNS) that would combat various phishing or pharming attacks. The security enhancement, DNSsec, was a topic of discussion at ICANN's annual meeting this week, with industry members suggesting a lack of regulation and a lack of widespread attacks have discouraged DNSsec's adoption. DNSsec improves the security of DNS records through cryptographic signing, meaning cache poisoning attacks that send users to the wrong IP address for data phishing purposes are less likely to succeed. But there is no documentation to suggest that such attacks are widespread, leading industry members to question whether there is an urgent need for the implementation of DNSsec. "We're still somewhat skeptical about DNSsec, but we want to be open-minded, we want to learn more," says Paul Diaz of domain registrar Network Solutions. There are a number of potential drivers for DNSsec, including consumers, regulation, security-conscious online businesses, e-commerce sites, and software developers. Some domain registrars are concerned about getting a return on investment if they deploy DNSsec, but others, such as Afilias CTO Ram Mohan, warn that registrars will be taking a risk by not deploying the security technology. Former U.S. Secret Service agent Keith Schwalm says the financial sector is aware that DNSsec is important to its business but is taking a slow and methodical approach to the technology's adoption.
    Click Here to View Full Article

  • "Calculating Wow!"
    USC Viterbi School of Engineering (11/28/05)

    Through the application of first principles of probability theory to a technological framework, two University of California engineers have developed a mathematical theory of surprise. Their work collected data by recording eye movements of volunteers while watching a video. USC's Laurent Itti and UC Irvine's Pierre Baldi developed a subjective method for classifying and quantifying information in their mathematical surprise theory. While previous work on communication theory focused on the observer, Itti and Baldi examined the information within the communication environment. A communicator's senses are assailed by innumerable stimuli in any setting, demanding an instinctive response that is part of a human's survival instinct. By reducing a signal into feature channels, the researchers isolated stimuli that contain unique visual traits. Through a parallel analysis of the infusion of new elements into the environment, the researchers created a model of what they call novelty. Through this process they developed a formula that claims to predict what a reaction will be to new data entering into the video stream. Observing the eye movements of participants in the study confirmed the researchers' theory. The researchers says that "At the foundation of our model is a simple theory which describes a principled approach to computing surprise in data streams...Beyond vision, computable surprise could guide the development of data mining, as it can in principle be applied to any type of data, including visual, auditory, or text." The research was sponsored by the National Science Foundation and the National Institutes of Health.
    "In Silicon Valley, Job Hopping Contributes to Innovation"
    New York Times (12/01/05) P. C4; Postrel, Virginia

    In attempting to explain the extraordinary resilience of the Silicon Valley economy, analysts identify the area's fluidity and flexibility. Computer industry employees change jobs with far greater ease and frequency than in any other market, and the economy is built on a loosely affiliated network rather than a vertically integrated structure. This enables companies to pursue a variety of projects and adapt to the changing demands of the market. The notion that moving from one job to another contradicts conventional economic wisdom, though Silicon Valley is unique in that its technology is constantly changing, which demands simultaneous work on many different projects. In Silicon Valley, the result is that the company working on the most cutting edge product changes frequently, as does the top development talent. A group of economists using population tracking data has argued that people change jobs more frequently in Silicon Valley than in any other technology center, and the overall mobility rate is 40 percent higher than the national average. This could be due partly to a California statute that prohibits companies from enforcing noncompete agreements. Outside of the computer industry, however, the fluidity falls off, as San Diego and Los Angeles report lower mobility rates than the rest of the nation.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Tech Security Day Teaches Computer Safeguarding Systems"
    Daily Nebraskan (12/01/05); Zelaya, Kevin

    Computer security was the focus of the Nebraska Information Technology Security Day held this week at the University of Nebraska-Lincoln. Local experts in areas such as identity management, secure communications, and biometric security came together to educate the public on computer-related security issues, according to Don Costello, a lecturer of cryptography and security. Lt. Governor Rick Sheehy gave the keynote address and said Nebraska's emergency preparedness plan, which includes an improved radio system, is becoming a model for other states. "A major problem during 9/11 and Hurricane Katrina was the loss of communication in the radio system," said Sheehy, who is also the state's Homeland Security director. Brian Helge, a computer engineering major who is a student of Costello, presented a slideshow on the Kerberos network security system and how it sets restrictions for use. "After the user is validated, he is granted a ticket for a certain amount of time in which they can access another server or print off information," said Helge. Bob Losee, systems coordinator for UNL Information Services, talked about UNL's move to replace student Social Security numbers with randomly assigned ID numbers, which eliminated 50 processes and made information more secure. The UNL Department of Computer Science, along with the UNL ACM student chapter, sponsored the event.
    Click Here to View Full Article

  • "The End User: Internet for the Small Screen"
    International Herald Tribune (12/01/05); Shannon, Victoria

    Mobile phone company representatives met with researchers, academics, and technology experts affiliated with the World Wide Consortium (W3C) in London last month to devise a plan for making the mobile Internet more like using the Web on a personal computer. One focus of their Mobile Web Initiative will be to establish some best practices for phone makers, browser and email software makers, mobile phone carriers, and Web page designers to follow. Common sense would tell the group to get rid of large graphics, background images, and frames or tables, and to use short titles. Best practices from the W3C include having the software that translates a page figure out how many colors a screen can show, whether the display is square or rectangular, and what browser software the device is using. W3C is also considering providing more detailed error messages; using actual devices to test features and usability; and keeping Internet addresses short. The group also addressed the issues of providing a keyboard shortcut to a hyperlink; eliminating pop-ups, pages that automatically refresh themselves or redirect to another site; and limiting content to the request of the user.
    Click Here to View Full Article

  • "Morphome Project Researching Proactive Computing in Homes"
    EurekAlert (11/25/05)

    Academy of Finland Research Director Frans Mayra's Morphome project indicates that fear of technology would be a central concern among homeowners when considering the prospect of a smart home in which computers play a significant role. Many test subjects reported the concern that computers are unreliable and reported reluctance to entrust them with important household tasks, though they were also distinctly curious about new features and applications, such as an infoscreen in the foyer to warn of electrical appliances that were left on. When integrating new devices into smart homes, such as time-controlled locks, lighting, or climate controls, Mayra's work demonstrated that it is important to be sensitive to people's fears associated with technology. In its study of automated control of light and sound, the Morphome project found that people are more amenable to computerized control over low-risk areas of the home rather than features such as door locks or the home entertainment center. Test subjects gave ground in areas such as a computer turning off a stove, where safety was a concern. The Morphome project tested smart lights that could alter their intensity or color based on noises they detected. Decibel-light has the potential to alter people's activities as they become aware of how noises affect lighting conditions. The study also determined that people react most favorably to unobtrusive technologies or devices cloaked in a familiar form, such as pillows containing RF sensors. Perhaps most importantly, the Morphome project concluded that to create a commercially viable smart home, designers must respect the need humans have to make their own decisions and their reluctance to turn too much of their lives over to technology.
    Click Here to View Full Article

  • "Air Guitarists' Rock Dreams Come True"
    New Scientist (11/28/05); Knight, Will

    Researchers at the Helsinki University of Technology have developed a system that applies music to the motions of playing air guitar. In the Virtual Air Guitar project, a computer monitors a player's movements through a video camera, and instantly matches them with riffs and licks to produce an authentic sound. Computer vision software monitors the motions and gestures of a user wearing brightly colored gloves. The researchers compiled a repository of sounds from the pentanonic minor scale, which is frequently used in rock guitar solos. Each of a guitar player's movements can be simulated in the system, from slapping the strings to holding one down while strumming energetically. Players can switch into different chords with an attached floor pedal. The researchers, who are demonstrating the project at Finland's Heureka Science Center, say that each playing experience is unique. A forthcoming version will work with a standard computer and Webcam.
    Click Here to View Full Article

  • "Modern Tools to Unlock Ancient Texts"
    IST Results (12/01/05)

    The CHLT project has developed tools that can provide access to vast repositories of historical texts buried in museums and libraries. The project generated a host of morphological analyzers, citation databases, and tools for clustering and visualization, and augmented them with existing dictionaries. The collections from several significant libraries were brought together, including the manuscripts of Isaac Newton and post-Renaissance scientific texts. Also included in the project is a translating capability based on computational linguistics, natural language processing, and information retrieval technologies. The CHLT open access library contains digitized versions of hundreds of historical texts written in Ancient Greek, Early-Modern Latin, and Old Norse with high-resolution images of the aged and fragile pages. The translation process is easy for some manuscripts, as hypertext is created automatically when they are scanned, though the Newton texts must be manually transcribed with XML encoded by hand. Sophisticated language analysis tools contain parsers, which identify the grammatical meaning of words in the obscure languages, where the meaning of a word is determined by its grammatical case. Automatic hypertext links enable a user to click on a word to register it and then look up its meaning in the dictionary. The dictionary can also be used as a translation tool, though the word-by-word process would be painstaking. Tools created in the CHLT project take a computational approach to studying writing style, where word frequencies are measured statistically. In a significant step toward integrating Europe's digital collections, the project created new library architectures and protocols, as well as metadata to facilitate sharing between collections.
    Click Here to View Full Article

  • "Beyond Gender"
    Utah Statesman (11/30/05); Naylor, Natalie

    Female participation in computer science at Utah State University hovers below the national average of 17 percent, as there were only 17 female computer science majors compared with 207 males at the school this spring. Roughly 10 percent of USU's computer science graduates are women, though they claim more than half of the school's degrees in math, dispelling any thought that their marginal participation in computer science is a matter of ability. Instead, it has more to do with the perception that the field is peopled by nerds, said computer science associate professor Vicki Allan, who partially attributes male interest in the field to their love of video games. While many woman perceive computer science as an isolated field, Allan argues that women can join ACM-W and collaborate with other majors for a sense of camaraderie. Some female computer science students feel that, far from isolating them, their minority status makes them stand out in their classes. Campus chapters of national organizations provide a sense of community, such as ACM-W and the Society of Women Engineers. "We have activities for girls in computer science so we can get together," said Divya Pillai, a computer science major who serves as vice president for the USU chapter of ACM-W. At one recent event, ACM-W brought in a guest speaker from Hewlett-Packard. Women do not have to be computer science majors to join--they must merely have an interest in the field.
    Click Here to View Full Article
    For information on ACM-W, visit http://www.acm.org/women

  • "No Train No Gain"
    Australian IT (11/29/05); Thorp, Diana

    On the heels of a recent drop in participation among students in technology courses, the Australian Computer Society has called a meeting of deans and professors to deliberate on the future of technology education. One rumor they will seek to dispel is that the field is all about programming, a belief that discounts some of the most important skills that employers look for, such as security and emerging technologies. There is fear that the declining interest in technology could cause a labor shortage as companies place more of a premium on graduate employment. In response to the flagging interest in technology, many institutions have redesigned their programs based on input solicited from industry, though student participation has not returned to its level prior to the dot-com collapse. Many universities have consolidated their programs, sometimes merging two previously independent majors into one course of study. Some feel that universities must place an increased emphasis on the business side of IT to demonstrate its relevance, rather than focusing on the more technical engineering side. If enrollment continues to slip, however, many smaller departments are in danger of being eliminated. Others could shift their focus to emerging areas such as multimedia applications and gaming. "It's difficult to generalize, but I think any university that's going to have an information technology faculty needs to have it very broadly based to allow for multidisciplinary programs," said John Rosenberg, academic vice-chancellor at Deakin University.
    Click Here to View Full Article

  • "The Internet's Third Wave"
    Financial Times (11/28/05); Noam, Eli

    Although the United States was able to maintain control of the Internet through ICANN during the global summit on the information society in Tunis last week, a more pressing issue than which government controls the Internet is whether the commercial or the non-profit segment of the private sector will be in charge in the near future, writes Eli Noam, a Columbia University professor and director of the school's Columbia Institute of Tele-Information. The apolitical "techies" who built the Internet represent the first wave of the Internet, the largely apolitical and libertarian innovators and dot-com entrepreneurs represent the second, and the highly political Internet social activist or civil society element that desires a state role in the Internet represents the third. The Internet social activists, represented by the NGOs in Tunis, have presented a formidable challenge of taking over the "means of production" in an attempt of social reform. Noam says the second wave can meet this challenge by continuing to advance the technology and winning the public imagination, maximizing its credibility by participating in the debate and offering sound values and principles, and embracing the social entrepreneurialism of the civil society. An alliance between the commercial and non-profit segment of the Internet community can beat back a fourth wave attempt at regulatory control over the Internet economy, writes Noam.
    Click Here to View Full Article

  • "Built to Last"
    Computerworld (12/28/05) P. 23; Mitchell, Robert L

    The life cycles of servers, desktop PCs, and laptops are a lot longer than they used to be, which is putting pressure on vendors to build equipment that will stand the test of time. "There's more pressure on us to make the boxes last a longer period of time," says Lenovo Group's Bill Owens. Yankee Group Research reports the average time between refresh for servers is currently four to six years, an increase from two to three years in the 1990s. Industry experts speculate that companies strapped for cash are motivated to keep equipment for longer periods of time, despite the downside of searching for hard-to-find parts for older equipment. Another reason is the aftermath of the dot-com crash, according to Jeff Wood, director of alliances at CompuCom Systems. "The economy forced a lot of companies to hold on to their equipment a little bit longer," Wood says. "They hold on to servers for four to six years in many cases." The average life cycle for a PC is three to four and half years, four to five years for a laptop, and seven years for a desktop, according to industry experts who advise organizations to plan ahead if they want if they want to keep systems longer.
    Click Here to View Full Article

  • "Computer Science R&D Goes Begging for Funds"
    EE Times (11/28/05) No. 1399, P. 1; Merritt, Rick

    A report issued by the Defense Science Boards earlier this year describes computer science as having outgrown the Defense Department's capacity to support and fund the industry it largely created. In the absence of a transition strategy, the possibility of the United States losing its competitive edge in university research is now very real. Over the past four years, DARPA estimates that it has cut funding for university research for computer science by almost half. In congressional testimony, DARPA director Anthony Tether said that some research projects have moved out of universities and into industry, and described DARPA funding as having remained "more or less constant." At the NSF, funding has actually increased, though the portion of the proposals for computer science projects it sponsors has dropped from roughly one-third to 14 percent. "We are looking at a situation where perhaps 40 percent of the good proposals we get, we don't have the money to fund," said the NSF's Peter Freeman, citing the war in Iraq and natural disasters as higher priorities for government funding. The lack of university funding has shifted much of the burden for supporting research to the corporate realm, which typically only supports short-term projects that have an evident business value. Since 1999, MIT's computer science department has seen the portion of its funding supplied by DARPA drop from 62 percent to 24 percent. As research funding becomes a lower priority for a government grappling with an escalating budget crisis, there is widespread concern that other nations, particularly China, could supplant the United States in the next 10 years as the world's leader in technological innovation. Evidence of this trend can be found in the facilities that many companies are establishing in China, India, and other countries, as well as the appearance of large new universities, such as one China recently opened with a capacity for 30,000 students.

  • "Security Expert: More Sophisticated Net Attacks Likely"
    IDG News Service (11/29/05); Gross, Grant

    Hackers nowadays are becoming more sophisticated than ever and are teaming up to do more damage, which may prove to have a devastating effect on a nation's economy, says Scott Borg, director of the Department of Homeland Security's supported U.S. Cyber Consequences Unit. Borg predicts attacks on the U.S. electrical grid, the financial industry, and the Internet may cost companies billions of dollars. Borg told attendees at the E-Gov Institute's Security Conferences in Washington to watch out for terrorist groups and criminal organizations who are now combining their talents. Despite Borg's warning, there are some skeptics who insist things are not as bad as they seem. "I don't know what people are talking about when they say urgent action is needed," says Howard Schmidt, former eBay and Microsoft cybersecurity executive and former White House cybersecurity advisor. Schmidt says cybersecurity vendors and Internet service providers are doing a good job of protecting customers from scams and credit card theft and predicts identity theft and phishing won't be a major cybersecurity issue in the future. He does agree with Borg that criminal hackers will be a major force to be reckoned with in upcoming years. Schmidt urges security professionals to improve wireless security before networked devices are everywhere.
    Click Here to View Full Article

  • "A Risky Gamble With Google"
    Chronicle of Higher Education (12/02/05) Vol. 52, No. 15, P. B7; Vaidhyanathan, Siva

    Though Google's effort to digitize and offer free online access to millions of books from major English-language libraries is an exciting prospect, New York University professor Siva Vaidhyanathan contends that the initiative could compromise much of the freedom and integrity that libraries and other reference resources embody for scholars, researchers, and the general public. Google's reputation as an egalitarian information resource can only be sustained through expansion, and its mission statement to index the world's data raises issues of propriety while also implying that libraries, universities, and other traditional disseminators of information can no longer perform such duties adequately, according to Vaidhyanathan. The author warns that the Google Library Project increases the risk of privacy infringement, as Google's privacy policy makes no promise that the company will keep patrons' individual reading records from the FBI or local law enforcement. Privatization of information resource maintenance is another area of concern because of the inherent instability of businesses, and Google is essentially a business. Vaidhyanathan argues that stable public institutions should oversee a project such as Google Library. The project's most dire implication is the burden it adds to an already unbalanced copyright system, which creates even more uncertainty for librarians, students, researchers, and other innocent copyright users. "The presumption that Google's powers of indexing and access come close to working as a library ignores all that libraries mean to the lives of their users," Vaidhyanathan concludes. "All the proprietary algorithms in the world are not going to replace them."

  • "Is There Anybody in There?"
    New Scientist (11/26/05) Vol. 188, No. 2527, P. 30; Chown, Marcus

    Stephen Wolfram's concept of an abstract "computational universe" informs his theory that extraterrestrial life--indeed, all conceivable knowledge--can be found by plumbing this universe. The premise hangs on the assertion that all phenomena, no matter how complex, are produced by simple algorithms, and Wolfram applies this reasoning to the Search for Extraterrestrial Intelligence (SETI) project. He characterizes SETI's method of scanning interstellar radio broadcasts for regular patterns as inefficient, given the tendency for intelligent transmissions to increasingly resemble random noise as communications technologies advance in step with civilization. Wolfram also doubts humans will ever spot an artifact of extraterrestrial origin because he believes our increasing intelligence will ultimately cause artificial structures to become at least as complex as natural structures. In his book, "A New Kind of Science," Wolfram demonstrates that 256 simple rules can generate very complex patterns similar to real-world structures when applied repeatedly. This has inspired a rethinking of SETI's approach in which Wolfram postulates that finding more complicated rules and charting their outcomes could lead to the discovery of extraterrestrial intelligence in the virtual realm. What is more, the computational universe can be explored for ultimate knowledge, according to Wolfram.

  • "Open Source Lights Up"
    CIO (11/15/05) Vol. 19, No. 4, P. 24; Gruman, Galen

    Industrial-grade open-source business intelligence (BI) and customer relationship management (CRM) applications have started to emerge, and Forrester analyst Michael Goulde urges CIOs to "sync up with their development teams to see [where such applications] might have payback to the organization." When properly managed, open source applications can yield significant cost savings, speed up enterprise operations, and reduce vendor lock-in. Widespread component reuse, improved access to underlying code to customize interfaces across applications, and simpler systems to manage are some of open source's benefits. Open-source applications can be particularly sensible for non-strategic applications such as reporting or salesforce automation, and are more likely to be considered by departments with novel technology requirements and smaller, frugally-budgeted enterprises, according to Goulde. He also says these applications make sense in situations where having a mutually supported tool carries universal advantages and does not erode anyone's competitive edge. However, PricewaterhouseCoopers partner Mark Lobel recommends caution, given that open-source licenses could expose strategic assets injudiciously embedded or reflected in such applications to undesirable parties. Furthermore, the success of open source hinges on volunteer developers, but a product with niche appeal can limit the scope of the potential developer community, and thus impact the application's viability. Ventana's Eric Rogge foresees open-source applications penetrating the BI reporting tool segment, while Common Sense Advisory analyst Don DePalma believes the expanding adoption of open-source databases should spur the creation of open-source reporting tools that exploit such products.
    Click Here to View Full Article

  • "The Legal Landscape of MGM v. Grokster"
    Bulletin of the American Society for Information Science & Technology (11/05) Vol. 32, No. 1, P. 6; Lipinski, Tomas A.

    The Supreme Court's decision in the case of MGM v. Grokster carries significant implications for file-sharing services and other defendants accused of secondary liability for copyright infringement. The Court ruled that defendants may assume a secondary liability if they "distribute a device with the object of promoting its use to infringe copyright, as shown by clear expression or other affirmative steps taken to foster infringement, [if supported by the evidence, the defendant] is liable for the resulting acts of infringement by third parties." In other words, the intentional inducement or encouragement of direct infringement constitutes contributory infringement. In its landmark Sony ruling of 1984, the Supreme Court decreed that producers or distributors of products would be exempt from secondary liability if their products are capable of substantial non-infringing use, except in cases where the copyrighted content being copied and distributed is obtained from sources other than those provided for free through mass media. Judge Posner of the Seventh Circuit suggested in a case involving another online file-sharing service that liability should be assigned to a potential contributory infringer only when it would not be efficient for the infringer to halt its contributory activity relative to the amount of infringement it would deter. The Supreme Court's decision in MGM v. Grokster took into account evidence that, in the Court's opinion, demonstrated a "staggering" volume of infringement committed by the defendants' customers, and a small percentage of actual use of non-infringing material. Liability for inducement is actually consistent with the safe harbor qualification of the Sony decision, because it is framed as an exception to the qualification. The solid documentation, rather than mere suggestion, of legitimate, non-infringing uses holds more weight for the Court.
    Click Here to View Full Article

  • "First ASE Fellows Honored at 20th ACM/IEEE Automated Software Engineering Meet"
    (12/02/05) Hall, Robert

    The steering committee of the ACM/IEEE International Conferences on Automated Software Engineering has established the honorary designation of "ASE Fellow" to be bestowed on those who have rendered significant and sustained contributions to the ASE community through their scientific accomplishments and services. The first three ASE Fellows were honored at the recent ASE annual conference in Long Beach, CA. Cordell Green, Director of the Kestrel Institute and 1985 recipient of ACM's Grace Murray Hopper award, was recognized for his foundational work on the original Knowledge-Based Software Assistant (KBSA) report as well as his scientific achievements and service to the community. Michael R. Lowry, of NASA Ames Research Center in Moffett Field, CA, was honored for his research achievements and for his leadership role in revitalizing the conference series. And Douglas R. Smith, also of the Kestrel Institute, was acknowledged for his seminal scientific work in the area of algorithm synthesis as well as his sustained service to ASE conferences.