HomeFeedbackJoinShopSearch
Home
 
ACM TechNews sponsored by Learn more about Texis, the text-oriented database providing high-performance search engine features combined with SQL operations and a development toolkit, that powers many diverse applications, including Webinator and the Thunderstone Search Appliance.   Learn more about Texis, the text-oriented database providing high-performance search engine features combined with SQL operations and a development toolkit, that powers many diverse applications, including Webinator and the Thunderstone Search Appliance.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Thunderstone or ACM. To send comments, please write to [email protected].
Volume 7, Issue 876: Friday, December 9, 2005

  • "EFF Moves to Block Certification of e-Voting Systems"
    CNet (12/08/05); Broache, Anne

    The Electronic Frontier Foundation (EFF) is seeking to overturn the recent certifications of e-voting machines in North Carolina, claiming that state officials did not uphold their legal obligations in approving them. The EFF filed the complaint on behalf of a voter advocacy group, urging the judge to nullify certifications granted to Diebold, ES&S, and Sequoia Voting Systems, and calling for a moratorium on the licensing of new machines until election officials become compliant with new state laws that took effect this summer. The laws created new standards for certifying e-voting machines, and called for the de-certification of existing systems, which the state officials ignored on two counts, according to the complaint. State officials failed to give the security features a thorough review and they did not collect all of the systems' source code, the complaint charges. Diebold filed a complaint in a North Carolina court last month alleging that the certification requirements are unreasonable, claiming that it would be impossible to turn over proprietary source code that powers its machines, though the judge ruled that Diebold must comply anyway or be penalized. The officials certified their machines regardless, as well as those of the other companies, claiming that none could meet the source code requirements, and that independent testing had confirmed that the machines were acceptable, provided the companies turn over the available source code and instruct the state where it could obtain code belonging to third parties by a given date.
    Click Here to View Full Article
    For information on ACM's e-voting activities, visit http://www.acm.org/usacm

  • "Tech Development a French Resolution"
    Washington Post (12/09/05) P. D1; Goodman, Peter S.

    France is converting its southern coast into a technology hub, a sort of Silicon Valley on the Mediterranean, where hundreds of companies have joined Texas Instruments and Alcatel and set up sprawling corporate campuses. This latest initiative aims to vault the nation into a position of global leadership in the high-tech and telecommunications sectors and to solve its unemployment crisis that erupted in violence in the suburbs last month. The Cote D'Azur has been named one of six Competitive Poles in the nation, hot spots of research and industry in line for $1.8 billion in tax credits and subsidies in a development process that the French government insists proceed in a collaborative fashion, where companies work together and share research as they develop new products and technologies. France has never been short on innovation, but has traditionally faltered when it comes to converting research into an engine of economic growth. Government intervention is intended to solve this problem, and to create over 84,000 jobs in the next three years and approximately 200,000 over the next decade. George Kayanakis, chief executive of ASK, a Cannes maker of computerized tickets that is helping make electronic labels used to track inventory worldwide, says, "In France, we have a lot of people who know a lot about high-tech, but they are not really put together. The Frenchman doesn't really like to gather. You need to have somebody who tells you, 'You have to get together.' You have to have a project that is fully approved by everybody, with all the details worked out and no risk for anybody."
    Click Here to View Full Article

  • "U.S. IT Workers Stressed Out"
    EE Times (12/08/05)

    IT workers are more stressed out about their jobs than other workers, according to new research from ISR, an employee research and consulting firm. Job stress is cited as a problem for 51 percent of U.S. IT workers compared with 41 percent of the overall workforce, and about 53 percent of IT workers added that their workload is too heavy, compared with 39 percent of all other workers. Though the number of IT workers that would seriously consider leaving their job has risen from 16 percent in 2004 to 25 percent this year, 57 percent of IT workers fear they may lose their job within the next year, compared with 47 percent of all other workers. Moreover, IT workers believe innovation is less likely to be rewarded, and only 33 percent are not overly concerned about a reorganization at their company. "These responses reflect the growing perception that companies view the IT function primarily as a cost-center instead of as a source of innovation that delivers a competitive advantage," says ISR executive director Gary Berger. "As the trend of outsourcing continues to gain momentum and the tough economic conditions of recent years continue to fade, it has created a perfect storm that could lead to higher turnover among IT workers as the U.S. economy continues to gain strength."
    Click Here to View Full Article

  • "'Data-in, Data-out' Signals Quantum Breakthrough"
    New Scientist (12/07/05); Knight, Will

    Two research groups have independently demonstrated a technique for transferring quantum data between atoms and photons that could produce impenetrable, lightning-quick computers and international communication networks. Laser pulses pull quantum data from a constellation of rubidium atoms packaged as a single photon, which is then sent across an optical fiber before it passes to a second cloud of atoms. The atomic clouds are known as quantum memories, and their linkage is critical to constructing networks that can harness the vast potential of quantum phenomena. The photon carried the excited atoms across roughly 100 meters of fiber optic cable to another cloud of rubidium atoms, where laser pulses transferred the quantum state. To filter the photon from the laser pulses, the researchers separated the photons with crystals based on their reflectivity, polarity, and absorption. Harvard's Matthew Eisaman described the filtering as a crucial step that could ultimately create long-distance quantum communication networks impervious to eavesdroppers. Existing methods of quantum transmission deteriorate after distances of tens of miles, but the introduction of a quantum repeater means that information is stored before it is retransmitted, which could preserve its integrity over great distances. Quantum computers could also emerge from this technique, passing information from one area of the memory to another, theoretically performing billions of calculations at the same time. Before any commercial application emerges from this discovery, however, researchers will have to extend the duration that quantum data from the clouds of atoms can be stored.
    Click Here to View Full Article

  • "Men Are From Mars, Robots Are From Mitsubishi"
    Financial Times (12/09/05) P. 9; Pincock, Stephen

    As Carnegie Mellon roboticist Daniel Wilson outlines in his book, "How to Survive a Robot Uprising," the field of robotics has taken off in recent years, with researchers around the world developing robotic applications to do everything from vacuuming to exploring space alongside man. Several Japanese companies are developing robots that can serve as in-home assistants, link up to the Internet to respond to questions, and serve as a kind of companion. Toyota recently released its Partner Robot with the ability to play the trumpet through lips endowed with sensitivity and fingers with human-like dexterity. The convergence of robotics and artificial intelligence has enabled researchers to develop devices such as Sony's Qrio, a small robot that knows to hold out its arms if it is falling, and can pick itself up from the ground. Embodied intelligence seeks to equip robots with such cognitive abilities, though ingraining many basic components of intelligence still eludes roboticists. Robots still cannot understand what gives an object its properties, though roboticists are working to give their creations the ability to learn by experience. The international group of researchers working together in the RobotCub project is trying to create a child-sized robot that can learn from interactions with its environment, just as people do. Many researchers feel that people's perceptions of the role of robots must change if they will ever be accepted as legitimate companions. Mitsubishi has begun taking orders for its Wakamaru robot, an in-home personal assistant that wakes you up in the morning, reports the weather and the headlines, and then greets you in the evening with any telephone messages. "We have tried to create a robot you can have a relationship with," said Mitsubishi's Ken Onishi.

  • "Power Could Cost More Than Servers, Google Warns"
    CNet (12/09/05); Shankland, Stephen

    The cost of powering servers could eventually exceed the initial price of the equipment unless the performance-per-watt improves, Google's Luiz Barroso warned in research recently detailed in ACM Queue. "The possibility of computer equipment power consumption spiraling out of control could have serious consequences for the overall affordability of computing, not to mention the overall health of the planet," said Barroso. The processor in a low-end server typically consumes between 50 percent and 60 percent of the overall power, reports Barroso. The computing infrastructure that powers Google, which consists of thousands of servers, has doubled in performance over its last three iterations, though absent an improvement in performance-per-watt, energy consumption has nearly doubled as well. Barroso sees the solution in multithreaded processors, what he calls "chip multiprocessor technology," which will demand a different programming technique that segments tasks to be run in parallel and simultaneously, though because it requires a complete rethinking of traditional programming, such a migration could be difficult for developers. Barroso says, "The computing industry is ready to embrace chip multiprocessing as the mainstream solution for the desktop and server markets."
    Click Here to View Full Article

  • "Port Scans May Not Always Signal Attacks, Research Indicates"
    Computerworld (12/07/05); Vijayan, Jaikumar

    A recent two-month study of quantitative attack data by the University of Maryland's A. James Clark School of Engineering shows port scans precede attacks only about 5 percent of the time, with more than half of all attacks not preceded by a scan of any kind, says Michel Cukier, Center for Risk and Reliability professor at the engineering school. The research indicates that previous ideas about network port scans being a prelude to attempted computer hacks are misguided. "There's been a lot of discussion in the security community about whether a port scan portends an attack or not," says Cukier. "The goal of the research is to find a link between port scans and an attack." Port scans were thought to be used by attackers to find open or closed ports and unused network services they can attempt to exploit. An increase in scans against specific ports has been viewed as a sure-fire warning of an impending attack against that port, but a 48-day study of two honeypot computers tell a different story. Just 28 out of 760 IP addresses tied to attacks against the university's computers launched a port scan before the attacks, says Cukier. By comparison, 381 of the IP addresses launched attacks without any previous port scanning activity. More than 22,000 connections to the two honeypots were analyzed during the study, which found that 38 percent of the attacks were preceded by vulnerability scans, which Cukier says hackers use to look for certain weaknesses on network-attacked computers. The study results were first reported in June at the Institute of Electronics and Electrical Engineers' International Conference on Dependable Systems and Networks.
    Click Here to View Full Article

  • "Cornell's Jon Kleinberg"
    Technology Research News (12/05/05); Smalley, Eric

    Cornell University computer science professor Jon Kleinberg, a member of the IBM Almaden Research Center's Visiting Faculty Program, sees many of the most stimulating areas in science and technology as multidisciplinary in nature rather than falling into "traditional" science and engineering branches. He describes network theory as being based on the principle that many technological, natural, and social phenomena can be presented in a network architecture, which encourages a phenomenological view of networks. Among the basic problems that Kleinberg believes rich network datasets captured across time could help solve is the path an idea, invention, product, technology, or entity follows throughout a network, which could shed light on the mechanisms or causes of its success or failure. The professor explains how various algorithms apply to network analysis: Algorithms that recognize densely linked regions in a network can help uncover important topics; algorithms that split a network into coherent clusters can identify not just topics, but also dichotomies or divisions within topics; and algorithms that find short paths can help address small-world and similar problems characteristic of decentralized networks. Kleinberg says understanding a piece of information on the Web involves understanding its position within the network, a principle that plays a vital role in today's Web search engines and that also applies to social phenomena. "If we look at long-term social trends as they are reflected in the Web, we see a complicated mixture of continuous and discrete effects: Gradual changes over time, punctuated by sudden, discrete, transformative events," Kleinberg notes. He believes the increased use of online tools to manage the massive amounts of information available via the Internet is polarizing civic dialogue, and he expects the growing volume, complexity, and variegation of information to force people to either create tools to more effectively handle the data or devise new management styles.
    Click Here to View Full Article

  • "Once-Brotherly Image Turns Big Brotherly"
    USA Today (12/08/05) P. 1B; Hopkins, Jim

    Google's expansion abroad and into fields such as classified ads, book publishing, video, Wi-Fi, and telecommunications threatens its once unblemished image and could force government regulation that in the past has slowed the growth of so many giants. "Google could easily become the poster child for a national public movement to regulate data collection," says Center for Digital Democracy chief Jeff Chester. Google is aware of the risk, and has poured hundreds of thousands of dollars into its lobbying efforts, recently opening an office in Washington. The recent naming of Elliot Schrage as Google communications chief gives the company an executive with a solid foundation in law and issues such as offshoring. Like Microsoft, Google's owners have spun out a charitable foundation to soften its image. Among Google's detractors is Microsoft, which is competing with Google for ad sales generated through AOL; both companies are embroiled in a lawsuit over the defection of former Microsoft computer scientist Kai-Fu Lee, who once headed Microsoft's push into China and now does the same for Google. The Association of American Publishers and Authors Guild have filed suit against Google over its plans to offer copyrighted books online. Google Base, which works as a free online classified marketing service, could push newspapers, eBay, job site Monster.com, and the National Association of Realtors to take action. Privacy is another issue, given Google's capabilities of collecting data about the online habits of Web browsers. Groups such as the Electronic Privacy Information Center, self-admittedly too weak to muscle Google, have expressed concerns over the latter's storage of every user's searches and Gmails, including deleted ones.
    Click Here to View Full Article

  • "Intel Working on Rootkit Detection Techniques"
    IDG News Service (12/07/05); Krazit, Tom

    Intel is developing a notification tool for PC users to alert them to an inadvertent download of a rootkit, such as the XCP vulnerability contained in millions of CDs recently distributed by Sony. At a recent open house, Intel outlined its vision of computers that are more in touch with their data, so users become more like supervisors of their machines than custodians. One such project would place a chip in a motherboard expressly for the purpose of monitoring for changes in a program's behavior that could be caused by malware, though such features are not likely to be included in Intel's products before 2008 or 2009. By their very nature, rootkits are designed to go undetected, and Sony had products on the market that contained the XCP software for months before it was discovered by an independent researcher. Sony had used the rootkit to activate copy protection policies. Intel's project, called "OS Independent Run-Time System Integrity Service," seeks to cut off attacks generated by malware that lays dormant in a system's memory by monitoring for alterations in application code, immediately notifying system administrators so that memory resident attacks such as the Slammer and Blaster worms could be prevented. Intel views its project as a second line of defense for existing antivirus and anti-spyware applications.
    Click Here to View Full Article

  • "European Thought Leaders Talk Innovation"
    Irish Developer Network (12/07/2005); Kelly, Joanne

    A diverse group of researchers, entrepreneurs, and lawmakers gathered in Brussels for Microsoft's European Research and Innovation Day to demonstrate and discuss the latest research and inventions that will define the future of technology. Microsoft's Rick Rashid spoke on the benefit technology can have on the European Union's economy, and how it is essential to Europe retaining a competitive international presence in an increasingly competitive environment. Rashid emphasized Microsoft's global efforts to promote research among students and faculty in universities to advance the field of computer science. Microsoft's Andrew Herbert noted that the company feels an obligation to advance European research to propel the rate of innovation. The day featured more than 30 demonstrations of new technologies in digital imagery, Web services, and social technology. One presentation detailed HomeNote, a digital bulletin board family members can use to exchange notes through digital ink or text messages. Another offering, the Educational Software for Multiple Input Devices, allows multiple users to apply a mouse to the same computer. Microsoft's Jonathon Tien presented the Image Completion with Structure Propagation, an application with which users can remove an unwanted part of an image from a digital picture, and then synthetically fill in the background.
    Click Here to View Full Article

  • "Welcome to the New World of Digital Cinema"
    IST Results (12/08/05)

    The WORLDSCREEN consortium is developing data compression techniques to handle the vast quantities of information needed to produce high-quality digital cinema. The WORLDSCREEN researchers have employed layered scheme compression (LSC) algorithms to effectively manage workflows and data without compromising quality. Project coordinator Siegfried Foessel says the researchers are working to produce solutions that meet the demands of each link of the cinematic value chain. The researchers are exploring the application of LSC on the set for digital acquisition with a portable media storage prototype built for high-end film scanners and digital cameras. By cutting out film processing and transfer, a director could immediately watch the previews and dailies. "A very important aspect of our work is that LSC offers the possibility to extract different resolutions and qualities from a single copy of the image," says Foessel, adding that that capability will be a tremendous aid to digital archiving. The researchers are working on a plug-in for a 2k-JPEG2000 encoder designed for editing software and special effects in post-production. LSC enables previews, file editing, and color correction from a compressed file in full resolution. The international community of partners in the WORLDSCREEN project has already defined workflow and metadata requirements, and is working to develop standards. The project has already attracted the attention of Hollywood, notes Foessel.
    Click Here to View Full Article

  • "Bridge Gender Gap to Avert Skills Crisis, Warn Top IT Women"
    Computer Weekly (12/07/05); Shifrin, Tash

    U.K. businesses are missing out on an opportunity to address the coming IT skills shortage by not closing the gender gap, according to female professionals participating in a recent industry debate. Sandra Smith, head of IS at Toshiba, said businesses should expect the operation of IT departments to increase in cost due to a lack of talent. Other participants noted that "less pushy" women may feel unappreciated if they do not receive pay raises or promotions they believe they deserve. These women may be less inclined to ask for a raise or promotion in their male-dominated workplaces, and may eventually move to areas where they will be recognized and can be creative. The work environment also does not foster networking opportunities for women. "From our networking events we have discovered that there are not always enough opportunities for a younger woman coming up through the ranks to talk in a very open situation to someone at the top level on an informal basis," said Maggie Berry, U.K. communications director at networking group Women in Technology. Women accounted for 21 percent of the IT workforce in 2004 compared with 27 percent in 1997, and only 17 percent are pursuing computer science degrees this year, according to the Office of National Statistics.
    Click Here to View Full Article

  • "The Root of the Problem"
    Technology Review (12/07/05); Gartner, John

    SonyBMG Music Entertainment left millions of computers vulnerable to virus writers by including the digital rights management (DRM) application Extended Copy Protection (XCP) on its CDs. The company saw the "rootkit" software as an opportunity to crack down on file-sharing, but the application included a security vulnerability that allowed virus writers to install malicious applications on users' computers. "The problem is that software coming from an established company like Sony will always be trusted by the consumer, even if they had software that popped up a warning that a driver was being installed, most [people] would likely allow it," says Windows expert Mark Russinovich, who posted his discovery of the rootkit software on his blog. What is more, rootkit applications, which are not viruses, are not closely scrutinized by computer security companies. "[There is] a difference between malicious code, as opposed to technology that can be used for malicious purposes," explains Vincent Weafer, senior director of Symantec Security Response. To make matters worse, the Digital Millennium Copyright Act makes it illegal to create new software to remove DRM software, so antivirus companies can only develop a patch to neutralize the rootkit application. Sony has now become the target of a number of lawsuits seeking to hold the company accountable for damaging property and violating personal privacy laws.
    Click Here to View Full Article

  • "W3C Looks at Next-Gen Voice Technologies"
    CNet (12/06/05); Kawamoto, Dawn

    The World Wide Web Consortium (W3C) has settled on requirements for VoiceXML 3.0, and the standard-setting group for the Internet now plans to focus on drafting specifications on voice identification verification for the next-generation technology. VoiceXML enables users to issue commands by voice, and businesses have used it to automate their processes. However, users have expressed fear about making business transactions by phones or using voice on computers because of security concerns. "Speaker verification and identification is not only the best biometric for securing telephone transactions and communications, it can work seamlessly with speech recognition and speech synthesis in VoiceXML deployments," said Ken Rehor, the newly elected chairman of the VoiceXML Forum. W3C plans to have a working draft of the VoiceXML 3.0 specifications before the second quarter of 2006. W3C also said a working group plans to hold a formal meeting in March on developing a document for extending Speech Synthesis Markup Language (SSML) to languages such as Mandarin, Japanese, and Korean.
    Click Here to View Full Article

  • "Getting Real"
    Computerworld (12/05/05) P. 34; Anthes, Gary A.

    DARPA's Real-World Reasoning Project demonstrated that with the steady addition of variables to a problem the number of possible outcomes increases exponentially, forever outpacing a computer's ability to conduct an exhaustive analysis. But the Real Project did uncover some shortcuts, such as automated reasoning tools that can definitively determine whether a specification is correct or not without having to test every technically possible scenario. Cornell University computer science professor Bart Selman, one of three DARPA contractors at work on the project, is trying to enable those tools to conduct multiagent reasoning, whereas conventional automated reasoning tools are limited to single-agent capabilities, where there are no opposing forces. Selman developed a multiagent-enabled chess program that improves with each game it plays and views the board conceptually, rather than spending processing power ruling out thousands of moves that a grandmaster would never consider. Researchers at SRI International are exploring tools with even more variables, such as a chess game with four agents. SRI's Patrick Lincoln developed an algorithm to identify the Nash equilibrium in a game, the point when no players can alter their strategy without adversely affecting their outcome. At that point the model checker can identify the best moves based on the partnerships that have developed among the players. To introduce uncertainty into automated reasoning, researchers from the University of California, Berkeley, are modeling Kriegspiel chess, where neither player can see the other's moves, forcing each to move based solely on inferences from his own moves. Algorithms to deal with problems where only part of the information is knowable could apply to real-life situations such as natural disasters or managing traffic flows.
    Click Here to View Full Article

  • "What's Next in Software"
    InfoWorld (12/05/05) Vol. 27, No. 49, P. 37; Binstock, Andrew

    It is a time of transition for many software development tools and technologies, according to this year's InfoWorld Programming Research Report. Results from a sample of approximately 300 developers indicate increased adoption of Web services, service-oriented architecture, and open source tools by the business community, and continued preference for dynamic scripting tools, Microsoft's .Net development platform, and Java. However, adoption rates among developers fall far below evangelists' projections. Pure compiled and traditional development languages--C, C++, Fortran, Ada, etc.--are expected to continue their decline, while Linux is on track to replace all versions of Unix and all mainframe operating systems within several years. Data modeling and other software quality-enhancing features are enjoying broader acceptance, although universal deployment of basic quality assurance tools remains a long way off. Forty percent of respondents said the disconnect between user requirements and developer specifications is a major problem at their work site, while 46 percent listed time pressure as a leading problem. The report suggests that software quality could be improved dramatically through the simple automation of development, code management, and testing, but only 68 percent of developers reported using tools for source control, 50 percent for configuration and deployment management, 47 percent for issue tracking, and 44 percent for logging and monitoring; the availability of free, open-source tools for such activities points to a shortage of discipline rather than a shortage of means. The report points to the absence at many sites of a basic devotion to software quality, the rapid consolidation of platforms, and growing momentum of Web services and associated tools that will probably become a standard component of distributed software in the coming years.
    Click Here to View Full Article

  • "Which Web Services Protocol?"
    IT Architect (11/05) Vol. 20, No. 11, P. 58; Hall, Eric A.; Saint-Andre, Peter

    Network Technology Research Group President Eric A. Hall and Jabber Software Foundation executive director Peter Saint-Andre argue the advantages and disadvantages of XMPP and HTTP in a debate over which transfer protocol is the most sensible option for Web services. Saint-Andre notes that XMPP offers a generalized platform for passing XML between any pair of network endpoints, and can serve as a fully asynchronous transport layer that is free from the restrictions of HTTP in terms of request-response semantics. Hall cites broad, "pervasive support" for HTTP for functions such as session-level redirection as one of the protocol's advantages over XMPP. Saint-Andre claims XMPP boasts greater security than HTTP because it is equipped with strong authentication, channel encryption, and optional end-to-end encryption, while Hall counters that HTTP's security is sufficient for most users, and its simplicity allows secure HTTP sessions to be created and carried out with less packets than secure XMPP sessions. Saint-Andre contends that HTTP's inability to pipeline due to its simplicity limits its appropriateness for many Web services, a drawback that XMPP avoids by providing fully asynchronous messaging; Hall supports HTTP's simplistic nature with his argument that pipelining can be avoided in all but a small number of implementation scenarios with very tight and connection-limited architectural constraints. Saint-Andre points to XMPP's presence awareness as a catalyzing agent for content exchange. Hall calls presence an unnecessary feature, and backs his assertion with the argument that "HTTP benefits from providing as little technology as possible (and no less), not from putting more monkeys into the barrel." In conclusion, Saint-Andre argues for XMPP on the strength of its presence, authentication, compression, and fast message exchange, while Hall praises HTTP for its efficiency, speed, lightness, and wide-ranging infrastructural support.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM