HomeFeedbackJoinShopSearch
Home

Be a TechNews Sponsor

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM. To send comments, please write to [email protected].
Volume 7, Issue 867:  Wednesday, November 16, 2005

  • "CD's Recalled for Posing Risk to PC's"
    New York Times (11/16/05) P. C1; Zeller Jr., Tom

    Sony has announced that it will recall millions of CDs that contain the rootkit vulnerability embedded in its copy protection software. Sony is expected to lose tens of millions of dollars from the recall, and has set up an email address and a toll-free number for inquiries. The company has said that of the estimated 5 million affected CDs, roughly 2 million have sold. The software is designed to limit the number of copies that can be made of a CD to three, though it also contains a vulnerability that alters deep levels of a computer system's operations, making a user's machine susceptible to viruses. Security researchers have also discovered flaws in both the uninstaller software and the patch Sony issued to address the vulnerability. The software is considered malicious because it embeds itself deeply in a computer and communicates with Sony's servers once it is installed, though it has only been found to affect Windows users, and the uninstall software can expose Internet Explorer users to malware. The software, which can be manipulated by virus authors to obtain administrator privileges, is all but impossible to remove without help. Among consumers, an estimated 36 percent listen to music on their computers, which translates to 720,000 computers if that percentage applies to Sony music buyers. Security researcher Dan Kaminsky conducted a more thorough analysis and found that 586,000 DNS servers had been contacted by one or more computers seeking to communicate with one of Sony's two servers. The Electronic Frontier Foundation believes the recall is not enough, and has issued an open letter to Sony calling for refunds for consumers who do not wish to make an exchange and compensation for the time spent dealing with the software and the damage that it could inflict on computers.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Behind 'Shortage' of Engineers: Employers Grow More Choosy"
    Wall Street Journal (11/16/05) P. A1; Begley, Sharon

    While concern over a perceived shortage of engineers has reached a fever pitch, and led to calls for increased government funding for math and science education, some analysts argue that it is a myth, and that in fact there is a surplus of labor in the technology industry. Recruiters report that it is not uncommon for companies to receive hundreds of resumes in response to a single position. Some companies are also to blame for the perception of a shortage, as they insist that applicants meet every requirement on a long list of qualifications. The dispute over the very existence of a worker shortage is at the center of policy debates among lawmakers as they consider issues such as the expansion of the H-1B visas that allow foreign workers with skills considered in short supply to hold jobs in the United States for six years. Despite an 85 percent increase in the number of bachelor's degrees awarded in computer science from 1998 to 2004, both industry leaders and the NSF still complain of a worker shortage. Microsoft, for example, hired just 1,000 of its 100,000 applicants who had graduated from college last year, and currently has openings for 2,000 software development jobs. Online job postings help create the appearance of an applicant glut, as some companies are forced to use software to filter out resumes that are a bad match. Recruiters also report that employers are becoming more exacting in their qualifications, often requiring that an applicant be proficient in several specific software packages so as to minimize the period of acclimation for a new hire. Some companies are also beginning to require other skills in their engineers, such as communication, writing, and the ability to work in groups.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "A Compromise of Sorts on Internet Control"
    New York Times (11/16/05) P. C2; Shannon, Victoria

    Representatives from nations around the world meeting in Tunisia to decide the future of Internet governance reached an agreement that calls for evolutionary change in the control of the network, though it leaves the current situation unchanged in the short term. The central issue of contention was whether the United States would continue to hold the dominant hand in controlling the Internet, a fear that was assuaged by the creation of an international forum for nations to voice their concerns. Though the Internet was created largely through American research, many governments have questioned whether it should remain under the jurisdiction of ICANN, a U.S. governing body, as the majority of users now come from other countries. The agreement calls for the creation of the Internet Governance Forum in the beginning of next year, which will have no power other than its ability to bring together the principals among Internet users, such as governments, consumer groups, and businesses. The draft agreement calls for Internet governance beyond assigning names and addresses, calling for a more active role in issues such as security and affordability, though it acknowledges that the new organization will "have no oversight function and would not replace existing arrangements, mechanisms, institutions, or organizations."
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Joint Industry and Government Initiative to Demonstrate Long Distance InfiniBand at SC05"
    Business Wire (11/15/05)

    Several computer companies and research groups affiliated with the OpenIB Alliance have announced that they will demonstrate computing devices powered by InfiniBand technology at this week's ACM-sponsored SC05 conference. The use of the interconnect standard will showcase its ability to create direct access networking of low latency and high performance. In the demonstration, the groups will connect servers, clusters, and optical service platforms across circuits between two locations in Washington state 50 miles apart. Among the applications of InfiniBand to be presented are the replication of remote data centers, grid computing, and high performance media streaming. InfiniBand is converted transparently at each endpoint, enabling global distribution of its fabric. Bill Boas, vice-chair of OpenIB and a computer scientist at Lawrence Livermore National Laboratory, says, "The goal of this demonstration is to show that InfiniBand and the OpenIB software can support advanced simulation, computing and visualization across wide area networks." InfiniBand is symbolic of an emerging world of increased performance with only moderate energy requirements, says OpenIB. The clustered system to be demonstrated offer a theoretical maximum performance of 3.2 teraflops.
    Click Here to View Full Article

  • "Democrats Release Technology Agenda"
    SiliconValley.com (11/16/05); Puzzanghera, Jim

    House Minority Leader Rep. Nancy Pelosi (D-Calif.) disclosed the House Democrats' proposed "innovation agenda" for reinvigorating U.S. competitiveness at a Tuesday news conference. Goals set forth in the agenda include a 100 percent increase in federal funding to expand broadband Internet access and provide affordable access to all Americans within five years; a doubling of NSF funding and basic physical science research funding; the generation of 100,000 new researchers, engineers, and mathematicians in the next four years through scholarships and an alliance with states, businesses, and universities; and improving America's energy independence via a new government effort to develop "revolutionary energy technologies." "Democrats challenge Congress and the country to renew our commitment to public-private partnerships that will secure America's continued leadership in innovation and unleash the next generation of discovery, innovation and growth," declared Pelosi. Earlier Democrat and Republican congressional task forces made many of the same recommendations, but Democrats say their proposal stands out because of its level of commitment. "If we were in power, this would have been done," stated Pelosi. "When we are in power, it will be done." Republicans attacked the plan, citing House Democrats' lack of support for recent legislation such as the Central American Free Trade Agreement. High-tech industry groups, however, were enthusiastic: "It is clear this initiative is a major step forward in pushing the debate on competitiveness both within the Congress and the nation," said AeA director Bill Archey.
    Click Here to View Full Article

  • "Can Europe Still Compete in Technology?"
    MSNBC (11/14/05); Rogers, Michael

    CEOs from across Europe recently gathered at the annual European Technology Roundtable (ETRE) to discuss the future of the continent amid the emergence of Asian powers China and India and the continued predominance of the United States. While Europe was once at the forefront of innovation, having been credited with the invention of the World Wide Web, radar, and the compact disc, there is now a widespread fear that Europe has lost its competitive edge, and will soon be a lesser player on the global technology stage. Speaking at the conference was Skype pioneer Niklas Zennstrom, whose recent sale of the Internet telephony company to eBay for $2.6 billion stands in stark contrast to the gloomy predictions of Europe's eroding stature. Far from encouraging, however, Zennstrom used his forum instead to lambaste European venture capitalists for not supporting Skype in the first place. Failure is not an accepted part of European culture, many observers note, which often impedes the pursuit of risky research ventures. Still entrenched in Europe's business community are worker safeguards such as short work weeks and the standard six weeks of vacation that put it at an inherent disadvantage when competing with more cutthroat market economies. Despite the efforts at integration, the nations of the European Union are still fractured and decentralized, as opposed to monolithic technology markets such as China and the United States. Many ETRE participants identified Eastern Europe as holding the promise for the continent's future, as those "are the countries that want to change, and it's not clear that the rest of Europe feels the same way," said Roel Pieper, chairman of Favonius Ventures. Many Eastern Bloc countries have solid math and engineering curricula in place, and the absence of sophisticated hardware fostered innovative programming to keep their machines current.
    Click Here to View Full Article

  • "Microsoft Supersizes Windows"
    Seattle Times (11/15/05); Dudley, Brier

    Microsoft is making its first foray into the realm of supercomputing with its Windows Compute Cluster Server 2003, to be unveiled at this week's SC05 conference in Seattle. Clusters comprise roughly 10 percent of all server sales, and Microsoft sees a growing market in research-oriented companies, particularly as the prices of clustered systems continue to drop and they become easier to manage. While Microsoft is catering to the enterprise market, supercomputers usually find their home in academic settings, and 80 percent of the systems employ open source software. Historically, supercomputers have been inordinately expensive systems that can fill an entire room, though the emergence of clustered systems is expected to bring the prices down to a level where the performance they offer could become a business imperative to industries such as biotechnology, financial services, and energy. There remains a doubt as to the true definition of a supercomputer, as Oak Ridge National Laboratory's David Bernholdt noted that "it's a constantly moving target;" IBM's BlueGene/L, the world's fastest supercomputer, doubled its performance over the last year. While supercomputers have historically been defined as cutting edge systems, ACM President David Patterson identified a shift in the field that occurred roughly 10 years ago with the emergence of clustered systems. "That's kind of the new wave; for some people, they can do the problems they want to solve with basically a lot of desktop computers," Patterson said. Clustered systems also simplify the process of writing software for the systems, noted Sun's Marc Hamilton, who said that many people are writing software for supercomputer systems on their laptops or desktops.
    Click Here to View Full Article

  • "Narrowing the Digital Divide"
    Associated Press (11/12/05); Jesdanun, Anick

    A movement that uses high-speed Internet connections to help in the treatment of AIDS patients in Africa known as the Digital Solidarity Fund is aligning technology funding with social movements to cure diseases and eliminate poverty and illiteracy, though it is most notable for its uniqueness, as few similar programs exist. This week's U.N. conference in Tunis seeks to address the disparity between the 14 percent of the world's population that is online and the 62 percent of Americans who are connected to the Internet. Funding is a central issue, as many countries feel that they have adequate programs in place already, though the nations meeting in Tunis are expected to approve draft language that refers to the fund as "an innovative financial mechanism." The money from the fund will be used to provide high-speed Internet access to Burkina Faso, Burundi, and nine other African nations, as well as videoconferencing capabilities and computers for clinics, and WiMax technology to bring the Internet to remote locations. Many participants are also encouraged by the detailed nature of the language, and view it as a critical step toward bridging the digital divide.
    Click Here to View Full Article

  • "Linus's World"
    Technology Review (11/14/05); Williams, Sam

    When the Australian hacker Andrew Tridgell reverse engineered BitKeeper, Linus Torvalds responded to the ensuing firestorm by creating his own management tool from scratch, apparently resolving the most serious threat the Linux kernel had ever faced. BitMover's BitKeeper had been controversial since its inception in 2001. BitKeeper is a commercial program that Torvalds had insisted was the most appropriate tool for Linux users, who were offered free access provided they did not alter or copy its code. Torvalds insisted that the efficiency of the program was well worth its opacity, particularly in the Linux setting of small, clustered groups. In March 2004, BitMover reported an increase of 130 percent in the amount of files added to the kernel in the two years since it had been officially adopted by Linux. BitMover revoked its Linux license once free software appeared that could effectively move developer data through BitKeeper's system, presenting Torvalds with a public relations crisis. Instead of opting for a similar tool with compromised functionality, Torvalds began sketching out his own application that would enable the management and alteration of source code as it was written. Torvalds bristled at security criticisms of Git, as the new system was dubbed, and doggedly championed it for its simplicity. Git has steadily gained acceptance, though, and Torvalds has retained his position as the leader of the Linux community through a mixture of humility and hostility. "I'll happily be abrasive and opinionated if it helps get issues out in the open and gets people into the conversation. The real magic ingredient is being able to change your mind occasionally so that people know you're an opinionated bastard, but that it might be worthwhile talking to you anyway," he said.
    Click Here to View Full Article

  • "High-Speed College Network Closes"
    BBC News (11/15/05)

    The Web site of the i2Hub network now reads "R.I.P. 11.14.2005," and service founder Wayne Chang says it was shut down due to legal pressure from the entertainment industry. The file-swapping service was designed to take advantage of the next-generation Internet2 research network and allow students at 207 connected universities in the United States to share textbooks and research papers. However, the growing use of the i2Hub to swap copyrighted music and movies for free prompted the entertainment industry to target peer-to-peer networks. Over the summer, the industry won a legal victory when a court ruled that such networks could be held liable for inducing or encouraging piracy. In October, entertainment companies launched another round of lawsuits aimed at users of the Internet2 research network. The superfast network enabled users of i2Hub to download large files in seconds. The Motion Picture Association of America has charged that up to 99 terabytes of film was being swapped in a single day; the amount is equivalent to all of the film available in a video rental store.
    Click Here to View Full Article

  • "Supreme Court Poised to Take the Initiative in Patent Law Reform"
    Financial Times (11/11/05) P. 7; Waldmeir, Patti

    As many as four cases will be heard by the U.S. Supreme Court, the first case as early 2006, to decide the procedure and requirements for granting patents to domestic and foreign businesses, which some say is too lenient and leads to expensive lawsuits. The task of rewriting patent laws was previously left in the hands of Congress, who were lambasted for granting too many unsatisfactory patents, so now the Supreme Court must decide what country's courts can dictate patent disputes that cross borders, what can be patented, and how obvious must an invention be to forfeit patent protection. The "obviousness" part of the decision refers to a standard in which patents are granted, which has made companies such as Microsoft argue that the federal patent court grants too many "obvious" technologies to be patented, such as the adjustable accelerator pedal, making it easier for technology companies to face litigation by holders of untrustworthy patents. The first case the court will hear, Laboratory Corp v Metabolite, involves diagnostic medicine and will determine what subjects can be patented. Some argue the court's decision may be a recipe for disaster. "People are very alarmed at the potential of the court to upset the scope of eligible subject matter in areas well beyond diagnostics," said Stephen Maebius, a patent expert at Foley & Lardner law firm. Charles Steenburg, of Wolf, Greenfield & Sacks law firm agrees. He said, "This case could restrict the ability of inventors to protect software products or business methods."
    Click Here to View Full Article

  • "Consumers Fight Copy Protection"
    International Herald Tribune (11/11/05); Goodman, J. David

    The Brussels-based consumer umbrella organization BEUC this week called on the European Commission to strengthen the rights of consumers who use CDs and DVDs. The call comes at a time when the issue of copyright infringement remains unresolved on the continent. Although the European Union four years ago passed the Copyright Directive, giving companies the go-ahead to pursue digital rights management and content-protection technologies, consumers also gained the opportunity to make private copies of CDs and DVDs. Italy and smaller EU states have incorporated the voluntary "private use" exception into their laws, but the issue continues to attract restrictive interpretations in courts in Germany, Belgium, and France. In April 2005, a consumer won a decision in a Paris appeals court that would enable the individual to transfer a film on DVD to a tape that would allow the movie to be played on a VCR. But Universal Pictures has appealed to France's highest court, and the court is expected to make its decision by the end of the year. "The decision in France could be considered a precedent," says Cornelia Kutterer, European regional counsel for BEUC. "But then again, it is still very uncertain, even within France."
    Click Here to View Full Article

  • "Supporting a Vision for the Future of Networked Audovisual Systems"
    IST Results (11/14/05)

    The Avista program recommends that Europe must focus on standardization and content when producing the next generation of interconnected audiovisual systems. Avista has encouraged collaboration among experts and developed a new platform for Networked and Electronic Media (NEM). The program considered the European Commission's Networked Audiovisual System and Home Platforms (NAVSHP) initiative to determine where the EC could best direct its attention in the future. The key to commercial embrace of the new networks will be content, the study found. Standardization is also critical to address the nagging issue of interoperability. The Avista project concluded at the end of October, though its mission has been taken up by an NEM endeavor that seeks to unite telecom operators, broadcasters, academics, standards groups, and equipment manufacturers to design a common platform for the distribution of disparate media across technologically transparent networks. The NEM project grew out of the New Media Council, a body of experts drawn together by Avista that helped to advise and steer NAVSHP projects.
    Click Here to View Full Article

  • "Not Invented Here"
    New York Times (11/13/05) P. 3-1; O'Brien, Timothy L.

    There is a widespread fear in the scientific community that a declining interest in research and innovation throughout the private sector and the government will cause the United States to fall behind its foreign competitors who are increasing their research funding at a time when it is being cut domestically. A report produced recently by a coalition of academics and executives called on the government to offer tax incentives to create well-paying jobs in emerging industries, as well as a renewed focus on math and science from primary school through college. The report cites China, India, and other emerging countries as the major threats to U.S. innovation and job growth. The Industrial Research Institute reports that the United States has recently been outpaced in research spending and the number of doctoral degrees awarded in engineering and science by China, Taiwan, and South Korea. American invention experienced a watershed around the time of Thomas Edison's death, as the process of innovation passed from the individual inventor's hands to the province of large corporations. Individual inventors have come to fear that the entangling of innovation with corporate profitability impedes the discovery of breakthrough technologies, as corporations are more likely to back routine modifications and upgrades of existing technologies than gamble on unproven quantities. The resulting friction has led to contentious legal challenges to the patent system. Despite the pervasive fear that American innovation is in an ebb tide, many believe that where the United States falls behind other nations in math and science test scores, there still exists a culture of creativity that is critical to converting academic knowledge into breakthrough inventions. For innovation to persist in the corporate world, many inventors argue, creativity must be divorced from profit, and accidents must not equate with failure.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Tech Guru O'Reilly Mashes It Up"
    BBC News (11/08/05)

    In a recent interview, technology expert Tim O'Reilly discussed his thoughts on the open source movement. Proprietary PC software did not emerge until the 1980s with the ascendancy of companies such as Microsoft and Oracle, though Richard Stallman's founding of the Free Software Foundation in 1984 signaled a revolt aimed at making PC software freely available again. Over time, the idea of giving software away to ensure it was used by as many people as possible gave way to open source as it is known today. As languages such as Perl began emerging, O'Reilly convened a meeting of the individual programmers of free software known as the Freeware Summit in 1998, where the term open source was coined. Open source refers not just to free software, but to a culture of freely sharing information throughout a community, notes O'Reilly. He believes the early adopters are good indicators of a technology's future, pointing to Wi-Fi, which began as a local area network technology but grew into something much larger, which was adumbrated by the legions of technophiles who were playing with homemade antennas on their roofs. O'Reilly looks at VoIP similarly, and believes that much of the evolution of the Internet comes from early adopters who quickly realized that giving software away was a surefire way to attract a broad coalition of users and contributors. O'Reilly likens the emergence of open source to shifting market forces that created Microsoft's fortune and sucked the profitability out of IBM's business model. IBM did not foresee that the commoditization of the PC would take the value out of it, while it created new value for proprietary software. Now, there is value in open source applications, such as Google, which is powered by Linux, which is indicative more of a shift in the sources of profit in the computing industry, rather than a decline in the industry's overall profitability.
    Click Here to View Full Article

  • "Dispiriting Days for EEs"
    EE Times (11/14/05) No. 1397, P. 1; Roman, David

    The recent "Insight 2005" survey conducted by EE Times involving 4,083 online respondents found that engineers are concerned about job outsourcing, reduced pay increases, and a lack of respect. Kerry McClenahan of McClenahan Bruer Communications said insulting labels such as geek and nerd are deterring young people from the field, and the "average age of engineers has been creeping up, and it does not look like we're preparing replacements," he said. Respondents believed that foreign schools and universities have been more successful in educating future engineers compared to those in the United States, and 65 percent said improving U.S. education is a priority. Only 10 percent of participants believed the United States will hold on to its position of global technology leader. Compared to workers in other fields, engineers viewed themselves as smarter and working in more challenging jobs, but also said they get less support from employers. The survey found that having a "fulfilling job" was important to 69 percent of engineers, compared to 84 percent of college-educated males ages 21 to 65 surveyed by the Interuniversity Consortium for Political and Social Research (ICPSR). Insight 2005 also identified four predominant personality types among engineers--"Cowboys," who are self-motivated and more independent; "Pioneers," who are self-assured and have better schooling; "Nervous Nellies" who worry excessively and are typically older; and "homesteaders," who are steered by a need for stability and are peace-loving.
    Click Here to View Full Article

  • "Java Meets Its Match"
    InformationWeek (11/07/05) No. 1063, P. 59; Ricadela, Aaron

    Scripting languages such as Ajax and PHP could vie for Java's throne in the Web-services application development arena, as they enable faster development and prototyping of applications. Development communities are springing up around scripting languages, whose value is increasing for IT decision-makers. They also dovetail with the corporate trend of quickly deploying and updating software on the Web, while integration with Java and .Net software using Web services is getting better. The growing popularity of Ajax and PHP partially stems from a push in the software industry to favor fast and frequent releases over scrupulously tested new versions. Ajax has for a long time been out of average developers' reach due to a lack of packaged tools, but that situation may be shifting thanks to the recent release of offerings from Microsoft and IBM. "Heavy-duty computer-science folks tend to get frustrated with scripting languages because they're not precise," notes Yahoo! engineer and PHP language inventor Rasmus Lerdorf. "But for someone who has a lot of work to do and needs to go home on Friday afternoon, it just works." Some technologists are concerned that the breadth of the Web could be endangered by Ajax's penchant to create incompatibilities across browsers without methodical coding.
    Click Here to View Full Article

  • "Bring Back the Color"
    Engineer (11/08/05)

    Xerox has developed new technology that would allow a user of a black-and-white printer, fax, or copier to recover the colors of an original image. Normally, the colors are converted to shades of gray, making it difficult to interpret the graphics of pie charts and bar charts, but Xerox has developed algorithms that change each color into a different texture or pattern in an image. Karen M. Braun, an imaging scientist at Xerox, says, "When you map color to textures in this way, the textures can later be decoded and converted back to color." Braun developed the color coding and decoding algorithms along with Ricardo L. deQueiroz, who is a member of the faculty of the Universidade de Brasilia in Brazil. Xerox envisions adding the decoding part of the algorithms to the scanner of a multifunction system, which would enable the user to print documents in their original colors.
    Click Here to View Full Article

  • "Pushing the Limit"
    Science News (11/05/05) Vol. 168, No. 19, P. 296; Klarreich, Erica

    A key ingredient in the reliable transmission of data across the gulf of space as well as more terrestrial environs is superefficient error-correcting codes theorized by scientist Claude Shannon almost 60 years ago. Practical approaches to building such codes, which are known as turbo codes or low-density parity-check (LDPC) codes, have only recently emerged. Shannon's axiom established an absolute limit on such codes' efficiency, but proved that every code whose efficiency level falls below this limit would allow the almost perfect recovery of the original message by the recipient. In 1993, a pair of French engineers outlined turbo codes, which are capable of either roughly doubling the transmission speed of other codes or equaling them with just half the transmitting power; the codes use cross talk between decoders to determine confidence about the likely elements of the original message. LDPC codes follow a similar methodology, but need vastly more decoders. Decoders can guess the value of certain bits very confidently, which increases their confidence about other bits of the message, according to the Federal Polytechnic School of Lausanne's Amin Shokrollahi. "We're close enough to the Shannon limit that from now on, the improvements will only be incremental," says Flarion Technologies coding theorist Thomas Richardson. Several recent space missions incorporate turbo-code technology, which also allows many mobile phone users to surf the Internet and send audio and video clips with their handsets; meanwhile, LDPC codes have emerged as the new digital-satellite TV standard. These codes could carry dramatic savings for space agencies by enabling them to use lighter data transmitters with smaller antennas and batteries on space missions, while wireless technologies could enjoy lower service costs and suffer fewer dead spots.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM