Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to technews@hq.acm.org.
Volume 7, Issue 871:  November 28, 2005

  • "Writing the Fastest Code, by Hand, for Fun: a Human Computer Keeps Speeding Up Chips"
    New York Times (11/28/05) P. C1; Markoff, John

    Kazushige Goto, a research associate at the University of Texas, has manually written code by himself that has outperformed automated programs written by large teams of programmers. His code, which incrementally resequences the instructions bound for microprocessors, powered seven of the 10 top supercomputers in 2003. Despite the surge of IBM machines that took the top three spots this year, four of the top 11 systems were powered by the Goto Basic Linear Algebra Subroutines (BLAS). Goto has remained dogged in his commitment to maximize the performance potential of computer chips, and his programs have been used in a broad array of applications that demand sophisticated linear equations, such as optimizing the efficiency of air flow over a car or a plane. One of Goto's principal rivals, the Atlas project, led by University of Tennessee computer scientist Jack Dongarra, uses an elaborate testing system to optimize microprocessor efficiency, while Goto simply uses a software debugger that traces the flow of data within the microprocessor. Goto then applies his subroutines to the reordered data so as to glean more speed from the chip. Goto stumbled into supercomputing by accident, and rose to the apex of his field through experimentation and his own curiosity; he has no formal training in software design. The sheer linear mathematical calculations are not what appeal to Goto, but rather their application in improving on the shortcomings of the microprocessors that power every type of computational device from handhelds to supercomputers. Keeping the calculation within the memory unit often improves performance, as the memory is closest to the calculating engine. Despite criticism that he has kept his methods close to hand, Goto says that an open-source version of the Goto BLAS is forthcoming, and that there already exists a freely available version for non-commercial use.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Computer R&D Rocks On"
    EE Times (11/21/05) P. 1; Merritt, Rick

    Computer science experts dismiss the assumption that computer research is antiquated and shrinking, arguing that major advances in both computing applications and architectures are on the horizon, even in the face of decreased U.S. government funding. University of Washington computer science professor Edward Lazowska expects to see a quantum computer after 2010, while other areas of expected advancement include natural-language searches, machine learning, speech-to-text, and computer vision. Justin Rattner, who heads Intel's Corporate Technology Group, says new challenges are opening up as technology penetrates deeper and deeper into everyday life, and notes that today's leading computer science problems include designing multicore, multithreaded processors, creating the programming environments to govern them, and managing the massive clustered systems that employ them. Systems- and chip-level parallelism frequently headlines the list of most critical problems in computer science: "We are at the cusp of a transition to multicore, multithreaded architectures, and we still haven't demonstrated the ease of programming the move will require," notes Rattner. He also asks, "Can you build systems out of scalable, commodity parts--without redesigning the application software? That's really the grand challenge in computer science at the systems level." IBM Research director Paul Horn cites solution engineering as the grand challenge in software and services, concurrent with a dramatic streamlining of the software stacks' underlying architecture. Prabhakar Raghavan, another IBM Research veteran, has broken down Internet science into the core fields of information retrieval or search, machine learning, data mining, computer architecture/human factors engineering, and microeconomics.
    Click Here to View Full Article

  • "Cut IT Skills Shortage by Closing Gender Gap, Say Female IT Professionals"
    Computer Weekly (11/28/05); Savvas, Antony

    As female participation in the U.K. technology industry dropped from 27 percent to 21 percent from 1997 to 2004, a panel of several prominent female IT professionals cautioned the government and industry to overcome the gender gap to solve both the mounting IT worker shortage and the fear of the United Kingdom losing its competitive advantage. Female enrollment in computer science programs has dropped to 17 percent, and a recent survey has found that while 76 percent of girls between the ages of 11 and 18 are interested in computers and technology, 43 percent would not consider a career in IT, and 34 percent are uncertain. The panel agreed that women frequently not ask for promotions or pay raises, and often migrate to other fields where their creative talents are more appreciated. Women frequently have limited access to networking events, and are often deterred by the concern it will be difficult to maintain a balance between work and personal life in the IT sector.

  • "Probing Galaxies of Data for Nuggets"
    Washington Post (11/25/05) P. A35; Glasser, Susan B.

    The recently founded CIA office known as the DNI Open Source Center hosts a publicly available site containing blogs on a host of intelligence topics. This office is the latest creation to arise from the formal recommendations issued by a string of commissions probing the intelligence failures leading up to the terrorist attacks of Sept. 11, 2001, which centered on the agency's failure to mine the wealth of information available in the age of the Internet, and the general lack of respect among the intelligence community for publicly available information. The Open Source Center owes its origins to the Foreign Broadcast Information Service (FBIS), created in 1941, which fed the CIA translated versions of Soviet newspapers throughout the Cold War. The FBIS appeared outmoded in the round-the-clock news environment of the 1990s, and some called for a complete dismantling. The Sept. 11 attacks breathed new life into the division, which expanded from a translation service into an office dedicated to mining the entire Internet for any relevant intelligence information. The center teaches courses on exploiting the Internet to members of the intelligence community, though outside consultants report that the center itself is far from being a leader in that field. Fueling this perception is the agency's sluggishness in reacting to the emerging presence of Web sites operated by al Qaeda affiliates. While it currently monitors between 150 and 300 of the most significant jihadist Web sites, it is an uphill battle to keep up with a fast-evolving enemy, and many government officials claim that private sources are more timely providers of cyber intelligence. There is also a reluctance to value publicly available information embedded in the culture of the intelligence community, where information on the open Internet is still sometimes classified.
    Click Here to View Full Article

  • "Outsourcing to the Heartland"
    Wired News (11/22/05); Johnson, Emma

    The lack of IT jobs for college graduates in rural America prompted Kathy Brittain White to start Rural Sourcing in 2003. The company has developed a strategy for outsourcing jobs to rural America at a time when more companies are showing an interest in tapping labor in non-urban locations across the country. The $35 to $50 per hour charge for Rural Sourcing talent does not compare to the outsourcing rates of India, but they are significantly lower than paying about $100 in New York City. What is more, clients are able to take advantage of tech experts who speak with local accents and work under the same time zone, while avoiding the stigma of sending jobs overseas. Outsourcing expert Michael F. Corbett says in-country sourcing can offer low-cost and high-quality service, but he does not expect companies to stop outsourcing abroad. Corbett, head of an outsourcing research firm and author of "The Outsourcing Revolution," acknowledges the draw in the familiarity of culture, however. He says, "It's one thing to design the operations of a Web site, and its another thing when you're selecting images and the exact choice of language that is on the money in terms of what makes sense to the intended audience."
    Click Here to View Full Article

  • "Your Smiling Face"
    Engineer (11/22/05)

    A team of German researchers has created a sophisticated software tool that enables a user to animate a face in an image almost automatically. The tool creates a three-dimensional model of the face based on the original image, alters it, and redraws it in a target image, with pose and illumination being automatically computed. The user must click on roughly seven feature points, such as the edge of the subject's eyes or mouth, in the only manual steps in the entire process. The developers envision a host of photo-editing uses for their program, such as a commercial application where the user could see how they would look with a different hair-style. They also report that it will be an easy transition to apply the technique to video frames.
    Click Here to View Full Article

  • "Two UW-Madison Professors Named to National LambdaRail Networking Research Council"
    Wisconsin Technology Network (11/21/05); Kleefeld, Eric

    National LambdaRail (NLR) continues to shore up its policy platform as it acquires and puts its fiber-optic infrastructure to use. NLR has added two UW-Madison computer science professors to its Networking Researching Council, which handles policy matters on the research goals of the high-speed fiber-optic network. Paul Barford says he will help make sound decisions on new technology and help make the high-speed infrastructure and sophisticated media capabilities widely available within the next 10 years. Lawrence Landweber, a professor emeritus of computer science, adds that he will serve more as a liaison between the NLR and the Internet2 project, which has similar goals involving network research. The NLR network has a capacity for up to 40 network connections simultaneously with a bandwidth of 10 gigabits each, and NLR President Tom West believes its research will lead to improvements in systems for academic institutions and businesses. "One of the concerns of the computer science folks and network research folks is that the technology used today for the Internet won't scale to the future," says West. "We want this resource to be available to those folks to be their laboratory."
    Click Here to View Full Article

  • "Renewed Warning of Bandwidth Hoarding"
    Washington Post (11/24/05) P. D1; Krim, Jonathan

    Recent moves by Congress to reform telecommunication laws have riled tech companies that see the revisions as nothing more than permission for network operators to hoard bandwidth and restrict Internet access. Earlier this month, Congress introduced legislation that removed the ban on blocking or impeding online content, applications, or services, supposedly to address the spread of Internet video. Internet founding father Vinton Cerf submitted a letter to Congress in which he warned that "Enshrining a rule that broadly permits network operators to discriminate in favor of certain kinds of services and to potentially interfere with others would place broadband operators in control of online activity." He went on to assert that "Allowing broadband providers to segment their...offerings and reserve huge amounts of bandwidth for their own services will not give consumers the broadband Internet our country and economy need." The phone companies claim they just desire the ability to retain some of their bandwidth for their own services. SBC Communications' Edward Whitacre Jr. said in an interview that he was not pleased with the implications: "Now what they [Google, Yahoo!, MSN] would like to do is use my pipes free, but I ain't going to let them do that because we have spent this capital and we have to have a return on it," he said. "So there's going to have to be some mechanism for these people who use these pipes to pay for the portion they're using." The FCC set network neutrality as a condition of the recent SBC/AT&T and Verizon/MCI mergers, but that condition only has a two-year lifespan and is company-specific.
    Click Here to View Full Article

  • "Opposition Mounts to .Com Price Hike"
    Computer Business Review (11/23/05); Murphy, Kevin

    Reaction has largely been negative to a proposed settlement to suits filed by VeriSign against ICANN that would let the former raise prices for .com domains by 7 percent a year beginning in January 2007 until 2012 under a new .com registry contract. If ICANN's board agrees and VeriSign follows through with the mandate, the price of a domain will double from its current $6. Among opponents are VeriSign's registrars, who have largely depended on low prices to draw customers and who recoup their money from add-on services. A group of small registrars led by Moniker.com's Monte Cahn has already voiced its worries to ICANN. "In an industry where the economics suggest that fees should be going down when there is competition, it is particularly troublesome and anti-competitive to grant a monopolist or a single-source provider the unilateral right to increase costs without justification," wrote Cahn in a letter to ICANN. Also, given the amount of duplicative text in ICANN public comment forums, someone is organizing a formal letter-writing campaign opposing the deal. VeriSign itself has little incentive to raise prices, since a recent dramatic increase in .com registrations has largely been attributed to low prices and their impact on speculative registrations, which could fall should prices increase.
    Click Here to View Full Article

  • "Applications Are Outward-Bound"
    eWeek (11/21/05) Vol. 22, No. 46, P. D1; Coffee, Peter

    Today's development priorities for mobile applications are informed by future mobility issues, writes Peter Coffee. He says applications must satisfy a broad array of internal demands, namely smooth management of discontinuous connectivity and finite bandwidth; external demands, such as the proper delivery of experience on various systems, must also be addressed. Mobile access' value will sometimes reside in location-specific data, which calls for transparent data integration complemented by security measures that protect privacy without threatening to overwhelm mobile devices' processing power, the wireless links' bandwidth, or the tolerance of multitasking users. Philipp Hoschka of the World Wide Web Consortium said last December that only 37 percent of mobile handsets in North America are Web-capable, compared to 49 percent of handsets worldwide; he projected a global installed base of 1.4 billion such handsets by no later than 2008, and indicated that a lack of device-appropriate content from providers rather than demand for Internet access is blocking the universal market adoption of the technology. Observations such as Hoschka's and recommendations from the W3C and others can help define mobile application development standards, writes Coffee. It can be determined from such advice that applications should not depend on Web browser conventions to facilitate basic tasks, while developers must accept the fact that their applications will not always perform perfectly. The solution to this problem is to write applications that commit a sizable amount of logic to limiting user choice, as well as attempt to spot errors at any point in the application stack and then provide users with enlightening descriptions of the errors as well as resolution instructions. Mobile applications keep their strain on the user minimal and make the biggest possible effort to let a user make several choices and then send requests to the site for processing.
    Click Here to View Full Article

  • "A Paper Questions Whether Search Engines Make Popular Sites More So"
    Economist (11/17/05) Vol. 377, No. 8453, P. 86

    Search engines drive Internet surfers more to less popular Web sites than popular sites, according to new research from Santo Fortunato and his colleagues at Indiana University and Bielefeld University in Germany. The researchers have posted their paper on arXiv, which is a hub for physics and related papers. The new paper is controversial because most computer, social, and political scientists believe search engines make popular Web sites even more popular because their algorithms flag pages as popular if they are linked to a number of other pages. However, according to the model developed by the researchers, the combination of the use of search engines and following random links directs people to less popular sites. The model is based on the use of random links to surf the Web and visiting pages returned by search engines. Arizona State University political scientist Matthew Hindman questions the research's data and says the difference between the model and the real world is not necessarily the result of the role of the search engine.
    Click Here to View Full Article

  • "Smart Searching"
    Military Information Technology (11/21/05) Vol. 9, No. 9; Gerber, Cheryl

    Extracting meaningful and relevant information from massive volumes of data is a tough job for defense intelligent analysts, who are turning to more guided and dynamic search tools and methods to sift through data more efficiently. The Defense Intelligence Agency (DIA) performs effective research and analysis through a semantic search methodology supported by a diverse array of technology tools. Together, these tools comprise a solution that facilitates multilingual, integrated knowledge sharing via entity extraction, text mining, and text analysis. Components of DIA's solution include a search and knowledge discovery platform for structured and unstructured data that also boasts high scalability; software that supports next-generation search with guided navigation featuring data integration, discovery, and analysis; and a linguistics platform that DIA's National Media Exploitation Center employs to process and distill information in the native script of multiple dialects. The linguistics modules enable different government agency transliteration standards to interoperate, while the Intelligence Community Metadata Working Group can also address the issue of rival standards. Other tools DIA uses include an in-house metadata extraction tagging system and a search engine that can identify each sentence's subject, verb, and object and organize the data in such a way as to effect easier analysis of relationships across documents. The search engine permits Web browsers to be selected on an individual basis, while another program can uncover trends and relationships.
    Click Here to View Full Article

  • "Copyright Office Opens DMCA Anticircumvention Rulemaking Proceedings"
    Today's Engineer (11/05); Hollaar, Lee

    The Copyright Office of the Library of Congress will revisit the restrictions on circumventing access controls to determine if more exceptions are needed, as directed by the Digital Millennium Copyright Act (DMCA). The DMCA mandates that the exceptions are to be reconsidered every three years to determine if they preclude legitimate, non-infringing use of select classes of materials. The Copyright Office is limiting its concern to restrictions on the access to copyrighted works, as the DMCA does not permit it to reevaluate copyright protections concerning technological circumventions of copyright owners' rights regarding reproduction, adaptation, distribution, display, or performance of a work. Classes of works are defined by their authorship, and fall into categories such as literature, drama, motion pictures, architecture. The Library of Congress decided in 2003 that certain Internet locations, computer programs and video games that have become obsolete, and select ebooks should be exempt from reconsideration. The Copyright Office will also not entertain comments on the general nature of copy protection. Submissions to the Copyright Office will need to pinpoint the technological issue that causes the access problem, describe in detail what the access problem in fact is, and explain why the affected use is non-infringing.
    Click Here to View Full Article

  • "Robots of Arabia"
    Wired (11/05) Vol. 13, No. 11, P. 188; Lewis, Jim

    The creation of robot camel jockeys is seen as a significant achievement from both a technical and social perspective. The machines were developed in an attempt by Qatar's emir, Hamad Bin Khalifa Al-Thani, to win the respect of the developed world by eliminating the practice of using children, imported from poor nations and trained and housed under less than humanitarian conditions, as jockeys. The 35-pound prototypes were developed by Swiss company K-Team under the guidance of project and services manager Alexandre Colot. The remote-controlled devices fit into specially designed saddles and feature two hands--one to pull the reins and one to bear the whip. The robots were also designed with a ruggedized aluminum frame and shock absorbers; a GPS-enabled monitor that tracks the camel's heart rate; and a 400 MHz processor running Linux and communicating at 2.4 GHz. A plastic head adds an anthropomorphic touch that makes the camels more accepting of the robots, but this feature is frowned upon by Arabic culture, which considers representations of the human form taboo. Qatar's prime minister has mandated that the heads must be removed before the commencement of the racing season.
    Click Here to View Full Article

  • "Papers Please"
    Government Technology (11/05) Vol. 18, No. 11, P. 18; Vander Veen, Chad

    The Real ID Act, ostensibly passed as a terrorism deterrent in response to 9/11, has become a lightning rod for controversy amid accusations that the law really establishes a de facto national ID card. The legislation sets up federally mandated standards pertaining to the information on state drivers' licenses, and requires states to comply with specific rules regarding the issuance of such licenses. The Real ID Act specifically requires licenses to incorporate the first name, birth date, gender, permanent address, license number, signature, and a digital photo of the bearer, along with yet-to-be-determined security features and machine-readability. The lack of clarity over such components, as well as the stipulation that all license data will reside in a national database, is of particular concern to privacy advocates, given that the entire country will have new drivers' licenses in just a few years. Critics of the legislation also take aim at the fact that it was passed as an attachment to the Emergency Supplemental Appropriations Act for Defense, the Global War on Terror, and Tsunami Relief 2005, a tactic that Ari Schwartz of the Center for Democracy of Technology calls "undemocratic." Other issues of concern include whether states will be forced to reissue new licenses to people who already have them, and the costs this would entail. In addition, the Real ID Act contains provisions for reforming deportation and admission laws; revising the definition of a refugee; and authorizing the homeland security secretary to suspend all laws necessary to guarantee the rapid building of specific roads and barriers at the U.S. border. Opponents charge that the Real ID Act's provisions could inspire Orwellian privacy infringement and citizen surveillance, as well as present additional obstacles to immigrants or refugees.
    Click Here to View Full Article
    To read about ACM's activities regarding the Real ID Act, visit: http://www.acm.org/usacm

  • "Interface Lift"
    IEEE Spectrum (11/05) Vol. 42, No. 11, P. 32; Wohl, Amy D.

    User interfaces are undergoing a major overhaul so that users can more efficiently manage a tidal wave of information stemming from steadily increasing computer power. There are three emergent interface varieties: Browser interfaces that allow users to easily transfer between computers; special-purpose interfaces for managing large information repositories; and interfaces for managing personal data collections. The browser interface ensures familiarity to Internet users and does not require specialized software to run applications residing on a remote server, but it is less capable of running complex applications. Specialized interfaces for navigating large collections of information include Inxight Software's Star Tree, a tool that arranges collections into galaxy-like configurations of subtopics that are easily comprehensible. EverNote is an example of a personal collection management interface: The product stores all computer files, regardless of format or type, in a single, searchable chronological manuscript where files are displayed in miniature along a timeline. The most radical user interfaces are coming out of research labs, independent inventors, and new ventures, because entrenched companies do not wish to estrange mainstream customers by rapidly introducing new concepts. Consequently, stalwarts such as Microsoft are tweaking their interfaces in increments without dramatically changing their look and experience. The more experimental interfaces include 3D and haptic interfaces.
    Click Here to View Full Article

  • "Holding Pattern: The 2005 Salary Survey"
    Software Development (11/05) Vol. 13, No. 11, P. 32; Morales, Alexandra Weber

    The eighth annual Software Development Salary and Job Satisfaction Survey shows a slight increase in salary levels--3 percent or lower, on average, compared to last year--while the mean age of software development professionals has risen from 39 in 2000 to 41 in 2005; additionally, the average number of workers' years of experience has climbed from 13 to 16 in the last five years. Satisfaction levels among the survey's approximately 3,500 respondents remain consistent: Less than 4 percent report extreme dissatisfaction, 14 percent report dissatisfaction, 23 percent are neutral, 43 percent report satisfaction, and 16 percent report extreme satisfaction. Sixty-four percent of respondents cite challenge as the most critical aspect of their job, while 55 percent cite flexible scheduling. Offshore outsourcing is identified as one reason why 37 percent of respondents are looking for a new job, while the 9 percent segment of professionals who call offshoring a factor in their job search represents an increase of six percentage points from last year. Eighteen percent of the survey's respondents were foreign-born, compared to 15 percent in 2004. Foreigners comprise 30 percent of software architects and 8 percent of staff database analysts, while non-native developers with a master's degree in computer science outnumber native developers 2 to 1. Fifty-six percent of staffers rate programming and algorithm design skills as very important, while just 48 percent of managers consider them to be crucial; 11 percent of respondents perceive proficiency with service-oriented architecture as important, compared to about 10 percent last year. Tools ranked in descending order of popularity are version control tools (75 percent), IDEs (60 percent), requirements gathering (54 percent), testing (51 percent), project management (49 percent), modeling (less than 30 percent), and MDA or computer-aided software engineering (4 percent).
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)