Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM. To send comments, please write to technews@hq.acm.org.
Volume 7, Issue 870:  Wednesday, November 23, 2005

  • "Google's Growth Helps Ignite Silicon Valley Hiring Frenzy"
    Wall Street Journal (11/23/05) P. A1; Tam Pui-Wing; Delaney, Kevin J.

    In the 15 months since Google went public, the meteoric rise of its stock price driven by its innovative approach to technology has touched off a hiring binge in Silicon Valley that recalls the climate prior to the dot-com collapse. Google has a current target of hiring 10 employees a day, and has enlisted legions of recruiters to scour the software community and deliver top engineers. Google has also held code writing contests to discover new talent. In its relentless search for top engineers, which Google's Alan Eustace believes have at least 300 times the value of an average engineer, the company has drawn such industry luminaries as Vinton Cerf, an Internet pioneer, away from MCI, and Kai-Fu Lee from Microsoft, and its rivals have taken note, accelerating their own recruiting initiatives. The high tide of the tech boom had seen aggressive recruiting initiatives from companies such as Apple, Cisco, and Netscape, though in the years since the collapse, Google appears to be the singular engine of growth, having witnessed a sixfold revenue increase from 2002 to 2004. Google is not alone in its hiring efforts, however, as the recruiting site Dice.com reports that its job postings have more than doubled since October 2003, now totaling 77,600. Google's aggressive tactics have prompted hostile responses from some companies, such as Microsoft's lawsuit contesting Lee's defection. Google's interviewing process is acknowledged as rigorous beyond precedent, and the company offers generous restricted share packages to augment its salaries, which, it admits, are reasonably in line with the industry, if slightly lower. Yahoo! and Microsoft both have increased their hiring efforts, often competing directly with Google for individual candidates. As Google's stock price continues to soar, many of its early employees are millionaires several times over, raising the concern that top talent could be diminished by their ability to retire early.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "House Democrats Stake out High-Tech Ground"
    Federal Computer Week (11/21/05); Sternstein, Aliya

    In an effort to restore the nation's technological competitiveness, House Democrats have introduced a laundry list of measures to be carried out over several years, known as the Innovation Agenda, though there is the fear among industry leaders that the proposal could be scuttled by partisanship. The agenda calls for doubling the budget of the NSF, cultivating a skilled labor force, and expanding the tax credit for research and development. Republicans were quick to question the Democratic commitment to the agenda, which would provide education for 100,000 new scientists, mathematicians, and engineers over the next four years through scholarships for students who pledge to make those vocations their careers upon graduating. A new visa would be created for international doctoral and postdoctoral academics in the fields of math, science, engineering, and technology, to ensure the U.S. talent pool remains full. MIT's David Clark, a pioneer of the Internet, praised the call for greater NSF funding, claiming the current lack of funds is the central impediment to innovation. Clark also blasted the U.S. immigration policy, describing it as "onerous," and blaming it for the declining influx of foreign students. Another Internet visionary, Google's Vinton Cerf, likened the initiative to the effect Sputnik had on him in his youth in its potential to ignite interest in technology careers. "I was one of those affected, and looking back, it could not have been more fortuitous," said Cerf. "I sincerely hope that this kind of initiative will engender bipartisan support."
    Click Here to View Full Article

  • "Video Games Are Their Major, So Don't Call Them Slackers"
    New York Times (11/22/05) P. A21; Schieel, Seth

    The study of video games is emerging as a mainstream academic pursuit, after long having been a minor niche discipline reserved for obscure vocational schools. The academic study of video games can either come in the form of programming and design, or as an examination of their symbolic significance to contemporary culture. While five years ago there were not even a dozen programs devoted to the study of video games, there are more than 100 now, some with an interdisciplinary focus that brings together seemingly dissimilar fields such as drama and computer science. The nation's leading game maker, Electronic Arts, has contributed millions of dollars to universities to expand their offerings in game programming and design. Bing Gordon, EA's chief creative officer, notes that for two decades students entering the gaming industry had to unlearn much of the computer science they studied in college to acclimate themselves to the gaming environment. Schools such as Parson the New School for Design, located in Greenwich Village, emphasize a soup-to-nuts approach to game development, where students begin with sketching out crude ideas on paper, and then move on to more sophisticated graphical design and story development. Professors in the video game programs credit them with having amalgamated students from hard-core disciplines such as computer science with the artists, and believe the collaboration is beneficial for both. While skeptics argue that video game design cannot be taught as a discipline, proponents liken that objection to Hollywood's initial reaction to the emergence of film schools in the late 1960s and early 1970s.
    Click Here to View Full Article

  • "Record Rise in Overseas IT Workers"
    ZDNet UK (11/21/05); Gomm, Karen

    The United Kingdom saw an influx of 22,000 foreign IT workers last year, 85 percent of whom were from India and many of whom were highly skilled. For its part, the United Kingdom outsources many lower-level IT jobs to India, but the flow of traffic goes both ways, notes Ann Swain, CEO of the Association of Technology Staffing Companies, the organization that supplied the data. Swain projects that the United Kingdom will only become more dependent on foreign IT workers if the number of graduates in technical fields does not increase. Some industry insiders are fearful that the United Kingdom could become a less attractive destination for foreign workers, which could cripple the technology sector if more graduates do not appear. There are also concerns that even India's vast talent pool could be drained as early as 2007, as some estimates place wage inflation in the technology sector at 36 percent. Software engineers and systems analysts are two of the occupations most in demand in the United Kingdom, according to the Home Office. Education will be central to resolving the issue, notes Swain. "Getting more students to study IT is crucial if we are to reduce our dependence on foreign skills, particularly now that those skills, like our own, are in increasingly short supply."
    Click Here to View Full Article

  • "Animation Event as Montreal ACM SIGGRAPH"
    fps (11/22/05)

    This week's Montreal ACM SIGGRAPH animation event showcases the relevance of computer graphics to traditional forms of animation. Computer graphics as a field can succeed in tandem with conventional animation; indeed, their relationship is not so much competitive as symbiotic. Art is historically predicated on the convergence and fusion of media, and animation should be no different. Tonight, Montreal ACM SIGGRAPH features "Creating Digital Animation: Needs, Challenges, Technology and Tools" at the Society for Arts and Technology. Wednesday's event is sponsored by Toon Boom, which will present its tool suite, highlighting its newest product, the demoscene.
    Click Here to View Full Article

  • "College to Collaborate With Sun Microsystems"
    Dartmouth Online (NH) (11/21/05); Bradley, Astrid

    Dartmouth University's Public Key Infrastructure Laboratory will develop an open-source certificate authority that runs on the open-source operating system of Sun Microsystems. The move is part of a larger collaborative effort that will have the PKI Laboratory continue work on hardware-based security, and have Sun donate equipment for a restructured graduate operating systems course. Dartmouth will bring its advances in secure computing to Sun's Open-Solaris Project. "One of the nice things about Open Solaris is that it is a well-engineered, high-performance OS that is available now to students and to the educators to see the insides," says Sean Smith, a computer science professor who is the director of the PKI Laboratory. And the donated equipment will give Dartmouth students more of an opportunity to be hands-on with operating systems. "A lot of the research and directions that Dartmouth were pursuing were very interesting to Sun, and a lot of the things that Dartmouth needed in terms of again providing that really solid foundation on which to build we had already put into Solaris-10," says Glenn Weinberg, a 1978 graduate of Dartmouth who is vice president of operating platforms at Sun.
    Click Here to View Full Article

  • "Ohio Professor Receives National Computational Science Award"
    Ohio Supercomputer Center (11/21/05)

    Capital University's Ignatios Vakalis, a professor of mathematics and computer science, received the Undergraduate Computational Engineering and Sciences (UCES) award last week at the ACM- and IEEE-sponsored SC05 conference. The award credited Vakalis as an innovator and acknowledged the scope of his contributions to computer engineering and scientific education. This is the 12th year the UCES award has been given for achievement in computer science education of students, faculty, or members of outside groups. Vakalis was cited specifically for his work developing curricula for computation engineering and science courses, such as interdisciplinary Web tools. Vakalis is also involved in an NSF program designed to standardize the curricula of computational science across colleges in Ohio. "My goal for developing the computational science coursework is that students will see the beauty and the practical use of computational science," Vakalis said. "I want students to be aware of the intersection and interplay among mathematics, computing, and science."
    Click Here to View Full Article

  • "Root Servers: The Real Net Power"
    CNet (11/21/05); McCullagh, Declan

    RIPE, based in Amsterdam, manages one of the root servers that guide online traffic to top-level domains, and is the regional address registry for Europe, the Middle East, and Russia. Of the 13 organizations charged with a server, not all are located in the United States, making moot some of the concerns delegates from developing-world countries expressed at the just-finished U.N. summit in Tunis in which the formation of a U.N. Internet Governance Forum was agreed upon as a compromise to the unwillingness of the United States to cede oversight of Internet governance to an international organization. RIPE managing director Axel Pawlik does not believe the status quo needs changing, noting that though technically the United States has veto power over ICANN mandates, it does not employ its right. Happy with ICANN's management, though admitting it could be improved, he cannot foresee a day that all the countries of the world will unite to give a group such as the International Telecommunication Union power over Internet governance, which the group believes itself to have a right to. Nor can Pawlik envision a day when the United States government orders ICANN to take a domain off the air because it happens to be at war with a particular country, but when asked what would happen if ICANN approved the .xxx domain and the Bush administration decided to use its veto power, he said he was not sure how the root server operators would react. Pawlik, while recognizing European opposition to U.S. involvement, also says that most countries are happy with ICANN's job thus far.
    Click Here to View Full Article

  • "SDSU Professor Maps the Future"
    Voice of San Diego (11/22/05); Bettiga, Molly

    When the San Diego area was ravaged by wildfires two years ago, San Diego State University professor Ming-Hsiang Tsou created online maps to display the locations and perimeters of the fires. Every six or seven hours Tsou updated the site, which received as many as 8,000 hits each day. Since then, Internet mapping has become a burgeoning industry, and SDSU is eager to stay atop the field. Tsou is currently at work on four mapping projects, having been commissioned by the city of San Diego and NASA, among others. NASA is collaborating with the U.S. Border Patrol in the Reason Project aimed at bringing agents up to speed in technologies such as Web mapping and GPS. Tsou says the project is not limited to a single area, as it creates a large database that can be accessed online. In another project, Web mapping monitors water quality in the San Diego area, relying on samples taken and reported by private companies, so that any resident could go online and learn the quality of their water. Tsou is also working on an NSF project focusing on geographical information science (GIS), aimed at encouraging participation and education in the field. Tsou cites Google Earth and Microsoft's Virtual Earth as examples of popular applications of GIS, standing in contrast to the notion that it is a specialized field with limited commercial appeal.
    Click Here to View Full Article

  • "Supercomputing Now Indispensable"
    MercedSearch.com (11/22/05); Lieberman, Bruce

    The blazing fast computational ability of today's supercomputers has become indispensable to scores of scientific applications, from climate modeling to simulating the conditions of the universe at the time of its creation. The fastest supercomputer today can perform at speeds roughly 150,000 times faster than a high-end consumer PC. Last fall, Jean-Bernard Minster and a team of scientists at the San Diego Supercomputer Center used the facility's fastest machine, an IBM system called DataStar, to simulate the effects of an earthquake on the area between Los Angeles and San Diego. Over five days, DataStar produced 47 trillion bytes of information in one of the largest earthquake studies ever conducted. While the United States has long been home to the fastest supercomputers, there is concern that advances in China and Japan could threaten its position at the forefront of the field. Supercomputers are vital tools in solving a broad swath of the most daunting problems facing the United States, such as defense intelligence in Afghanistan and Iraq, gene modeling and cancer research, and monitoring the aging stock of nuclear warheads. The president's proposed budget for 2006 does not allocate any new funding to supercomputers, and actually cuts funding for Department of Energy supercomputing research. The San Diego Supercomputer Center receives its annual budget of $80 million from the NSF. The center at San Diego boasts several systems with a combined computing capacity of 28 teraflops, and advises scientists around the country on how to link desktops together to create small versions of supercomputers. Improving the memory gap, or the relative slowness of a computer as it moves data from its memory to its processor, will be central to the further advancement of supercomputers.
    Click Here to View Full Article

  • "Self 2.0: Internet Users Put a Best Face Forward"
    Washington Post (11/22/05) P. A1; Noguchi, Yuki

    Tens of millions of Internet users shop, socialize, and communicate online through virtual personas or avatars that serve as representations of themselves. It is estimated that 90 percent of America Online instant messengers currently employ an avatar of some kind, while 7 million people visit Yahoo!'s avatar creation site a month. Avatars are for the most part humanoid, and some can perform physical and facial gestures, as well as speak; the interaction between avatars can be surprisingly intimate. Though avatars' physical appearance can be idealized or exaggerated, they tend to accurately reflect their user's personality, which helps cultivate trust and rapport. Oxford University research fellow at the Oxford Internet Institute Ralph Schroeder says, "People become attached to their online identity. They care about being consistent so that people can trust them. This issue of trust comes up again and again." As time goes on, avatars may follow their real-world counterparts around on various programs, such as a name tag that appears on instant messages, Web logs, and shopping searches. "It's part of a range of personalization tools we're building at Yahoo!," notes Yahoo! director of community applications David Kopp. Experts predict that avatars will eventually become the primary means of online identification between computer users.
    Click Here to View Full Article

  • "Putting the Napster Genie Back in the Bottle"
    New York Times (11/20/05) P. 3-1; Hansell, Saul

    File-sharing service Grokster plans to introduce peer-to-peer technology by the end of the year that will monitor the file-trading of copyrighted works, and force the file-swapper to pay for a song or block the attempt to download the content directly on another computer. The technology that Grokster will use was been developed by Snocap, the new company of Shawn Fanning, 25, who developed Napster seven years ago while he was a student at Northeastern University in Boston. "Nobody has ever built a reliable peer-to-peer service, where people can really access all the music they want in one location," says Fanning. He believes Napster would have eventually served as a legitimate and profitable sales channel for music companies and artists if it would have continued, and he envisions legal file sharing as providing an open system for anyone from major labels to local bands to register music and set the economic terms for trading songs. Sony is backing the startup Mashboxx, which is buying the Grokster name and is the only peer-to-peer service to sign up for Snocap's technology so far, and the three other big labels have agreed to provide music to Snocap as well. However, the jury is still out on how many people will embrace legal file sharing.
    Click Here to View Full Article

  • "His Goal Was to Make It Simple to Use and a Joy to Look at. He succeeded. The Result Was the iPod."
    Telegraph.co.uk (11/19/05); Derbyshire, David

    Apple's Jonathon Ive, the British-born designer credited with inventing the iPod and the iMac, attributes those designs to the collaboration of the manufacturing, software, hardware, and electronics teams. Through his inventions, Ive essentially reversed Apple's slumping fortunes single-handedly; today 30 million iPods have sold. Ive says that he approaches design from the standpoints of simplicity, aesthetics, and quality. Apple's products continue to improve with each successive design, such as the new iPod Nano and the new iMac, which packs a home entertainment center into a flat screen television's casing. Ive consistently rejects clutter in his designs, instead preferring unadorned, frequently white products with no unnecessary buttons or features. The remote control to the new iMac has one large button that performs 12 functions. Ive focuses much of his attention on blending functionality with design, such as the hinge that connects the monitor of the iMac to its base and etching the serial number onto the back of the device, rather than placing it on a visually offensive silver sticker. He also put a magnet on the remote so that it can attach to the side of the unit when not in use. Ive laments that more companies are not as concerned with the design of their products, maintaining that it is a testament to their values when they create products that require their users to frequently refer back to the instruction book.
    Click Here to View Full Article

  • "Blue Brain Power"
    Computerworld (11/21/05); Hamblen, Matt

    Researchers from Lausanne, Switzerland, have undertaken a project known as Blue Brain, using IBM's eServer Blue Gene to model the human brain. The project is analyzing the functions of the 10,000 neurons within a rat's neocortial columns (NCC), which are quite similar to those in the human brain. Each NCC is 0.5 mm in diameter and between 2 mm and 5 mm in height. First, Blue Brain will attempt to build a template within two to three years that models an NCC, which will help the scientists understand how information is stored and retrieved from memory, potentially shedding light on the memory's capacity and how it can be lost. Exposing vulnerabilities in the neocortex could be particularly instructive, as brain disorders originate in that region. Numerous neurological and psychiatric disorders could be treated through the research, such as depression, autism, and schizophrenia. Performing brain experiments using computer simulations will be much more efficient than conducting research in the lab. Using the IBM simulation will be especially beneficial because it will leave no stone unturned in examining the neurons, providing the researchers with unprecedented comprehension of the brain. Once the researchers complete the modeling of the neocortex, they will move on to other regions, such as the hippocampus, ganglia, and cerebellum. Installed in July, the Blue Gene supercomputer simulated 25,000 simple neurons in 60 seconds, though the modeling of the 10,000 complex neurons in the NCC will be a far lengthier process. The supercomputer contains 8,096 processors, each of which is expected to simulate between one and 10 neurons. As processing speeds and memory advance, it may be possible to model the entire human brain, which contains 100 billion neurons, within a decade.
    Click Here to View Full Article

  • "ITU Head Foresees Internet Balkanization"
    Computer Business Review (11/21/05)

    International Telecommunications Union secretary-general Yoshio Utsumi hinted that the "nuclear option" mentioned prior to the World Summit on the Information Society in Tunisia could manifest itself over the next few years. "The Internet need not be the so-called one Internet controlled by one center, so regionalization is already getting started and I suspect in few years the scenery of the Internet will be a quite different one," said Utsumi. At the heart of the debate is ICANN's oversight of the domain name and IP address allocation systems that in effect gives the U.S. Department of Commerce power over the management of other country's top-level domains. Many U.N. member nations had called for international oversight under the umbrella of a global organization such as the U.N., but the United States strongly opposes a change. The summit resulted in agreement that the status quo should remain regarding Internet governance but that a new "Internet Governance Forum" should be established to serve in an advisory role to ICANN. The group would not be given any oversight powers. Some suggested that the United States did cede some of its power at the summit, citing Paragraph 68 of the Tunis Agenda, which reads, "We recognize that all governments should have an equal role and responsibility, for international Internet governance and for ensuring the stability, security, and continuity of the Internet." Paragraph 63 reinforced the message, stating, "Countries should not be involved in decisions regarding another country's country-code Top-Level Domain." But ICANN President Paul Twomey said the agenda does not in any way change the way ICANN carries out its duties, though there "may" be some change in how Commerce approves its decisions.
    Click Here to View Full Article

  • "Spies, Lies and Butterflies"
    New Scientist (11/19/05) Vol. 188, No. 2526, P. 32; Brooks, Michael

    Applying chaos theory to the concealment of messages could give users of existing communications networks more peace of mind, according to a new study in Nature. Chaotic communications can be facilitated through a setup involving two synchronized lasers, in which a chaotic light beam emitted by one laser is used to transmit a message securely to the other. The beam is split and fed into a reflector that provides feedback and induces chaos, and the chaotic light signal is routed along a fiber-optic cable to a modulator, which adds the message; part of the encrypted message is then fed into an identical laser, producing a chaotic signal that corresponds to the one generated by the transmitting laser, which is then subtracted from the encrypted signal to reveal the message. "If you need security for a number of hours, where it wouldn't be easily detectable, or where there's a low probability of interception, [chaotic encryption is] probably useful," says Alan Shore of the University of Wales at Bangor. Experimentation has demonstrated that telecoms could employ this system to enhance the privacy of calls without drastically changing their networks, or offer chaotic encryption on dedicated networks. The same concept could also be applied to cell phone signal encryption. Impressive though chaotic encryption may be, it is still less resistant to eavesdroppers than quantum cryptography.

  • "Buggy Software and Missile Defense"
    The New Atlantis (11/05) P. 47; Halpern, Mark

    Software programmer Mark Halpern finds fault with the long-standing contention that an Anti-Ballistic Missile (ABM) defense system is unrealizable because the software required for such a system would be undependable. He argues that a software system can be highly reliable, provided some real-world experience is obtained in its use, and blames software's so-called undependability on our own lack of discipline. Subjective perceptions of software's complexity and cost add to the confusion, and Halpern notes that software is vastly easier to create, more testable, and more dependable than hardware for the class of functions that we program computers to carry out. "In the end, the only rational basis for measuring the relative complexity of software and hardware is function-by-function--that is, by comparing the costs of realizing well-defined functions using each of the competing technologies," Halpern concludes. He reasons that software's bugginess stems from people's natural tendency to pile on code and then blame the technology for failures, and to prefer buggy software because it is cheaper than bug-free software. Halpern suggests an effective ABM system is an achievable goal once various technical delusions are cleared up: For one thing, the assumption that such a system is doomed to fail falls apart when one removes arbitrary constraints from the equation. In addition, computers are excellent at predicting a missile's flightpath and ascertaining an interception route, while ABM R&D is committed to developing weapons that do not need as much precision to effectively neutralize enemy missiles. Halpern supports the aggressive development of missile defense by observing that there is no reason to expect the technology not to work; that software is perfectly reliable once debugged, which can be done by throwing enough resources at the problem; and that "perfect" testing is not only impossible, but unnecessary.
    Click Here to View Full Article