Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to
Volume 6, Issue 679:  Monday, August 9, 2004

  • "Fewer College Students Choose Computer Majors"
    USA Today (08/09/04) P. 4B; Kessler, Michelle

    The gloomy state of the tech job market is chiefly to blame for decreasing numbers of enrollments in college computer programs, and many educators are worried that there will not be enough people to fill out the U.S. tech workforce when the economy bounces back. The Computing Research Association (CRA) estimates that the enrollment of computer science and computer engineering majors in North America declined 23 percent between 2002 and 2003, while the number of full-time computer science undergraduates enrolled for the fall semester at San Jose University was 417, compared to 525 last year. San Jose computer science Chairman David Hayes reports that computer degrees were once thought to guarantee stable jobs with good salaries. "Now, the perception is jobs are going overseas, and people are being laid off," he notes. Carnegie Mellon associate dean Peter Lee sees some good coming out of this trend: Although enrollments are not as high as before, the students who are enrolling are usually of higher quality, and driven by a love of technology rather than career ambition, he contends. AEA and the National Science Foundation estimate that only about 6 percent of the world's engineering degrees are awarded by the United States, placing America behind Japan, Russia, and India. There are also concerns that graduate program enrollments will also start to decrease if new security regulations discourage foreign students from applying for U.S. visas. CRA reckons that 43 percent of computer science and engineering recipients in North America are foreigners.
    Click Here to View Full Article

  • "Next-Generation Search Tools to Refine Results"
    CNet (08/09/04); Kanellos, Michael

    As the digitization of recorded information continues, technology researchers are working on ways to more effectively search the rapidly growing store of data. This month, the New Paradigms for Using Computers Conference highlighted several of these efforts: Inxight showed off software that tracks how and where search items get mentioned on the Web, thereby creating a graphical representation of the hidden connections between groups or individuals. University of California at Berkeley researchers demonstrated their Flamenco search system that focuses on artwork and antiques stored in museums worldwide. Flamenco relies on descriptive text used by those museums when digitizing their work, but does not actually search any of the pieces' visual information: Users are able to find works of art by category as well; Flamenco uses content, century, artist, medium, and other identifiers to categorize items. The new software development is occurring alongside search integration in next-generation operating systems, including Microsoft's Longhorn and Apple's Tiger Mac OS X. Apple Computer developer Bruce Horn founded Ingenuity Software to help desktop users prepare their systems for better file-searching in the future: His company produces tools that enable users to more easily index photos and documents. Microsoft researcher Jim Gemmell says one of the lessons his company has learned from its MyLifeBits project is that ad hoc file categories are not useful in the long-term. Internet Archive founder Brewster Kahle spoke at the New Paradigms conference and said his organization is on the way to archiving some 100 million published books, more than 2 million audio recordings, up to 200,000 movies, approximately 50,000 software titles, and a growing amount of TV broadcast and Internet data.
    Click Here to View Full Article

  • "Controlling Software Component Quality"
    IST Results (08/09/04)

    Information Society Technologies' QCCS project offers a solution to the problem of under-specification in component-based software development, which project coordinator Anne Marie Sassen says can avoid costly and calamitous errors such as those that led to the destruction of the Ariane 5 rocket. "Designers can create a software component to carry out a specific task but they usually do not know how it will operate in different circumstances or environments," she notes. "They design it with a certain function in mind and know what it is supposed to do, but often they cannot tell what else is needed for it to do that or what effect it will have on other components." QCCS supplies a quality of service description for software components using contracts that particularize both functional and non-functional component aspects, which enables safe component re-use in new applications. QCCS also uses Aspect Orientated Programming, which deconstructs software problems and solves them before they are reintegrated, as well as permits the knowledge of experienced programmers to be collected and handed to less experienced programmers. Three test cases involving the use of advanced component-based systems were employed to test QCCS' open-source methodology and tools. The systems were used in the fields of mobile healthcare, information sharing, and Internet services. The University of Cyprus, IRISA French research institute, and the Technical University of Berlin were all involved in the QCCS project.
    Click Here to View Full Article

  • "Work on Higher-Speed WLAN Standard Begins"
    TechNewsWorld (08/07/04); Korzeniowski, Paul

    Suppliers are investigating how to boost WLAN speeds beyond 100 Mbps, and analysts predict that a new standard approach dubbed 802.11n will soon be introduced and then incorporated into products projected to ship in 2006. The speed of wireless connections can already be doubled through "Super G" proprietary methods that use channel bonding, in which a pair of 54 Mbps channels are merged into a single 108 Mbps connection. However, analyst Allen Nogee reports that Super G products' low level of interoperability has limited their use to niche multimedia applications, while the channel bonding method can actually weaken bandwidth throughput under certain circumstances. The most commonly used WLANs support 11 channels, only three of which are nonoverlapping, while increased transmission range often reduces the effectiveness of high-speed transmissions. Vendors have been looking into a standard way to expand WLAN bandwidth since last September, and Super G is one of the methods under consideration. Several companies have mapped out a technique to raise WLAN speeds to 500 Mbps by employing Multiple Inputs Multiple Outputs--the deployment of two to four sets of antennas on each wireless link rather than one antenna--and doubling the size of each radio channel to 40 MHz; however, analyst Abner Germanow notes that users are hesitant to adopt products that operate in the 5 GHz frequency range because of interoperability issues. IEEE plans to conclude the collection of potential proposals for the 802.11n standard by mid August, while formal presentations from groups supporting different proposals will be heard in the fall. Items to be included and excluded will then be determined by the 802.11n committee; the evaluation process could take six months to a year or more, given the technical challenge of extracting additional bandwidth over existing frequency ranges.
    Click Here to View Full Article

  • "Hewlett-Packard Expresses Interest in University's Neural Network"
    Associated Press (08/03/04)

    Researchers in Idaho have developed the first hardware system that would give machines the ability to learn on their own, and without programming. The hardware approach is based on the concept of a neural network, which would enable computers or robots to perform a number of computations simultaneously, and think like a human being. Inventor Richard Wells calls his technology a "biomimic artificial neuron," and the neural network formed on a microchip would serve as the building block for the thinking machine. "The low-power technology is miniaturized to a scale approximately the size of a few animal cells per neuron and performs sensing, information processing, routing and actuation, much like the brain or spinal cord," says Gene Merrell of the Idaho Research Foundation. "The long-term applications are for artificial limbs or other prosthetics." Hewlett-Packard has contributed $200,000 to the research effort at the University of Idaho, and a patent on the neural network technology is pending. Until now, researchers have focused on a software solution, in which a microprocessor performs the computations.
    Click Here to View Full Article

  • "Open Supercomputing Hits Big 1-0"
    Wired News (08/06/04); Delio, Michelle

    This week marks the tenth birthday of the original Beowulf open-source supercomputing cluster, and Beowulf project co-founder Donald Becker commented at an anniversary celebration that the general attitude people have toward the supercomputer has changed in the last decade from profound opposition to almost unanimous support. Becker says the inspiration for Beowulf was the convergence of three concepts: PC-type machines overtaking traditional supercomputers in terms of price-performance improvement; the realization that the development of a common, community-driven software system would enable PC-class machines to augment supercomputers; and the emergence of Linux as a dependable, network-capable operating system. The first Beowulf cluster, Wiglaf, was built from 16 off-the-shelf 66 MHz 486 DX4 processors linked by channel-bonded Ethernet. Among the benefits of Beowulf clusters cited by Becker are their ability to deliver optimal performance, their support of software that enables nearly anybody to construct their own cluster, and their potential for increased capability thanks to the wide availability and affordability of commodity computers. "I see now that when we were initially thinking about the benefits Beowulf would provide we missed one of the most important elements--clusters are incrementally scalable," noted Becker. "Unlike custom-designed supercomputer systems that are designed as large machines, you can start with a small cluster and scale it as the demand grows." Although Beowulf clusters do not always run open-source software, some think that the Beowulf appellation is inapplicable if open source is not running on the cluster. Becker said that open source enables software analysis to confirm that the unmodified software will operate properly in a cluster environment.
    Click Here to View Full Article

  • "Can Quantum Dots Compute?"
    IEEE Spectrum (08/04/04); Guizzo, Erico

    Researchers at Harvard University and Duke University have created entangled quantum dots using semiconductor technology, and experts say the vast body of knowledge and available resources in the semiconductor industry could mean more reliable and scalable quantum computers than those built by other means. Other research teams have already created quantum computing prototypes using sophisticated lab equipment and environments shielded from environmental interference, and those solutions involve using molecules in liquid or ions contained by lasers or electric fields. Duke researchers created a thin two-dimensional electron sheet by growing an aluminum gallium arsenide layer atop an gallium arsenide layer; the free electrons gathered at the interface created the electron sheet, and applied negative voltages caused the electrons on both sides to repel each other, creating a void except for two reservoirs of electrons. These reservoirs are essentially quantum dots roughly 200 nm across and several hundred nanometers apart. By decreasing the negative voltages, the quantum dots sensed each other more strongly and became linked by quantum forces, triggering the entanglement properties necessary for quantum computing. The Duke team is currently working on ways to observe quantum dot spin. Harvard physics professor Charles Marcus separately achieved the same basic result, except with a distance of 1 mm between the quantum dots, and containing each of them in two separate areas of the electron sheet; the Harvard experiment relied on spin propagation through the electron sheet to achieve indirect entanglement, which Marcus says could be analogous to circuitry in semiconductors. IBM researcher David DiVincenzo says the two experiments have not met the stringent requirements necessary to define quantum dots because they used too many electrons, and argues that single-electron quantum dots would provide more accurate scientific results.
    Click Here to View Full Article

  • "Developing Nations See Linux as a Savior From Microsoft's Grip"
    Los Angeles Times (08/09/04) P. A4; Chu, Henry; Magnier, Mark; Menn, Joseph

    Government officials in Brazil, China, and other developing countries are hoping to remove their dependence on Microsoft by opting for free or low-cost software systems such as Linux. This migration away from proprietary software is being driven by economic, security, and ideological factors. Open-source products have no annual license fees, and therefore can help underserved segments of the populace to afford computers and Internet access. Supporters also claim that open-source software is less vulnerable to security flaws, since the code is revealed and the huge community of open-source developers theoretically makes it easier to detect and deal with malware; meanwhile, government officials in Beijing are concerned that Microsoft may be installing "back doors" in its software so that U.S. agencies can spy on Chinese government users. Some governments are pushing for alternative software because of entrenched distrust of global capitalism. Vienna recently announced it that it would encourage government staffers to switch from Windows to Linux or other alternatives on about 50 percent of the 16,000 workstations in the municipality, while the city council of Munich decided back in June to incorporate Linux into its 14,000 PCs. A major argument for the move from proprietary to open-source software is that it will not only save money, but allow countries to better serve the interests of consumers as well as competitors. "Monopoly is not good for innovation, fair competition or end users," declared Li Wuqiang with China's Science and Technology Ministry. Microsoft is attempting to increase its standing by giving product discounts, offering to open up its source code for review, or donating hardware and software to schools and other institutions.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Nano, Bio Converge to Provide Key Nanotech Link"
    Small Times (08/06/04); Pescovitz, David

    Academic researchers are investigating new technologies and systems that can be created through the convergence of biology and nanotechnology. The goal of MIT's Synthetic Biology Working Group is to create a repository of compatible genetic components with specific uses that can be assembled into new complex systems; another way of describing synthetic biology is the rewiring of "genetic circuits" to build novel biological devices. MIT materials scientist Angela Belcher is building practical nanodevices out of evolved biological organisms, and has successfully engineered a virus that coats itself with semiconducting material and spans the gap between electrodes. MIT's BioBricks project involves "programming" DNA strands to configure themselves into nanoscale devices, while New York University chemist Nadrian Seeman reported in May that he had constructed a 10-nm-long DNA "robot" that moved along a minuscule track; making the device capable of carrying a metal atom is the next step. Meanwhile, UC Berkeley computer scientist Adam Arkin's BioSPICE effort aims to create a computer-assisted design tool for genetic circuits that fulfills a similar role to Simulation Program Integration Circuitry Evaluation (SPICE) software for integrated circuit design. "We want to actually program cells as if they're computers so they can do much more complicated tasks," Arkin explains. UC Berkeley researchers, along with counterparts at UC San Diego and elsewhere, are attempting to reverse-engineer diatoms so that their glass shell-making ability can be tapped to build precise nanostructures such as gears for micromachines, nanoscale test tubes, or customized shells. The National Science Foundation forecasts that the annual nanobiotechnology market will be worth $36 billion in two years.
    Click Here to View Full Article

  • "Jack of One Trade, Master of All"
    Ha'aretz (Israel) (08/06/04); Smooha, Shahar

    At the 12th World Computer Chess Championship held last month at Israel's Bar-Ilan University, human beings had little to do apart from feeding their computers with the maneuvers of competing machines and moving the pieces on the chessboards per their programs' instructions. Bar-Ilan professor Nathan Netanyahu observes that the game of chess has long been a preferred platform for testing machine intelligence, and noted its appeal to AI researchers. "Through chess one can develop ideas that derive from the game, such as searches through a large tree of possibilities, or in databases and systems of rules," he explains. Shredder, a chess-playing program written by German programmer Stefan Meyer-Kahlen, has won the title of world chess champion three times, but Meyer-Kahlen says that it has become more and more difficult to improve the program with each passing year: He admits, "If I introduce changes into the code and the program just keeps losing, often I don't know what went wrong, because I'm not good enough at chess to understand what's actually happening there." Meyer-Kahlen thinks that soon it will be impossible for even the greatest players to beat the leading chess programs. Chess programs have gotten smarter and their skill levels more comparable to human players thanks to advances in processor speed and algorithm intelligence, along with the provision of databases by top chess players. Concurrently, chess programs have been increasingly able to run on more standard computers, to the point where a significant number of world championship contestants operate on simple systems. Experts agree that in a few years unbeatable chess programs will be running on garden-variety computers.
    Click Here to View Full Article

  • "Carnegie Mellon Develops Robot That Successfully Explored Gas Mains in NY"
    ScienceDaily (08/05/04)

    A wireless prototype robot designed to crawl through and examine subterranean gas mains has been developed by Carnegie Mellon University researchers funded by NASA, the Northeast Gas Association (NGA), and the U.S. Energy Department's National Energy Technology Laboratory (NETL). The untethered, remote-controlled robot is the product of over three years of research, and features a segmented body equipped with lights and fisheye cameras at the front and rear; the machine, dubbed ExplorerTM, can send camera images of the interior pipe and other kinds of information to an operator based at a control van, while its wireless communication range and battery capacity determines how far the robot can travel. The robot can also turn 90 degrees in tees and elbows, an ability that will facilitate the inspection of live 6- and 8-inch gas mains for much longer distances from one point of entry than traditional push-rod cameras, according to NGA project manager George Vradis. ExplorerTM was successfully deployed in Yonkers, N.Y., where it probed hundreds of feet of live, 114-year-old cast-iron gas mains. Hagen Schempf with Carnegie Mellon's Robotics Institute says the robot is only a small demonstration of the use of wireless inspection systems in places where people cannot reach. "The implications to potential cost-savings for preventative maintenance, inspection and emergency response should not be overlooked by any utility that has to manage its underground infrastructure," he notes. Rodney J. Anderson of NETL expects the addition of advanced sensor technology will enable the robot to conduct precise, high-resolution investigations of all natural gas mains. "This kind of technology will be essential in years to come to control costs in utility operating budgets and may even expand to other applications outside of gas distribution," predicts Schempf.
    Click Here to View Full Article

  • "Security Expert Q&A: the Virus Writers Are Winning"
    Network World Fusion (08/04/04); Brown, Bob; Weinberg, Neal

    In an interview, F-Secure computer security expert Mikko Hypponen says the new Mydoom.M worm is inserting a distributed denial-of-service client in infected computers instead of the spam Trojan that Mydoom.A was carrying. He believes those behind this latest worm are also behind the other Mydoom worms, the Bagle worm, and perhaps others. Hypponen says that Mydoom.M does not appear to be making its users any money, but previous operations created a huge network of "zombie" computers to use as spam proxies or free hosting servers, as well as to steal information. He thinks that the idea of offering bounties to catch virus writers is a good idea, as it will put pressure on the writers, but notes that they will always have something of an advantage because they have access to security products. He also argues that "much of the problem is that home computer users are infecting corporate networks by accident." Hypponen believes Internet service providers should educate their customers about the risks of going online, and predicts mobile devices will be seeing more viruses, which will create new security challenges. He says the Cabir Bluetooth proof-of-concept virus, for example, "is the first virus that spreads on proximity--if you are close to other Bluetooth devices you can spread the virus." Someone could take such an infected Bluetooth device on a subway, for example, and quickly transmit the virus to hundreds of other devices; Hypponen also is concerned about the open PocketPC platform and its susceptibility to viruses. He says the virus situation is getting worse, and although all of the 100,000 viruses released over the past 18 years have been cracked, "we might very well see a virus some day that we can't crack."
    Click Here to View Full Article

  • "Feds Seek a Few Good Hackers" (08/03/04); Brandt, Andrew

    The recent Defcon 12 hackers' conference included a recruitment presentation by federal law enforcement agents searching for talented people to work for the government. "The Department of Defense understands how important computers are to defending the United States, and is always on the lookout for good people," said Alvin Wallace, a supervisory special agent for the Air Force's Office of Special Investigations. The presentation was well-received with many of the twenty-something crowd taking business cards and asking questions about pay, security clearances, and college scholarships. Former National Security Agency director of information assurance Mike Jacobs spoke, urging hackers to help protect the United States from spies and terrorists. He said that when he worked at the agency, he would remind his colleagues that "the hacker community is probably our ally, and we need to pay attention to what they're doing out there." Some hackers may have trouble getting security clearances due to past misbehavior. Jim Christy, director of the Defense Department's Cyber Crime Center, says that the fight against terrorism has reduced security agency resources for cybercrime. The presenters noted that recruitment has to continue because employees tend to move into private industry. Wallace says his office provides "one of the best training grounds...Some of the best computer crime investigators in other federal agencies had their start in the Air Force Office of Special Investigations."
    Click Here to View Full Article

  • "XP: Lessons From the Front Lines"
    Software Development Times (08/01/04) No. 107, P. 29; George, Fred

    ThoughtWorks technical architect Fred George bases the value of Extreme Programming (XP) on real-life experiences, and comes to the conclusion that the practice is not less productive than traditional programming, as some people argue. George writes that he has seen programmers become more centered as a result of being paired up, and the quality of their software also improves when they have someone to keep them focused on their jobs. The author notes that good programmers keep the delivery of the asked-for application foremost in their mind, and recommends that programmers harvest legacy artifacts--code, developers, etc.--that could prove valuable and help expedite delivery. George outlines the best way for the XP team to conduct daily meetings efficiently: They should be done standing up and concentrate on the delivery of the application; each team member should talk about yesterday's work and what they plan to work on today, while managers should try to rectify problems by fixing troublesome pairs instead of assigning blame. The author notes that when planning a new XP project with a new team, the question, "Which practices do I really need?" comes up most frequently, and his answer is for the team to test all practices, as there is not enough knowledge at so early a stage to weed any out. Pair programming is the most discussed practice among managers, teams, and critics, and George is puzzled that managers often harbor doubts about the practice, given that pair programming supports cross-training, team growth, and introduction of specialists. George writes that simple design is the most complex XP practice: Programmers' penchant for overdesigning and overimplementing must be reined in, while reworking designs to incorporate new requirements must be stress-free.
    Click Here to View Full Article

  • "A Bright Spark"
    Defense News (08/02/04) Vol. 19, No. 30, P. 22; Walker, Karen

    The U.S. National Geospatial Intelligence Agency, formerly the National Imagery and Mapping Agency, by the end of 2004 plans to open a joint program office for a revolutionary new intelligence modeling and simulation system that promises to overhaul the way in which intelligence agencies share information. Still in an early prototype stage, the Future Intelligence Requirements Environment (FIRE) system has been fast-tracked to offer initial capability next year, with full capability coming in 2006. Original plans called for FIRE to be introduced starting in 2009, while development of the system is mostly targeted to meet the intelligence needs and systems 10 to 15 years hence. FIRE is designed to overcome the individual stove-pipe systems of the intelligence community, and will have the ability to store and access data from the systems of the various agencies as well as from accumulated data collected from sensors. Military officials plan to use the capability to perform "what if" scenarios for combat, while other government agencies want to use FIRE as a tool for playing out the possibilities of terrorist attacks. "This is a good example of a tough challenge, not from a technical point of view, but from getting all the agencies together to participate in something that could have so much significance as you look down the future road of the intelligence community," says Keith Hall, vice president of Booz Allen, a partner on the FIRE project. FIRE consists of a database that sources data from the various intelligence agencies, and intelligence analysts will use its suite of analysis and simulation tools when it comes time to make decisions.

  • "Tricky Business"
    Government Technology (07/04) Vol. 17, No. 7, P. 18; McKay, Jim

    Government agencies need better information systems to counter the terrorism threat, but current projects fail to address equally important privacy concerns, according to a new report from the Markle Foundation's Task Force on National Security in the Information Age. Programs such as Total Information Awareness (TIA), Multistate Anti-Terrorism Information Exchange (MATRIX), and the Computer Assisted Passenger Prescreening System (CAPPS II) each have faced privacy concerns that either cancelled or delayed the efforts. Software that sifts through records and flag suspicious behavior is far more worrying than subject-based data mining, where investigators start with a suspect and use the data systems to pull up links and connections, says Center for Democracy and Technology executive director Jim Dempsey, who also worked on the Markle Foundation task force. Some of the above programs, such as CAPPS II and MATRIX, have valid watch list and information retrieval components, but they also include data mining techniques that would not be acceptable in investigations without computer assistance: The risk score generated by CAPPS II, for example, basically amounts to an anonymous tip, which normally must be accompanied by some evidence before agents can act upon it. Similarly, MATRIX was used in Florida to create a list of 120,000 suspect terrorists based solely on criteria fed into a data mining program. Electronic Frontier Foundation attorney Lee Tien says applying such technology to credit card fraud might make more sense, because authorities are able to create better profiles of that type of activity. Technology has preceded policy discussion in many cases, says former CIA Science and Technology deputy director and Analytic Services CEO Ruth David. She says developers need to be educated on privacy protections and focus on solutions that provide overall process efficiency, not just better security.
    Click Here to View Full Article

  • "Seamless Mobile Computing on Fixed Infrastructure"
    Computer (07/04) Vol. 37, No. 7, P. 65; Kozuch, Michael; Satyanarayanan, Mahadev; Bressoud, Thomas

    The pervasive installation of public-access computers in a wide range of locations is envisioned by Intel Research's Michael Kozuch, Carnegie Mellon University's Mahadev Satyanarayanan, et al as a major step toward seamless mobile computing: In this scheme, a computer acquires a user's singular customization and state only when he starts to use the device, and this information is lost as soon as the user terminates his session with the machine. The customization and state acquisition process must be accurate and almost instantaneous in order for it to have wide user appeal, while the management and system administration costs of deploying so many machines must be low enough to support the viability of the business model. Addressing these challenges is the purpose of Internet Suspend/Resume (ISR), a pervasive computing technology that uses hardware virtualization and file caching to expediently personalize and depersonalize anonymous hardware for passing use. ISR employs a distributed file system to streamline site management and reduce the skill required to administer ISR devices or set up new ones. This file system is layered with virtual machines (VMs), each of which compacts distinct execution and user customization state. The authors chose the Coda distributed file system for ISR because Coda caches can encapsulate the whole VM state, while advance knowledge of resume machines can be leveraged through a clean interface supplied by Coda's support for hoarding; moreover, the system's support for disconnected or flimsily connected operation boosts endurance against various failures and deviant network conditions, and experimentation is eased via Coda's user-space deployment. Resume latency in ISR can be lowered by propagating dirty state to servers prior to suspend, warming the file cache before arrival at the resume device, and allowing the user to resume before full state arrival. With these steps, ISR resume latency is expected to be comparable to the average delay experienced when opening a laptop.
    Click Here to View Full Article