HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Parasoft Logo    Automated Software Analysis - by Parasoft:
Error prevention for unit testing, coding standards analysis,
functional/performance testing, monitoring and reporting.
http://www.parasoft.com/products


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Parasoft or ACM. To send comments, please write to [email protected].
Volume 6, Issue 720:  Wednesday, November 17, 2004

  • "The Real Problem With Voting"
    Technology Review (11/17/04); Selker, Ted

    The recent presidential election was plagued by irregularity and potential vulnerability to fraud, but not because of the technology used, writes Ted Selker, co-director of the Caltech/MIT Voting Technology Project (VTP). On Nov. 2, voters in the Boston area used optical-scan ballots that were recommended by the VTP group as the most reliable and secure method for collecting vote results, but though the technology itself was estimated to be the best available, the Boston vote was still compromised by poll workers who had not received enough training or side-stepped important procedures. One worker handled ballots without the required monitor to provide accountability, and during the shutdown procedure pencils and erasers were used instead of pens. Finally, the balloting module, equipment storing electronic records, and ballots themselves were all transported together by a single, unescorted police officer to election headquarters. By doing so, the election officials wasted their optical-scan systems' advantage of multiple records if something should have happened. Other previous elections also demonstrated that worker training and strict adherence to voting procedures trumps technological solutions: Nevada's Sept. 7, 2004 election used Sequoia's direct-record electronic voting system with verifiable paper receipts, but some voters were wrongly given provisional ballots because poll workers mistyped a code. Another worker did not know how to rethread a paper roll in the receipt machine and resorted to cutting out a portion of the paper tape. Election officials and other concerned groups should consider the holistic needs of a secure election, not just the technology, writes Selker.
    Click Here to View Full Article

  • "IPv6 Shooting for the Moon"
    InternetNews.com (11/16/04); Kerner, Sean Michael

    The approximately 20-year-old IPv4 protocol suffers from a dearth of addresses brought on by the expansion and use of the Internet, which is why officials have been clamoring for a retooling. The next-generation IPv6 protocol was designed to fill the void, and researchers say the protocol recently successfully completed the third phase of its testing across the Moonv6 multivendor network. The tests, which began at the University of New Hampshire's InterOperability Laboratory (NH-IOL) and wrapped up at the U.S. Defense Department's Joint Interoperability Test Command in Arizona, focused on functionality and implementation schemes that included wired and wireless local area networks, firewalls, voice over IP, domain name server, and dynamic host configuration protocol. Participating vendors included AT&T, Hewlett-Packard, Sun Microsystems, Juniper, Nortel, Cisco, and Microsoft. UNH-IOL technical manager Erica Williamsen says the Phase 3 testing demonstrates that the IPv6 infrastructure is sound, and now it is up to vendors to refine their deployments and interoperability, with adoption and implementation in the hands of service providers. Internet Software Consortium founder Paul Vixie explains that enterprises and end users have little inclination to adopt IPv6 because they have successfully skirted the lack of IPv4 address space through their deployment of Network Address Translation and internal use of private (RFC1918) addresses. "This was painful, and it was wrong, but it's now common practice, and it's 'good enough' that paying more money for upgrades and training to get IPv6 has no business case in 2004," Vixie points out. He says the communications and infrastructure communities are driving IPv6 adoption, and he expects enterprises and end users to tag along.
    Click Here to View Full Article

  • "Bio-Inspired Modules Open New Horizons for Robotics"
    IST Results (11/17/04)

    The IST program-funded HYDRA project has created the world's first modular shape-shifting robot, a significant breakthrough in robotics and artificial intelligence as well as a notable step toward versatile, self-configuring machines. Denmark's Maersk Institute, LEGO, the University of Zurich, and the University of Edinburgh jointly developed the ATRON and HYDRON classes of spherical modules, which are capable of autonomous operation, communications between one another, and self-configuration into practically any shape or function. The HYDRON units were designed for use in a fluid medium, while the ATRON units were developed for use on land. "We based the design on the way biological cells interact, how they move, die and reconstruct themselves, and we emulated that in the modules, which are essentially building blocks for robotic devices that look very much like a string of atoms or cells when connected together," explains Maersk Institute professor Henrik Hautop Lund. A typical ATRON unit consists of a north and south hemisphere that can rotate around the equator, and each hemisphere is outfitted with a set of male and female connectors that allow the modules to link together and function as a whole; built-in infrared sensors make the modules aware of each other's presence, while infrared transmitters and receivers enable orders to be relayed to the modules and effect intermodule communication. Autonomous operation is facilitated by an onboard multiprocessor computer, while additional sensors gauge movement, speed of rotation, and degrees of tilt. Lund sees short-term potential for ATRON-based robots in entertainment and technology, while longer-term possibilities include nanoscale medical applications, space exploration, and hazardous environment inspection.

  • "Unused PC Power to Run Grid for Unraveling Disease"
    New York Times (11/16/04) P. C6; Lohr, Steve

    The World Community Grid is a collaborative venture between IBM, the United Nations, the World Health Organization, and other entities to harness idle PC power to decipher the genetic puzzles of diseases such as AIDS, cancer, and Alzheimer's. Dr. Eric Jakobsson with the National Institutes of Health's Biomedical Information Science and Technology Initiative calls the program "both a sizable commitment of computing resources and an encouraging sign of progress in moving toward a community model for biomedical computing." The project's success hinges on millions of volunteers consenting to donate their PCs' unused computational capability. Participants will download software from the www.worldcommunitygrid.com Web site that allows a PC to contribute to the grid when the machine is on but not in use. Medicine and biology are regarded as model areas for distributed computing, since more and more computers are being employed in the search for genetic disease markers and the underlying mechanisms of life. Scientists must resolve to keep their research and software tools within the public domain if they are to use the community grid, and the program's 17-member advisory board will determine what problems the grid will address. The grid's first problem will be identifying all the proteins in the human body and their roles through the Institute for Systems Biology's Human Proteome Folding Project initiative, which will involve the computation of how new genes fold into proteins and then matching those shapes against a 3D protein database. "The hope is that the World Community Grid project can expand the impact of this kind of computing to a much broader set of applications," notes Argonne National Laboratory computer scientist Ian Foster.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Getting a Handle on Data"
    Financial Times-IT Review (11/17/04) P. 1; Perkin, Julian

    The disconnected links, outdated information, and other inconveniences that most individual users grudgingly put up with to access the content of the Internet can be a serious detriment for governments and businesses: Such annoyances can prevent governments from delivering key services and information to citizens, while businesses could lose customers because of the fluctuating format and location of online data. One possible solution combines two related technologies, the Handle System and digital object identifiers (DOIs). The Handle System is a scheme that assigns, manages, and resolves permanent identifiers or "handles" for online content while also providing a basis for DOIs, a standard technique for identifying published digital material. Unlike traditional Web links, which reference the location of content, the DOIs identify the content itself. Because handle-based identifiers are persistent and globally unique, they will ensure the resolution of actual documents as well as the provision to all users of the same, definitive versions of documents. DOIs can be embedded within content such as images and video, so references to an object are retained even if the object's location changes; potential applications of this method include policing access to or tracking duplicates of copyrighted content. The possibility that portions of content instead of complete documents can lead to new commercial potential for information providers, and support public disclosure of documents while keeping sensitive information safe from prying eyes. Trends fueling the need for governments to revise data management tactics include numerous Developments, such as anti-terrorism imperatives, insufficient information exchange between law enforcement and social security, calling for greater cooperation between government departments, and increasing international support and acceptance of freedom of information laws.
    Click Here to View Full Article
    (Access to full article is available to paid subscribers only.)

  • "Yale Teams Test Skills in Computer Programming Competition"
    Yale Daily News (11/16/04); Duboff, Josh

    Among the 67 teams participating in this past weekend's ACM Regional Collegiate Programming Contest for the Greater New York Region were four teams of Yale undergraduates, who were placed 13th, 17th, 27th, and 29th. The competitors were required to solve a series of computer programming problems, which remained the same regardless of the team members' ages. Yale computer science professor and team coach Stan Eisenstat said the teams' performance fulfilled his expectations, although he noted that "we didn't do super well." Yale student Ruben Grinberg attributed his team's poor showing to a lack of preparation and practice, while freshman Brad Galiette said his team was thrown for a loop when they learned that they would be using one computer in the contest instead of two, as they originally expected. In addition, there was a last-minute withdrawal of a senior team member, which caused the sophomore team to be undermanned as a result. Still, Eisenstat said the freshmen and sophomore teams, which were ranked in 17th and 13th place, respectively, did well, and may place as top teams for their age groups. Grinberg said the team's performance was not representative of the quality of Yale's computer science program. Eisenstat said that some of the rival schools are more serious about the contest, and require students to gird themselves through preliminary tests and practice sessions.
    Click Here to View Full Article

  • "High and Dry at High End?"
    EE Times (11/12/04); Merritt, Rick; Mokhoff, Nicolas

    Supercomputing experts celebrated new advances in both custom-built and off-the-shelf supercomputing machines at the Supercomputing 2004 conference in Pittsburgh, but also sounded warnings about the lack of fresh ideas in the field. Cluster supercomputers created using off-the-shelf components have become the majority on the Top 500 supercomputing list, including the SGI Columbia machine that recently took second-place on that list at 51.87 teraflops. The National Research Council, however, warned that low-cost clusters would eventually be stymied by growing memory and network latencies. "We are extrapolating that by 2020, a computer node can execute a million instructions in the time it takes to communicate with another node," said University of Illinois at Urbana-Champaign professor Marc Snir. The U.S. federal government needs to work with industry and academia on a supercomputer road map that will help them make purchases with future technology in mind, and the National Research Council also advocated more resources for custom-designed supercomputing hardware and architecture. Custom-built machines have scored gains recently, such as IBM's BlueGene/L Supercomputer, which won the top spot in rankings with 70.72 teraflops performance, displacing Japan's custom-built NEC Earth Simulator, which had held first place for five consecutive editions of the Top 500 list. BlueGene/L has only been outfitted with 32,000 PowerPC 400 processors, but will eventually have 128,000 of the chips when delivered to Lawrence Livermore National Laboratory. Department of Defense science and technology official Charles Holland said benchmarking also needed improvement, and suggested broader use of the High Performance Linpack test that measures both speed and accuracy of calculations.
    Click Here to View Full Article

  • "Computer Design on Mind of Author"
    The Lantern (Ohio State University) (11/12/04); Nelson, Ahleeya R.

    In a Nov. 10 lecture at Ohio State University, computer interface design researcher and author Ben Shneiderman said that people need to concentrate more on a computer's lifestyle enhancement applications rather than on its technical capabilities, with an emphasis on increasing the machine's reliability, usefulness, and universality. In his book, "Leonardo's Laptop: Human Needs and the New Computing Technologies," Shneiderman attempts to outline a model for an optimum computer interface design whose functions are irrelevant to user demographics. He said the Bush administration has shown little consideration for the interrelationship between regular computer users and income level, and argued that economics and education should be given a greater role in the computer design process. Shneiderman queried approximately 50 students and 50 professional computer users about the most frustrating aspects of using computers, and 46% cited a loss of productivity due to frequent computer crashes and automatic shut-offs. Among his proposals for developing more useful computers was the creation of a computer design template whose structure resembles that of the Periodic Table of Elements. Shneiderman concluded his lecture with a recommendation that the audience "get to work" on altering people's perception of computers. OSU computer science graduate student Peter Chang remarked that Shneiderman's lecture made him realize that computer scientists are devoting more time to features and computer speed than design.
    Click Here to View Full Article

  • "Software Sorts Out Subjectivity"
    Technology Research News (11/24/04); Patch, Kimberly

    Classifying information by studying text is a simple matter for people, who understand the meaning of words; however, the process is a major challenge for computers, which usually focus on identifying words, phrases, and patterns that signify sentiment. Cornell University researchers have developed a technique that improves a computer's accuracy in classifying the sentiment contained in text as subjective or objective by focusing on context to the exclusion of meaning, thus eliminating neutral sentences. Cornell computer science professor Lillian Lee says the method represents text as a network or graph, with sentences represented as nodes. Modeling contextual information between each pair of sentence nodes involves inserting a link whose strength accounts for how much both sentences deserve to be classified as objective or subjective, using standards such as the closeness of the sentences to the text, their separation by a paragraph boundary, and indications of subjectivity or objectivity underlined by the presence of certain words. Afterward, the sentences are clustered into subjective and objective classifications according to the strength of the link and pattern recognition is used to categorize the documents as positive or negative, based on the segments labeled as subjective. Lee says the technique boosts sentiment classification performance from 82.8% to 86.4%, and posits that the method could be employed to automate support for review-aggregator Web sites, track changes in attitudes toward specific subjects, glean business intelligence, or filter search results. The Cornell team's research was funded by the National Science Foundation, among others, and Lee says the researchers' longer-term goals include devising techniques that take language variations into consideration and ultimately deal with rhetorical devices such as sarcasm and irony.
    Click Here to View Full Article

  • "Specialty Majors Are the Rage on Some Campuses"
    NorthJersey.com (11/15/04); Adler, Jessica

    Offshore outsourcing and a listless U.S. economy are encouraging more students to pursue specialty majors such as video game development, casino studies, homeland security, and sports sales in the hopes that they will lead to lucrative careers. Students are adopting lessons outlined in "The College Majors Handbook," which states that graduates generally command much higher wages in jobs closely related to their major than they do in unrelated jobs. Bloomfield College professor Roger E. Pedersen, who offers a game design major, explains that the skills students are picking up apply not just to games, but also to films, TV advertising, and Web applications that utilize the same programs. The gaming/casino major offered by Morrisville State College in New York includes emphasis on facial-recognition software and habit-tracking software. University of Denver professor Scott Leutenegger has co-launched a video game development major with a traditional computer science component, and he believes such a strategy can make computer science more interesting and challenging to students, which could perhaps help mitigate a shortage of computer scientists projected within the next five years. The cost of education is another factor driving students toward specialty majors, while still another is the high value accorded to college degrees. "College Majors Handbook" co-author Paul Harrington, a professor at Northeastern University, reports that students with bachelor's degrees were earning 66% more money than high school graduates in 2000, up from between 15% and 17% three decades earlier.

  • "Researchers Create Free, Downloadable Software Radio Design Tool"
    Newswise (11/16/04)

    Open Source Software Communication Architecture Implementation: Embedded (OSSIE) software developed by Virginia Tech's Mobile and Portable Radio Research Group (MPRG) is free for download at www.mprg.org/research/ossie, and deputy director of the MPRG Jeffrey Reed reports that many universities and companies around the world have already taken advantage of the software's availability. OSSIE is a software environment written in C++ that is interoperable with general-purpose software-defined radio equipment developed by the U.S. Defense Department's Joint Tactical Radio System. OSSIE was originally devised by MPRG post-doctoral fellow Max Robert and a team of researchers as a software radio research tool underwritten by the Office of the Director of the CIA, but Robert and Reed determined that the software could be employed by other researchers to develop software radios. Furthermore, sharing the software with other researchers would enlarge the aggregated pool of knowledge for the creation of diverse practical software radios. Software radios differ from conventional radios in that their software, not their hardware, defines their signal-processing capability. This approach facilitates greater functionality and sets up communications with a variety of other devices. "Offering OSSIE as an open source tool over the Internet will speed up growth of the technology and make faster innovations possible," notes Robert. "This will benefit all wireless researchers who are working to develop software radios."
    Click Here to View Full Article

  • "As Wireless Networks Spread, Video Transmissions Taking Root"
    Investor's Business Daily (11/16/04) P. A7; Korzeniowski, Paul

    The proliferation of wireless networks has left many users wanting video applications, and standards bodies such as the IEEE are working on protocols to make that happen. The latest 802.11g version of Wi-Fi boasts 54Mbps transmissions--enough to pipe video streams to remote users--but there still needs to be some type of quality of service features that ensure consistent bandwidth for video applications. With regular Internet infrastructure, traffic is handled for optimal throughput, but not necessarily for the constant stream of data needed to ensure smooth video and audio transmission. A developing IEEE standard prioritizes multimedia traffic so that other bandwidth-hungry applications do not disrupt it. Wireless video applications have great appeal for businesses, which already benefit from allowing their workers to access information and applications remotely and also reduce travel expenses through videoconference technology; video-enabled wireless networks would also mean fewer ISDN lines or WLAN links. Meanwhile, consumers are eager to link their entertainment devices--TVs, game consoles, and stereos--to their computers and the Internet without having to deal with wires. Cable and telecom companies are eyeing opportunities for multimedia downloads, such as video files that users could watch when and where they like. There are still several other roadblocks to consider, however: Many corporate firewalls make it difficult for users inside to receive broadcasts from outside the firewall, and high prices for wireless receivers and video encoding devices could prevent mainstream consumer use.

  • "NSF Grants for Research of Wireless, Optical Networks"
    University of Texas at Dallas (11/15/04)

    University of Texas at Dallas engineering professors have received $950,000 in funds from the National Science Foundation (NSF) in order to develop cooperative wireless networks and optical networking technology. Associate electrical engineering professor Aria Nosratinia is the principal investigator for cooperative wireless networking technology that would see algorithms and protocols embedded in a wide range of wireless devices, enabling them to securely relay data signals. With third-party devices acting as in-between nodes, wireless networks would become much more robust, faster, and require less power, says Nosratinia. The University of Texas team is trying to resolve fairness issues with such a scheme, making sure privacy is ensured and battery usage is not a concern, for example. The second NSF grant goes toward a joint optical communications project between the University of Texas at Dallas, the University of Kansas, and the University of Guelph in Ontario, Canada. The team is researching an optical equivalent to Ethernet switches, which would essentially enable plug-and-play optical local area networks with much higher bandwidth and transmission speeds than allowed by regular technology. A core challenge is developing a micro-optical spectrum analyzer that would analyze network connections and traffic and send data over the best route; existing analyzers are too large for the switch devices the team wants, so they are working to build one at a smaller size. "The beauty of the node would be its simplicity--you won't have to be an engineer to set up your own fiber-optic network in an office or other environment," said University of Texas associate professor Andrea Fumagalli.
    Click Here to View Full Article

  • "EPA Builds a Better Search"
    Federal Computer Week (11/15/04) Vol. 18, No. 39, P. 63; Perera, David

    The EPA has enhanced a keyword search in its Web pages by incorporating metadata standards into its search engine, and now members of the Categorization of Government Information Working Group want similar searching capabilities for the entire federal government. The working group, a subcommittee of the E-Government Act of 2002-created Interagency Committee on Government Information, has issued recommendations that the government adopt a similar strategy that uses a standardized metadata scheme based on uniform resource names (URNs). Unique identifiers would be assigned to policy documents, Web sites, photos, maps, and other digital material. The larger goal of the group members is to permanently make government information available in digital formats, considering the future of individual government agencies is not always certain. The search engine that the EPA modified is able to produce documents according to rankings of data stored in metadata fields, prioritizing them in descending order based on the information that includes the search query term embedded in a subject, title, description, and text. Before the change, "The relevancy ranking of our search engine couldn't really say, 'Here's a general thing about water quality that could get you started,'" explains Richard Huffine, program manager for the agency's National Library Network, who wrote part of the draft recommendations. URNs coupled with a standardized metadata scheme provide new ways to analyze data, says James Erwin, the Defense Technical Information Center's director of science and technology and lead author of the working group's URN proposals. However, working group members still must decide what types of information will get URNs.
    Click Here to View Full Article

  • "Virtualization: One Box, Many Servers"
    InfoWorld (11/08/04) Vol. 26, No. 45, P. 40; Yager, Tom

    Server virtualization, in which a standalone computer's behavior and capabilities are imitated by software, yields actual benefits for enterprise data-center administrators that include cost reductions through server hardware consolidation, dynamic allocation of resources on an as-needed basis, testing and debugging in controlled environments, simpler management of heterogeneous resources, acceleration of the new systems provisioning process, and the isolation of overall system health from application or operating system failures. Virtualization involves a host consisting of two lower layers in the software stack: A single instance of a garden-variety OS installed directly onto the server hardware, and a virtualization layer performing the redirection and emulation that comprises the virtual computer. Operating systems and applications running on virtual servers do not directly control hard drives, memory, and other resources; instead, the virtual machine running below the OS and applications intercepts requests for interaction with hardware and accommodates them as it deems proper. Virtualization products allow administrators to construct customized server configurations that align precisely with application requirements. Virtualization spares users the burden of repeatedly installing the OS and software on physical servers by saving a customized server's disk image to a file so that users can employ it as a model for other guest systems. Virtualized servers are susceptible to the same bugs and hang-ups as regular servers, but these problems will not impact the hardware. Virtualized PCs are not invulnerable, but luckily the virtual PC's disk image can be overwritten with a clean image. Furthermore, most virtualized systems incorporate solution-specific management software, enabling an administrator to control all enterprise virtual servers from a central terminal.
    Click Here to View Full Article

  • "Display Technology Leaps to the Next Generation"
    Military & Aerospace Electronics (10/04) Vol. 15, No. 10, P. 28; Ames, Ben

    The coming of the networked battlefield era has necessitated the widescale deployment of liquid-crystal displays (LCDs) that deliver high resolution, readability in sunlight, power efficiency, ruggedness, light weight, and low cost. But though LCDs are expected to sustain their dominance over military and aerospace applications thanks to forthcoming advances, organic light-emitting diode (OLED) displays and other future products promise to give them a run for their money. The shortcomings of commercial displays that the military relies so heavily on include a lack of rugged performance, and sizes and shapes that do not align well with consoles and cockpits; the U.S. Display Consortium aims to overcome these problems by investing in the development of new technology. For instance, the USDC has awarded General Electric Global Research a grant to devise a barrier film to better protect OLED displays from oxygen and moisture, while another grant to General Dynamics Canada and Interface Displays and Controls supports work to reduce the size and shape of active-matrix LCDs to make them more configurable for military and aerospace platforms. Arizona State University professor Greg Raupp says the Army Flexible Display Center was set up to help provide "ubiquitous, conformal, and flexible displays that are lightweight, rugged, low power and low cost" to future soldiers, and adds that the displays will be combined with global positioning, communications, and computation subsystems to facilitate enhanced situational awareness, effectiveness, and survivability among troops. Enhancements to light-emitting diode technology have led to new products such as the lightweight, low-cost Eye HUD head-mounted display from Rockwell Collins Government Systems, which can be attached to standard night-vision goggles donned by helicopter and combat-support aircraft pilots. Another developing technology of interest to the military is Kopin's Multidomain Vertical Alignment microdisplays, which boast a nanopixel technology-enabled color-filtering technique that will be applied to night-vision goggles and head-mounted units that relay battlefield data.
    Click Here to View Full Article

  • "Computing at the Speed of Light"
    Scientific American (11/04) Vol. 291, No. 5, P. 80; Gibbs, W. Wayt

    The ever-widening gap between microprocessor performance and memory access is spurring the replacement of copper wiring by photonic connections in the next 10 years. Driving this transition is the development of a diversity of photonic devices that could be mass-produced in the same facilities used to fabricate inexpensive microchips. A great deal of optical computing research has emphasized CMOS-interoperable techniques for blending electronics and photonics, which has yielded at least three methods. Hybrid integration features devices with both III-V microchips and logic-bearing silicon chips, and these disparate components can be built separately and integrated later; however, faster microchip speed results in greater heat output, a factor that may relegate hybrid optoelectronic chips to slower external connections and outlying devices rather than core computer hardware. A second approach, monolithic integration, aims to enable the assembly of whole photonic systems directly onto motherboard chips or into microprocessors with existing fabrication technology by tricking silicon and other CMOS-friendly elements into discharging, manipulating, and sensing light. The third and perhaps least expensive technique is polylithic integration, in which CMOS processors are attached to the motherboard with a compact array of optical and electronic connections, allowing light to be pumped into the processor from small, cheap III-V chips that are prevented from overheating via strategic positioning. Optical connections can operate at high bandwidth over both long and short distances, and such a breakthrough could facilitate a fundamental change in the shape of computers. Some machines could be equipped with holographic disk drives with hundreds of gigabytes of storage capacity; optical network cards could let users with direct links to the international fiber-optic telecommunications grid access the Internet at over 1Gbps; and computer hardware could break out of its rectangular box paradigm and be widely distributed throughout a car, a building, or a city.


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM