Read the TechNews Online at: http://technews.acm.org
ACM TechNews
September 29, 2006

MemberNet
The ACM Professional Development Centre
Unsubscribe

Welcome to the September 29, 2006 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Sponsored by
Learn more about Texis, the text-oriented database 
providing high-performance search engine features combined with SQL 
operations and a development toolkit, that powers many diverse 
applications, including Webinator and the Thunderstone Search Appliance.
Learn more about Texis, the text-oriented database providing high-performance search engine features combined with SQL operations and a development toolkit, that powers many diverse applications, including Webinator and the Thunderstone Search Appliance.


HEADLINES AT A GLANCE:

 

ACM Security Experts Urge Paper Trails for Electronic Voting
Ascribe Newswire (09/28/06)

Ensuring that the U.S. election process is trustworthy is an important function of voter verified paper trails, stated former ACM President Barbara Simons at a congressional hearing reviewing security for electronic voting systems. Simons, founder of ACM's U.S. Public Policy Committee and co-chair of ACM's study of statewide registered voter databases, testified that all currently available e-voting systems carry risks, such as poor design, lack of thorough testing, limited audit capabilities, and inadequate software engineering. "Technology, if engineered and tested carefully, and if deployed with safeguards against failure, can reduce error rates, provide more accessibility, increase accountability, and strengthen our voting system," she noted, adding that the inclusion of a voter verified paper audit trail (VVPAT) or voter verified paper ballot (VVPB) will improve the security of voting systems and provide for routine audits. Princeton University computer science professor Edward Felten, a member of ACM's U.S. Public Policy Committee, urged that extra care must be taken in securing voting systems throughout the election process, and called for better certification for software updates to e-voting machines and increased employment of independent security experts. Simons and Felten concurred that the election and technical communities must collaborate to develop trustworthy computerized voting and electronic registration systems.
Click Here to View Full Article
to the top


Visas for Skilled Workers Still Frozen
Washington Post (09/28/06) P. A12; Kalita, S. Mitra

The bill that technology companies had been counting on to extend the number of visas issued to foreign workers has stalled in Congress amid more controversial provisions concerning border security and illegal immigration. While a bill that would have doubled the number of visas issued each year to skilled professional cleared the Senate in the spring, proponents of the legislation have now resigned themselves to the hope that Congress will take up the measure in a lame-duck session late in the year. "It is incredibly difficult to pass major legislative reforms in any areas, and they tried to bite off a lot," said Intel's Jenifer Verdery. "We've made a strong case, and we're hoping to take that to the finish line...if there is any policymaking left to do after the election." Major technology companies and research institutions have long depended on the H-1B visa to bring in foreign workers to fill in the gap left by the shortage of qualified U.S. workers. The majority of the 65,000 H-1B visas that have been issued each year were snatched up by workers from India and China, who encounter the longest waits for obtaining green cards. Under the Senate plan, the number of H-1B visas available each year would have increased to 115,000. Technology companies mounted a major push to increase the cap this year, but there are some groups that oppose the expansion, arguing that skilled foreign workers can hold down salaries, even though both parties have traditionally supported the program. Immigration experts echo the concern, claiming that the restrictions can be crippling to certain sectors of the economy.
Click Here to View Full Article
to the top


Time to Achieve Many Big Steps for Women in Science
San Francisco Chronicle (09/28/06) P. B7; Whitney, Telle

While women represent more than half of the U.S. workforce, they account for just a scant 18 percent of all technical positions. Indeed, the declining participation of both women and men in computer science and other technology-related courses of study could have a grave impact on U.S. competitiveness in space and innumerable other scientific endeavors, writes Telle Whitney, president and CEO of the Anita Borg Institute for Women in Technology. The number of incoming students majoring in computer science, for instance, dropped 60 percent from 2000 to 2004, according to a study conducted at the University of California, Los Angeles. Image is a major problem for the technical disciplines, as students increasingly view those areas of study as the socially irrelevant province of 'geeks' who have no reservations about spending a career toiling in isolation in front of a computer screen. But in fact, technical fields hold tremendous potential to alleviate problems such as poverty, illiteracy, and global warming. This is the message trumpeted by former astronaut Sally Ride, who will give the keynote address at the upcoming Grace Hopper Celebration of Women in Computing, co-sponsored by ACM, where she will call for improved outreach programs for girls, particularly minorities, to encourage participation in the technical fields.
Click Here to View Full Article
to the top


IBM and the Association for Computing Machinery Kick Off Worldwide College "Battle of the Brains"
Market Wire (09/26/06)

Students from around the world will begin competing for spots in ACM's annual International Collegiate Programming Contest (ICPC) from September through December. More than 6,000 teams of three students from 84 countries will participate in the international regional competitions, which will have the groups solve real-world computer programming problems in five hours. The World Finals of the 31st ICPC, also known as the "Battle of the Brains," is scheduled for March 2007. The ICPC has grown substantially since IBM signed on as its sponsor in 1997. "The ICPC attracts incredibly bright young men and women who will shape the future of computing," says Dr. Bill Poucher, a professor at Baylor University who is the executive director of the ICPC. "The partnership between ACM, IBM and colleges and universities around the world has grown to be a force in advancing education and innovation in computer science and engineering." A team of students from Saratov State University in Russia emerged as the world champion at last year's World Finals in San Antonio, Texas. For more information about ICPC, visit http://icpc.baylor.edu/icpc/
Click Here to View Full Article
to the top


Innovation and Competitiveness: How'd We Do?
CRA Bulletin (09/27/06)

The lofty goals that President Bush outlined in the American Competitiveness Initiative (ACI) in his State of the Union address in January have been echoed by key congressional leaders, seemingly answering the long-ignored pleas for help of the technology industry for the government to help the U.S. technology industry compete with its foreign rivals. The provisions that Bush spelled out include doubling federal research funding over the next 10 years, treating research and development as a tax credit, and bolstering secondary science and math education. Despite tepid support early on from House Republicans, appropriations bills to double the funding of research organizations such as the NSF, NIST, and the Department of Energy's Office of Science over the next 10 years have cleared the full House and Senate. The lone remaining stumbling block concerns whether Congress will return to the appropriations bills following the November elections if they are not passed beforehand. Should Congress opt simply to pass a continuing resolution through Sept. 30, 2007, however, spending would remain at the level of the previous fiscal year, erasing the increases promised by the president and Congress. While such a move is unlikely, it is certainly not being discounted by the technology community, which instead is hoping for the passage of an omnibus package that rolls all the appropriations bills into one comprehensive piece of legislation. Meanwhile, several congressional proposals authorizing various provisions of the ACI have encountered opposition from the White House, which claims that the ACI programs need no additional authorizations. With the post-election lame-duck session fast approaching, the chances of any of those provisions passing up to the president are increasingly slight.
Click Here to View Full Article
to the top


New Demands Register With Patent Laws
Financial Times (09/29/06) P. 8; Tait, Nikki

Australian inventor Neal Macrossan continues a legal battle against the UK Patent Office, which rejected his application to patent an electronic document assembly system that would enable users to create new companies via the Internet. Macrossan has obtained patents for his Incorporator system in Australia, New Zealand, Singapore, and South Africa. However, the UK Patent Office insists that his invention is excluded under the Patent Act because it does not use a new mental process and does not set forth a new means of conducting business. The issue of exclusion is a thorny matter in Europe, as the European Parliament was lobbied in recent years by those who wanted all software to be patentable and those who worried that such a move would hinder research and development but failed to take sides. The Court of Appeal is expected to issue a ruling next week. According to attorney Joel Smith of Herbert Smith, "The decision is expected to signal the way ahead for software patenting." He says, "This is the first time in a decade that the appeal court has looked at the issues...The decision is expected to signal the way ahead for software patenting."
Click Here to View Full Article - Web Link to Publication Homepage
to the top


Analysis Finds Value of Research & Development Higher than Thought
Investor's Business Daily (09/29/06) P. A4; Deagon, Brian

Preliminary findings by the U.S. Bureau of Economic Analysis (BEA) and National Science Foundation (NSF) indicate that R&D makes a greater contribution to U.S. economic productivity and growth than is currently estimated. The two agencies will be making a long-term effort to improve its understanding of and ability to calculate R&D's impact on the economy--something that "could be very valuable to the Federal Reserve Board and could persuade them to pursue different monetary policies," said Steve Landefeld, director of the BEA. Matthew Kazmierczak, vice president for research and industry analysis at the AeA--formerly known as the American Electronics Association--says the new analysis will be a boon to the AeA as it seeks to lobby Congress to restore and make permanent its tax credits on R&D investments. According to the BEA, there is no commonly agreed upon output measure for R&D, which is treated as a business expense and intangible asset rather than as an investment. By contrast, the BEA and NSF are taking the view that R&D is an investment--something that would enable economists to quantify its contribution to economic output more clearly. The preliminary findings indicate that R&D's contribution to real growth from 1995 through 2002 was 6.5 percent, compared to 4.5 percent from 1959 through 2002. The difficulty of measuring the effects of R&D investments is a partial explanation for former Federal Reserve Chairman Alan Greenspan's difficulty explaining the basis for the economic boom in the 1990s; as part of the answer to this question became clear, the BEA in 1999 started incorporating the flow of investments made in software into its GDP estimates.
Click Here to View Full Article - Web Link to Publication Homepage
to the top


Study Shows Internet to Be Resilient Against Terror Attacks
Ohio State Research News (09/28/06)

Ohio State professor Morton O'Kelly is co-author of a new study that concludes that a serious attack on Internet network hubs in the U.S. would not likely collapse the Internet, but may degrade its functioning. "There are so many interconnections within the network that it would be difficult to find enough targets, and the right targets, to do serious damage to Internet reliability nationwide," says O'Kelly. Detailed results have been published in the most recent issue of the Environment and Planning B journal. The study used computer modeling to simulate an attack on major Internet backbone facilities, and assumed not all facilities could be attacked at once. Seattle and Boston have the most diverse number of hubs supporting Internet traffic among cities, and therefore are most resilient, the study concludes. The study, conduced with Ohio State graduate student Hyun Kim and professor Changjoo Kim, was a follow-up to a 2003 study by O'Kelly that assumed that selected city network nodes would be completely knocked out by accidents or attacks. O'Kelly says that is not a likely scenario since peering agreements between carriers makes it very difficult to shut down an entire network node. O'Kelly says, "There is a rich web of connections in these Internet nodes, and a hit on a single city node or even several of them is not likely to wipe out Internet connectivity."
Click Here to View Full Article
to the top


Alliance Aims to Rethink Network Computing and Communications
Rensselaer News (09/27/06)

Researchers at Rensselaer Polytechnic Institute are pursuing research that defense agencies in the United States and the United Kingdom hope will improve wireless sensor networks in urban environments. Computer science professor Boleslaw Szymanski will head a team that will study how complex sensor data infrastructures manage audio, visual, radar, and chemical sensors. The U.S. Army Research Laboratory and the U.K. Ministry of Defense want to apply the findings to secure networks of sensors, with hopes of giving coalition forces more flexibility on the battlefield. "We are going to take what we already know about sensor network protocols and infrastructure and think creatively about the future designs," says Szymanski. "With information coming from these different sources, we need to know how to make them collaborate to provide the best information while minimizing the chance that they will be detected." Szymanski's team has received $1.85 million to develop sensor network algorithms, which could also have some civilian applications. The project is part of a larger $138 million initiative over 10 years to reevaluate network computing and communication, involving a consortium that includes the University of Southampton, CUNY, LogicaCMG, and IBM.
Click Here to View Full Article
to the top


Virtual Face-Ageing May Help Find Missing Persons
New Scientist (09/26/06) Simonite, Tom

Researchers at Kent University in the United Kingdom hope to improve upon existing artificial aging software by creating a system that can produce more detailed changes of how a person would look over time. Chris Solomon, a forensic imaging specialist at Kent, is heading a team that has developed software that automatically simulates the aging process, based on photos that show how an individual's face has changed in the past, as well as images that show the aging of family members and the general population. "Most changes in people's faces are shared by the population as a whole," says Solomon. The system analyzes the location and size of each feature, before converting a face into a set of numbers, and then compares the face to others in its face database to determine the needed changes. To test the accuracy of the program, the researchers have had older people provide them with photos from their younger years, and then have used the software to see how close its artificial rendering comes to an actual face. Results have been positive, says Solomon, "although sometimes it doesn't age a face as much as you would expect." His team will continue to work to improve the software, which could be used to help find missing people.
Click Here to View Full Article
to the top


Intel Pushes for 80-Core CPU by 2010
VNUNet (09/27/06) Sanders, Tom

Intel's new wave of research aims to produce terra-scale chips to run the next-generation mega data centers that will power hosted applications. At last week's Intel Developer Forum, CEO Paul Otellini demonstrated a prototype of the TerraFLOP processor, which features 80 cores and runs at 3.1 GHz, delivering more than one teraflop of combined performance. The chip should make it to commercial production by 2010. "This kind of performance for the first time gives us the capability to imagine things like real-time video search or real-time speech translation from one language to another," Otellini said. Online-content providers such as Google and YouTube will likely require this level of processing power. By 2010, terra-scale servers will account for one quarter of all server sales by 2010, Intel projects. "We are talking about a fundamental change in the way that the whole computing infrastructure is built," said Intel CTO Justin Rattner. "At the core of that infrastructure will be the future data center, what we refer to as the mega data center." The TerraFLOP chip is comparable to Sun Microsystems' Niagara processor and IBM's Cell processor. Intel is now developing silicon-based laser technology to boost interconnect speeds. Earlier this month, the company announced that its researchers had made a breakthrough whereby they could create lasers through conventional manufacturing processes. A cluster of 25 of those lasers could deliver a terabyte of throughput, according to Rattner.
Click Here to View Full Article
to the top


A Pioneer of the Web Campaigns for Internet 'Neutrality'
New York Times (09/27/06) P. E6; Markoff, John

Earlier this year, Web inventor Tim Berners-Lee added his to the chorus of voices advocating Net neutrality--the notion that Internet service providers should not be permitted to give preferential treatment to certain packets of data. He shared his thoughts on the subject in a recent interview, claiming that in fact he had broached the subject (albeit not by that name) in a book he wrote several years ago. Berners-Lee describes the debate over Net neutrality as one of social consequence that transcends mere technical concerns. "I think people who talk about dismantling--threatening--Net neutrality don't appreciate how important it has been for us to have an independent market for productivity and for applications on the Internet," he said. Internet TV is one application that Berners-Lee cites as a burgeoning market dependent upon neutral delivery of data packets. By itself, packet inspection is not a threat to Net neutrality, Berners-Lee says, noting that routers today must be capable of greater functionality, so that they can inspect packets for threats such as denial-of-service attacks without skewing transmission speed. Myriad security threats notwithstanding, Berners-Lee claims that Internet users should be able to connect with a certain quality of service, without having to negotiate. Even if the United States were to lose its hegemony in the Internet space, Berners-Lee is confident that the Net neutrality charge would be taken up by other nations.
Click Here to View Full Article
to the top


High Pay, Plenty of Jobs, but Few Students: It Doesn't Compute
Globe and Mail (CAN) (09/26/06) Van Kampen, Karen

Colleges and IT employers in Canada are working together in an effort to attract more students to IT and computer science programs. The joint effort comes at a time when enrollment in IT and computer science programs has fallen 50 percent over the past five years, even as IT companies continue to add jobs and offer attractive salaries. At a meeting at Microsoft Canada's office last year, IT employers said they want technical talent that is also able to communicate on an interpersonal basis, give presentations, and work in teams, and colleges and universities have responded by focusing more on providing students with soft skills. IT companies also encouraged colleges to incorporate IT into other majors, which would enable students from other departments to graduate with technology skills. Meanwhile, colleges suggested that IT companies hire more people after they graduate and help change the negative image of the industry. Many parents and students remember the high-tech slump that began in 2000, and the prevailing perception that the industry is for geeks and men does not help. Women account for just 10 percent to 15 percent of IT students.
Click Here to View Full Article
to the top


Ian Pearson, Futurologist: The ITWales Interview
ITWales.com (09/25/06) Earls, Sali

BT futurologist Ian Pearson says in an interview that he keeps track of nascent technologies, "so as soon as we learn that somebody is doing some research in a particular field, [my team starts] putting that together with all the other bits of research that everyone else is doing, and try to figure out what people might try to use that for once it becomes real technology in a decade or so." He recommends the application of common sense and business acumen to filter out the good technology from the bad. Pearson's prediction in the BT Technology Timeline that androids will comprise 10 percent of the population in the next 10 to 15 years is based on the observation that several Japanese companies are manufacturing small androids with the expectation that there are major markets for robot companionship and domestic labor, and that these companies undoubtedly researched the marketplace carefully in coming to this conclusion; he also tempers his forecast with the statement that the androids that will be seen in that time will not be as sophisticated as the human-looking robots popularized in science fiction. Pearson says advances in neuroscience are demystifying the brain's workings, and he projects that this, paired with progress in machine software generation, will yield prototypes of computers whose intelligence approximates that of human beings by 2015. The futurologist notes that cybernetic technologies will raise concerns about people's bodies being hacked, as well as issues that cybernetic enhancements will give people unfair advantages over others. Pearson believes AI advancements, the collective computing power of networked devices, and the proliferation of cyberterrorists and other malefactors with the connectivity and resources to attack us will make Internet security and overall security nonexistent. An even worse possibility is the use of smart bacteria to break security, and perhaps annihilate humanity. Pearson expects social technology such as instant voice messaging and phones that can track their owners and owners' friends' whereabouts to be a hot area for future IT.
Click Here to View Full Article
to the top


Software Development Methodology Today
InformIT (09/22/06) Jayaswal, Bijay K.; Patton, Peter C.

Tapping the best practices of the past and the potential of more recent programming technology can yield a Robust Software Development Model (RSDM) that takes into account the need to tackle quality problems upstream, where nearly all software bugs crop up, write Bijay K. Jayaswal and Peter C. Patton. Software testing can uncover few potential problems because there is a difference between the complexity of hardware and software, and software lacks a manufacturing process. The challenges of generating trustworthy software can be met by combining the iterative RSDM, Software Design Optimization Engineering, and Object-Oriented Design Technology, which are integrated together in the authors' proposed Design for Trustworthy Software (DFTS). The Robust Software Development Process is comprised of seven elements, including an enduring development process that can supply interaction with users by recognizing their articulated and unarticulated needs across the software life cycle; accommodation for feedback and iteration between multiple development stages on an as-needed basis; a mechanism supporting the optimization of design for reliability, cost, and cycle times simultaneously in upstream stages; opportunity for early return on investment provided by incremental development techniques; step-wise development to construct an application as needed and to supply sufficient documentation; support for risk analyses at various stages; and the ability to accommodate object-oriented development. The authors' proposed software development model mixes the cascade and iterative development models that supports feedback at all levels. The model marries the best practices and features from numerous development methodologies and collectively accommodates a customer-oriented robust software technology that fulfills all seven requirements of the Robust Software Development Process.
Click Here to View Full Article
to the top


Computers That Read Your Mind
Economist Technology Quarterly (09/06) Vol. 380, No. 8496, P. 24

With all kinds of technologies vying for people's attention, researchers are developing products designed to help users become more lucid and focused by achieving a state of "augmented cognition" through the use of sensors that can deduce a person's mental state. Such technology could help people cope with information overload, a problem plaguing the U.S. military; it comes as no surprise that the Defense Advanced Research Projects Agency (DARPA) is a major investor in augmented cognition research. One concept being pursued in this vein is a smart cockpit for fighter aircraft, in which the pilot's brain activity is measured by a helmet equipped with EEG sensors. When weighed against contextual information, the system can determine if the pilot's level of concentration is too delicate to be interrupted and filter out non-essential input to reduce cognitive stress. There are also augmented cognition efforts that target the workplace and other non-military venues. Microsoft Research scientist and American Association for Artificial Intelligence President Eric Horvitz says his lab is working on technology that filters data before it reaches the user; the goal is to make people capable of absorbing more information without being overloaded. Rather than analyzing brain activity, the idea is for the system to get clues about the user's mental state by studying other factors, such as keystrokes, the content the user is viewing, the time of day, and the contents of a desktop calendar. Augmented cognition's potential applications also include entertainment: John Laird of the University of Michigan's Artificial Intelligence Laboratory believes such systems could prevent boredom and confusion among video gamers, while Lancaster University's Alan Dix foresees sensor-outfitted game consoles that can infer the player's level of alertness.
Click Here to View Full Article
to the top


Building Effective Multivendor Autonomic Computing Systems
IEEE Distributed Systems Online (09/06) Vol. 7, No. 9, Rana, Omer F.; Kephart, Jeffrey O.

Panelists discussing the possibility of building effective multivendor autonomic computing systems at the 3rd IEEE International Conference on Autonomic Computing were asked the importance of such systems to themselves, their organizations, and their research; the key technologies needed to realize such systems; and the political, economic, or social obstacles to using such systems in real-world scientific and business applications and how such hindrances could be addressed. All panelists were in agreement in terms of their emphasis on the business advantages of autonomic system development, which include increasing the number of servers maintained by one system administrator. IBM Research's Steve White said the strongest motivation for using an autonomic system is addressing the "complexity" of system component management, while another key goal to facilitate broader autonomic computing system takeup is the management of user expectations. The importance of autonomic computing standards was recognized by all the panelists, while Julie McCann of Imperial College, London, suggested the need for standardizing the types of probes needed for monitoring, the kinds of events generated in the system, and actions executed by the system. HP Labs' Kumar Goswani noted a number of reasons why administrators may shy away from adopting autonomic systems, including fear of job replacement by automation. In general, the panelists agreed that to a limited extent autonomic systems are already in place, and wider adoption requires a determination of autonomic computing's meaning to customers and researchers as well as the incorporation of greater trustworthiness in such systems. They also concluded that autonomic computing systems must be interoperable and supportive of virtualization, while key application scenarios that would facilitate wider adoption of multivendor autonomic systems must be identified.
Click Here to View Full Article
to the top


To submit feedback about ACM TechNews, contact: [email protected]

To unsubscribe from the ACM TechNews Early Alert Service: Please send a separate email to [email protected] with the line

signoff technews

in the body of your message.

Please note that replying directly to this message does not automatically unsubscribe you from the TechNews list.

ACM may have a different email address on file for you, so if you're unable to "unsubscribe" yourself, please direct your request to: technews-request@ acm.org

We will remove your name from the TechNews list on your behalf.

For help with technical problems, including problems with leaving the list, please write to: [email protected]

to the top

News Abstracts © 2006 Information, Inc.


© 2006 ACM, Inc. All rights reserved. ACM Privacy Policy.