ACM TechNews (HTML) Read the TechNews Online at: http://technews.acm.org
ACM TechNews
March 7, 2007

MemberNet
The ACM Professional Development Centre
Unsubscribe

Welcome to the March 7, 2007 edition of ACM TechNews, providing timely information for IT professionals three times a week.


HEADLINES AT A GLANCE:

 

Feds Test New Data Mining Program
USA Today (03/07/07) P. 3A; Yaukey, John

The Department of Homeland Security is testing a new data searching tool that has many concerned that federal data analysis abilities are outpacing Congress's ability to oversee them. The program, known as Analysis, Dissemination, Visualization and Semantic Enhancement (ADVISE), can link and cross-match materials from Web sites and blogs with government records and personal data in order to identify patterns that suggest terrorist involvement. Homeland Security has been developing the system since 2003, when the Pentagon's Total Information Awareness program was scrapped due to privacy issues. "Congress is overdue in taking stock of the proliferation of these databases that increasingly are collecting and sifting more and more information about each and every American," said Senate Judiciary Committee chair Patrick Leahy (D-Vt.). In January, Leahy and other Senators introduced the Federal Agency Data Mining Report Act of 2007, which would require all federal agencies to inform Congress of all data mining activity. "We have not used any data that was not legally obtained," says Homeland Security's Chris Kelly. However, the lack of public knowledge concerning such programs worries many. "There is not enough information about these [data mining] programs to meaningfully evaluate the benefits," says Electronic Privacy Information Center director Marc Rotenbeg. Whether or not ADVISE is deployed will depend on demand for it, says Kelly.
Click Here to View Full Article
to the top


Searching for Michael Jordan? Microsoft Wants a Better Way
New York Times (03/07/07) P. C3; Markoff, John

On the opening day of Microsoft's three-day Techfest, the company focused on its research into improving Web search abilities. By changing the way users and computers locate information on the Internet, Microsoft believes it can surpass Google in the search market, despite its inability to make up any ground in the past. A new service called Mix, to be released within six to nine months, allows users to organize and share search results. Another service on display, Web Assistant, aims to increase the relevance of searches by resolving ambiguities in search terms such as those that would return results containing different people with the same last name. Web Assistant is "a prototype of a browser that aims to change the way we interact with information," says Microsoft researcher Silviu Cucerzan. By considering searches conducted by other users, and the way they changed search terms when presented with results they did not want, search applications could potentially refine results. Another way of increasing relevance was displayed by Personalized Search, which compares results with Desktop Search, the index built by Windows users from documents on their hard drives. Such a system could predict that a user was searching for Michael Jordan the machine-learning expert, not Michael Jordan the basketball player. Microsoft predicts that the future of Web search will look nothing like today's simple interfaces. "If in 10 years we are still using a rectangular box and a list of results, I should be fired," says company search expert Susan Dumais. Researchers are also exploring several other techniques for enhanced understanding of what users are searching for, including conversational-style interfaces.
Click Here to View Full Article
to the top


Assessment, Incentives Key to Government's Data Center Energy Gals
Computerworld (03/05/07) Dunn, Darrell

More than two dozen technology leaders met with representatives from the Department of Energy in Austin, Texas, last week to discuss how federal energy-efficiency legislation for data centers should be applied. Congress is currently evaluating more than 70 energy-related pieces of legislation, and the President signed a bill at the end of last year requesting an in-depth study into data center efficiency. The discussions in Austin are expected to produce Energy Saving Assessments (ESAs) of the country's largest data centers. A recent study of 200 heavy industrial businesses has shown that more than 50 trillion BTUs of natural gas could be saved, and 250 more ESAs are planned for this year. In order to improve energy efficiency, those in Austin agreed that assessment, incentives, server consolidation, and virtualization should be pursued. The unification of facilities and IT personnel was also discussed, since "there has been an inconsistency of approach, no common vision, and sometimes a lack of trust" between the two realms, according to Hewlett-Packard's Paul Perez. The creation of an Energy Star certification, an awareness effort, and a metric for measuring data center efficiency were also discussed. One participant suggested a consumption tax for businesses exceeding a certain baseline, if a metric were to be established. The President's submitted budget for 2008 identifies $9 billion for energy efficiency efforts.
Click Here to View Full Article
to the top


Senator Introduces U.S. 'Competitiveness' Bill
IDG News Service (03/05/07) Gross, Grant

Tech industry groups are welcoming the introduction of the America Competes Act, which seeks to double the $5.6 billion annual funding for the National Science Foundation (NSF). The legislation answers the call from many tech companies for an increase in funding for math and science programs to help ensure the competitiveness of U.S. workers and industries. "To keep our competitive edge, we need to embrace technology and ensure that our children receive a stronger education in the core subjects of mathematics and science," according to a statement from Sen. John Ensign (R-Nev.), who introduced the bill. The legislation would boost the budget of the Department of Energy's Office of Science from $3.6 billion for fiscal 2006 to $5.2 billion in 2011, have federal agencies funding science and technology research require spending of about 8 percent of R&D budgets on high-risk projects, create training programs for math and science teachers at the NSF, and provide greater support to math and science programs at the NSF and other agencies. U.S. competitiveness is the subject of Senate Health, Education, Labor, and Pensions Committee and House Small Business Committee hearings on Wednesday.
Click Here to View Full Article
to the top


Silicon Valley's Immigration Problem
Forbes.com (03/05/07) Corcoran, Elizabeth

The U.S. technology industry has benefited tremendously over the past 20 years from the presence of talented immigrants from India, Taiwan, and other Asian countries. Although there is a perception that the best and brightest immigrants have their eyes set on the United States, that may no longer be the case. According to Rosen Sharma, only three of the 40 students in his graduating class from the Indian Institute of Technology (IIT) in Delhi in 1993 decided not to pursue jobs in the United States, but last year the figure reached 35 of the 45 IIT graduates who participated in the same program. Such a trend would have huge implications for the nation in terms of competitiveness. Over the past decade, foreign-born immigrants started 25 percent of U.S. technology startups, which collectively employed 450,000 people and generated $52 billion in sales, according to a study by AnnaLee Saxenian, currently dean of the School of Information at the University of California at Berkeley, and researchers at Duke University. Meanwhile, students born in the United States understand the global nature of the business world today, and see international opportunities as a way to gain valuable experience for their careers. The hope is that they will return to the United States and help grow the domestic industry.
Click Here to View Full Article
to the top


Look, Ma, No Scalpel
Globe and Mail (CAN) (03/06/07) Belford, Terrance

A system conceived by an undergraduate biomedical engineering student at the University of Toronto is allowing people to see what they would look like with different facial features. Three years ago, Alireza Rabi took facial-recognition software developed by his professor and began tweaking it to not only recognize a face but to replace features in a way that "the resulting face looked natural and not like you had stuck someone else's nose on someone's face," Rabi says. The software, called Modiface, is available online as a beta version and has drawn an average of 100,000 hits per day since January. Applications exist that simulate the results of plastic surgery, but "this one is entirely automated," explains University of Toronto Artificial Perception Laboratory head Parham Aarabi. "It is all point-and-click. Anyone with a digital camera and an Internet connection can do it." The process of creating Modiface first involved developing algorithms that could perform facial recognition at sufficient speeds, then developing software that could work in the high-traffic environment of an Internet site. Rabi says the system learns from its mistakes, such as placing a nose or eyes incorrectly on a face, and the researchers adjust it every few weeks to remove problems caused by users submitting images other than faces. "Essentially the software amends and updates its own algorithms automatically as it gains more experience," Aarabi says. "By seeing how it deals with learning, we, in turn, get insights into artificial intelligence."
Click Here to View Full Article
to the top


Gates in DC as New H-1B Battle Shapes Up
Computerworld (03/06/07) Thibodeau, Patrick

Bill Gates is scheduled to appear in front of the Senate Committee on Health, Education, Labor & Pensions during a hearing titled "Strengthening American Competitiveness for the 21st Century," which will include discussion of H-1B visas. Committee chair Ted Kennedy (D-Mass.) is currently working with Sen. John McCain (R-Ariz.) on an immigration reform package that will likely recommend an increase of the current cap on H-1Bs. Applications for the visas will be accepted starting April 1 and the cap of 65,000 visas is expected to be reached within a few months. Many believe that Gates can have more of an impact on increasing the H-1B cap than the President, to whom H-1Bs are only a small part of immigration reform. In a recent op-ed piece in the Washington Post, Gates explained that computing jobs are growing, even as the number of students pursuing related degrees is decreasing, exacerbating the need for an influx of skilled workers. The opposition is led by Sen. Jim Webb (D-Va.), who in his Democratic response to the State of the Union address spoke of a responsibility to protect American workers against the decreasing number of white-collar jobs. The main concerns with the H-1B program include the potential for employees to hire foreign workers at lower than average salaries without considering American workers for the positions. Both sides agree that green cards should be easier to obtain so skilled workers can stay in the country without depending on temporary H-1Bs. Rochester Institute of Technology public policy professor Ron Hira says that "More and more members of Congress are becoming aware of the serious flaws in the H-1B program."
Click Here to View Full Article
to the top


New Research Center at UF Expected to Improve Powerful Computers
University of Florida News (03/06/07) Hoover, Aaron

The first U.S. research center for reconfigurable computing has been established at the University of Florida. The NSF Center for High-Performance Reconfigurable Computing will operate as a consortium of universities and more than 20 federal and industry members, with the goal of creating techniques for next-generation computers to adapt internal hardware for optimal performance of any given task. The center is a response to the increasing need for maintenance, electricity, and other concerns of the high-performance computers that so much of today's research relies on. Special-purpose logic devices are built to perform a single task with extreme effectiveness, while general-purpose logic devices are built to perform nearly any function, but do so with less effectiveness, so "What we need are technologies that can be both powerful and flexible," says the center's director, Alan George. "Think of an integrated circuit as a big ball of clay. If you were a sculptor, you could model that clay into anything and everything you wanted, limited only by the amount of clay you have," he adds. "That's the basic idea behind a reconfigurable system ... we can combine and morph [digital logic gates] into structures for whatever purpose we need at any given time."
Click Here to View Full Article
to the top


E-Rescue Plans for Coping With Disasters
The Australian (03/06/07) Foreshaw, Jennifer

A four-year project led by the National ICT Australia (NICTA) is developing technology to help response efforts in the case of natural disasters or other emergencies. The Smart Applications for Emergencies (SAFE) initiative will include video surveillance with smart cameras, wireless mesh networking, planning, and information management. "We need to improve our game, in terms of how we operate across agencies, how we warn the community, and how we can better provide response and long-term response systems to enable more efficient deployment of resources post-disaster," says Safeguarding Australia project leader Renato Iannella. A NICTA lab is currently building a demonstrator to prove that these technologies can be combined into an effective response system. NICTA's Smart Transport and Roads project is building testbeds on Sydney streets to trial wireless and sensing technology, eight of which have been completed. These testbeds include advanced video sensing and surveillance techniques, new traffic control systems, and multi-modal interfaces for control-room operations. As these technologies develop, they are expected to be applied to military, logistical, and airline systems. Many aspects of the SAFE project were on display at Techfest 2007, NICTA's annual technology showcase.
Click Here to View Full Article
to the top


Google Helps Terabyte Data Swaps
BBC News (03/07/07) Waters, Darren

Google is helping researchers transfer data that is too immense to be sent over a computer network by providing them with hard drive systems capable of storing 120 terabytes of data that will then be passed along to other researchers. Machines used are about the size of a brick and contain numerous hard drives. The data is kept by Google in an open format and placed into the public domain or covered by a creative commons license. The idea came about when the project to re-construct the Archimedes Palimpsest, a medieval parchment containing treatises by the Greek scientist, had created huge amounts of data. "The networks aren't basically big enough and you don't want to ship the data in this manner, you want to ship it fast," says Google open source program manager Chris DiBona. "You want to ship it sometimes on a hard drive. What if you have these huge data sets--120 terabytes--how do you get them from point A to point B for these scientists?" The program is not currently open to the public, rather Google approaches researchers who are known to need massive data storage, or researchers contact the company themselves. Google's open source efforts include funding, totaling more than $1.5 million last year, and a program called the Summer of Code, where student developers work with open source teams. "The founders of Google are passionate about open source," explains DiBona. "They see Google as a net beneficiary of open source technology."
Click Here to View Full Article
to the top


Vint Cerf: Father Knows Best
Dark Reading (03/02/07)

Vinton Cerf, co-creator of the TCP/IP stack used to build the Internet infrastructure, has been involved with the Internet longer than most, and from his new position with Google he can see the need for increased security efforts. Cerf's work includes promoting Internet access to those who don't have it around the world, Internet policy development, promoting concepts within Google engineering groups, and bringing new employees and partners to Google. At the same time, he is chairman of ICANN, and works at the Interplanetary Internet effort at the Jet Propulsion Laboratory. Cerf remembers a time when being called a hacker was something to be proud of and says "purists wish that we could apply some other terms so as to keep 'hacker' what it once was, but I think the language has become too polluted." Currently "much work is needed to increase the security of the Internet and its connected computers," he says. Domain Name Security (DNSSEC) technology could be useful in protecting the Internet's DNS servers, he adds. In addition, he believes that the "use of IPSec would foil some higher-level protocol attacks ... [and the] digital signing of IP address assignment records could reduce some routing/spoofing risks." OSes need to be more secure, and two-factor authentication should surpass normal passwords as the standard. "Security is a mesh of actions and features and mechanisms," says Cerf. "No one thing makes you secure."
Click Here to View Full Article
to the top


The Digital Building--Security Starts at the Door
Fraunhofer-Gesellschaft (03/07)

German researchers have developed digital building technology that blurs the line between IT and the physical world. The system, known as "facilityboss," enables rooms to be reserved using only the Internet, locks to be adjusted automatically and remotely to grant access to certain people on certain dates and to instantly alter accessibility, and all IT systems and electronic devices to be interconnected. "Facilityboss is a kind of operating system for the digital building, making it possible to link and control a wide variety of components in a building," says the Fraunhofer Institute for Secure Information Technology's Thorsten Henkel. Building operations, from heating to computers, can be controlled from a single interface, which gathers information from a network of sensors throughout the building. RFID tags enable the building to know where various equipment is being used at any given time and enable people to access certain areas by identifying themselves. The radio-based locking system is comprised of cylinder locks with an integrated radio system and a PC that runs administration software. Each cylinder lock connects to the administration through an access point, allowing changes to be made remotely in the case that keys are lost or a new employee is hired. When a room is reserved for a certain time, those meeting there are able to gain access during the hours reserved. "The [locking] system combines the advantages of electronic locking systems with those of wireless communications," says the Fraunhofer Institute for Communications Systems ESK's Markus Augel.
Click Here to View Full Article
to the top


Our Manycore Future
HPC Wire (03/02/07) Vol. 16, No. 9, Feldman, Michael

HPC Wire editor Michael Feldman believes that "The Landscape of Parallel Computing Research: The View from Berkeley" is one of the most important works on the subject of manycore architecture "not because it claims to have all the answers, but because it manages to ask all the right questions." The report claims that manycore architecture and the software built for it could "reset microprocessor hardware and software roadmaps for the next 30 years." Its authors promote the use of small, simple processing cores, which they argue are best suited for parallel codes. Thirteen computational methods, known as the 13 Dwarfs, will serve as the foundation for parallel apps, according to the report; they will consist of Phil Colella's original Seven Dwarfs from scientific computing and six more from other computing domains. Parallelized applications could dominate IT in the coming years, as Internet-based applications such as text searching become more parallelized. If word processing is to evolve further, parallelism would play a key role, enabling voice recognition, improved language translation, and other functions. The main concern for the industry is how to program massive parallelism, and the Berkeley professors do not side with either sequential nor multicore programming; the goal is to create multicore processing that is independent of the number of processors. Although the paper's authors promote human-centric design, they are aware of the tradeoff between ease of programming and runtime performance. The report also details how the HPC and embedded communities are being brought together by a common need for energy efficiency, low-cost hardware building blocks, reuse of software, and high-bandwidth data.
Click Here to View Full Article
to the top


Here's Why Your Web Apps Are Sitting Ducks
Network World (03/01/07) Brown, Bob

Web servers are still at a high risk of being targeted by hackers, according to a new paper from researchers at the Honeynet Project. The Honeynet Project provides real systems for unwitting attackers to interact with so they can study what the attackers are looking for and what tactics they use. Web applications remain vulnerable for a variety of reasons, including poor-quality code, the emergence of search engines as hacking tools, and the ability to use PHP and shell scripts to execute attacks. Hackers can also obtain massive amounts of information from Web servers because they have higher bandwidth connections than most desktops and are often connected to an organization's databases. According to the Honeynet Project's report "Know Your Enemy: Web Application Threats," hackers found vulnerabilities using search, spider, and IP-based scanning and executed attacks with code injection, remote code-inclusion, SQL injection, and cross-site scripting. Hackers also attempted to disguise their identities using proxy servers, the Google Translate service, onion routers, and several other systems. The primary objectives of the attacks were defacement, phishing attacks, email spam, blog spam, botnet recruitment, and file hosting.
Click Here to View Full Article
to the top


Puppetnets: Misusing Web Browsers as a Distributed Attack Infrastructure
Honeyblog (03/05/07) Lam, V.T.; Antonatos, Spyros; Akritidis, P.

Puppetnets are networks generated by malevolent Web sites for the purpose of indirectly misusing visiting Web browsers as unwitting tools for worm propagation, distributed denial-of-service attacks (DDoS), reconnaissance scans, and other attacks on third parties. Though the threat rating of puppetnets is lower than that of botnets, the regularity of client-side exploits could make puppetnets a serious problem in the future, according to the authors of a study presented at the recent ACM Conference on Computer and Communications Security. Unlike botnets, puppetnets are not critically reliant on the exploitation of specific deployment flaws or on social engineering strategies that fool users into installing malware on their computer; they also support a model where the attacker has only partial control over the actions of the participating nodes, while the dynamic nature of puppetnet participation makes puppetnets harder to track and filter. The authors contend that the use of puppetnets illustrates a flaw in the Web's design, namely that the security model is committed almost exclusively to shielding browsers and their host environment from malicious Web servers and servers from malicious browsers, thus ignoring the possibility of assaults directed against third parties. The power of a puppetnet depends on how popular a malicious Web site is as well as the users' browsing patterns. The authors offer several approaches for countering puppetnet attacks, although they are only partial solutions at best. Disablement of JavaScript will reduce the effectiveness of puppetnet-engineered DDoS attacks, reconnaissance probes, and worm propagation by at least one order of magnitude, while carefully implementing existing defenses can also mitigate the puppetnet threat to a certain degree. Other defenses evaluated include server-side controls and puppetnet tracing, server-directed client-side controls, client-side behavioral controls, and filtering that uses attack signatures, all of which have their pluses and minuses.
Click Here to View Full Article
to the top


Eastern Europe's Silicon Rush
Chronicle of Higher Education (03/09/07) Vol. 53, No. 27, P. A45; Woodard, Colin

Eastern European universities are helping attract technology multinationals to the region through their outflow of first-rate computer-science graduates, but there are concerns among academics that their departments will be emptied and their programs demolished by IT graduates and even faculty jumping ship to the private sector in order to earn a higher salary. Among the factors that have generated so much foreign interest in Eastern Europe is the growing dominance of regional students in international programming contests, while the region enjoys a competitive advantage over other countries through its geographic and cultural closeness to Western Europe. Yet Eastern Europe's industrial growth could come to a screeching halt if excessive numbers of computer-science graduates are drawn away by the promise of more money. To avoid such a scenario, technology companies have partnered with universities in Kosice, Slovakia, to bolster the schools' computer-science departments and establish a center for research and innovation. "Businesspeople have come to the conclusion that they need the universities, not just their graduates," reports Technical University of Kosice computer-science professor Anton Cizmar. "If we're to produce good graduates, the professors also have to have the right working and living conditions." A group of companies, the local government, and the Technical and Pavol Jozef Safarik Universities have joined forces to set up the Kosice IT Valley Association to lay down a sturdy platform for the local IT industry. Association coordinator Tomas Sabol says the aim of the organization is to first build a "critical mass" of computer-science graduates, and then use them to draw more research and development so that Kosice can become a center of excellence that attracts more investment and fortifies the local economy with the creation of high-paying jobs.
Click Here to View Full Article - Web Link to Publication Homepage
to the top


Possible Ontologies
Internet Computing (02/07) Vol. 11, No. 1, P. 90; Hepp, Martin

More and better ontologies are necessary for creating the Semantic Web, but Digital Enterprise Research Institute researcher Martin Hepp points out that ontology generation has an inherent social component that suffers from technical, legal, economic, and social bottlenecks. He cites five basic aspects of building and committing to ontologies that are inadequately addressed by existing ontology-engineering practices: The conflict between ontology engineering lag and conceptual dynamics; consumption of resources; communication between creators and users; incentive conflicts and network externalities; and intellectual property rights. Hepp details four major obstacles in an attempt to explain why actual Web ontologies are so few in number. He mentions that there is widespread ignorance of the dynamics among conceptual elements, which holds relevance when constructing ontologies for specific domains. Another major bottleneck is economic incentive, and Hepp notes that even if the ontology's overall benefit throughout its lifetime more than compensates for creation costs, its creation must still be economically feasible for each individual who must contribute. The third big hindrance is ontology perspicuity, with Hepp writing that a lack of understanding of the inferences that are to be derived from a specific ontology up front makes their authorization by individuals or organizations difficult. The fourth bottleneck is intellectual property rights, which limit the creation and re-publication of ontologies as derived works. Through analysis of these bottlenecks, Hepp predicts a situation in which "the more detailed and expressive the ontology, the smaller the actual user community will be because it increases the resources necessary for reviewing and understanding the specification and associate documentation, which makes committing to the ontology reasonable only for a smaller number of individuals."
Click Here to View Full Article - Web Link to Publication Homepage
to the top


Human-Computer Interaction: The Human and Computer as a Team in Emergency Management Information Systems
Communications of the ACM (03/07) Vol. 50, No. 3, P. 33; Carver, Liz; Turoff, Murray

Incorporating the computer as a part of the emergency management team guarantees that people will continue to excel at jobs requiring their particular skill sets while being supported rather than impelled by the technology. The success of systems proposed as aids to the emergency management team's decision-making process depends on the human-computer interface design's consideration of user requirements, and the interface's role as a facilitator of human-computer interaction. Tools under development along these lines include information prioritization, decision support and modeling tools, and a representation of a common operating picture. Context visibility is offered as a method for mitigating information overload. The approach involves the realization that any external event is a root item that enables the dynamic convergence of all related events, and that the resulting knowledge structure template for an action/decision process must be accessible to all roles focusing on that particular event. By establishing local networks at the sites of catastrophes and creating a framework for sharing digital voice, graphics, and video, the emergency management team can improve the flow of reliable, timely, and relevant data between on-site members and command, control, and coordination staff. Allowing the user to do his job by managing the data stream as the volume of sensor data mounts is a task for automated systems, although automation must always be under human control. The major characteristics of shoddy automation include autonomous behavior, failure to supply adequate feedback about activities and intentions, a tendency to interrupt human activity, and difficulty in reconfiguring the automation in a desired manner; behaviors that may result from such poor design include automation bias, automation complacency, and automation surprises. Such facts make the case for a user-oriented systemic approach in which user requirements are a driving force in technology development.
Click Here to View Full Article - Web Link to Publication Homepage
to the top


To submit feedback about ACM TechNews, contact: [email protected]

To unsubscribe from the ACM TechNews Early Alert Service: Please send a separate email to [email protected] with the line

signoff technews

in the body of your message.

Please note that replying directly to this message does not automatically unsubscribe you from the TechNews list.

ACM may have a different email address on file for you, so if you're unable to "unsubscribe" yourself, please direct your request to: technews-request@ acm.org

We will remove your name from the TechNews list on your behalf.

For help with technical problems, including problems with leaving the list, please write to: [email protected]

to the top

News Abstracts © 2007 Information, Inc.


© 2007 ACM, Inc. All rights reserved. ACM Privacy Policy.