Read the TechNews Online at: http://technews.acm.org
ACM TechNews
April 19, 2006

MemberNet
The ACM Professional Development Centre
Unsubscribe

Welcome to the April 19, 2006 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Sponsored by Information, Inc.

http://www.infoinc.com/sponsorthenews/contactus.html


HEADLINES AT A GLANCE:

 

Academia Dissects the Service Sector, But Is It a Science?
New York Times (04/18/06) P. C1; Lohr, Steve

The field of service sciences is emerging as the recourse for U.S. technology professionals who face increased competition for their jobs from workers in rapidly developing nations such as India and China. The field is best described as a fusion of technology, management, engineering, and mathematics with a service sector, such as retail or health care, looking to improve its operations by leveraging technical expertise. Various universities, including Stanford, Rensselaer Polytechnique Institute, and the University of California, Berkeley, have begun offering courses or research programs dedicated to services science. More than 75 percent of Americans work in some area of services, but education and research in the field have lagged behind. While workers with a narrow set of technical skills are likely candidates for outsourcing, enhancing those skills with management training and specialized knowledge of a specific industry helps distinguish young job seekers and add value to their profile. "This is how you address the global challenge," said Jerry Sheehan of the Organization for Economic Cooperation and Development. "You have to move up to do more complex, higher-value work." The NSF has established a program to fund university research in services science, and Berkeley will launch a certificate program for graduate students this fall, though blending fields to create a unified discipline will be challenging, particularly with scientists' historic disdain for the heavily anecdotal curricula taught in business schools, though it is important to note that when the term "computer science" first appeared decades ago, it rankled traditionalists who felt it was unworthy of a scientific designation. Among the many potential applications of services science are networked sensors to improve transportation, tiny implants that could monitor the biological processes of living organisms, and new methods of data-tracking in marketing and e-commerce.
Click Here to View Full Article - Web Link May Require Free Registration
to the top


IT Employment Reaches Record High in U.S.
InformationWeek (04/18/06) Chabrow, Eric

U.S. IT employment reached a record high of 3.472 million workers in the period ending March 31, according to an analysis of data drawn from the Bureau of Labor Statistics by InformationWeek. That figure surpassed earlier high-water marks set in the previous quarter and at the close of the third quarter in 2001. The IT labor force, including both employed and unemployed tech professionals, reached 3.56 million, its highest point since the end of 2001. The rate of IT unemployment stood at 2.5 percent last quarter, the lowest mark since dipping to 2.2 percent at the end of 2000, and well off the rate of 3.7 percent posted at the end of the first quarter of 2005. "A lot of SOX projects are completed, and now more business-driven projects--as opposed to finance-drive IT projects--are coming out that are integral to companies," says the Yoh Group's Jim Lanzalotto. IT has also seen an increase in specialization. Meanwhile, the percentage of computer programmers within the IT workforce has fallen from 21 percent at the height of the boom in 2001 to 17 percent today, while the proportion of highly trained software engineers in the overall tech workforce has risen from 21 percent to 24 percent in the same period. The national unemployment rate stood at 4.7 percent in March. The Labor Department considers the following IT-related professions in its analysis: computer and IS managers, computer scientists and systems analysts, computer programmers, software engineers, computer support specialists, database administrators, network and systems administrators, and network systems and data communication analysts.
Click Here to View Full Article
to the top


Microsoft Launches Competitor to Google Scholar
TechWeb (04/12/06) Sullivan, Laurie

Microsoft has launched the beta version of its new Windows Live Academic Search tool, teaming up with more than 10 publishers of research materials such as ACM as well as industry association CrossRef, the nonprofit behind a citation-linking platform that provides legal access to millions of academic articles and other content from scholars and publishers. The search tool for searching academic material on the Web, which will compete with Google Scholar and SciFinder Scholar, makes use of the library standard OpenURL to connect researchers to subscription-based content, in addition to the Open Architecture Initiative (OAI) for indexing repositories that are compliant with the protocol. Researchers can use the OpenURL standard to click on a link in their library OpenURL to determine their access to the full text. Scholars in the United States, the United Kingdom, Germany, Italy, Spain, Australia, and Japan can access the beta version, which is currently limited to computer science, electrical engineering, and physics content.
Click Here to View Full Article
to the top


New RFID Travel Cards Could Pose Privacy Threat
CNet (04/18/06) McCullagh, Declan

The embedded computer chips that might be used in government-issued travel cards can be read at a distance up to 30 feet, according to Jim Williams, director of the Department of Homeland Security's US-VISIT program, posing a potential threat to privacy. The chips, which use RFID technology, could appear in cards used by Americans to enter Canada and Mexico as early as 2008. Privacy advocates have already voiced concerns about RFID technology, and a California politician has introduced legislation restricting its use. The concerns diminish with chips that can only be read from a few inches, though at a distance of 30 feet, sensors hidden along a road could theoretically read them, as could a stranger passing on a street. The disputed cards are known as "PASS" (People Access Security Service), and are being issued as part of a government initiative requiring anyone traveling over the Canadian or Mexican border to carry alternative travel documentation. A government procurement notice issued by the Department of Homeland Security stipulated that the devices be readable from a distance of at least 25 feet, and that the "IDs be read under circumstances that include the device being carried in a pocket, purse, wallet, in traveler's clothes, or elsewhere on the person of the traveler." The State Department appears to prefer a proximity-based card, rather than a remotely readable RFID-enabled device. RFID chips will already begin appearing in U.S. passports in October, and proposals to implant them in driver's licenses have met with staunch opposition from privacy advocates. Despite similar criticism of the e-passport initiative, the State Department's Frank Moss says the chips used will only be readable from 10 centimeters, and will contain a sophisticated cryptographic technique known as basic access control.
Click Here to View Full Article
to the top


Code for 'Unbreakable' Quantum Encryption Generated at Record Speed Over Fiber
NIST News (04/18/06)

Researchers at the National Institute of Standards and Technology (NIST) have generated raw code for quantum encryption at a record speed of more than 4 Mbps over 1 kilometer of optical fiber, doubling NIST's previous record. Using individual photons of different orientations to create a steady binary code, or encryption key, the NIST quantum key distribution (QKD) method could lead to ultra-secure transmissions of video and other data over conventional high-speed networks. The researchers attained the record with only a 3.6 percent error rate, and look toward the next step of processing the raw key to produce a secret key at roughly 2 Mbps. "This is all part of our effort to build a prototype high-speed quantum network in our lab," said NIST physicist Xiao Tang. "When it is completed, we will be able to view QKD-secured video signals sent by two cameras at different locations. Such a system becomes a QKD-secured surveillance network." Though tested over a shorter distance, the NIST system operates faster than previously reported systems developed by other groups. Through two channels linking two PCs in a lab over optical fiber, the system transmits photons in their quantum states representing ones and zeros, adjusting for changes in environmental conditions such as temperature and vibration. Once the system creates and processes the raw key, it uses the secret key to encrypt and decrypt video signals. Lasers produce a series of single photons in the NIST system, which carry the raw material for the quantum key over the fiber via a one-way channel. The researchers had to develop a workaround to correct the distortion of the sending computer's photons that occurred when they passed over curved fiber. Once corrected, the key is hashed in a privacy amplification technique that ensures only the sender and intended recipient will see the key in its entirety.
Click Here to View Full Article
to the top


Graphical World Opens for Visually-Impaired People
IST Results (04/19/06)

The IST-funded Interactive Tactile Interface (ITACTI) project has developed a new device that will use a graphical display to open the world of images to the visually impaired, just as Braille makes text accessible. Today's Braille displays use electro-magnetic or piezo-electric forces to raise and lower the dots that form Braille letters, but they only show one line at a time, while the new device employs electro-rheological fluids. Historically, price has been a barrier to graphical interfaces for the visually impaired, though the new device produces an entire screen at a time, which keeps the costs low, said Sami Ahmed, managing director of the Smart Technology Group, which developed the electro-rheological fluids that change from liquid to semi-liquid when stimulated with a charge, the central challenge that the project faced. "We use these types of fluids in other applications, but it took quite a lot of work to get the specification we required for this device," Ahmed said. The device also contains system controls and software that enable the dots to raise and lower so that users are presented with a complete page of text or graphics. Reading even single lines could become easier with the device, as current displays replace text once it is read, making it inconvenient to go back and reread passages. The main advantage of the new device will be graphics, however, as the visually impaired will be able to absorb spreadsheets and the meaning of icons, which could open up a host of new job opportunities. The device also incorporates input and output, so a user could read a Web page and then follow a hyperlink. Users can also trace the screen with their fingers so the interface functions as a mouse.
Click Here to View Full Article
to the top


A Year Later, Dual-Core Software Is Still Lacking
Investor's Business Daily (04/18/06) P. A8; Detar, James; Seitz, Patrick

A year ago Tuesday, the first PCs powered by Intel dual-core chips appeared in stores, but the software for the chips has evolved at a disappointing pace, industry executives say. While hardware often precedes software on the development curve, too great of a gap could dissuade consumers and investors from embracing dual-core technology. Although developing software for dual-core chips is difficult, says analyst Roger Kay, Intel plans to incorporate dual-core technology into 70 percent of the notebooks and desktops that it powers by the end of the year, and 85 percent of servers, said the company's Rob Crooke. Until software developers optimize their applications for dual-core processors, however, consumers are unlikely to see much benefit. "Any time you move software code to a new model, it isn't something you just snap your fingers and do," says Margaret Lewis of AMD, which debuted its first dual-core chip three days after Intel's roll-out. "The reality is that people might put into a plan: 'I'm going to create multithreaded code.' But that code could be three months to 18 months from being released." The principal benefit to materialize from the chips so far has been the increased speed at which a PC can simultaneously operate multiple applications, though for a single application to run faster, its code must be written so that its internal functions can be broken up to run at the same time. Programs such as spreadsheets are not likely to see any benefit from dual-core technology, which will primarily improve video, graphics, and other applications that demand a lot of processing power. The stronger selling point for most consumers is the ability to run multiple programs more efficiently.
Click Here to View Full Article - Web Link to Publication Homepage
to the top


The Corporate Toll on the Internet
Salon.com (04/17/06) Manjoo, Farhad

Many critics fear that Internet users' activities will be controlled by phone and cable companies, and they say nowhere is that assumption more evident than in AT&T's plan to charge online businesses to send data into subscribers' homes through its high-speed DSL lines. Amazon.com's Paul Misener says this will lead to a subtle form of discrimination in which broadband firms "prioritize content from some content companies over others." Web firms, Internet policy experts, and engineers are calling for the institution of federally mandated network neutrality rules that prescribe the equal treatment of all online data by broadband companies. Online companies claim the absence of such rules will reduce competition for online content and applications, and eventually limit consumer choice. On the other hand, AT&T's Jim Ciccone argues that the regulation of broadband service would stifle network build-outs by making broadband companies unable to recoup the cost of new DSL line deployments. Broadband companies have promised to neither block access to the public Internet nor downgrade quality of service, but network operators are setting up networks that can differentiate traffic into fast, medium, and slow lanes according to how much the data's senders have paid. Phone companies say the advantages of this approach are more Internet efficiency and lower broadband costs for residential consumers. Another argument neutrality regulation opponents cite is the need for better network management to ensure that advanced Internet applications--particularly audio and video--perform well.
Click Here to View Full Article
to the top


Binghamton University Research Links Digital Images and Cameras
EurekAlert (04/18/06)

Researchers at Binghamton University, State University of New York, have developed a new technique that links digital images to the camera used to take them, as well as detects forged images. The technology works similar to the way in which forensic examiners use tell-tale scratches to connect bullets to the gun used to fire them. Jessica Fridrich, associate professor of electrical and computer engineering, says the technique would make it more difficult for child pornographers to avoid prosecution. "The defense in these kinds or cases would often be that the images were not taken by this person's camera or that the images are not of real children," says Fridrich. "Sometimes child pornographers will even cut and paste an image of an adult's head on the image of a child to avoid prosecution." After discovering that original digital pictures include a weak noise-like pattern of pixel-to-pixel non-uniformity, Fridrich, Jan Lukas, and Miroslav Goljan developed an algorithm to extract and define the unique fingerprint of a camera so that information about the origins and authenticity of single images could be gathered. They have achieved 100 percent accuracy in linking pictures to the camera that took them, in a test that involved 2,700 individual images and nine digital cameras.
Click Here to View Full Article
to the top


Bringing Free Software to the Masses
ZDNet UK (04/13/06) Marson, Ingrid

In a recent interview, Peter Brown, the executive director of the Free Software Foundation (FSF), discussed his thoughts on the forthcoming draft of the GPL and the foundation's campaign to end digital rights management (DRM). Brown claims not to have written a program since he was 14, instead bringing a background in management and finance to the FSF when he took a part-time job there in 2001 doing mostly administrative work. Brown eventually took on more responsibility, managing the GPL compliance lab before being elevated to executive director of the foundation in February. One of Brown's main challenges is promoting the foundation's mission in the mainstream press, so that the message of free and open computing stretches beyond the strictly technical community. With the second draft of GPLv3 forthcoming in June, Brown says that will be "the first stake in the ground against DRM," noting that the campaign against digital restrictions is all the more timely with the upcoming Windows Vista and the recent revelations that certain CDs will not play on certain players because of DRM. Though the campaign will not be overtly political, Brown notes that there is some poor legislation that should be changed, but that the FSF, with the help of a professional campaigner, will focus more on mobilizing people at a grass-roots level to boycott products or picket certain organizations. Brown also noted that, unlike other free or open-source projects, the FSF is highly legalistic, and believes that securing its assets through copyright agreements with developers is crucial to protecting computing freedom. The FSF generates revenue through membership, donations, and selling merchandise such as t-shirts and books, and its founder and director, Richard Stallman, refusing to take a salary, lives off speakers' fees and award money.
Click Here to View Full Article
to the top


IUS Aims to Apply Computer Skills to Real-World Problems
Courier-Journal (04/18/06) Kaukas, Dick

Indiana University Southeast will begin offering a degree program in informatics this fall, enabling students to apply their knowledge of computer science to other disciplines such as business, psychology, or health information. "I'm confident (that the program will be popular) because anything is usually more fun when you can solve some sort of problem with it," said Joe Hollingsworth, a 12-year veteran computer science professor who is coordinating the program. As an example of informatics at work, Hollingsworth developed a football program that can help a coach analyze games and improve the performance of his team by providing statistics such as the percentage of first downs on which the opposing team ran or passed. Informatics students with a concentration in criminal justice could help police identify areas of unusual criminal activity or work with overbooked courts to optimize the scheduling of complex cases. IUS Chancellor Sandra Patterson-Randles described the program as "an ideal course of study for the student who wants to combine a strong understanding of information technology with how that technology applies to the larger world." Indiana University began offering an informatics program at its Bloomington campus and at Indiana University Purdue University Indianapolis in 2000, before bringing it to the South Bend campus in 2002. Since then other schools have adopted similar programs.
Click Here to View Full Article
to the top


Cerf: Governments to Participate In, Not Dominate, Net
Computerworld New Zealand (04/16/06) Bell, Stephen

ICANN Chairman Vint Cerf finds it natural that governments should seek to have a say in the Internet--given how it has spread to all corners of the Earth--but at the same time he lauds their restraint in assuming an unaccustomed role of stakeholder on equal footing with technical, academic, and private sector groups, rather than as a decision maker. Along with the internationalization of the Web comes the need for internationalized domain names, which ICANN is working on. Two ways of representing non-Roman alphabets in domain addresses, one called Dname and the other called NS Records, will be tested by year's end. ICANN's delay has prompted some countries to introduce IDN capabilities to their own ccTLDs. Cerf declined to give a timeline for the Web-wide introduction of IDNs. "Once you publicize the availability of such names, people will immediately want to start registering names, and as a registrar, you will need to have amended your procedures," he says. Cerf also notes amid the controversy surrounding the proposed .xxx domain, ICANN's Government Advisory Committee (GAC) is developing a standard process for evaluating new generalized top-level domains (gTLDs), which should be ready by the end of the year.
Click Here to View Full Article
to the top


DHS Still Gearing Up Response to Cyberthreats
Government Computer News (04/17/06) Jackson, William

U.S. Homeland Security Department (DHS) acting director of National Cybersecurity Andy Purdy says the United States faces serious vulnerabilities to cyber-attack and the department still has a long way to go in addressing this problem. Purdy, speaking at the 2006 International Conference on Network Security, says the department wants to create a plan for how to organize a quick response to a significant cyber-attack. Its second priority is to develop a way to disseminate cybersecurity and attack-related information among government agencies and companies. DHS has been working with the IT industry to forge a comprehensive protection plan for critical national IT infrastructure, but it has yet to get far. Purdy says such efforts are hindered by a lack of organization. He says, "There are so many players, so many different people doing different things." DHS also recently created a cybersecurity position at the level of DHS assistant secretary, and Purdy says the department is working with the White House to fill it now.
Click Here to View Full Article
to the top


Does Open Source Encourage Rootkits?
Network World (04/17/06) Vol. 23, No. 15, P. 1; Messmer, Ellen

In its recently published report, "Rootkits," McAfee identifies a ninefold increase in the number of rootkits collected as samples of malware this quarter compared with the same time last year, attributing the spike to the activities of the open-source community. Nearly all the rootkits McAfee identified are designed to conceal code, such as spyware or bots, or to mask applications operating in Windows systems. "The predominant reason for the growth in use of stealthy code is because of sites like Rootkit.com," said McAfee's Stuart McClure. Rootkit.com has 41,533 members who anonymously post rootkit source code, though site operator Greg Hoglund claims the site exists as a resource for anti-virus firms and others who want to learn about rootkits, but that anyone with strictly malicious motives would be foolish to post on the site, because the rootkit would be held up to public scrutiny and detection. Hoglund admits, though, that with tens of thousands of users, there are likely to be some people who are more interested in exploiting vulnerabilities than using the site for educational purposes. Because it draws on a massive brain trust, the open-source community is critical to exposing new vulnerabilities and developing better code, says TrendMicro's David Perry, who nevertheless allows that Rootkit.com attracts a lot of would-be hackers who use it to shop for tools. Hoglund says there are probably only 20 to 30 main types of rootkits, though there are numerous variants. Rootkit detection and eradication have become frontiers in software research, and while some rootkits are nearly impossible to eradicate, Rootkit.com has made it easier for people to use the software designed to find and eliminate them, said Komoku CTO James Butler. A major fear in the security industry is that a hacker will soon be able to scan networks with a worm and deliver a piece of malware that could wipe out files or alter data while remaining hidden by a rootkit.
Click Here to View Full Article
to the top


Life in Silico: A Different Kind of Intelligent Design
Science (04/14/06) Vol. 312, No. 5771, P. 189; Krieger, Kim

Just as engineers create shapes and structures in the virtual world with AutoCAD, systems biologists could soon have a tool that enables them to design and build models of living organisms to simulate how they would react under programmable environmental conditions. A team of Harvard University researchers will soon unveil "Little b," what they believe to be the first truly modular assembly program. While the emerging field of computer simulated biological systems has led to advances in the pharmaceutical industry, Harvard mathematician Jeremy Gunawardena, who is leading the Little b team, believes that it is still in its infancy and in dire need of standardization. While there are numerous standardization efforts underway to enable the sharing of completed biological models, Gunawardena envisions a future in which researchers can share each piece of a model, enabling them to mix and match computer code at the level of individual proteins. If Little b fulfills its developers' expectations, it would enable researchers to build an organism from a repository of modular components. The Little b team is approaching the project from the perspective of a software engineer, with due understanding of the complexities of creating software with built-in modularity. Computational biologists can already share models of biological systems with the Systems Biology Markup Language (SBML), though it is only a file format that ensures that complete models are compatible, requiring no modularity, while Little b is a full modeling language. "It's the Holy Grail to take all these individual modeling components, plug them together, and get a comprehensive view of what's going on," said Daniel Gallahan of the National Cancer Institute. Other researchers are eager for a program that will take the redundancy out of the perpetual re-creation of computer modeling tools for individual experiments, though some take issue with its basis on the somewhat obscure language LISP, often used for artificial intelligence applications.
Click Here to View Full Article - Web Link May Require Free Registration
to the top


Research Revolution
InformationWeek (04/10/06)No. 1084, P. 63; Ricadela, Aaron; Claburn, Thomas

Web labs sponsored by Microsoft, Google, and Yahoo! are becoming incubators of technological innovation and helping facilitate a move away from product/service development characterized by disconnection between researchers, engineers, and executives. The benefits of such an approach include faster and cheaper product development, greater room for more purely technical research, and more effective collaboration between product groups. Web labs' stock is rising as many traditional centers of industrial computer science research are in decline, while government agencies are dedicating less and less funding to research. Microsoft's Live Labs focuses on a wide array of projects, such as search algorithms that employ user demographic profiles to deliver results and serve ads; online visualization software that can zoom in and out of high-resolution areas for closer study of digital maps, video clips, and photo albums; and scientific tools for data mining and grid computing. Google's Alan Eustace says his company's strategy is to distribute smart people throughout the firm rather than erect a wall between research and product development. Among the projects Google's research group is working on is a speech-powered search engine and software that can furnish Chinese-English Web page translation. Yahoo!, meanwhile, is designing and pricing its products in consultation with economists, social scientists, and psychologists from top universities. The company has also set up a stable of research groups around the world, and has recruited Defense Advanced Research Projects Agency (DARPA) AI expert Ron Brachman to beef up its labs.
Click Here to View Full Article
to the top


The (Not So) Hidden Computer
Queue (04/06) Vol. 4, No. 3, P. 22; Coatta, Terry

Purpose-built systems are growing more complex, which makes it more challenging to hide computers and address user interface and integration issues, writes independent consultant Terry Coatta. "We have to start taking the lessons learned from building general-purpose computer systems and apply them to the software that is being developed for purpose-built systems," Coatta explains. A bottom-up approach to purpose-built system software development may seem the sensible way to go, but consumer needs are favoring a top-down model to contend with greater complexity. Consumers will no doubt demand a wider array of functionality from devices, but Coatta says designers must be sure to prevent the additional features from downgrading the device's primary function. What is more, the device's purpose must be clearly defined. Coatta reasons that more developers will be designing software that operates on devices other than an average PC. "We must address an abundance of design issues: Human factors, power consumption, security, integration with other devices, whether to strip down a standard operating system to be robust enough or to build something more specific from the ground up, whether or not to use special hardware for certain functions," the author concludes.
Click Here to View Full Article
to the top


To submit feedback about ACM TechNews, contact: [email protected]

To unsubscribe from the ACM TechNews Early Alert Service: Please send a separate email to [email protected] with the line

signoff technews

in the body of your message.

Please note that replying directly to this message does not automatically unsubscribe you from the TechNews list.

ACM may have a different email address on file for you, so if you're unable to "unsubscribe" yourself, please direct your request to: technews-request@ acm.org

We will remove your name from the TechNews list on your behalf.

For help with technical problems, including problems with leaving the list, please write to: [email protected]

to the top

News Abstracts © 2006 Information, Inc.


© 2006 ACM, Inc. All rights reserved. ACM Privacy Policy.