Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to technews@hq.acm.org.
Volume 7, Issue 861:  October 31, 2005

  • "White House Urged to Make Cybersecurity a Priority"
    GovExec.com (10/27/05); Belopotosky, Danielle

    Cyber Security Industry Alliance executive director Paul Kurtz, speaking to a House Armed Services subcommittee on Thursday, called for a presidential directive making cybersecurity a top Bush administration objective and encouraging more coordination among the military and the private sector. Kurtz said, "We need a national policy to secure cyberspace." Others testifying before the committee argued that the current approach to cybersecurity is ineffective because it lacks research funding, has a shortage of suitable researchers, relies too much on vulnerable commercial software and hardware, and does not encourage coordination with any other sectors. Purdue University Center for Education and Research in Information Assurance executive director and professor Eugene Spafford lamented the use of commercial software and hardware, because most manufacturers of such products rely on patches, or quick fixes, to correct vulnerabilities rather than securing vulnerabilities before release. Spafford believes a holistic view is the only way to prevent and effectively respond to a catastrophic cyber attack, which could affect the electric power grid as well as the telecommunications infrastructure. Spafford says, "These systems are interconnected, and we need to protect all of them." Intel's David Rawrock said more certified security professionals are needed. He said, "The number of professionals in the field seems to be shrinking and not expanding."
    Click Here to View Full Article

  • Coming Soon to a Kernel Near You: GPL 3"
    eWeek (10/28/05); Galli, Peter

    The Free Software Foundation (FSF) will unveil the roadmap and process to oversee the first version of the revised GNU General Public License in a few weeks. FSF's Eben Moglen said the first draft of GPL 3 will appear next year, but that he wanted the community to get acclimated to its governing rules before they see its content. Many in the community support the FSF's efforts to involve as broad a group as possible in the process, particularly as it has come to dominate the realm of open source software. Moglen has said the process for GPL 3 will be implemented approximately one year from its announcement, to ensure that the new version will "hit the ground running." GPL 3 is designed to resolve persistent open source issues such as the licensing of intellectual property, patent issues, and how to treat software employed across a network. The Open Source Development Labs' Diane Peters sees the two central areas on which the revision must focus as substantial modifications to rights and obligations, and a clarification of the definitions of provisions that are not changing. "I anticipate those areas as being the ones in which many of the changes will be focused and debate will ensue; issues such as Web services, trusted computing, source code distribution requirements, and patent termination provisions," said Peters. There will be eight people at work full time on the GPL 3 process, with another 60 people slated to chair committees or have some high level of involvement. Moglen insisted that the process would not be sponsored, though there would be some funds raised to help with the travel expenses associated with international conferences. Linux creator Linus Torvalds has expressed his support for the GPL, which, though he admits is not perfect, "simply is the best license for the kernel."
    Click Here to View Full Article

  • "Ruby on Rails Chases Simplicity in Programming"
    CNet (10/31/05); LaMonica, Martin

    The simplicity and productivity offered by the Web development framework Ruby on Rails (RoR) have made it an increasingly popular choice among developers and executives. Denmark's David Hansson developed the tools, which he is releasing as an open source initiative. Hansson's strategy in developing the tools was to rethink the fundamental underpinnings of computer science to which programmers adhere, and travel a simpler path that removes the needless complexity that confounds much of today's software. Rather than creating an elaborate blueprint that could revolutionize the structure of Google or Amazon.com, Hansson instead turned his attention to more mundane designs and templates that address issues such as database alterations. Hansson is committed to proving that the applications required for the most complex problems are not always the most appropriate for simpler issues, and RoR has already turned the heads of many influential programmers, such as David Geary, a member of the technical committee for the most recent Java Web programming model who noted that RoR is five to 10 times faster than similar Java applications. In creating RoR, Hansson sought to fuse the simplicity of PHP with the structure of Java to create an environment that will produce code that can be written quickly and easily maintained over time. RoR has thrived on the growing popularity of scripting languages that could supplant Java. RoR uses AJAX development methods to create interactive Web pages, though the Burton Group's Richard Monson-Haefel notes that its practical application is limited to new green field uses or situations where the developer has total control over the database.
    Click Here to View Full Article

  • "IBM Researchers Take Axe to Computer Security"
    IDG News Service (10/28/05); McMillan, Robert

    IBM's Assured Execution Environment (Axe) project was the creation of Amit Singh, who set about looking for ways to simplify security in light of the recognition that the operational effectiveness of a PC was being curtailed by the cluttered multitude of security and management software. The program loads the Axe runtime software onto the kernel each time the PC is turned on, regulating the software that runs on the operating system and ensuring that it is only running authorized code. As opposed to antivirus software, Axe blocks out any code that was not written in a format compatible with Axe, which is almost impossible for authors of spyware and viruses to achieve, according to IBM's researchers. Axe is compatible with both the Windows and Mac OS kernels, and can be used to render data unreadable or shield Word or PowerPoint documents from unwanted viewers. Implementing Axe is not an all-or-nothing proposition, as users can configure it to allow unknown code to run if they approve it, or create a virtual environment to run it where it can do no harm. The Axe approach of creating a "whitelist" of preapproved programs will likely become more commonplace as the body of conventional antivirus software becomes too cumbersome to be practical, though some are concerned about the time-consuming re-registration process that could be necessary each time a software update is released. Axe will probably enjoy the most widespread use in applications where users do not need all of the functionality of their operating systems, such as point-of-sale or stock-trading machines.
    Click Here to View Full Article

  • "UMass Researchers Fight Fraud With Software"
    Daily Collegian (10/27/05); Marx, Hayden

    New fraud-detection software is being developed by researchers at the University of Massachusetts in partnership with the National Association of Securities Dealers that promises to improve upon the ability to predict fraud among brokers. Current software focuses solely on the individual history of a broker, but the program developed at UMass' Knowledge Discovery Laboratory also takes the history of the brokers they come in contact with into consideration, which is similar to the strategy of predicting the spread of an infectious disease, says David Jensen, an associate professor in computer science and KDL director. The software uses relational probability trees, which considers the characteristics of related objects, to compile information, and then builds a model that shows speculated relationships. Predictions are based on organizational relationships in the securities industry, linking brokers to firms, customer complaints to brokers, and branches to parent firms. The results matched many of the brokers that appear on NASD's Higher-Risk Broker List, and identified new ones. "That it performs as well as live examiners is fascinating," says John Komoroske, vice president of the NASD.
    Click Here to View Full Article

  • "Spray on Sensors"
    Engineer (10/25/05)

    Researchers in Scotland are building a programmable network comprised of autonomous sensors that have the ability to perform computations and communicate with one another. The Speckled Computing Consortium plans to develop thousands of "Specks" and scatter the tiny sensors, which will have their own captive renewable energy source, about to serve as low-power sensor applications. The Specks could serve as lighting and temperature sensors in buildings, monitors on aircraft wings to detect problems, and as aids in medicine bottles to ensure that medications are taken at the proper time. The idea is to embed the Specks in objects, or scatter or spray them on surfaces or on a person. The consortium consists of physicists, chemists, electrical engineers, and computer scientists from the Universities of Glasgow, Edinburgh, St. Andrews, and Strathclyde, and the Ultrafast Systems Group at Glasgow will contribute the ultra-low power radio and high-efficiency solar cell technologies for the project.
    Click Here to View Full Article

  • "Remote Control Device 'Controls' Humans"
    Associated Press (10/26/05); Kageyama, Yuri

    The Japanese company Nippon Telegraph & Telephone is developing video game technology where remote controls guide the movements of people, known as galvanic vestibular stimulation. The subject wears a headset that conveys a low-voltage jolt of electric current to the ears and head, traveling either to the left or the right, depending on the movements of a joystick, which correspond to the direction in which the human walks. NTT senior research scientist Taro Maeda says, "We call this a virtual dance experience." There is not a clear scientific explanation as to why humans respond to electric currents in the ear, though the researchers have demonstrated the ability to make a subject walk in the shape of a pretzel with the device, but have yet to formulate plans for a commercial application. The researchers hope a similar program that controls a subject's movements according to music will catch the attention of Apple's researchers for use in the iPod. Boston University biomedical engineering professor James Collins says gamers likely would "get a kick out of the illusions that can be created to give them a more total immersion experience as part of virtual reality."
    Click Here to View Full Article

  • "ICANN Prez Welcomes New Era of Internet"
    Register (UK) (10/26/05); McCarthy, Kieren

    ICANN CEO Paul Twomey is heralding a new deal reached between ICANN and VeriSign as a "new era" for the Internet. The deal extends VeriSign's contract to run the .com domain name registry until 2012 and extends VeriSign's presumptive right over the domain past that period as long as the company complies with the conditions of the contract. The deal is expected to help end the legal battle between ICANN and VeriSign that has been going on since VeriSign launched the controversial SiteFinder service two years ago. A gagging clause in the new contract requires VeriSign to voice support for ICANN publicly. The agreement also includes new provisions for .com dispute resolutions; specifically, competition disputes will be handled by the relevant country's public authority, while technical issues will be handled by a new international technical body. Twomey says the deal with VeriSign has absolutely nothing to do with the extension of the .net domain contract, but it does show that ICANN has "been able to work through an outcome and get a solution applicable throughout the world." He says the ability of ICANN to relatively quickly reach an agreement with VeriSign is another example of the benefits of the current Internet governance arrangement. Nevertheless, he would not say whether ICANN favors the vision of Internet governance espoused by the U.S. or that proposed by Europe ahead of next month's World Summit on the Information Society. Twomey says ICANN's focus right now is "on completion of the MoU and continuing to internationalize ICANN itself."
    Click Here to View Full Article

  • "Deciphering the World of Crypto"
    Network World (10/24/05) Vol. 22, No. 42, P. 1; Messmer, Ellen

    The Internet Engineering Task Force (IETF) has turned its attention toward standards for cryptographic algorithms such as Triple-DES and AES. IETF does not research and test cryptographic algorithms, leaving those tasks to government organizations with the council of outside experts, though the group ensures that only secure algorithms find their way into its protocols. IETF is also evaluating standards from other countries, such as Russia, Japan, and South Korea, and has already awarded RFCs to South Korea's SEED and Russia's GOST. SEED has enjoyed use in VPN applications and digital rights management. GOST is Russia's national standard, but was recently modified to enhance its interoperability, though many view the Soviet-era protocol as archaic, despite the fact that it has yet to be broken. Russia is applying GOST to the public-key infrastructure project at its National Treasury to address document coding and signing. GOST is currently being considered for implementation in OpenPGP. IETF standardization is widely viewed as helping a protocol gain popularity, as well as improving its interoperability by fleshing out its technical depth.
    Click Here to View Full Article

  • "Bring on the Geekettes"
    Maclean's (10/25/05); Bourette, Susan

    New research continues to support the belief that environment is more likely to determine the success of an individual in math and sciences than chromosomes. Although research presented at the American Sociological Association meeting in Philadelphia in August showed women were well represented in the fields of health and education, experts believe girls would pursue science-related fields more if it were stressed to them that science is another practical way to help people. As a result, over the past decade educators have worked to fine-tune the way they teach and their curriculum to ensure that girls continue to remain interested in science. Universities are setting up multidisciplinary programs that allow students to take hard science courses as well as other subjects. In Ontario, engineering programs have added biology, which is a popular subject among girls, to the list of course options, and they are also promoting the humanistic values of engineering to high school students. Maria Klawe, dean of engineering at Princeton University and a past president of ACM, played a key role in boosting the number of female computer science degrees at the University of British Columbia from 22% five years ago to 42% for combined biology and computer science degrees today. "If you really want to change the world, you've got to consider a career in computer science," says Klawe, who oversees a department that has a first-year class that is 32% female.
    Click Here to View Full Article

  • "Where Am I, Robot?"
    Defense News (10/24/05) Vol. 20, No. 41, P. 52; Peniston, Brad

    MIT computer science associate professor John Leonard has developed a robot with the watershed ability to create a real-time map of its location based on data collected from its sensors. Simultaneous location and mapping (SLAM) has long been a goal of robotics, as robots can perform their missions more effectively with better maps. Leonard decided that aggregating data to create a large map did not improve a robot's performance, and turned his attention to creating smaller, more detailed maps that would update instantly as the robot moves. The Atlas pattern-matching algorithm helps robots recognize territory they have already visited when they approach it from a different direction. In the testing stage, the robot initially struggled with turning corners and interpreting angles, though it corrected itself once it tapped into its repository of existing maps and recognized a pattern. While the test was in a two-dimensional environment with no obstacles, the robot's ability to navigate without the aid of humans or GPS technology offers vast new potential for robotic applications. Since the first test, Leonard has been experimenting with three-dimensional applications, such as equipping robotic submersibles with SLAM that could hunt for mines.
    Click Here to View Full Article

  • "Power Search"
    Federal Computer Week (10/17/05) Vol. 19, No. 36, P. 58; Sternstein, Aliya

    The federal government has issued a call for new search technologies to replace its current standards for information management. When the General Services Administration and the Office of Management and Budget consider new search technologies, they will look for those that can perform location, retrieval, and sharing functions at a minimal cost. The technology must be able to span across the government to inform the searches of users who may not be familiar with the jurisdiction of every agency, and incorporate relevant nongovernmental information that may not be available through mainstream search engines. The request for information from the GSA and OMB comes at a time when the large commercial search engines are abandoning the Government Locator Service, a 10-year-old standard the National Institute of Standards and Technology recently declared obsolete. The request also calls for alternatives to government-mandated standards, as well as reasons why government standards are inefficient or unnecessary. In addition to drawing on the private industry, OMB has also received proposals from government entities, such as the Interagency Committee on Government Information's suggestion of open and interoperable standards that would assist agencies in cataloging information in such a way as to make it retrievable through normal search terms. The committee recommended a searchable identifier standard to provide long-term access to digital information that would be flexible enough to adapt to new technologies while still providing authoritative access. The Government Printing Office is also considering new information retrieval and sharing technologies, and expects to have a system in place to offer Web browsing, downloading, printing, and search by July 2007.
    Click Here to View Full Article

  • "Managing Metadata"
    InfoWorld (10/24/05) Vol. 27, No. 43, P. 33; Udell, Jon

    There are various ways to define and maintain metadata, and they come with their own distinct challenges. Programmers incorporate an informal type of metadata known as comments into their source code to help human readers understand the software's design and operations, but comments can also be employed more formally to flag the characteristics of and relationships between software components. Java and .NET programs contain their own metadata descriptions and can also reveal intrinsic metadata about their objects, along with those objects' attendant types and properties; this allows the programs to dynamically cooperate with other programs. Metadata can also be embedded in Web sites and documents, and one way to manage this metadata is through social tagging. Social tagging can be used to tap non-confidential, free-form resources as hubs for communities of interest, and extract the relationships between things by exploiting the communities' collective ability; but social tagging cannot flourish without critical mass, which is why the data must be freely shared and a large population must interact with the data. Numerous efforts are under way to incorporate metadata into file systems for the purpose of personal information management, such as Microsoft's WinFS and the World Wide Web Consortium's Semantic Web. The W3C's initiative has so far yielded little of any value, which is unsurprising given the scale and complexity of the Web; WinFS, meanwhile, faces the challenge of mating a relational database to a file system and erecting bridges between WinFS types, among other things. How to close the gap between these various metadata deployments may be found in the nascent field of service-oriented architecture.
    Click Here to View Full Article

  • "Sweating in the Hot Zone"
    Fast Company (10/05) No. 99, P. 61; Kirsner, Scott

    Symantec currently owns the bulk of the consumer antivirus software market thanks to its fast response to new viruses, worms, and other kinds of malware. The nature of malware impels Symantec to constantly update its products and collect around 20,000 virus samples a month. The company's Security Response Center consists of investigators who dissect the collected malware and then update their software to shield computers against the latest threats. A new set of "signatures" that identify the most dangerous security threats and tell computers how to thwart them is generated about 30 times a day, and the emergence of new threats ignites a race between Symantec teams around the world to expediently deconstruct the threat and craft a signature. Symantec and other antivirus software providers frequently share the information they accumulate with law enforcement agencies when malicious code is especially dangerous. Researchers can more effectively take apart new viruses and anticipate future threats by understanding virus writers' motivation, which can vary among individuals, according to Symantec virus writer profiler Sarah Gordon. "Usually, the virus writer is a young person who doesn't recognize the impact of what they're doing," she says; factors Gordon cites as motivation include a desire for revenge, notoriety, or tackling the technical challenge. Symantec instituted substantial changes to its staffing makeup when a series of major threats emerged simultaneously in August 2003.
    Click Here to View Full Article

  • "The Forgotten Era of Brain Chips"
    Scientific American (10/05) Vol. 293, No. 4, P. 66; Horgan, John

    Brain-stimulation research pioneer Jose Delgado's work has faded into obscurity despite the enormous debt modern brain-implant technology owes him. Delgado invented and implanted radio-equipped electrode arrays or "stimoceivers" into animals and humans, and demonstrated that the devices could be used to modify--and even control--the subjects' physical and mental behavior with electrical stimulation. Tests showed that the stimoceivers could change a patient's emotional state by stimulating different areas of the limbic system, while specific physical responses could be triggered via stimulation of the motor cortex. But other researchers cast Delgado's work in an ominous light, raising the issue that brain-stimulation technology could be used to control people's minds and eliminate undesirable behavior. Delgado later focused on less invasive approaches to brain stimulation, and his research in this area led to the invention of headgear that could send electromagnetic pulses to specific regions of the brain. The waning attention paid to Delgado's research, along with lingering ethical questions, discouraged brain-stimulation studies in the U.S., but the past 10 years have seen a resurgence in brain-implant research. Delgado's expectations for the technology are modest compared to those of modern researchers, who envision such mind-bending applications as instantaneous knowledge "downloading," telepathic communication, and thought-controlled machines. He still strongly believes the technology will be most helpful as a tool to control mankind's inherent aggression and help people suffering from psychiatric and neurological disorders.

  • "Our Broadband Fiasco"
    Chief Executive (10/05) No. 212, P. 16; Galuszka, Peter

    U.S. broadband deployment is being held up, and MIT visiting scholar Charles Ferguson warns that its slow rollout could cost the U.S. economy up to $1 trillion over 10 years. Although the country's size relative to other nations is a challenge for widescale broadband implementation, a far greater obstacle is the telephone service system. The Telecommunications Act of 1996 requires phone companies to allow broadband suppliers to piggyback on their networks, but these companies frequently refuse. In addition, there are regulations and taxes that upstart broadband suppliers do not have to comply with, but legacy companies do: One such rule forces incumbent firms to contribute to the federal government's universal service fund for providing phone service to inner city and rural areas; incumbents must also ensure that their lines can adequately support 911 emergency calls and are open to legal monitoring by law enforcement. There is a lack of federal legislation to address the many confusing issues associated with broadband technology, and some analysts believe incumbent telecoms are slowing down the broadband rollout by wielding their economic influence. Ferguson says these companies suffer from a dearth of high-tech and visionary management, while the Bush administration, unlike the previous administration, has undertaken a policy of noninvolvement, leaving broadband and the Internet in the hands of market forces. That policy shifted earlier this year with the appointment of Kevin Martin as head of the FCC. Martin has promised to prioritize broadband and recently led a vote to permit legacy telecoms to charge DSL providers market-based rather than government-set rates.