HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 630:  Monday, April 12, 2004

  • "More Cash Flowing to Robotics Research"
    Associated Press (04/11/04); Sheehan, Charles

    Robotics research is taking off thanks to technological advances and renewed interest from the federal government. While core supporters have consistently pushed robotics forward in the past, the field is generating a lot of publicity with recent military action. The Defense Department is using unmanned aerial vehicles and exploratory robots in current operations, and is funneling millions of dollars into university research programs. The goal is to develop unmanned air, sea, and land vehicles that would keep soldiers' lives from danger. Many of the casualties in Iraq, for example, have been due to roadside bombs and surface-to-air missiles. Under orders from Congress, the Defense Department intends to make one-third of its ground vehicles unmanned by 2015, including reconnaissance, transport, and fighting vehicles. U.S. troops are already using a 42-pound PackBot to explore caves in Afghanistan, while the Marines have developed another robot half the size. The Defense Advanced Research Projects Agency (DARPA) currently sponsors more than 40 sponsored robotics projects, including a Carnegie Mellon effort to build the Spinner, a five-ton autonomous fighting vehicle, and a CalTech neuroprostheics program that would interface human brains with vehicle controls. CalTech engineering professor Joel Burdick warned, however, against unrealistic expectations on the part of sponsors and the public. In the 1980s, many manufacturers were disappointed by robotic arm performance. "You can't throw robots into a process," Burdick says. "You've got to know how to integrate the technology." During the recent DARPA Grand Challenge, the media portrayed the event as a big failure, while the researchers involved saw it as an important step forward, says Virginia Tech professor Charles Reinholtz. Another robotics competition this July is expected to garner more media attention as teams compete to fly an autonomous aircraft 1.8 miles in an urban setting, enter a specified building, and photograph a target inside.
    Click Here to View Full Article

  • "Smart Net Drops PCs, Keyboard, and Mouse"
    Computerworld Australia (04/08/04); Gedda, Rodney

    Australian researchers are working to integrate technology with social and behavioral research, including software that allows groups of people to collaborate digitally without the use of keyboard, mouse, or traditional computer. Called Project Nightingale, the effort is backed by the Smart Internet Technology joint venture, the University of Sydney, and National ICT Australia. The software is the key element behind the technology, says University of Sydney Professor Aaron Quigley. It includes a data component for managing data exchange between different interfaces, a context component that provides situational information like bandwidth and network latency, and an application component. Quigley used a large touch-sensitive display from Mitsubishi Electric Research Labs to demonstrate the Project Nightingale technology, showing how multiple users could simultaneously share and edit digital content. He moved and re-sized a collection of images on the device. A smart Internet coupled with personal LANs would allow people to become content producers, enabling people to load information on a personal server for work, then sending it back into the Internet ether when they are finished. Quigley, who said the software was developed in Linux because it could be stripped down to bare components for quick testing, also used SQLite as the data manager. Linux also allowed for a lithe personal server. The Smart Internet Technology group--an Australian joint venture made up of public, private, and academic interests--is also working on new character recognition interfaces for the Internet, which would allow non-technical users to intuitively access computer applications using an infrared pen, for example. Users could write commands, which would be interpreted by the computer, and receive text-to-speech responses.
    Click Here to View Full Article

  • "In the Trenches With Antivirus Guru Mikko Hypponen"
    E-Commerce Times (04/07/04); Millard, Elizabeth

    F-Secure director of antivirus research Mikko Hypponen is one of the best virus hunters, a type of researcher that is fairly obscure. Hypponen has been working in computer security for 13 years, and says that his assembly language skills have come in handy in reverse-engineering viruses. However, assembler skills are not widely taught any more because the there is not great demand and learning them is tough. He says that "very few people need such low-level skills anymore. It's all C and C++ nowadays." However, he believes that universities will soon start teaching abut malicious code and how to analyze it. Right now, he says university computer science departments focus on some aspects of computer security, such as cryptography, but often do not teach students how to parse and analyze malicious code. Hypponen sees an evolution of computer worms and viruses, from the era of boot viruses to macro viruses to email worms. He predicts that email worms will be replaced by network worms as soon as next year, and that fewer worms are written by havoc-minded teens and more are written by those wanting to make money by stealing data or installing spam proxies. Teens still write most viruses, but the biggest outbreaks seem to come from more organized groups. The most challenging computer viruses or worms Hypponen has fought have been SMEG and Zmist from a technical standpoint, because they modified themselves spontaneously, though the recent Bagle/Mydoom/Netsky variants have been extremely tiring since so many continue to emerge.
    Click Here to View Full Article

  • "Spamhaus Proposal Aims to Stop Spam"
    InformationWeek (04/07/04); Gardner, W. David

    Anti-spam organization Spamhaus has submitted an application to ICANN for a .mail top-level domain name in the belief that its proposed "server-to-server" scheme can prevent spam from reaching email servers. The Spamhaus server-to-server approach is a seamless arrangement that works behind the scenes to keep email at bay, says Spamhaus' John Reid. The way Spamhaus envisions it, .mail users would register with The Anti-Spam Community Registry, which would be staffed by Spamhaus volunteers. The system relies on sending-server operators to register a .mail domain, and receiving-server operators would look up the IP address of the sender and other domain information in order to verify the transmission. This process would allow the receiving server to "easily determine if the sending server is spam-free, as well as determine if the email was forged," according to the application Spamhaus submitted to ICANN. The proposal will not only stop spam, it will also resolve the current problem whereby filters prevent "good" email from getting through, says Reid. Spamhaus, which intends to get started quickly on its proposal if ICANN approves its application, has already contacted some of the more prominent email-server software providers and developers about working on the project. An array of noted anti-spam activists would sit on the Anti-Spam Community Registry's board of directors.
    Click Here to View Full Article

  • "The Pure Software Act of 2006"
    Technology Review (04/04); Garfinkel, Simson

    Spyware is perhaps more insidious than other malware such as viruses and worms, since it mixes commerce and deception in a way morally abhorrent to most computer users, writes Simson Garfinkel. While viruses and worms are clearly illegal, spyware that tracks users' online activity and computer use is often authored and distributed by legitimate companies and with customer consent. But today's click-wrap license agreements fall far short of the labeling regimes in other industries, such as the Pure Food and Drug Act of 1906, which required manufacturers to clearly state ingredients, product weight, and avoid deceptive labeling. Software needs similar labeling to help consumers make more informed decisions about what they are installing on their computer. Almost by definition, spyware hides its true purpose though other software programs with similar functions go out of their way to make it clear what they do. Google's Toolbar for Internet Explorer, for instance, urges users to read the license agreement carefully so they understand their browsing activity will be fed back to Google in order to get the "page rank" for a certain site. A hypothetical Pure Software Act of 2006 would require the Federal Trade Commission to come up with labeling standards and rules for use. Software labeling would have to contain important information without glutting consumers with too much data. Simple icons could be used to denote potential unsavory features, such as remote control, unremovable programs, computer use monitoring, pop-up ads, or modifications to the operating system. Importantly, such a labeling regime would have to be mandatory, as companies such as Google currently do a good job voluntarily informing users of software features while unscrupulous firms do not.
    Click Here to View Full Article

  • "Email Attack Could Kill Servers"
    New Scientist (04/06/04); Knight, Will

    Computer security experts at NGSSoftware have discovered a way to disable email servers by using forged emails with thousands of incorrect addresses in the "copy to" field. The researchers found that sending these emails to large email servers ricocheted enormous quantities of unwanted email back at the email server specified in the "copy to" field, as long as the first machine is configured to return an email and its attachments to each incorrect address. NGSSoftware researcher Gunter Ollman says the email is forged to look as though it comes from the targeted server, and the flood of bounced messages generally makes that server crash. Experts says that 30 percent of Fortune 500 companies' email servers could be used for such an attack, and using an insecure server for the initial messages would make the attack almost impossible to trace. Ollman says that it should be simple to reconfigure mail servers to make them invulnerable to the attack, but he warns that if large firms do not adjust their mailing architecture, it only takes a few of these companies for the attack to work.
    Click Here to View Full Article

  • "ENUM Still Stalled in US"
    Computer Business Review (04/07/04); Murphy, Kevin

    Despite its technological feasibility and the endorsement of the U.S. government in February 2003, the ENUM standard for telephone numbers used over the Internet has yet to be implemented in the United States due to a number of concerns. Insiders generally agree that ENUM will be a crucial element in the widespread adoption of VoIP, but political boundaries and revenue concerns over the operation of ENUM registries have stalled government efforts to deploy the standard. Most experts agree that ENUM technology is ready; Paul Mockapetris, chairman and chief scientist of Nominum and author of the DNS specification, says, "With regards to ENUM, everyone thinks that is a solved problem." So far, a group called the ENUM Forum has agreed that the government should give a limited-liability corporation the authority to manage ENUM. ENUM's "Tier 0" root, E164.arpa, is already being run by RIPE-NCC, which operates Europe's IP addressing. RIPE designates Tier 1 country codes to nations who apply for one, but the +1 dialing code that represents the United States also governs 18 other countries. To deal with this problem, NeuStar, the Virginia-based company that runs the North American Numbering Plan, may set up a "Tier 0.5" for certain local area codes under the Tier 1 registry. NeuStar plans to bid for management of the ENUM standard when it is deployed in the United States, and the company has already expressed concern over Pulver.com's request to launch a top-level domain (.tel) that would use ENUM.
    Click Here to View Full Article

  • "Group Suggests 25 Ways to Improve IT Security"
    Government Computer News (04/06/04); Miller, Jason

    The Corporate Information Security Working Group released a report this week that says to improve government and private sector cybersecurity, new legislation, insurance changes, and public outreach efforts are needed. The group, consisting of academic and industry members, has offered 25 recommendations to improve IT security, at the request of Rep. Adam Putnam (R-Fla.), chairman of the House Government Reform Subcommittee on Technology, Information Policy, Intergovernmental Relations and the Census. Last autumn, Putnam drafted legislation that would require publicly traded companies to include an IT security plan status report with their filings to the Securities and Exchange Commission, but did not sponsor the legislation and instead created the working group. "Since approximately 85 percent of this nation's critical infrastructure is owned or controlled by the private sector," Putnam explains, "I have worked to identify strategies that will produce meaningful improvement in the computer security of corporate America." The working group's subgroups focus on reporting, information sharing and performance metrics, procurement practices, education, incentives and liability, and best practices. The recommendations include enforcing provisions of the Federal Information Security Management Act, which requires agencies to uphold minimum security standards; identifying qualified, certified, or compliant third-party organizations; and providing an antitrust exemption for critical infrastructure industry groups that agree to obligatory security specifications for software and hardware.
    Click Here to View Full Article

  • "Here There Be Data: Mapping the Landscape of Science"
    National Science Foundation (04/06/04)

    Researchers from a number of different fields are working to map the entire scientific landscape using computers. Nearly 20 articles recently published in the Proceedings of the National Academy of Sciences describe different aspects of this effort which would benefit not only scientific study, but also students planning their education, business investment, and research agency funding. A map of science could one day become as common in schools as traditional political maps, depicting different disciplines as continents with closely related areas on the same continent. The quickening specialization of science means that researchers need some tools to help them link scientific results. Currently, much of scientific data remains an uncharted landscape for the scientific community. The recently published articles describe several different ways to create science maps, including creating programs that can read the meaning of the articles or linking articles by citation. Different types of maps are possible as well, such as maps showing the evolution of scientific networks. Some of the researchers point out that computer-generated maps might not correspond with human-assigned designations. Definitions on the map, such as borders and landmarks, could come from popularity as measured on the Web or by identifying groups of collaborators. Researcher Mark Newman demonstrated that social networking schemes could be used to create maps of the scientific community. Another paper used a collection of 1.8 million computer science articles to show how communities within that field evolved over the last 10 years. Eventually, a comprehensive map of science could encompass hundreds of dimensions and draw from terabytes of information, including publications, grants, databases, patents, and other sources.
    Click Here to View Full Article

  • "Swiss Diplomat Leads Debate Over Policing the Web"
    NZZ Online (04/06/04)

    At the behest of United Nations Secretary General Kofi Annan, Swiss diplomat Markus Kummer will help shape the international discussions on Internet governance that are expected to be held at the second World Summit on the Information Society in 2005. Kummer says he plans to create a Working Group on Internet Governance that will be charged with giving a definition to Internet governance and presenting a list of recommendations for discussion. The working group, which Kummer expects to be functional by the end of June, will be in charge of setting its own mandate. All stakeholders, including developing countries, will have a voice in the group's work. Kummer raises the possibility that criticism of the current state of Internet governance will lessen once the working group is able to study the issue. Based on the current loose definition of Internet governance, it seems likely that the working group will address issues that go beyond domain names and the Web naming system, Kummer says, noting that privacy, spam, network security, and consumer protection have become important topics. Addressing the status of ICANN, Kummer says that by no means is there an intention to destroy the organization or to cut off communications with U.S. authorities.
    Click Here to View Full Article

  • "Rajan's Spirit"
    Siliconindia (04/04) P. 36; Shankar, Pradeep

    NASA is using MAPGEN (Mixed-Initiative Activity Plan GENerator) to instruct the Mars Exploration Rovers on what maneuvers and scientific tasks to perform each day on Mars. MAPGEN is able to automatically create conflict-free plans and schedules for science and engineering activities; assist in hypothesis testing--what if analysis for various scenarios; support plan editing; analyze the usage of resources; and perform constraint enforcement and maintenance. The software helps NASA to maximize use of the rovers during their expected operational lifetime of about 90 Martian days, or sols, which consist of six hours of sunlight. "Between the time the rover goes to sleep and wakes up the next morning, Martian scientists are working frantically," says Kanna Rajan, a senior scientist at NASA's Ames Research Center in Moffett Field, Calif. Rajan and his team spent three years building the activity-planning tool which is crucial to the Mars Exploration Rovers mission. He points out that if the software fails to execute even one day, it would cost NASA $4.5 million. After scientists receive telemetry from the rovers and download data, they use MAPGEN to formulate a viable sequence of activities, in line with mission goals, for the rovers to undertake for the next sol. For example, MAPGEN would know not to have the rovers take PanCam pictures when they are moving, because the pictures would be blurred. MAPGEN also would know to conduct atmospheric sky campaigns when the sun is rising or setting, rather than take periodic samples of the sky during the night. Developed using C++, MAPGEN consists of APGEN, a multi-mission tool used for flight projects; Planner, a planning and scheduling system; and Constraint Editor, a science-intent capture tool.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "Photo Recognition Software Gives Location"
    New Scientist (04/06/04); Randerson, James

    Researchers at the University of Cambridge in the United Kingdom have developed software that can recognize photographed buildings nearby, determine your location, and then provide you with directions to your destination. Roberto Cipolla and Duncan Robertson envision cell phone owners using their photo recognition software. Users of cell phones with camera capabilities would be able to take a picture of the nearest building, then press send. The photo recognition software is designed to match a photograph of a building to a database of three-dimensional images of real-life streets, and to determine the location of where someone is standing to within one meter. Cipolla and Robertson tout the photo recognition software over GPS satellite, which is accurate to 10 meters and can be blocked when tall buildings are in the direct line of sight of the satellites, and cell phone base station positioning, which offers precision between 50 to 100 meters. In addition to providing directions, the photo recognition software also would offer other information on buildings that are photographed. Cipolla and Robertson are now working on a prototype that includes all buildings in Cambridge city center, but there are some concerns about the commercial viability of the photo recognition software. "The question is: how much are people prepared to pay for it, and how often will they use it?" says Rob Morland, of technology consultants Scientific Generics near Cambridge.
    Click Here to View Full Article

  • "International Report: Taiwan"
    Information Today (03/04) Vol. 21, No. 3, P. 29; Poynder, Richard

    China and Taiwan are moving to join the International Commons (iCommons) project of Creative Commons (CC), a nonprofit group that is giving content creators an opportunity to place their work in the public domain without losing control over how others use their copyrighted material. The iCommons initiative gives jurisdictions outside the United States the chance to apply the 11 Creative Commons licenses to local copyright laws. France, Japan, and the United Kingdom are among the nine jurisdictions that are localizing Creative Commons licenses, and 50 other jurisdictions have potential affiliate institutions. CNBlog.org, which deploys open collaborative research on the Internet, is heading the effort in China, and the Institute of Information Science Academia Sinica, a government-sponsored academic research institution, is leading the way in Taiwan. "With the various indigenous and Chinese legal traditions in Taiwan, the introduction of the CC licenses will induce a re-examination of the culture of knowledge sharing [and stimulate discussion] on the development of copyright law, international IP protection, and the relationship between humans and their creative activities," says Shunling Chen, co-project leader of the effort in Taiwan. Glenn Otis Brown, executive director of Creative Commons, says the one-size-fits-all approach of copyright law does not work well with the Internet. Approximately 1 million Web pages now use CC licenses. "Our goals are nothing less than to have the double-C (CC) become as familiar with the public as the standard copyright (c)," says Brown.

  • "Ease-of-Use Equals Use"
    Software Development (04/04) Vol. 12, No. 4, P. 38; Marcus, Aaron

    User-centered user-interface development (UCUID) is a process whereby user interfaces and their components are fleshed out through careful planning, research, analysis, design, deployment, evaluation, training, and maintenance according to usability principles to ensure that products and services are helpful and appealing to the primary stakeholders--the users. The process begins by interviewing users: UI developers should precede interviews with brainstorming sessions to ascertain between five and nine different user types extrapolated from previous knowledge and user contact, after which the developers should set their learning goals and list interview questions. Useful questions interviewers might want to ask may revolve around the kinds of problems users wanted to address and what steps users took to solve them; user satisfaction with the outcome; people and roles involved; wasteful activities or conditions; and critical success factors on which users are assessed and how they relate to system use. UI developers should summarize key operations and data for the users, and be vigilant for unanticipated and potentially valuable functions and content. Once the interviews are concluded, the next step is to model at least three user types, choosing one primary user type to concentrate UI development on: A typical profile should include such information as the user's needs, responsibilities, usage patterns, characteristic behavior, and design implications. These profiles form the basis of several realistic use scenarios to be weighed by stakeholders. The next step is to carry out task/need analysis with target users and build worksheets that categorize tasks by groups, ease of use, and any descriptive metaphors. The result is a UI designed as a series of layouts with particular labels and data fields that consider tasks, workflow, and user concerns; accurate representations of the user's world can be devised after a few organizational sessions, and the resultant prototype or demo can be mated to or checked against information collected via conventional software development methods.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Into Thin Air"
    Fast Company (04/04) No. 81, P. 76; Reingold, Jennifer

    Supporters claim that offshore outsourcing is crucial to the continued competitiveness of the United States in the global economy, leading to productivity boosts, lower prices, and increased demand for U.S. goods. The accuracy of this projection is of little comfort to American workers--many of them IT professionals--displaced by the migration of their jobs to countries where labor is cheaper: Gartner estimates that 500,000 IT jobs could be exported by year's end, while up to 25 percent of all IT professional jobs could be offshored by 2008. "If you're just laid off, you can tell yourself that the economy swings back and forth, but if it's outsourced offshore, it ain't coming back," laments software engineer Myra Bronstein. The growth of outsourcing and its extension to white-collar professions has many highly skilled people fearing for their jobs' security, although well-educated U.S. workers who get outsourced often elicit less sympathy because they commanded such high salaries at the height of the IT job boom. The backlash against offshoring has prompted lawmakers to propose legislation to impose limits, but most companies call such measures protectionist, and critics argue that these policies will only spark further job erosion. Nevertheless, offshoring will linger as long as it offers solid cost savings, which it does: Firms can shave off 20 percent to 70 percent of their labor costs by transferring jobs to low-wage countries, according to one estimate. Other companies and experts believe the government and the private sector should be more proactive in helping displaced U.S. workers get back on their feet by amending the Trade Adjustment Assistance Reform Act of 2002 to accommodate more white-collar jobs, for example. At any rate, many American companies are keeping a low profile on their offshoring operations, while for many multinationals outsourcing cuts both ways: They risk alienating the American public by offshoring jobs, or earning the enmity of offshore markets if they limit or halt the hiring of foreign labor.
    Click Here to View Full Article

  • "Security Patching: Easy As 1-2-3"
    Network Magazine (03/04) Vol. 19, No. 3, P. 37; Greenfield, David

    Network managers will have a much easier job protecting Web applications if two new security protocols automate vulnerability description, assessment, and patching. However, the XML-based Application Vulnerability Description Language (AVDL) and Web Application Security (WAS) standards are not being developed in regards to one another and could have interoperability problems. Cooperation between the standards' developers could come some time in the future, says WAS technical committee Chair Mark Curphey, also consulting director of Foundstone. If the two standards work together, security vendors could publish bulletins that organizations subscribe to via middleware, using the WAS and AVDL standards to define and carry the security information, says Computer Incident Advisory Capability (CIAC) security analyst Jon Diaz. As it is now, network managers spend inordinate amounts of time deciphering application vulnerability scan reports, then reconciling those reports to their particular firewall's rule wizard, and finally rewriting firewall rules to secure the Web application. AVDL is supported by the Organization for the Advancement of Structured Information Standards (OASIS) and five lesser-known security vendors. WAS has the backing of major vendors such as CheckPoint, and would likely win out if the two protocols do not match. WAS tackles the more difficult challenge of defining tasks as opposed to simply communicating information about the attacks between devices. IETF Transport Layer Security working group co-Chair Eric Rescorla, also founder of security consulting firm RTFM, warns that automated scanning descriptions could be harnessed by hackers to quickly create new attack tools based on freshly discovered vulnerabilities.
    Click Here to View Full Article

  • "The God Particle and the Grid"
    Wired (04/04) Vol. 12, No. 4, P. 114; Martin, Richard

    Once it is up and running in 2007, CERN's Large Hadron Collider (LHC) in Switzerland could generate a peak rate of 10 petabytes of data annually in its search for the elusive Higgs boson, or God particle. Processing that data is the job of the LHC computing grid, a virtual network of clustered physics centers, large institutions, research facilities, and individual workstations lashed together by ultra high-speed connections. The grid nodes will agree to contribute a portion of their computing resources in exchange for LHC collision data, and Caltech physicist Harvey Newman believes those agreements will eventually become a "grid economy" whose computation, storage, and network resources are as accessible as any other commodity. The speed and network infrastructure for the LHC grid already exists: Caltech and CERN research teams successfully sent 1 terabyte of data across 4,400 miles in less than half an hour at 5.44 Gbps last October. Furthermore, there is plenty of cheap, unused dark fiber available that would allow the LHC to lease its own network rather than purchase bandwidth. The 10 GB link the LHC grid will occupy adds up to tremendous costs, however. At the core of the grid's ability to supply supercomputing resources on demand is the open-source Globus Toolkit, which allows a grid to interpret a user request and then locate the appropriate computing resources autonomously, splitting the job up into smaller operations, apportioning the spare computing power, and working toward a solution. Enterprises are watching the LHC grid's development very closely, hoping that its success will usher in the age of ubiquitous on-demand supercomputing.
    Click Here to View Full Article



    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM