HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 7, Issue 874:  December 5, 2005

  • "Nanotechnology Regulation Needed, Critics Say"
    Washington Post (12/05/05) P. A8; Weiss, Rick

    Emerging environmental concerns over nanotechnology have spurred calls from lawmakers, environmental groups, and industry leaders for increased federal regulation of a growing body of materials that are being used in consumer products. Regulation has been slowed by the amount of unknown information about the materials and questions about who should shoulder the financial burden for regulation. The rules of physics and chemistry change at the nanoscale, which can alter the chemical, electrical, and physical properties of commonly used materials. The threats come from the particles of the materials, rather than the materials themselves as they appear in bulk form. The nanomaterial industry, expected to surpass $1 trillion a year within 10 years, branches into a wide variety of consumer products, such as sports equipment, cosmetics, computers, and food wrappings. An EPA proposal has called for manufacturers to notify the agency when they incorporate nanomaterials into their designs and to detail any information they have learned concerning environmental or health risks, though the notification would be voluntary. The proposal has drawn criticism from environmental advocates who believe that disclosure should be mandatory, though the agency argues that scientists' knowledge of nanomaterials is still insufficient to determine the best approach to regulation, and that a voluntary program could be implemented far more quickly than a compulsory notification system. In response to alarming discoveries such as the finding that carbon nanotubes increase cell death, there have been calls among lawmakers to increase funding for environmental, safety, and health concerns with nanomaterials to between 10 percent and 20 percent of the overall nanotechnology budget. The financial impact of increased regulation on smaller manufacturers remains to be determined.
    Click Here to View Full Article

  • "Europe Has the Hot Hand"
    Associated Press (12/03/05)

    A coalition of European researchers has developed a prototype of the first prosthetic hand with the ability to evoke instinctive sensory signals that could be attached to a human arm within two years. Cyberhand, which attempts to simulate the feel of touching objects, is the product of Europe's burgeoning robotics research community, a field that is still struggling to secure adequate financing as it takes aim at competitors in Asia and the United States. The European approach is steeped in concern for the social and ethical dimensions of robotics, and is generally seen as more cautious than Japanese efforts. Electrodes and biomimetic sensors connect the hand to the nervous system, which makes it responsive to the environment, and scientists have included multiple layers of synthetic material to create an aesthetically realistic simulation of human features. Europe's team-oriented research approach is credited for addressing the issues of sensory feedback, electrodes, and command control and processing, which have been largely ignored by the more independent development process in the United States. Another European project, known as HYDRA, is creating the world's first robot that can alter its form through independent modules of processors, batteries, sensors, and actuators that can break off from the robot to alter its form, and then reattach themselves.
    Click Here to View Full Article

  • "NSF to Support Grid Software With $13M"
    University of Southern California (11/30/05); Mankin, Eric

    The NSF has granted $13.3 million of funding to the group of researchers who developed the open source grid software Globus Toolkit. The goal of the five-year project is to maintain and improve the software, which is widely used in science projects that involve large amounts of information both in the United States and abroad. Globus Toolkit is designed to amalgamate the operations and data of computers spread across the globe, harnessing their computing power, while also addressing security, data management, and resource discovery. The team of researchers from the University of Chicago and the University of Southern California will coordinate with the scientific community around the world to develop a priority list of enhancements to the software, which underpins such celebrated projects as the U.S. TeraGrid and the LHC grid. "Researchers and educators can now build on this software with confidence, knowing that a dedicated team is available to address problems and to enhance its capabilities as their needs evolve," said the University of Chicago's Ian Foster. Scientists hail Globus for its ability to handle massive quantities of information and its ability to integrate multiple scientific disciplines. Funding for the project comes from the NSF Middleware Initiative, created to support the enabling technology required by scientific projects. In addition to the international community of developers, Globus has drawn support from DARPA, the Department of Energy, IBM, and Microsoft.
    Click Here to View Full Article

  • "GPL Revision Guidelines Made Public"
    TechNewsWorld (12/02/05); LeClaire, Jennifer

    The Free Software Foundation (FSF) has issued a specification for the revision of the GNU General Public License (GPL), with a first discussion draft to be released in January 2006. The GNU GPL, which has not been revised for 15 years, accounts for almost 75 percent of open source software distributed throughout the world. By 2010, Gartner predicts that 75 percent of IT groups will have formal procedures for handling open source software. FSF founder Richard Stallman has said that the forthcoming revision, GPLv3, will ensure the protection of the four freedoms to which software users are entitled: studying, copying, modifying, and redistributing the programs they use. Throughout the revision process, the FSF will encourage public discussion of the issues surrounding open source software, and will publish a second discussion draft by next summer, and a final draft in the fall. "Throughout this process, all voices will be heard. We will evaluate every opinion and will consider all arguments in light of the GPL's goals. The process is accessible, transparent, and public for all those who want to participate," said Eben Moglen, general counsel to the FSF. The revision process will solicit the participation of open source community projects, global 2000 companies, and individual developers and users. The GPL revision process aims to achieve the same transparency and communal participation as the open source model itself. The considerable increase in the use of open source software since the last update in 1991 is likely to boost participation in the revision process. Stephen Fronk, an intellectual property lawyer, said he expects the forthcoming revision to address software patents, the role of the GPL in the international arena, and the dissemination of code.
    Click Here to View Full Article

  • "Uncertainty Clouds Future of E-Vote Tests"
    Inside Bay Area (CA) (12/01/05); Hoffman, Ian

    There is a broad recognition that e-voting machines are severely flawed, and it remains unclear how long it will be before secure, reliable technology emerges. Carnegie Mellon's Michael Shamos found that one-quarter of the machines submitted to Pennsylvania were not qualified for elections, citing such "glaring failures" as their inability to properly count votes. A recent study found that each e-voting system examined failed to record between 3 percent and 4 percent of votes. Shamos believes that as many as 10 percent of the systems in use could be failing on Election Day. Many of the systems that pass the certification process contain flaws, and often break down once in use. Despite the well-documented reservations with the testing process voiced by numerous experts, there is unlikely to be significant change in the next couple years, as it will take between 12 months and 18 months for the new standards being developed by the U.S. Elections Assistance Commission to take effect. The modifications include stricter regulations on wireless communication and paper record systems that are easy to manage. Due to the lengthy delays in approving the purchase of new equipment, the elections in 2006 and 2008 will likely be conducted using the equipment currently in place. California is leading the charge of states seeking to impose tougher testing requirements, which arouses concern among voting system makers, who argue that increased testing will divert money that could be used to improve a system's performance. Vendors say the cost of preparing a system for use in a state election is already prohibitive; Diebold spent at least $250,000 fixing the sliding finger bug that caused one out of five systems to crash when a tester slid a finger across the screen.
    Click Here to View Full Article

  • "SNARFing Your Way Through Email"
    CNet (12/02/05); Fried, Ina

    To help manage the proliferation of unwanted and unread email, Microsoft has developed a software program known as SNARF (social network and relationship finder) that can organize messages based on the recipient's relationship with the sender, relying on the assumption that the most important messages come from people whom the recipient knows well. SNARF, currently in the research stage, is designed to improve on what Microsoft's Steve Smith describes as the current "ADD sort order" technique of organizing an inbox, where the most recent messages are automatically placed at the top, irrespective of their relevance to the sender. While computational analysis of a social network may seem counterintuitive, Smith says that "social relationships are countable," and can be measured by such criteria as the frequency of correspondence between individual users. Social quantification techniques are not new, and have been the subject of research at Microsoft and other companies for years. SNARF also factors in whether a message was sent directly to the recipient, whether the recipient was copied, or whether the message had a large distribution list. The software is currently available as a download for users to experiment with. Smith is also developing ways for users to further customize the software, such as tagging specific emails or senders as important regardless of their frequency. He is also exploring a version of the software designed for a cell phone.
    Click Here to View Full Article

  • "Viral Cure Could Immunize the Internet"
    New Scientist (12/01/05); Kleiner, Kurt

    A network of "honeypot" computers distributed across the Internet could be used to immunize the Web faster than the spread of a computer virus, according to Eran Shir and colleagues at Tel-Aviv University in Israel. The mathematical model presented in Shir's research applies network theory in the form of a dedicated and secure network of honeypots, which look like ordinary computers to viruses on the Internet. However, once a honeypot attracts a virus, all the others immediately know about it. The honey pot automatically analyzes the virus, produces the healing code, and distributes it to the broader network. The strategy works better on larger networks, as the simulation shows that only 5 percent of a network of 50,000 nodes would be infected if 0.4 percent of the computers were honeypots. Compared to a network of 200 million nodes, with the same proportion of honeypots, only 0.001 percent of computers would be infected. IBM computer scientist Jeff Kephart says the research is promising, and further analysis of benefits and cost is needed. Shir plans to make an example program available soon and hopes a company or group seriously considers implementing the model across the Internet.
    Click Here to View Full Article

  • "Finding a Balance Between Digital Copyright and Consumers' Rights"
    IST Results (11/29/05)

    The need to balance consumer rights and protect technology from piracy is tough task for copyright holders to manage, but new digital rights management (DRM) technology may make it easier. DRM technologies are designed to protect music, movies, video games, and broadcast television from illegal copying. "DRM technologies have to be interoperable if they are to safeguard consumer choice and avoid consumer lock-in, i.e. tying users to one system operating one type of DRM," says Miguel Dias of ADETTI research institute. All of the DRM technologies have advantages as well as disadvantages, such as restricting consumers' rights to fair use, inability to prevent piracy, and invasion of user's privacy. Europe, the biggest audiovisual content market in the world, incurs more than 4.5 billion euros in piracy costs per year. Dias, who is also the editor of the DRM Requirements Report, says more effort needs to be put into DRM before it can be used effectively. Some insiders are optimistic about the emerging MPEG-21 standard, which is designed to communicate machine-readable license data in a "ubiquitous, unambiguous, and secure manner." Alexandre Cotarmanac'h, researcher for France Telecom and coordinator for the DANAE project, is currently working on an advanced MPEG-21 chain. "With a CD you not only get the music but also the jacket that contains relevant information about the product," he says. "MPEG-21 aggregates that into a digital packet, which can also include copyright protection mechanisms." MPEG-21 is still being developed and the VISNET project is working on fixing the issues that still plague the standard, such as its interoperability. Dias and Cotarmanac'h agree that more work needs to be done on standardization, but are pleased with the progress of MPEG-21.
    Click Here to View Full Article

  • "Hands-Free Car Takes Rage Off Road"
    Australian IT (11/29/05); Foreshew, Jennifer

    Australian researcher Ljubo Vlacic expects to see driverless vehicles on the road sooner or later. Vlacic, a professor in the School of Microelectronic Engineering at Griffith University and founder of the Intelligent Control Systems Laboratory, says researchers at the lab have already developed a prototype of a sensor that can detect moving obstacles on a road. The researchers are also developing electronics and control and decision-making algorithms that would allow vehicles and robots to operate on their own. They have tested algorithms for driving maneuvers indoors and outdoors. The cyber-car system would make use of a series of sensors, a microprocessor, and software to handle signals, make decisions, and activate controls. "Our technology relates to control and decision-making algorithms for a variety of driving maneuvers such as crossing an intersection, overtaking and stop-and-go traffic situations," says Vlacic. The drivers would determine when to operate their vehicle in the driverless mode. European Union officials believe driverless vehicles could be available in Europe as early as 2012.
    Click Here to View Full Article

  • "Bioinformatics Project Goes to Next Level"
    University at Buffalo Reporter (12/01/05) Vol. 37, No. 12; Fryling, Kevin

    The University of Buffalo's Center for Computational Research (CCR) is working to raise the awareness of bioinformatics among young students in western New York through its Bioinformatics Workshop for High School Teachers. For the past several years, CCR has provided coursework in bioinformatics for students attending Orchard Park High School and the all-girls Mt. St. Mary Academy in Kenmore through its "Next Generation Scientists: Training Students and Teachers" project, but plans to use the Nov. 30 workshop to get other high school teachers in Erie and Niagara counties to introduce bioinformatics to students. The workshop is designed to show teachers how to incorporate bioinformatics into their biology and computer programming courses. Used to decode the human genome, bioinformatics is a field in which scientists use mathematical, computing, and statistical techniques to solve problems in molecular biology, according to E. Bruce Pitman, a professor of mathematics and associate dean for research and sponsored programs in the College of Arts and Sciences. Researchers also use bioinformatics to search for treatments for genetic diseases. Pitman views the effort as a way to grow the field in the region. The "Next Generation Scientists" program is part of the training and education initiative of UB's New York State Center for Excellence in Bioinformatics and Life Sciences.
    Click Here to View Full Article

  • "Gone Spear-Phishin'"
    New York Times (12/04/05) P. 3-1; O'Brien, Timothy L.

    Phishing is the practice of disseminating Trojan horse programs in the guise of messages from trusted sources to computer users, who unwittingly install the Trojans, which then record sensitive information and send it elsewhere. An even more insidious form of phishing, dubbed spear-phishing, has emerged: Spear-phishers differ from widespread phishers in that their messages are intended for specific prey, and security experts say spear-phishing is harder to spot. Analysts believe spear-phishing is likely orchestrated by profit-motivated organizations, and the phenomenon has received little publicity because victims are reluctant to report such exploitation. These phishing attacks at the very least demonstrate the vulnerability of sensitive data stored on computer networks, hurt consumers' confidence in Web-based transactions, and make email less trustworthy, according to analysts. One of the most scandalous instances of spear-phishing was recently uncovered in Israel, when authorities learned that members of three of the country's biggest private investigation firms conspired to commit corporate espionage on a grand scale using spear-phishing methods. In addition, some of Israel's most illustrious corporations are being investigated for possible information theft in connection with the scandal. Authorities make several suggestions for countering phishers: One is for consumers to be aware that banks or financial institutions would never alert them of any problems with their credit cards or accounts via email, which should tip them off that such messages are suspect. But Johannes Ullrich with the SANS Institute's Internet Storm Center warns that phishing strategies will only get worse as spear-phishing is combined with widespread phishing "so that company logos can be snatched from Web sites to build customized databases of corporate logos."
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Computer, Heal Thyself"
    BusinessWeek (12/05/05) No. 3962, P. 74; Hamm, Steve

    In the fall of 2001, IBM's Paul Horn wrote the "Autonomic Computing Manifesto," what he describes as "a call to action" for engineers to develop a new breed of systems capable of repairing themselves. His inspiration came from the spiraling complexity of new technologies, such as open source software and wireless devices, that confounds the average user. To Horn, computers needed to be smarter and capable of monitoring themselves for viruses and other problems, and to be able to fix problems when they arise without human intervention. With IBM's release of three self-monitoring software products that restart systems after a power outage and efficiently manage the traffic flow of processing tasks, Horn's vision is closer to reality. Cisco and Microsoft have similar projects in the pipeline, signifying the emergence of self-healing computing as a major market force. The upcoming version of Windows will be able to detect the impending failure of a disk drive and alert users to save their work. Other self-healing technologies are less visible, such as the ability several companies have developed to allow users to tap into their wireless network without fear of infection. The first wave of self-healing technology will be directed at the commercial environment, where systems have become so complex that for every dollar an average company spends on new technologies, they spend $9 on their oversight and maintenance. Self-healing computing has become a dominant thrust of IBM's research activities, as Horn has dedicated more than 100 research scientists to the initiative. One feature IBM has developed detects problems in a computer's chip and powers down corrupted transistors. Scientists are unsure how much intelligence they will be able to infuse computers with, and some believe that large networks will be particularly resistant to self-healing technology, though they admit the field is still in its early stages.
    Click Here to View Full Article

  • "New Path of Attacker"
    InformationWeek (11/28/05) No. 1066, P. 28; Claburn, Thomas

    Network devices' operating systems and applications are now the main target for hackers, according to a study on the 20 most critical Internet Security vulnerabilities released by the SANS Institute, in partnership with U.S. and UK government officials. The SANS Institute has been working on the top 20 report since 2000 and also found that data-backup software is susceptible to attack, which has many business managers worried about the safety of critical corporate data. The SANS report is aiming to put pressure on software companies to beef up their security systems. Howard Schmidt, former chief security officer for Microsoft and eBay, suggests companies fix the problem through secure coding and better QA of development processes, penetration testing on compiled code, and vulnerability testing of integrated, deployed application via Web front ends. The Department of Energy Computer Incident Advisory Capability recently issued a warning to organizations in regards to attackers who target specific vulnerabilities at specific companies. Mark Richmond, network systems engineer for the U.S. District Court, is one of many industry professionals who has noticed the increase in the sophistication of attacks. "The coordination of attacks over the last few years seems to be increasing," Richmond says. "There are cooperative arrangements between various groups, formal, or informal, that seem to be facilitating the use of networks and computers for criminal activities." The threat of attack has impacted consumers, says a Pew Internet & American Life Project study that finds that more than 90 percent of Internet users say they are changing their online behavior because they fear cybercrime. Hackers are getting better at what they do, but the good news is, companies are getting better prepared to handle security problems by investing more time and effort into coming up with solutions.
    Click Here to View Full Article

  • "Security Gets Framed"
    IST Results (11/29/05); Cotton, Andre

    Many IT companies have not yet figured out how to get rid of holes in digital security systems, but the Security Experts Initiative plans to put an end to the problem by implementing a framework for security that negotiates between users and the service they are attempting to gain access to. The SEINIT framework is designed to apply the appropriate level of security, while the user controls privacy. "We set ourselves the almost paradoxical goal of reconciling security and freedom," says Andre Cotton, SEINIT project coordinator and head of Thales Communication's Advanced Information Technologies Laboratory in France. "Users decide at the beginning what level of information they want to reveal to the service. The framework then negotiates with the service and applies the appropriate security component or protocol," says Cotton. Plans to create a user interface will take place in January for a separate project, called DISCREET. The SEINIT project allows security to be independent from any hardware, software, or access protocol, such as cellular phones, Bluetooth, Wi-Fi, Ethernet, and broadband connection. This means security is regulated by a framework, not encryption or a particular program. Cotton hopes the SEINIT method will soon be introduced as the formal security standard of the future. The project team is currently putting the final touches on a demonstrator for the framework on Windows and Unix system Linux, LANs, Internet, and wireless networks.
    Click Here to View Full Article

  • "Report on the 5th International Web Archiving Workshop (IWAW)"
    D-Lib Magazine (11/05) Vol. 11, No. 11; Aschenbrenner, Andreas; Brandt, Olaf; Strodl, Stephan

    The European Conference on Digital Libraries' fifth International Web Archiving Workshop (IWAW) drew some 60 international participants, and more than 50 percent of presentations at the event were from current Web archiving efforts. The conference's key sessions focused on the initiatives of the International Internet Preservation Consortium (IIPC), audio and video Web archiving, time dimension, digital preservation, and current projects. The IIPC session included the presentation of the consortium's open-source Heritrix Web crawler tool, the BAT archive format manipulation tool, the WERA access tool, and the NutchWAX search engine tool; elucidation on the WARC Web archiving format initiative and the IIPC Web Archiving Metadata Set, including its eventual integration with METS; and discussion of technical and legal issues. The audio and video Web archiving session comprised a single presentation by the French audiovisual archive center's (INA) Thomas Drugeon on his institution's Web crawling and archival storage strategy. The INA will be responsible for overseeing French Web sites as they relate to media institutions and audiovisual resources, and the facility intends to automatically amass specific domains. The digital preservation session featured presentations from the likes of Old Dominion University's Frank McCown, Kongelige Bibliotek's Niels Christensen, and Dr. Shigeo Sugimoto on such topics as an emulation strategy-based "long-term access" solution, the Grace tool for dynamic file format transformation, a simulation strategy for estimating the mean time to failure, and an enclose-and-deposit digital preservation technique. IWAW 2005's final session covered current projects such as Germany's Kopal, an effort to collaboratively develop and operate a DIAS system-based long-term preservation system; and cooperative syndication of Web archives in a range of initiatives via the European Archive.
    Click Here to View Full Article

  • "Tomorrowland: When New Technologies Get Newer"
    Educause Review (12/05) Vol. 40, No. 6, P. 14; Neas, Bonita M.; Bojonny, John S.; DiLorenzo, Emilio

    The EDUCAUSE Evolving Technologies Committee has focused on how several new technologies will become even more innovative and critical to education in the years ahead. The evolution of wireless technology is moving at a rapid clip, and its significance to people both on and off campus is growing; now is the time for colleges and universities to plan to exploit emerging standards as wireless and handset technologies converge. Wireless products expected to play an important role in the future include ultrawideband, free space optics, virtual fiber, WiMax, 3G, and EV-DO, while a mesh networking standard should be ratified in 2008. Portals are important communication, user authentication, and data/application resource tools for campus communities, and their continued evolution will focus on application delivery, streamlined application integration, and the incorporation of diverse standards. Expected trends in the portal space include vendor consolidation, open-source portals' penetration of the mainstream, and the emergence of augmented communication and collaboration tools, flexible customization, integration with mobility devices, and additional channels and applications that all strongly emphasize privacy and security. IT outsourcing is gaining credibility in the education sector as budgets shrink and demand for distance courses, remote services, and access to online information increases. Institutions need to recognize the growing appeal of student collaboration tools such as wikis and blogs, and actively formulate a strategy to manage students' expectations and expedite their collaboration requirements. Finally, video games are on track to be incorporated into many educational areas thanks to their interactivity and sociability potential, and it would pay to investigate how gaming could change higher education as well as how its increased use would impel such changes.
    Click Here to View Full Article

  • "A Conversation With Ray Ozzie"
    Queue (11/05) Vol. 3, No. 9, P. 18; Kellogg, Wendy

    Microsoft chief technical officer Ray Ozzie's interest is the enhancement of collaboration between people via technology, and such collaboration is becoming increasingly important as organizations are forced to incorporate outside business partners into their core practices. Email serves as the default technology for collaboration, but Ozzie says the age and the stress it has been subjected to over the decades is making alternatives such as blogs, wikis, or Groove more appealing to customers. "Frankly, the path that we're on leads one to believe that a lot of the benefits of these innovations are accruing to small businesses and individuals much more readily than enterprises," states Ozzie. "The reason: Enterprises are really different from the public Internet in that they have fairly substantial compliance issues." Ozzie says technologists in businesses should realize that collaboration technology can aid changes in business process or culture, but it is not a cure-all. A distinction must also be made between the roles of IT, the line of business, and the individual; the line of business' job is to understand the key collaborative processes. Challenging areas Ozzie perceives in the collaboration technology effort include user authentication in the Internet environment, the use of products capable of offline support, and incorporating the ability to work both on- and offline into a collaborative solution. Ozzie would like all citizens to receive broadband and have access to wireless infrastructure, because those advantages can change people's interaction with other people, devices, and software.


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM