HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 733:  Monday, December 20, 2004

  • "On the Open Internet, a Web of Dark Alleys"
    New York Times (12/20/04) P. C1; Zeller Jr., Tom

    Former central intelligence director George Tenet urged a corralling of cyberspace at a Dec. 1 technology security conference, warning that otherwise terrorists will continue to "work anonymously and remotely to inflict enormous damage at little cost or risk to themselves." Though recently introduced legislation and other measures may help fortify government networks against intrusion and cyberattack, reining in anonymous online communications between terrorists and other malevolent parties may ultimately prove futile, given the many options available to them. Terrorists do not need superior technical skills to communicate over the Internet under law enforcement's radar: Interviews with several terror suspects conducted by the Al Jazeera TV network a few years ago hinted that the Sept. 11 bombers corresponded using prearranged codewords, a decidedly low-tech strategy. In fact, every computer terminal linked to the Internet has the potential to be used as a "dark alley" where sinister elements can virtually meet with little danger of detection from authorities. There is also proof that terrorists have employed encryption technologies, and the open nature of the Internet makes finding anonymization tools relatively simple. Computer forensics expert Michael Caloyannides notes that there are hundreds of encrypted messages daily on public Usenet newsgroups, many of which originate from bogus email accounts and whose intended recipients are unknown. For instance, an undercover correspondent looking for a clandestine message at a specific newsgroup can download a series of missives and employ an encryption key on one with a prearranged subject line. In a 2003 Parameters article, Lt. Col. Timothy Thomas with the U.S. Army's Foreign Military Studies Office wrote that cyberplanning may constitute a greater threat than cyberterrorist attacks.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "File-Sharing Tool Gaining Users"
    Los Angeles Times (12/20/04) P. C2; Veiga, Alex

    BitTorrent online file-sharing software, which enables users to share large fragments of data, operates differently from other types of file-sharing software in that data flows faster as the number of people using the software increases. BitTorrent is less vulnerable to spoofing than programs such as Kazaa and Morpheus: Those programs let users connect their PCs to networks and run searches for files or titles they are after, but BitTorrent uses "seed" Web sites that furnish a list of links to people offering files through the BitTorrent protocol, and then the software integrates file chunks acquired from all users sharing the file into complete files. CacheLogic CTO Andrew Parker attributes up to 50 percent of all online file-swapping to BitTorrent users, and John Malcolm with the Motion Picture Association of America describes the software as "more of a threat [to the entertainment industry] because it is probably the latest and best technological tool for transferring large files like movies." MediaDefender CEO Randy Saaf notes that BitTorrent does not anonymize users' online identities, which means that entertainment companies can sue BitTorrent users like any other digital pirates. Some BitTorrent sites that host seed files have been successfully shuttered by industry pressure, but many others avoid investigation because they do not host copyrighted content, but marker files. BitTorrent's creator, Bram Cohen, says he invented the software as a tool to ease the distribution of content among users over the Internet, and does not consider himself responsible for those employing BitTorrent to commit digital piracy. Nicholas Reville with the Downhill Battle independent music group says BitTorrent is exciting because of its potential to allow users "to blog video and blog their own home movies [and] independent films and have a way to distribute them without having to have a big budget for Web hosting."
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Network Robot Project Gets Boost"
    Korea Times (12/19/04); Tae-gyu, Kim

    South Korea's network-based robots are a good idea that will dramatically speed tasks, says Carnegie Mellon University computer science professor Raj Reddy. Starting next year, the South Korean Ministry of Information and Communications plans to test five different types of robots that will automate tasks in the home environment and at post offices. The robots' intelligence will reside on the network, including sensing and processing functions, while the robots themselves will simply be an interface between the network and real world. Although the network-based robot project is headed in the right direction, Reddy warns that checks and balances need to be implemented to prevent dangerous situations; eventually, artificially intelligent robots will assume more and more human tasks until humans are responsible for just 1 percent of the shared workload, which will occur in roughly 100 years, according to Reddy. The famous 80/20 rule, when applied to robotics, states that robots should assume 80 percent of tasks while the automated portion continually grows. Human productivity, meanwhile, will be significantly increased with desktop computers with terascale and petascale computing capabilities. Reddy says South Korea provides a model for Asian countries in its IT efforts, and that the nation's communications infrastructure and other achievements are significant given the relatively small population of just 48 million people. Reddy won the Bronze Tower award from the South Korean government in 2002 because of his efforts in linking South Korean IT development with research groups at Carnegie Mellon.
    Click Here to View Full Article

  • "Anthropologists to Beat Gadget Rage"
    New Scientist (12/20/04); Knight, Will

    Companies are enlisting anthropologists to make high-tech products and services easier to use. For example, Lancaster University anthropologist Lucy Suchman reasoned in the 1980s that users of Xerox photocopiers would be less frustrated if there was a large green Copy button, which became a standard component. North Carolina University anthropologist Travis Breaux expects people in his profession to play a larger role in gadget design, noting that "Ethnographic methods are being applied to friend-finding networks such as Friendster, multi-player online role-playing games such as Everquest and online dating systems." These technologies are likewise proving to be valuable tools for social science research. "Future technologies will in turn be affected by our studies of the way people behave on these networks," Breaux predicts. Anthropologist Richard Harper has used Bronislaw Malinowski's study of the Trobriand islanders' "kula" ritual to launch Vodafone's Postcard service. In much the same way that the islanders exchange ornamental seashell jewelry to solidify social bonds between island groups, the Postcard service allows users to send an MMS picture-and-text message to Vodafone, which turns it into a postcard and sends it to any recipient, in the hope that the recipient will be encouraged to send his own postcard and thus increase the network's subscriber population. Meanwhile, sociologist Abigail Sellen was hired by Microsoft to observe users' at-home PC use in an effort to give software engineers a better idea of what kinds of software will appeal to ordinary users.
    Click Here to View Full Article

  • "Scientists Hope to End Book-Breaking Work"
    NewsFactor Network (12/17/04); Martin, Mike

    One of the frustrations of copying material in books is the risk of damaging their spines by flattening the books against the copier's scanning surface so as to avoid warped and poorly illuminated text that may inhibit clear reading. To counter this problem, Xerox researchers Beilei Xu and Robert Loce have inexpensively refined the software of common scanners by embedding a mathematical formula that compensates for disparities in distance from the platen along a bound book page. This removes the distortion of words running along the page's center as well as eliminates darker copy sections where the page is bound to the book. Xu says the updated copier software calculates the distance of the book from the scanning surface for every page pixel by employing the same light the scanners beam and analyze. Xu and Loce detailed their new scanning technology at last month's 5th International Conference on Imaging Science and Hard Copy. City College of New York computer science professor George Wolberg notes that Google intends to make major library collections available over the Internet, an effort that will involve the digitization of scores of books. "Any book-scanning technology must address digital correction of warped pages of text," he says.
    Click Here to View Full Article

  • "Rice University Computer Scientists Find a Flaw in Google's New Desktop Search Program"
    New York Times (12/20/04) P. C3; Markoff, John

    Rice University computer scientist Dan Wallach and two graduate students uncovered a potentially crucial composition flaw in Google's recently released desktop PC search program. The program indexes content on a user's local hard disk and then integrates Web search results with local user information such as email and text documents; the bug would affect the search so as only to reveal small chunks of the files. The software is designed to allow user queries rather than locally stored information to be disseminated over the Internet, but Google can place AdWords text advertisements next to the search results exhibited in a user's browser window by reading queries sent to Google's search service. The Rice researchers reported that the program seeks traffic that appears to be going to Google.com and then embeds results from a user's hard disk for a specific query, noting that the program can be fooled into inserting those results into other Web pages where they could be exploited by an attacker. Wallach and his students built a Java program that established computer links back to the computer from where it was downloaded and then made it seem as if it was requesting a search at Google.com, which successfully tricked the Google desktop program into disclosing the user's search data. The researchers notified Google of the vulnerability in late November, and Google issued a statement over the weekend indicating that the company started distributing a patched version of the software on Dec. 10. The program allows Google to automatically implement new versions of the software on users' PCs without users knowing it or being able to intercede. Microsoft and Yahoo! are competing with Google with similar search tools, and the Rice researchers said the Microsoft offering apparently does not blend Web and local search results in the same way the Google program does.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "BT's Buzzing Hive of Ideas Has Designs on the Future"
    Computing (12/17/04); Samuels, Mark

    British Telecom's BT Exact research division is the center of a growing innovation hub located near Ipswich in the United Kingdom. BT Exact is joined in the Adastral Park site by a number of spinoff firms and academic institutions such as University College London and the University of Essex. U.K. startups are generating more interest from venture capitalists compared to just a few years ago, according to Evolved Networks chief operations officer Chris Sharpe, whose automated access network management firm spun off from BT in 2003. BT Pulse is focused on health care applications such as radio-frequency identification (RFID) in hospitals; the U.K. National Health Service is expected to adopt RFID in a few years, and applications are being tested at Adastral Park, including automated patient check-in, medication tracking and alerts, and automatic supply re-ordering. Although no single RFID application is radical, the cumulative effect is very significant, says BT pharmaceutical marketing manager Gary Hawksworth. Vidus offers automated field-service management technology first created at BT Exact and already counts a number of European blue-chip firms among its clientele since spinning off from BT in 2003. Vidus systems allow companies to improve the efficiency of their field-service dispatch systems through real-time data feeds, including information about the weather, traffic, and priority given certain customers. Evolved Networks is another recent spinoff and offers software to automated access network management, which is usually the highly manual task of connecting individual subscribers to the telecom system.
    Click Here to View Full Article

  • "Open 3D Operating System Ripe for Commercial Development"
    GraphicsIQ (12/16/04); Weinberger, K.E.

    The Croquet Project has tapped its first third-party developer to create commercial software for Croquet, an open-source computer software and network architecture. 3Dsolve, a developer of collaborative simulation learning solutions with interactive 3D graphics, will help researchers at the University of Wisconsin-Madison develop Croquet and incorporate the open 3D operating system in key applications in the years to come. "At the University of Wisconsin, a software development focus is in integrating peer-to-peer systems, especially Croquet, with existing authentication and identity management software solutions," says Dr. Julian Lombardi, assistant director of the Division of Information Technology. Developers would be able to use Croquet to develop sophisticated three-dimensional user interfaces that enable many people to share, organize, customize, and access data. And Croquet might support communication via voice, video, and text, and individual or collaborative design of shareable spaces by end users one day. The Jasmine version was released in October, and the beta of Croquet is scheduled for next fall. Croquet has many potential applications, including medical imaging, training, K-12 education, and the government and military.
    Click Here to View Full Article

  • "Synthetic Vision Is No Fake"
    Washington Technology (12/13/04) Vol. 19, No. 18, P. 20; Beizer, Doug

    The Synthetic Vision system developed by NASA and its industry collaborators aims to dramatically reduce airplane crashes attributed to poor visibility by integrating a high-resolution display, terrain databases, and Global Positioning System technology. "The current baseline set of [aircraft] instrumentation requires a mental integration of information to form the picture of where the airplane is in respect to the terrain, airport and obstacles," explains NASA project manager Dan Baize. "What Synthetic Vision does is remove the workload of the mental integration and provide the flight crew with a visual picture that gives them all the data they need to remain safe." The video game-like display shows the terrain, obstacles, the approach path, and runways; photorealistic displays with landmarks and other fine details can be furnished by databases, but Baize says pilots can more effectively absorb information with generic texturing, which will probably be employed in first-generation Synthetic Vision projects. Furthermore, Synthetic Vision makes sure that the data it is displaying correlates with the aircraft's actual location through the use of an integrity-monitoring system. Baize reports that Synthetic Vision is more intuitive and detailed than the Highway in the Sky system, which typically displays a tunnel simulation to guide pilots. Other potential Synthetic Vision applications could include air traffic control, commercial trucking, and military operations, Baize notes. NASA's motivation for developing the system stems from the FAA's mandate to cut fatal airplane accidents by 80 percent.
    Click Here to View Full Article

  • "Net Domain Costs on the Rise?"
    CNet (12/16/04); McCullagh, Declan

    ICANN is drawing ire over a new requirement, set to take effect next year, that charges a 75-cent annual fee for all .net domain names. ICANN is able to implement the new fee system because VeriSign's contract as operator of the .net TLD expires next June, and the 75-cent fee will be included in the bidding process for a new .net registrar. Some worry that ICANN will begin charging similar fees for names registered under other TLDs, such as .com and .biz, when their operating contracts expire. Critics of the new fee for .net domains say the policy will allow ICANN to expand its annual budget by as much as $4 million without holding the organization accountable for how that money is spent. If ICANN decides to extend the fee to other popular TLDs in the future, its annual budget could increase by more than $34 million. ICANN says it will use the funds from the new .net domain levy to aid stakeholders from developing countries, improve the security and stability of the DNS, and cover other unspecified ICANN costs. Kurt Pritz, vice president of ICANN's business operations, adds: "The intent of the fee is to ease the burden of [registrars] paying a higher percentage of the fees in the long run." VeriSign VP Tom Galvin says the company is supporting the 75-cent fee because it could help ICANN "better the Internet either by improving the stability and security of the Internet or helping with best practices or helping to move Internet development into other regions." The new fee likely will get the backing of the Bush administration, which earlier had asked that ICANN broaden its funding base; ICANN tried to implement a $1 annual fee for all domain names in 1999, but the Clinton administration did not approve it.
    Click Here to View Full Article

  • "Federal Anti-Spam Law Gets Mixed Results"
    Network World (12/13/04) Vol. 21, No. 50, P. 9; Garretson, Cara

    The Controlling the Assault of Non-Solicited Pornography and Marketing (CAN-SPAM) Act passed into law by Congress last January has not stanched the flow of spam email, which has swelled in volume thanks to spammers' increasingly sophisticated tactics and their tendency to operate outside of U.S. jurisdiction. However, advocates say CAN-SPAM has demonstrated its value by establishing a unified architecture for prosecuting spammers that overrides myriad state regulations. Unspam CEO Matthew Prince says this frees up states to devise tougher spamming restrictions that piggyback on CAN-SPAM, which could in turn be embedded in new federal legislation. FTC staff attorney Katie Harrington-McBride attests that CAN-SPAM also provides guidelines that legitimate emailers can employ to differentiate themselves from spammers; such guidelines include accompanying each email with an opt-out option and a mailing address. Pillsbury Winthrop attorney Cathie Meyer says this strategy would enable legit businesses to target consumers who are genuinely interested in their offerings. The FTC says the opt-out element gives recipients control over the email they get, which was the purpose of CAN-SPAM from the beginning. "The act is really not drafted in a way to diminish the amount of email that individual consumers receive, but to empower them to limit the flow on a sender-by-sender basis," maintains Harrington-McBride. Still, critics such as Internet Mail Consortium director Paul Hoffman lament that spam has increased steadily despite CAN-SPAM's passage, with the Radicati Group forecasting that worldwide spam messages will have skyrocketed from 15 billion to 35 billion over the past year.
    Click Here to View Full Article

  • "Security Is a Moving Target"
    eWeek (12/13/04) Vol. 21, No. 50, P. D1; Coffee, Peter

    The dangers enterprise development professionals must contend with are amplified by the expanding preponderance of Web-enabled applications that expose their finished products to even more intruders, and by the accelerated development cycles of customer-facing or supply-chain-partnering software. The fact that many intruders originate from within the enterprise points to the need for bottom-up security deployment within applications. The Web-facing enterprise must adopt an application-centric security perspective that is nurtured and enforced by development managers and teams, and these developers must securely execute the applications' myriad functions and associated business processes with hackers rather than routine users in mind. There must also be intense concentration on the swelling ranks of stakeholders as IT becomes a political tool as well as a controversial public relations issue. With proper authentication, a user interacting with an application that has appropriately limited access should be accorded the same privileges no matter what kind of connection is being used. Otherwise, the user may decide to postpone data entry until he returns to the office, which could lead to less accuracy and timeliness than should be expected of wireless communications, for example. Configuration management for Web-based applications is poor for the most part, given that developers tend to retain older, less secure application iterations. Coding remnants can be another point of vulnerability, so discarding the "startup code," though it may seem unwise, is sometimes required to ultimately give clients and customers a truly secure basis for applications.
    Click Here to View Full Article

  • "Cybersecurity Slips as a Homeland Security Priority"
    InformationWeek (12/14/04); Greenemeier, Larry

    Computer security experts fear cybersecurity is losing momentum at a time when Homeland Security needs to focus more attention on the issue. The Intelligence Reform Act that Congress passed last week does not create a position for an assistant secretary of cybersecurity for the Homeland Security Department. And while the Bush administration continues to look for a Homeland Security secretary, the agency still has not had a permanent cybersecurity director for its National Cyber Security Division since October. Last February, the White House introduced a strategy to secure cyberspace that called for the creation of a national cyberspace response system, a cyberspace security-threat and vulnerability-reduction program, and a cyberspace security-awareness and training program, but the administration has made little progress on its initiatives. Moreover, leaders from academia, business, and government, as part of the National Cyber Security Partnership, formed a corporate governance task force last December that recommended actions that could be taken to make the cybersecurity initiatives a reality, but the White House has been slow to take action on its suggestions as well. Task force co-chair and RSA Security President Art Coviello noted the issue of increased interdependence of the private and public sectors as a result of the Internet. "An increasing reliance on the public Internet and wireless access has accelerated the need for improved security technology," said Coviello.
    Click Here to View Full Article

  • "Will Open Source Software Unlock the Potential of eLearning?"
    Campus Technology (12/04); Coppola, Christopher D.

    One possible reason why e-learning has fallen short of its potential is the perception of education as a market, rather than environment, for learning, argues rsmart group President Christopher D. Coppola. Markets and their commercially-oriented foundations support homogeneity, whereas environments support heterogeneity, without which e-learning cannot fully flourish. Coppola thinks communally developed and shared open source technology has strong cultural ties to higher education and its goal of disseminating knowledge to the public at large, which spurs adoption and results in new collaborative practices for creating innovative enterprise software. He says leading institutions are increasingly interested in and capable of contributing to Sakai, the Open Source Portfolio Initiative (OSPI), uPortal, and other projects dedicated to the development and distribution of enterprise software that both rivals and stimulates the advancement of proprietary software. Coppola says open source e-learning software can support more effective exploitation of technology for teaching and learning by lowering the software's costs to an institution while also giving the institution more control over its destiny; this is possible because open source can eliminate or reduce license and maintenance fees, keep commercial services regulated by market forces, and be more effectively customized. Open source e-learning applications also spur learning tech innovations by making new e-learning systems widely available. Coppola writes that these systems support innovation by adding functionality that addresses a local problem and then channels it back into the community for the greater good; they also uphold experimentation so that new projects can be sustained in a widely adopted enterprise environment, and can be creatively combined into a system whose whole is greater than the sum of its components.
    Click Here to View Full Article

  • "Trading Privacy for Health"
    Discover (12/04) Vol. 25, No. 12, P. 21; Johnson, Steven

    Doctors and computer scientists are building new information-based medicine frameworks that allow quick analysis of broad sets of patient data--but though digitizing patient records and adding new genetic test information will undoubtedly prove beneficial to health care, it also poses threats to privacy that could have serious implications if patient data is abused by insurers or prospective employers. In 1988, the FDA took several months to detect a trend among users of a new Japanese pharmaceutical, L-tryptophan, that eventually killed more than 30 people and left over 1,500 permanently disabled; current computerized systems would make this type of analysis faster by months, but nowhere near as fast as a system being constructed by IBM and the Mayo Clinic. The two organizations are building out the Mayo Clinical Life Sciences System that would allow fast, Google-like searches of combined patient records databases. With this desktop capability, researchers would be able to follow up on hunches without having to worry about time and cost penalties. "This will generate new knowledge, as opposed to being used to verify things that we're already looking for," says Mayo Clinic IT committee chair Nina Schwenk. In addition to allowing fast horizontal searches of patient records, IBM also plans to help incorporate genetic data that is the result of new genomic studies and tests; this means massive information stores and computational problems that will be addressed in part by IBM's new Blue Gene supercomputer, says IBM on demand business vice president John Lutz. Not only would such a system bolster disease-tracking in populations, but also provide better individual care based on a person's genetic and medical profile. Instead of using trial-and-error to determine the best treatment, doctors would be able to prescribe safer and more effective medications.
    Click Here to View Full Article

  • "Enterprise Search: The Next Frontier"
    Software Development (12/04) Vol. 12, No. 12, P. 36; Wang, Roland

    Enterprise search engines are increasingly desirable as the volume of unstructured enterprise data swells. Enterprise search engines, unlike Web search engines, can search files no matter what format they are written in or what repository contains them, and they boast support for numerous operating systems and enable users to deploy classification, taxonomies, personalization, profiling, agent alert, community/social networking and collaborative filtering, and real-time analysis, among other things. Standard functionalities and components of enterprise-class search engines include accommodations for multiple repositories (local hard disks, network files, groupware, databases, etc.) and nearly all of the approximately 300 file formats typical of the current enterprise warehouse; filters and connectors that can be vendor-owned, licensed from other vendors, or interfaces for third-party/customer-made modules; the ability of the search engine to add more processors or servers to scale up to increasing users and information; compatibility with diverse platforms; metadata search; international support (language support, cross-lingual search, interfaces for user-added languages, etc.); fault tolerance; concurrent load-balancing brokers to apportion the volume of data equally among multiple servers and disk paths; linguistics tools (thesauri, stemming, tokenization, word/phrase analysis, noun phrase extraction, and so on); crawling to facilitate index upgrades; security management for document access control and communication protocol encryption via authentication, single sign-on, and other features; and software development kits that let users construct search-enabled applications with no need for reengineering. The differentiating factor for enterprise search engines is how well these various features are deployed, as well as the relevance of the results they generate. Each search vendor faces the problem of uncovering truly relevant documents and boosting the signal-to-noise ratio.

  • "Sensor Webs"
    GeoWorld (12/04) Vol. 17, No. 12, P. 36; Percivall, George

    The Open Geospatial Consortium (OGC) is developing geospatial standards in order to realize the dream of an "open sensor web" envisioned by its industry, government, and academic constituents. The consortium's Sensor Web Enablement (SWE) group is focused on developing a flexible technical infrastructure for self-describing in-situ sensors and enabling devices, remote-sensing devices, stored data and live sensor feeds, and simulated models that employ sensor data so that unrefined sensor input can be processed into value-added information with semantic delineations while also enabling the connection of sensors to the network and network-centric services. SWE is a key element of the consortium's OGC Web Services 3 (OWS-3) effort to establish Web services interoperability by promoting a vendor-neutral, evolutionary compatibility architecture for Web-based discovery, access, integration, examination, exploitation, and visualization of multiple online geospatial content sources, sensor-extracted data, location services, and geoprocessing capabilities. OGC members have described such a framework using open Web services as a foundation for establishing ubiquitous geospatial data in enterprise schemas. SWE will cooperate with other, independently developed OWS-3 elements and facilitate wider-ranging enterprise capabilities such as decision support employing service chaining. The OGC will promote the open standards critical to fulfilling the potential of Web-based sensor networks through its OWS-3 Interoperability Testbed and Interoperability Experiments. OWS-3 is also playing a role in programs relating to the meshing of sensor webs into enterprise information systems. One project, SensorNet, is a public-private initiative to enable a nationwide system for detecting, identifying, and evaluating nuclear, chemical, biological, radiological, and explosive threats in real time, and the OGC's SWE standards will be a major component.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM