HomeFeedbackJoinShopSearch
Home
 
ACM TechNews sponsored by Learn more about Texis, the text-oriented database providing high-performance search engine features combined with SQL operations and a development toolkit, that powers many diverse applications, including Webinator and the Thunderstone Search Appliance.   Learn more about Texis, the text-oriented database providing high-performance search engine features combined with SQL operations and a development toolkit, that powers many diverse applications, including Webinator and the Thunderstone Search Appliance.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Thunderstone or ACM. To send comments, please write to [email protected].
Volume 7, Issue 845: Friday, September 23, 2005

  • "Paper Trail Urged as E-Voting Fix"
    Wired News (09/23/05); Zetter, Kim

    The Commission on Federal Election Reform concluded five months of deliberation this week by recommending a congressional mandate for the inclusion of voter-verifiable paper audit trails (VVPATs) in electronic voting machines by 2008. At the same time, the commission recognized that paper trails might not be the best solution to e-voting security issues, and advised researchers to devise new technologies that could more effectively alleviate these concerns. VVPATs are not tamper-proof: Election fraudsters could program e-voting machines to record one vote on paper and a different vote within the machine. The commission recommended that state election officials hold post-election audits to compare a minimum of 1 percent of paper votes against electronic votes, but refused to ask Congress to make such audits mandatory. Critics of paper trails claim few voters will actually study the paper record, while poll workers will have even more work to do to clear paper jams and fix other technical malfunctions. They also contend that recounts would be complicated and time-consuming in elections involving many races and candidates because the resulting paper trail would be so cumbersome. A major issue is the question of whether the electronic or paper record should be counted when an inconsistency comes up, and though the commission would leave this decision to individual states, associate director Daniel Calingaert admits that not all states follow the same strategy. VerifiedVoting.org's Pamela Smith reports that only about half of the 26 states with paper-trail laws require a manual audit of the paper and electronic records, and only about one-third make the VVPAT the official record in the event of a discrepancy.
    Click Here to View Full Article
    For information regarding ACM's e-voting activities, visit http://www.acm.org/usacm

  • "Demonstrating Ambient Intelligence Tools and Techniques"
    IST Results (09/22/05)

    The IST-funded OZONE project explored, developed, and deployed a generic architecture for consumer oriented ambient intelligence applications that was tested in demonstrations designed to provide seamless content access, in-home content distribution, and an extended home environment. The first demonstration concentrated on supplying users with personalized video content using direct delivery from a server via wireless access points; the second involved a location- and activity-sensitive consumer electronics platform that accommodated both wired and wireless transmission; and the third sought to provide data to users on the move through a simple, intuitive, and secure technology. The OZONE framework features a top layer that facilitates service enablement based on context awareness or sensitivity, a middle layer supporting a software environment with a priority on seamless task migration, and a bottom layer providing a computer platform that combines high performance with power efficiency. Advancements the OZONE demonstrators supported include adaptation of content to user preferences through a simple, multimodal, intelligent, and interactive interface; accessibility of data from new portable, lightweight, friendly, and ubiquitous devices; presentation of content that can be adjusted according to the device, the time of day, and the user's location; transparent manageability of networks, content, services, and devices; and the expansion of the "one content to one device" model to "one content many devices." OZONE technical coordinator Harmke de Groot says privacy and security in networked households is still a source of concern, but the trials demonstrated "that users very much like the 'follow me' concept, which is having data available when they want it."
    Click Here to View Full Article

  • "Battling Google, Microsoft Changes How It Builds Software"
    Wall Street Journal (09/23/05) P. A1; Guth, Robert A.

    To protect its market share from emerging rivals, Microsoft is revisiting its formula for developing software. The new market environment sees the rapid introduction of software over the Internet while the provider watches it in action, observing its shortcomings, and modifying it to improve its effectiveness. Since Microsoft's key product is the massive operating system that oversees all of a computer's functions, it is more difficult to attain that agility, though the company has taken a cue from Google and developed a core program to which features can be added relatively easily. Microsoft recently announced a wholesale restructuring of the company, creating three main business units and identifying more agile programming as one of its key goals. Microsoft's traditional method of delivering user-friendly features quickly, while only addressing bugs with post-release patches, left little flexibility for supplementing the software with new features. As the Longhorn project developed, it quickly became clear to the 4,000 programmers involved that writing code independently, then testing its integrated functionality and debugging was not a viable process, particularly with the pressing task of including the desktop search program WinFS that Bill Gates had elevated to a top priority. The companies that were fast becoming Microsoft's rivals developed cohesive programs made of integrated components, each with a singular function in mind, tested them, and dispatched them over the Internet. Microsoft enacted a fundamental shift in the Longhorn code development, subordinating features to core, bug-free code, which over time netted significant improvements in quality and turnaround time, and when the beta version was sent out for testing, a far smaller number of problems were returned than anticipated.

  • "Name That Worm--Plan Looks to Cut Through Chaos"
    CNet (09/22/05); Evers, Joris

    Last month, a worm with various names wreaked havoc on Windows 2000 operating systems, abetted by the chaotic and fractured attempts to identify it. To address that issue the CME naming system has emerged, which tags a given piece of malware with a unique identifier. The United States Computer Emergency Readiness Team (US-CERT) says its product will provide a common identifier to help users identify which threat is attacking their system, and notify them if they are protected or not. CME promises to fulfill the longstanding goal of the security industry to agree on a unified system to name viruses and worms; industry participation in CME is voluntary, and will be a key factor in the initiative's success. When multiple security companies create different names for the same outbreak, there is often widespread confusion as to whether or not there is one threat or multiple, related threats. Organizations that use multiple security products from different vendors are often confounded by multiple alerts of the same virus or worm with different names. At first, CME will only issue numbers to major threats, though US-CERT plans eventually to cover all attacks. Regardless of the names security vendors produce, CME will assign an attack with a random number within hours of its discovery, and tag it with its associated characteristics; then security companies are urged to include the CME tag with whatever semantic description they produce, so as to create a commonality that helps users understand the actual scope of the threat.
    Click Here to View Full Article

  • "Cell Phones Could Send Real-Time Safety Data"
    Daily Californian (09/21/05); Sharp, Sonja

    A team of UC Berkeley researchers led by computer science graduate student R.J. Honicky are designing passive sensors that can turn cell phones into transmitters of environmental and geospatial data that could be critical to first responders and others. The information collected by the sensors could be used by researchers to map out air pollution or radiation levels, or real-time safety data that first responders could use through their phones should a terror attack or natural catastrophe occur. Honicky says current wireless networks that gather water quality or air pollution data rely on expensive and short-lived sensor network units, while cell phones have the advantage of already boasting most of the technology needed to set up a network. He says this "great economy of scale" would allow scientists to access the technology in developing nations where environmental monitoring efforts receive little funding and larger networks are unaffordable. Cell phones could be outfitted with sensors with installation costs of less than a dollar per unit, according to researchers. Scientists also visualize exterior sensors that could be affixed to a cell phone and used to measure pollution levels in water.
    Click Here to View Full Article

  • "Wearable Technology to Aid Disaster Relief"
    Computerworld Australia (09/20/05); McConnachie, Dahna

    For more than seven years the University of South Australia has been developing a wearable augmented reality (AR) system whose potential applications include disaster relief operations, according to professor Bruce Thomas of the university's wearable computer laboratory. The system is comprised of a computer that can be carried in a backpack, virtual reality goggles, and a video camera that can send information to a control room through wireless LAN or 3G networks. Thomas says the system would enable on-site disaster relief workers to collect digital images, videos, and voice information that are then geospatially mapped to data sources in the control room, enabling more effective remote collaboration with experts and supervisors. In addition, the control center can generate 3D maps and images that field operatives can see through their goggles. Thomas believes the technology could also find use in defense and viticulture applications. The professor says his team has been developing and exploring control room technologies for a number of years, but notes that the wearable technology project differs from other initiatives in several respects, "such as visualization of real-time information from one or more people in [the] field, directing people in the field, communicating with people in the field with AR information, and the presentation of data in a temporal, coherent fashion." The project and its applications will be presented by Thomas at the South East Asian Regional Computer Confederation 2005 Conference next week.
    Click Here to View Full Article

  • "By George, You've Won!"
    Guardian Unlimited (UK) (09/21/05); Burkeman, Oliver

    Computer scientist Rollo Carpenter won the 2005 Loebner prize for George, a software program that was deemed the year's most convincing conversationalist. George differs from most previous programs in that its responses are not based on a few preprogrammed language rules; rather, it has "learned" to make sensible conversation by participating in over 5.7 million exchanges with thousands of people who visited Carpenter's Jabberwacky.com Web site. Carpenter says some visitors talk with George for as long as seven hours. George is all the more fascinating in that it is given to bouts of distemper and is generally curmudgeonly, which may encourage those who converse with the program to identify it as human, at least on a semi-conscious level. Carpenter says George thinks, from a certain perspective. "My program would know precisely nothing about language, had it not learned," he explains. "So, to a reasonable degree, you could say that it's building a non-human form of understanding." The methodology used to determine the Loebner prize winner is the Turing test, a measure for machine intelligence based on the assumption that a machine can converse so convincingly as to be mistaken for a human by another human.
    Click Here to View Full Article

  • "Firefox Faces Challenges as It Matures"
    IDG News Service (09/21/05); Perez, Juan Carlos

    In its first year, Mozilla's Firefox has achieved a more significant share of the Internet browser market than any of Internet Explorer's other competitors, though its future will not be without its share of challenges, most notably issues concerning security, expanding its user base, and surviving the forthcoming upgrade to Internet Explorer. Recent vulnerabilities have dispelled the myth that Firefox was a completely secure browser, a revelation that is emerging as Microsoft is in the testing stage of Internet Explorer 7, which promises to address many of its highly publicized security gaps that had essentially been ignored. As Mozilla scrambles to keep Firefox ahead of Internet Explorer in features and security, a 1.5 version is in beta testing, though there is some doubt as to whether or not the changes will be sweeping enough to encourage greater acceptance among everyday users and IT departments. Mozilla says the new version's enhanced capacity for caching and pre-rendering content will enable improved Web navigation. An automated update feature alerts users to the most recent patches and improvements to keep the browser current. As Firefox has proved that it is not immune to security vulnerabilities, much of its initial fervor has dropped off; analysts say the key to its sustained viability will be to react quickly to security issues, and to make upgrades readily and promptly available for users. The growing pains inherent to a maturing technology pose a challenge to Firefox compounded by a reinvigorated Internet Explorer, and Firefox will likely retain its niche status for the foreseeable future.
    Click Here to View Full Article

  • "New Technology Aims to Improve Internet Access for the Impaired"
    Wall Street Journal (09/22/05) P. B6; Reiter, Chris

    The government, standards bodies, and companies such as Microsoft and IBM are working to improve the accessibility of computer programs and the Internet to disabled users through new technologies. IBM has contributed software programming that will allow assistive features to be incorporated into Mozilla's upcoming Firefox 1.5 Web browser. Such features will include the ability to tab over to pull-down menus in the browser window through the keyboard rather than the mouse, and Mozilla also plans to expand screen reader support. Microsoft's new Windows Vista operating system promises to streamline navigation and standardize common elements in Windows-enabled programs through its User Interface Automation Design, which the company will offer to the industry without any royalty scheme attached. Dissimilar accessibility standards among various industry and federal standards groups have long thwarted improved access to Web pages. Such access will enable businesses to tap the enormous market of visually impaired consumers, says Knowbility executive director Sharron Rush. Microsoft estimates that the number of U.S. citizens using assistive technology will rise from 57 million in 2003 to 70 million by the end of the decade.

  • "Perpetual Stumbling Block: Battery Life, Storage Capacity"
    USA Today (09/22/05) P. 1B; Kessler, Michelle

    Battery life and storage capacity lag behind the increasing capabilities and sophistication of consumer electronics, a problem compounded by the sheer diversity of batteries and storage products available. The basic structure of battery technology has remained unchanged for over 60 years. Many electronic manufacturers use proprietary batteries despite the performance problems they can create because the capabilities of standard batteries are constrained by a rigid shape and size. By contrast, storage technologies have improved substantially as researchers become increasingly adept at partitioning disks into smaller and smaller data storage segments. Some products are designed to tweak their performance to compensate for lingering storage and battery problems. Nokia cell phones darken their displays after several minutes of idleness, and many IBM laptops automatically dim their displays when they switch from outlet to battery power. Consumers can also take action to conserve battery power or storage space. Storage maximization techniques include recording content at a lower quality or resolution, and compressing files.
    Click Here to View Full Article

  • "The Next 50 Years of Computer Security: An Interview With Alan Cox"
    O'Reilly Network (09/12/05); Dumbill, Edd

    EuroOSCON keynote speaker and Linux kernel developer Alan Cox describes computer security as "basic" and "reactive," but starting to show signs of improvement. He says the interim between the discovery of bugs and the launch of exploits has shrunk, and exploits will improve in tandem with software tools; because Linux offers greater security than many competitors, it is less vulnerable to exploits, but Cox says no system--Linux included--provides enough protection. Promising developments Cox points to include a significant uptake in code verification and analysis tools, which helps prevent the introduction of errors within production, and a movement toward in-depth defense through the use of SELinux, no-execute flags in processors and software emulation, and randomization of where objects are located in memory. He notes that SELinux can also be employed to make users more security-conscious by turning behavioral advisories into policy. Cox believes the incorporation of security into software development tools can be done without hindering developers' productivity because many improvements automate tedious chores. Cox says the cost of cleaning up the mess caused by system breaches is the current driver of secure software implementation, while the bad publicity this entails as well as statutory duties with data protection are further incentives. He reasons that lawsuits from the government or users harmed by poorly run systems might also encourage security deployments. "In theory as we get better at security the expected standard rises and those who fail to keep up would become more and more exposed to negligence claims," Cox says.
    Click Here to View Full Article

  • "New MPEG Standard Starts to Take Shape"
    TechNewsWorld (09/20/05); Korzeniowski, Paul

    With the number of HDTVs worldwide expected to increase from about 10 million to 52 million by 2009, a new video codec standard has emerged: MPEG H.264 is used to deliver broadband programming, and could eventually govern the video transmissions of cell phones, desktop PCs, and portable computers. Providing HDTV demands considerable bandwidth, with each channel typically requiring 25 million bytes. As HDTV emerged in the popular sphere, vendors set to work on new video compression techniques and multiple standards appeared; MPEG H.264 was ratified in 2003, and offers a three to one compression rate, which reduces the bandwidth requirements for service providers. Microsoft began promoting its own standard, Windows Media Video, but has since retreated in its support for the standard, leaving the market open for MPEG H.264. Comcast, Echostar, and DirecTV have all adopted the standard, and now many set top boxes and DVRs support it. The standard is still relatively new, and has some interoperability issues, while the heavy investment many providers have made in MPEG2 has made them slow to implement the upgrade. As MPEG H.264's popularity grows, it has ventured into other media, such as Sony's Playstation video game console, with more applications in a diverse set of devices certain to follow.
    Click Here to View Full Article

  • "Talking in the Dark"
    New York Times Magazine (09/18/05) P. 24; Thompson, Clive

    The recent of experience of Hurricane Katrina was an excruciating lesson in the utter dependence we have on our communications systems: The panic and chaos that followed the storm were exacerbated by the failure of our communications networks, as the only devices that still worked for the week after Katrina hit were satellite phones and two-way radios. Wi-Fi mesh offers a self-correcting communication system capable of surviving a disaster of Katrina's magnitude. Conventional phone systems are centrally operated, meaning that the disruption of a small cache of switches affects service for a large portion of users; also, they are frequently overwhelmed in times of disaster, as they are only designed to allow 10 percent of customers to talk at once. Wi-Fi mesh systems are inexpensive and decentralized, and can easily support a phone system impervious to disaster. Meshed Wi-Fi can be thought of as a widescale bucket brigade, as each node transmits data to the next, located only a few hundred feet away; Wi-Fi also supports VoIP, and enables widespread connectivity to the Internet if just one user is logged on. Mesh networks are ideal for disaster situations, as the removal of a given node does nothing to disrupt a widely implemented Wi-Fi network. They are also remarkably efficient and inexpensive, as each node only consumes about 10 watts, and carries an implementation cost of around $350, a figure that increases to $650 with the addition of an emergency battery. Though Wi-Fi nodes do require a clear line of sight to communicate with each other, their marginal cost makes their widespread implementation in densely clustered urban areas eminently viable.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "New Technology Aims to Making Academic File Sharing Easier"
    Associated Press (09/21/05); Armas, Genaro C.

    A new peer-to-peer network that would allow academic researchers to share photos, research, class materials, and other types of information more easily is expected to be available for general use on Sept. 30. Mike Halm, director of the LionShare project at Penn State University, describes the technology as being much more than an academic version of Napster because it combines file sharing with repository searching to make a single search "like Google-searching the Internet." Faculty, researchers, and students would be able to use the private network to share and search for large files among colleagues or other institutions. Electronic Frontier Foundation intellectual property attorney Fred von Lohmann sees the benefits of being able to share files without having your own Web server, but still expresses concern about the potential for violation of copyright infringements on the closed networks. "It all comes down to how people share content and what restrictions they put on the content that they share," explains Halm. Researchers at Penn State have led the development of LionShare, which has also included the participation of the Internet2 consortium and Simon Fraser University in Canada.
    Click Here to View Full Article

  • "NSA Granted Net Location-Tracking Patent"
    CNet (09/21/05); McCullagh, Declan

    The National Security Agency has received a patent for a technique that can be used to determine the geographic location of an Internet user. Although NSA does not offer any specifics about the potential uses of the geo-location technology, the method could be used to enhance the agency's signals intelligence mission, an initiative for spying on the communications of foreigners. The NSA patent measures the difference in time between computers exchanging data of many locations on the Internet and building a topology map of the known locations of Internet addresses. The thinking is that an Internet address can be identified on the map by measuring how long it takes known computers to connect to an unknown computer. "It wouldn't give precision, but it would give them a clue that they could use to narrow down the location with other intelligence methods," says geo-location technology expert Mike Liebhold, a senior researcher at the Institute for the Future. Still, the technique can only trace dial-up users to their Internet service providers, and has no answer for proxy services such as Anonymizer.
    Click Here to View Full Article

  • "A Sci-Fi Future Awaits the Court"
    Wired News (09/22/05); Schneier, Bruce

    The possibility of John Roberts being confirmed as chief justice for the Supreme Court concerns Counterpane Internet Security CTO Bruce Schneier, because technology-driven privacy challenges currently in the realm of science fiction will likely go before the court in the next several decades, and Roberts' opinions on privacy remain vague and dismissive. Schneier is confident that genetic mapping could one day be cheaply and easily carried out without a person's knowledge, and he wonders what protections, if any, people have against government or private actions based on the data provided by such technology. The Counterpane CTO also expects technological innovation to make citizen surveillance even more personal and intrusive through advances in automatic face recognition and data mining. One of the sinister consequences could be the suspicion and detention of people for crimes or conspiracies based on an indeterminate series of coincidences. Schneier also posits a scenario in which corporations can use the results of data mining to deny people services, jobs, or mortgages. The U.S. Constitution's failure to explicitly enumerate the right to privacy is also a source of concern, and certain legal scholars are calling for an overhaul of privacy law. Schneier concludes that "we as a country are likely to face enormous challenges to personal privacy in the decades ahead. And the Supreme Court will increasingly have to rule on questions so far only discussed in science fiction books."
    Click Here to View Full Article

  • "Clarkson University Wins First TuxMasters Invitational"
    NewsForge (09/15/05); Lyman, Jay

    Clarkson University in Potsdam, N.Y., recently took first and second place in the first TuxMasters Invitational coding contest, a competition sponsored by Unisys and the Open Source Development Labs' (OSDL) Data Center Linux Initiative aimed at promoting Linux and the general open source community. The first prize was awarded to graduate students Patricia Jablonski and Todd DeShane for their system that manages and searches large sets of frequently accessed data. That their system employs collaborative data mining to address the central problem of understanding large data sets argues for its broad relevance to the enterprise environment. Regarding the project's utility to the business community, Jablonski and DeShane wrote that their "project accomplishes this by focusing on human-computer interaction, usability, presentation, and documentation." The system coalesces multiple databases and caches the results in a digestible format, as well as permits users to build on each other's search queries. Open source projects hold the added advantage for students that they could actually have an impact on real-world computing. Clarkson professor Jeanna Matthews has advised students on seven winning entries in open source competitions since 2001, a streak of successes that has solidified Clarkson's niche in the open source academic community, as well as earning the school substantial funding through prize money. The deadline for the second TuxMasters Invitational comes at the end of the month, and OSDL's Derek Rodner expects a greater turnout and stresses to all entrants the demands of business-ready Linux, noting that in this competition Clarkson was well ahead of the pack on that front.
    Click Here to View Full Article

  • "IT Pros Aid in Search for Katrina Victims"
    Computerworld (09/19/05) P. 8; Vijayan, Jaikumar

    Individual IT volunteers and nonprofit organizations have pitched in to help locate missing victims of Hurricane Katrina with Web sites, hotlines, and computing resources. Technology For All, based in Houston, set up a computer center at the Astrodome to help evacuees register as survivors and search for missing friends and relatives. Technology For All President Will Reed said the effort involved deploying some 140 PCs at the stadium and two other relocation facilities, 150 access devices, and a wireless network that allowed volunteers to walk around the Astrodome and enter data about evacuees through handhelds. Meanwhile, the National Center for Missing & Exploited Children (NCMEC) was enlisted by the government to locate storm victims, and IT director Steven Gelfound said the resulting uptick in network traffic was so massive that NCMEC brought several old Web servers out of mothballs to handle it. In addition, the center has established a call center equipped with voice-over-IP and wireless networking technologies, and has sent personnel to survivor relocation centers to help transmit digital photos and other information back to headquarters. Former Unix administrator Dan Chaney contributed to the effort with a missing-persons Web site that was first hosted by his own home-based Linux server. When the traffic threatened to overwhelm Chaney's T1 line, Yahoo! hosted the site on its servers for free.
    Click Here to View Full Article

  • "Bringing Network Effects to Pervasive Spaces"
    IEEE Pervasive Computing (09/05) Vol. 4, No. 3, P. 15; Edwards, W. Keith; Newman, Mark W.; Smith, Trevor F.

    A team of researchers from the Georgia Tech College of Computing and the Palo Alto Research Center (PARC) has developed the Obje Interoperability Framework, middleware technology that facilitates interaction between networked applications and services, even in instances where they know next to nothing about each other. The researchers say the technology represents part of an effort "to move away from a model of combinatorial complexity toward one of constant complexity--where integrating each new device is just as easy as integrating the one before it." Obje employs a handful of simple and consistent abstractions that are assumed to be universally understood by all network peers: According to the infrastructure, a device can connect to other devices, supply metadata about itself, provide references to other devices, and be controlled. Devices can therefore be compatible with new device types added to the network, provided they use these abstractions to represent their functionality. The end result is devices that can interact using whatever protocols are applicable for the data being exchanged, while also allowing the recipient to render the received data. Though Obje-enabled devices might interoperate with a new device, they may not know what operations that device performs or whether it is appropriate to communicate with the device--and if so, when. The researchers envision an environment where "Users will be able to enter a pervasive space, quickly assess device capabilities, and then assemble the devices into a desired configuration."
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM