HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 726: Friday, December 3, 2004

  • "Voting Errors Tallied Nationwide"
    Boston Globe (12/01/04); Mooney, Brian C.

    At least 12 U.S. states have reported voting glitches in the month following the presidential election, citing incidents of overvoting, undervoting, misallocation, and voter disenfranchisement owing to technical or human error. "We are convinced that while the election went relatively smoothly compared to what many had expected, that does not eliminate the need to study the results and collect data to document machine malfunctions and other administrative matters," notes U.S. Election Assistance Commission Chairman DeForest Soaries. Indeed, the problems were serious and numerous enough to warrant nationwide probes by the commission and the Government Accountability Office, while minor presidential candidates in several states have requested recounts. Most concerns are oriented around electronic voting or vote tabulation systems, including some that do not provide a paper trail for recounts and audits. Meanwhile, the Black Box Voting organization wants to run an audit of election results around the country by going after election records, with particular emphasis on the state of Florida; Black Box founder Bev Harris believes there is a strong likelihood that electoral fraud occurred in certain polling places on Nov. 2. Florida is in activists' cross-hairs partly on the strength of several widely publicized studies indicating that vote tabulation technologies inflated Bush's state totals, although their conclusions are disputable. One study conducted by a University of California-Berkeley team using statistical analysis estimates that Bush may have received 130,000 excess votes because of e-voting machine "irregularities." Electiononline.org director Doug Chapin recommends that more machines be set up in polling places, and that the nationwide voting modernization program mandated by the Help America Vote Act of 2002 be completed.
    Click Here to View Full Article
    For information on ACM's e-voting activities, visit http://www.acm.org/usacm

  • "Founders Face the Future as the Web Turns 10"
    eWeek (12/01/04); Vaas, Lisa

    The founders of the Internet examined its impact and looked ahead toward future developments at the 10th anniversary celebration of the World Wide Web Consortium on Dec. 1. Pew Internet & American Life Project director Lee Rainie put most of the Web's effects in perspective, noting that its maturation over the last decade has been accompanied by a closer alignment with world demographics, as demonstrated by more mainstream usage by women, minorities, and lower-income households. "People's use of the Internet took on the cast that's familiar with anybody who studies media usage: Whites and blacks, religious and secular, male and female, they all use the Internet differently," he explained. Rainie observed that, contrary to expectations, Internet users have actually become more sociable than nonusers. He also said more experienced Web users employ the Internet more seriously, graduating from email correspondence to online health and money management, for instance, while improved Web connections leads to more intense Web activity. One of the most consistent factors in the Internet's evolution is email's lingering status as the killer application (58 million Americans were using email on Dec. 30 alone). In addition, the Web has enhanced civic-minded activities such as political campaigning, and turned the doctor-patient relationship on its head. Anticipated future breakthroughs include Web security standardization; business processes that can prevent data overload in the retail industry as a result of radio frequency identification technologies; and a Semantic Web that promises to deliver universal Web accessibility, mobility, and multimodal interaction.
    Click Here to View Full Article

  • "The Tangled Internet: Is It Time for a New One?"
    Christian Science Monitor (12/02/04) P. 13; Lamb, Gregory M.

    A battle is being waged for the soul of the Internet, with online anarchists and criminals fighting against corporate interests who want to exploit commercial opportunities. Many of the fundamental principles on which the Internet was founded are no longer valid, such as the assumption that participants could be trusted not to exploit an open environment, says Harvard Law School Berkman Center for Internet & Society cofounder Jonathan Zittrain. He says Internet pioneers were more concerned with keeping the system away from centralized control than they were about protecting it from hackers, but today, that openness is being threatened by companies who are clamping down on file-sharing services, for example. UCLA professor Leonard Kleinrock says the Internet still has many surprises yet to be revealed, and he foresees "intelligent spaces" in the future where the entire environment has some smart, interactive network capability. The Internet2 and Internet Protocol version 6 (IPv6) provide more secure and technically sophisticated alternatives to the current system, but some people have even begun to utilize the Internet2 for illegal purposes. University students have set up a file-sharing network called i2hub.com that runs over the proprietary Internet2 network, which offers bandwidth 1,000 times that of normal broadband. Content pirates adversaries have moved to the Internet2 as well; the Motion Picture Association of America is exploring both threats and opportunities posed by that system. IPv6 promises "a higher bar for security," says North American IPv6 Task Force Chairman Jim Bound, but its adoption will be gradual despite its ability to support pervasive Internet access. Bound says, "This technology is not just for the elite. This technology is for all. And that's not going to happen with IPv4."
    Click Here to View Full Article

  • "Exploring the Future of the Human-Computer Interface"
    Gizmo.com.au (12/02/04)

    The National ICT Australia's (NICTA) Visual Information Access Room (VIAR) is a testbed for cutting-edge human-computer interface technology. At the facility's official launch, presenters such as professor Peter Eades with NICTA's Interfaces Machines and Graphic Environments program noted that innovative human-digital interaction methodologies are key to extracting meaningful knowledge from increasingly vast volumes of data. Eades said the mouse, keyboard, and screen may be the primary interfaces right now, but future interaction will be supported by everyday objects. "The walls of our homes will become our screens, the arms of our armchairs will be input devices, and we will operate equipment with winks and nods," he predicted. Eades noted current applications use either gesture or speech recognition, but the VIAR is able to integrate these two techniques with signal processing and more sophisticated algorithms, thus allowing the computer to ascertain the user's intentions more accurately. Technology employed at the VIAR includes information displays that permit either single- or multiple-user participation, or use a smart wallpaper format in which data is presented on multiple services via computer projection. Among the projects currently residing at the facility is the spherical VI Ball, whose rotatable surface supports creative data visualization for exploring more natural human-display interactions; Geometry for Maximum Insight, a tool for visualizing and analyzing large and complex networks in order to uncover insightful patterns and trends; and the Phantom Desktop haptic display for facilitating more effective interaction and understanding of data. The NICTA researchers expect such technology will lead to applications where faster response times, greater ease-of-use, and more precise determination of human intention are critical.
    Click Here to View Full Article

  • "Urban Renewal, the Wireless Way"
    Salon.com (11/29/04); Baker, Linda

    Forecasters believe Wi-Fi, global positioning system (GPS) locators, pervasive networking, and other new technologies will redefine public spaces by adding a digital dimension to life on the streets. The increasing linkage between computer science and urban design is being driven by several trends, including the public sphere's status as one of the few domains still unconquered by the tech industry; the emergence of technology small enough and applications big enough to enable computing in public spaces; and virtual reality's failure to rival real-time, real-place interactions in terms of immediacy and sentience. Notable deployments of digital technologies that enhance public spaces include the enablement of "smart mobs" in Tokyo that use GPS chips embedded in mobile devices to navigate the city via custom maps and buddy-finder tools. Meanwhile, Hewlett-Packard's Urban Tapestries project in Britain employs Wi-Fi-enabled networks to let users digitally tag actual locations with text and images that can be downloaded when they point their handhelds at a tagged site. Intel People and Practices Lab researcher Michelle Chang has designed a research tool for documenting urban activities on New York streets by supporting a real/virtual street game that assigns players random blends of objects, practices, and locations to record stunts, which harkens back to old-fashioned, increasingly obscure games such as hopscotch and stickball. These various projects reflect the distinction the urban planning community makes between practical and playful digital tech deployments. Philadelphia urban planner Scott Page says people in his profession lack a far-reaching strategy for implementing mobile and wireless technologies, although he recently devised such a strategy for a poor neighborhood that involves setting up a community technology center where kids can learn GIS skills to build a neighborhood database and publicly accessible digital bulletin boards that also function as art.

  • "Gamers Eye Open Virtual Worlds"
    Wired News (12/02/04); Terdiman, Daniel

    For multiplayer online gamers such as University of Michigan professor Peter Ludlow, the idea of an open-source virtual environment constructed by independent contributors and with no corporate policing is very appealing. Ludlow, who was expelled from "The Sims Online" gaming community by Electronic Arts, says the architecture of an open-source game world "wouldn't be dictated, but would emerge from numerous people trying to extend the game space." He envisions a single metaverse woven together from content created by the members of such open-source groups as Multi-User Programming Pedagogy for Enhancing Traditional Study (MUPPETS) and the Open Source Metaverse Project (OSMP). The OSMP was developed by Hugh Perkins as a platform on which to build an unrestricted 3D environment with infinite extensibility; Perkins says keeping the environment open source allows the project to "grow organically with time as more people come in to use it [and] bring their own ideas [and] ways of working." MUPPETS is designed by Rochester Institute of Technology professor Andy Phelps as a educational tool for new students even before they become competent in programming. Phelps explains that MUPPET participants basically receive a plot of land on which they can create anything--as long as it does not violate some fundamental academic codes of conduct--that can interact with objects built by other participants. No restrictive terms of service or end-user license agreements apply in the MUPPETS metaverse, and participating students have created player vs. player games as well as 3D structures that could function as game settings. Ludlow believes multiplayer online games could dramatically improve by meshing such projects together into a single shared metaverse.
    Click Here to View Full Article

  • "The Battle Against Cyberterror"
    IDG News Service (12/01/04); Blau, John

    Although Eric Byres of the British Columbia Institute of Technology's Internet Engineering Laboratory thinks terrorist organizations currently lack the technical knowledge needed to breach utility networks, evidence suggests that they have started to build such skills. Consultant Justin Lowe notes that sensitive documents about supervisory control and data acquisition (SCADA) systems were uncovered in al-Qaeda hideouts, while Mi2g Chairman DK Matai warns that many skilled hackers are willing to offer their services for personal profit or to serve a political agenda. Consultant Joe Weiss says the Internet is currently being used by utilities and factories to convey SCADA messages from a growing population of Web-enabled, remote-control systems, while many of their so-called "private networks" employ fiber-optic links and transmission services from telecom companies that are frequently targeted by cyberattacks. Lowe also says that terrorists who cannot launch external attacks against utility networks can always try an internal attack by enlisting disgruntled employees or other susceptible personnel. MCI's Vint Cerf says the Internet's chief strength and weakness is the fact that "everything is connected," and he recommends that every host in every internal network install a firewall and deploy a strong authentication scheme. However, attacking the Internet itself is a difficult proposition because of its distributed architecture, says ICANN's Steve Cocke. Byres says that although the level of protection for critical infrastructure varies throughout the world, there is no direct correlation between a country's protection and its economy.
    Click Here to View Full Article

  • "University of Ulster Research Pushing Back the Frontiers of Space"
    ScienceDaily (12/02/04)

    University of Ulster researcher Roy Sterritt sees autonomic computing as the long-term solution to keeping computing networks running in the years to come. Sterritt, who works in Ulster's Computer Science Research Institute, estimates that by 2010, 220 million people, which is more than the current number of workers in the United States, would be needed to provide IT support for computing networks. Sterritt's team of researchers are working with experts from BT to provide telecommunications and computing networks, which are already creating management problems, with the capability to manage and heal themselves. Sterritt presented to NASA scientists in Washington his research on developing computer systems that would operate similarly to the autonomic nervous system of the human body, which manages biological systems. Such systems would not need as much human intervention. NASA has plans for launching the Autonomous Nano-Technology Swarm (ANTS) mission between 2020 and 2030. "The mission is viewed as the prototype for how many future unmanned missions will be developed and how future space exploration will exploit autonomous and autonomic behavior," says Mike Hinchey, director of NASA's Software Engineering Laboratory.
    Click Here to View Full Article

  • "NASA Gives Researchers Millions"
    Daily Trojan (12/02/04); Hawkins, Stephanie

    A $58 million NASA grant will be divided up between three professors at the University of Southern California's Information Sciences Institute (ISI) for the development and analysis of systems that are construction-based, computer-based, and logistical, respectively. ISI senior research scientist Wei Min Shen received $28 million for his "Modular, Multifunctional Reconfigurable Superbot" study, which seeks to build and analyze shape-shifting robot modules that are not only self-reconfiguring, but capable of autonomous connection to each other. Veteran NASA researcher John Damoulakis was allocated $18.1 million for his four-year "Fault-Aware, Modular, Reconfigurable Space Processor" project, whose goal is to produce reliable space-computing systems capable of fault comprehension and self-repair. 3D imaging and mapping will be employed to demonstrate the project's conclusions. A project led by ISI professor Robert Neches, "Coordinated Multisource Maintenance-on-Demand," will receive approximately $15 million to investigate a way to help the space agency use improved cost-benefit analysis as it relates to the expected requirements for equipment against the probability of malfunction. The overall target of Neches' project is to lower the support costs of scientific and exploration activities while maximizing their efficiency and resiliency. "The people at ISI are extremely pumped about [the award] because only 16 of the 70 grants that NASA did went to universities and we were the only one to get three," reports ISI public information director Eric Mankin. ISI executive director Herbert Schorr explains that USC made itself appealing as a grantee by focusing on areas of research that are of great interest to the agency, and by organizing teams that NASA would appreciate.
    Click Here to View Full Article

  • "Street Smarts: A Device to Help the Blind Find Crosswalks"
    New York Times (12/02/04) P. E6; Austen, Ian

    Researchers at Japan's Kyoto Institute of Technology have developed a software system designed to help visually impaired pedestrians cross streets safely by identifying crosswalks through camera image analysis. The concept calls for the pedestrian to wear a pair of eyeglasses outfitted with a digital camera and processor, and a speaker to relay synthesized vocal warnings when a crosswalk is detected; however, the software can only identify zebra crossing-style crosswalks at the moment, which are common in Japan but not North America. The software basically draws a virtual line out onto the street, where the borders of the painted white lines of the zebra crossing manifest themselves as a predictable series of points. The pattern is disrupted by passing vehicles, so the system can scan several images before determining a crosswalk's presence. The Kyoto Institute researchers report in the November issue of Measurement Science and Technology that the software correctly recognized 194 of 196 crosswalks in tests, and they intend to integrate the software with an older system designed to detect the color of traffic lights. AccessWorld editor in chief Jay Leventhal notes that blind pedestrians face the greatest risk when crossing wide roads that do not have crosswalks or traffic signals, while the software's inability to detect cars making right turns into crosswalks against red lights is another danger. He says it is advisable to incorporate crosswalk systems into currently available Global Positioning System technology designed to help visually impaired people navigate, although an even better solution is to design roads, traffic signals, crossings, and sidewalks to account for blind people. Other companies and researchers working on assistive navigational devices for the blind have relied on ultrasonics, which can neither specifically recognize crosswalks nor determine traffic signals.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Haptics: Can You Feel the Buzz?"
    TheFeature (11/29/04); Frauenfelder, Mark

    Haptics technology is making a splash in the gaming, automotive, medical, and graphics sectors, and developers are preparing new applications for the mobile communications sector. Current haptic products, which incorporate tactile sensations of temperature, texture, pressure, and vibration into digital systems, include dummies outfitted with force-feedback haptic simulators for training doctors and nurses; iDrive knobs in BMWs that impart unique sensations for specific in-vehicle functions; and 3D armature devices that enable animators to "feel" virtual clay as it is manipulated on a monitor. Upcoming mobile haptics technologies include Samsung mobile phones that use Immersion's VibeTonz development platform, which augments various functions--ringtones, games, navigation, chat, and so on--via the device's vibrate motor. VibeTonz, for example, could be used in phones that emit buzzes and pulses in tune with the ringtone, and chat emoticons that convey a kissing, slapping, or purring sensation. Meanwhile, Nokia Research scientists equipped a phone with an acceleration sensor to support a game similar to Pong that is controlled by the user tapping the device either vertically or horizontally, which triggers distinctive vibrations that generate feedback. The effect is "a kind of kinesthetic illusion of a soft ball being tapped and bouncing inside the device," according to a paper presented at the NordiCHI human-computer interaction conference. Researchers at the University of British Columbia have created a prototype Wireless Haptic Texture Sensor (WHaT), a cordless stylus that simulates the tactile sensation of any textured surface. Potential mobile applications include phones that feel different depending on who is calling, and displays that function as virtual fabric swatches.
    Click Here to View Full Article

  • "On the Tech Radar"
    ZDNet (11/29/04); Farber, Dan

    Gartner's "Technology Radar Screen 2005-2014" maps the emergence of pervasive and intelligent computing, enabled by advances in sensors, networking, and user interfaces. Many technology predictions have proven to be too optimistic, while others have been understated, such as the impact of the Internet over the last 10 years; likewise, the social changes caused by Gartner's predicted technologies could make those developments even more disruptive than currently understood. Mesh networking and increasingly sophisticated wireless technologies will provide people with seamless connectivity even in rural areas, and wireless sensor networks will improve security and support decision-making and automated business operations. Companies will need to employ grid computing, predictive modeling, and semantic metadata standards in order to deal with this sea of incoming information. Privacy will continue to be a big issue, but the debate will not be over whether to collect data but rather how to control access. Gartner predicts that by 2008, intermediary services will exist to manage personal data in commercial transactions, while in 10 years, the development of wearable computing and increasingly optimized user interfaces will provide people with augmented reality and whatever data they need for the task at hand. Low-cost flexible displays and improved power management will allow portable devices to carry richer media. Gartner also says 70 percent of the developed world's populace will interact with others in the digital realm 10 times longer than they do in the real world by 2010, requiring more sophisticated collaboration technologies such as Wikis and other tools supporting distributed decision-making.
    Click Here to View Full Article

  • "Cyber Detective Links Up Crimes"
    New Scientist (12/01/04); Graham-Rowe, Duncan

    DePaul University computer scientists Tom Muscarello and Kamal Dahbur have developed a system that compares crime case records with all the files on past criminal offenses using artificial intelligence. The Classification System for Serial Criminal Patterns (CSSCP) combs through all available case records and assigns distinct numerical values to specifics such as the type of crime, the perpetrator's gender, and the weapon used, and from them organizes a crime profile; crimes with similar profiles are then sought by a Kohonen neural network. Kohonen is especially adept at uncovering patterns from input data without human assistance. If a possible connection between two offenses is established, the system compares time and location to determine if the same perpetrators could have traveled between the two crime scenes within the time limit. Muscarello explains that CSSCP was modeled after conventional crime-solving methodology to a certain degree: Just as several detectives may be assigned to handle individual aspects of a single case, so can the system tap different neural networks to analyze specific angles in an investigation. Muscarello reports that in laboratory tests using three years' worth of armed robbery data, CSSCP identified 10 times as many patterns as a team of detectives. However, the DePaul computer scientist insists that the system is not designed to replace human investigators, but instead serve as a jumping-off point for investigations by finding potentially linked offenses. Muscarello hopes the Chicago police department will agree to run trials of CSSCP.
    Click Here to View Full Article

  • "An Added Dimension for Virtual Museums"
    IST Results (12/02/04)

    Museums can now set up 3D manipulable virtual exhibitions using technology developed by the IST ARCO project. Digitized museum pieces are rendered in either X3D or Virtual Reality Markup Language (VRML), and 3D models are produced via object modeling software; other tools make the 3D models interactive, and then the models and their associated metadata--age, shape, color, etc.--are represented in XML for archival in an object-relational database. ARCO project coordinator Martin White says the database and a content management application enable a museum to store all the digital data in a catalog and link the data to templates, so that the museum can build and update virtual exhibitions. The templates can be employed to visualize objects on the Internet or a museum kiosk, while interacting with the objects in virtual reality requires users to install the Cortona VRML plug-in. User trials of the ARCO system involved the participation of three European museums, which are working on licensing models to allow any museum to adopt the technology. White says the project partners were excited about the technology as a tool for enhancing the visitor experience as well as improving access to content for the disabled. He says, "ARIF allows the user to send a virtual museum artifact from a Web page to the augmented reality application. The Web page will disappear and be replaced by a video stream from a Web camera pointed at a marker card, used as a spatial reference," and a special mouse can be used to give users a tactile sensation for virtual objects.
    Click Here to View Full Article

  • "Binary XML Proponents Stir the Waters"
    SearchWebServices.com (11/22/04); Mimoso, Michael S.

    The W3C ought to specify a single binary XML format in order to head off multiple individual efforts from the wireless and mobile computing industries, says W3C XML Binary Characterization Working Group member Michael Leventhal in an interview. At the recent XML Conference & Exhibition 2004, Leventhal presented the case for a binary XML standard and met with a more positive response than anticipated. Software vendors worry that a binary version of XML could make their current products lose value. Leventhal says Microsoft is the largest force opposing binary XML because a standard would threaten so many of their applications; they have refused to join the current working group but have also kept an open mind. XML 1.0 is a human-readable format that is simple and straightforward, but a binary version would enable companies in the wireless and mobile computing industries to move beyond current bandwidth and processing roadblocks. When XML was created, it was mostly used for publishing, but more and more XML is used for real-time transactions and is taking up more network bandwidth. Basically, text XML is too cumbersome on bandwidth-constrained applications; the military and radio and television broadcasters have already adopted binary XML for this reason. For low-power mobile devices such as PDAs, binary XML is critical for processing Web XML data more efficiently and thus saving battery power. Leventhal says a binary XML standard that guarantees interoperability with the existing XML 1.0 software stack and covertability between binary and text versions would be generally accepted. Leventhal says, "This could potentially start making the computing landscape real different...Changes are coming. What form it takes or whether it reaches the objective of making the Web interoperable is what's in the balance."
    Click Here to View Full Article

  • "Mesh Moves Into the Wireless Office"
    Computerworld (11/29/04) Vol. 32, No. 48, P. 27; Wexler, Joanie

    Wireless mesh networks are spreading among organizations that need 802.11 infrastructure that is easy to install and configure; standards are not due out for 802.11 mesh networking for some time, but a number of vendors already offer mesh networking products that allow users to easily move and add new access points. The main benefit of mesh networks is the ability to get rid of wires connecting access points to the wired infrastructure. Ironically, one of the most costly and complex aspects of traditional wireless deployments today is cabling; WLAN meshes automatically self-configure so that IT staff only have to find a power source for the new access point--the nodes communicate with each other to determine the most efficient transmission path. Interference is something of a problem for WLAN mesh technology and can hamper throughput, while limited speeds make some permanent applications or bandwidth-hungry applications such as voice more suitable for faster Ethernet infrastructure. Urology Clinics of North Texas IT manager Kyle Nash often had to physically move access points to optimize WLAN performance--but with new mesh technology Nash now is able to simplify that task, which means the network is up longer and can more easily expand. Shared P.E.T. Imaging sells mobile diagnostic imaging services to hospitals and previously had to spend between four and eight weeks setting up the Ethernet cabling for its customers; now, the company spends about an hour, says IT director Marc Simms. Although extremely beneficial in some instances, even mesh networking vendors say the technology's application is limited by the number of wireless hops required, node density, and interference. But Strix Systems' Jose Villarreal expects wireless speeds to quickly increase and points out how some engineers never expected DSL speeds to be delivered over twisted-pair lines.
    Click Here to View Full Article

  • "Web-Based Development: A Giant Step Forward, Small Steps Back"
    SD Times (12/01/04) No. 115, P. 17; Correia, Edward J.; deJong, Jennifer;Lee, Yvonne L.

    Former Novell CEO Bob Frankenberg recalls that few developers could initially see the possibilities of the World Wide Web beyond an online information resource, but in the 10 years following its introduction the Web has evolved into a core integration platform for applications as well as data. Credited with this evolution is a mix of both old and new technologies and tech vendors' endorsement of a unified set of standards, such as HTML, HTTP, XML, SOAP, and Web services. Forrester Research analyst Randy Heffner attests that programmers' transition from fat clients to the Web browser represented two steps back as far as usability was concerned, but three steps forward in terms of accessibility. Web-based applications development spurred the adoption of technologies that had long been the subject of intense debate: For example, Ovum analyst Gary Barnett notes that three-tier development finally emerged thanks to the use of middleware driven by refined Web-based applications, although the widespread automation of e-business initiatives has only materialized in the last several years. Microsoft's Scott Guthrie says that integrating Web apps and businesses automatically stands out from previous enterprise developments, while Frankenberg points out that a lingering debate about software reuse has been rekindled by Web development and its current focus on Web services and service-oriented architecture. "The ability to create reusable components, and place the application logic above them, has evolved [the Web] to way more than just an integration platform," he says. IBM's Bob Sutor is looking ahead to Web development's next phase, which entails the establishment of technologies and standards that will guarantee the reliability and security of Web apps in such a way that business can be performed without human involvement.
    Click Here to View Full Article

  • "Every Step You Take...Every Move You Make...My GPS Unit Will Be Watching You"
    Popular Science (11/04) Vol. 265, No. 5, P. 89; Rosenwald, Michael

    Technology such as the Global Positioning System (GPS), which is used for socially beneficial pursuits such as keeping tabs on Alzheimer's patients, rental cars, children, and valuable items, is also being exploited for stalking and other forms of criminal behavior. The technology itself, as well as sensitive information about a stalker's targets, is easily available over the Internet. The profit potential of surveillance and monitoring devices and services sold over the Internet is undeniable, one example being an online outlet controlled by entrepreneur Brad Holmes that offers products such as spy cameras, GPS trackers, and listening devices--as well as associated countermeasures. These retailers refuse to assume responsibility for how customers use the products: "People can sell beer, and depending what the customer does with it, that could be harmful too, right?" argues Track & Spy owner Greg Shields, who sells clock radios equipped with hidden video cameras, among other things. The shrinking size of GPS units that can be embedded into products has raised fears of pervasive, invisible surveillance; other technologies stalkers have used or are likely to use include keystroke-recording software, spy programs sent to victims as email attachments, and a service that sidesteps Caller ID. High-tech gadgetry allows stalkers to go about their business virtually, which makes it harder to catch them red-handed. The legal system is lagging behind technology: Nearly every U.S. state has passed some form of cyberstalking legislation, but they often exclude surveillance technologies such as GPS units, and still validate an activity as stalking only if the victim is aware and feels threatened. Tracy Bahm of the National Center for Victims of Crime's Stalking Resource Center is helping compose prototype anti-stalking legislation that can be implemented nationally and that covers technologies that have yet to be realized.
    Click Here to View Full Article

  • "From Seeing to Understanding"
    Science & Technology Review (11/04) P. 12; Heller, Arnie

    Massive amounts of data generated by supercomputers are spurring scientists at Lawrence Livermore National Laboratory to develop radical visualization techniques to better pinpoint areas and images of interest in 3D simulations. Livermore's supercomputers are committed to providing physics and engineering simulations that play a key role in ensuring the safety and security of the U.S. nuclear stockpile as part of the Advanced Simulation and Computing (ASC) program, of which the Visual Interactive Environment for Weapons Simulation (VIEWS) Program is a major ingredient. The visualization engines and associated systems software employed by ASC supercomputers used to be provided and integrated by commercial vendors, but the processors and graphics cards' cost and proprietary nature limited the machines' expansion capability; recently, the visualization engines made the transition from a proprietary shared-memory architecture to "clusters" of inexpensive commercial PC microprocessors and graphical processing unit (GPU)-equipped graphics cards appearing in PCs and gaming boxes. When coupled with software tools based on open-source code and a high-speed network running on a Linux operating system, the new visualization engine offers superior performance and simple expandability. "The GPUs give us 10 times the performance for one-fifth the cost of cards found in previous ASC visualization engines," notes VIEWS visualization project leader Sean Ahern, who estimates that that the Linux clusters permit users to run larger, higher-resolution simulations without a noticeable increase in time. A diverse array of open-source cluster software has been developed for the new visualization engine: Examples include Distributed Multihead X, which can stitch displays from multiple machines together into a single screen; VisIt, a tool for viewing very large data sets; MIDAS, which allows large simulations to be run on office desktops; and Blockbuster, which can play high-rez movies at 30 frames per second on large displays.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM