HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 656:  Monday, June 14, 2004

  • "Pioneer Who Kept the Web Free Honored With a Technology Prize"
    New York Times (06/14/04) P. C4; Shannon, Victoria

    The Finnish Technology Award Foundation will honor World Wide Web Consortium director Tim Berners-Lee with the $1.2 million Millennium Technology Prize--the biggest tech award in the world--on June 15 for conceiving of the World Wide Web and insisting that it employ a license-free operating system. Berners-Lee muses that if his employer at the time, the European Particle Physics Laboratory, had demanded royalties, then the Internet would today be saddled with myriad incompatible "Webs." He reflects that his choice not to patent the World Wide Web operating system was fortunate, given the mad dash to patent software nowadays. "The problem now is that someone can write something out of their own creativity, and a lawyer can look over their shoulder later and say, 'Actually, I'm sorry, but lines 35 to 42 we own, even though you wrote it,'" Berners-Lee explains. The scientist notes that a preliminary finding by the federal Patent and Trademark Office of the United States that would reject a key software patent claim is an important step forward. In Europe, however, proposals to amend patent legislation in order to limit software patents may be withdrawn. Berners-Lee comments that such allowances are endangering the spirit of software development and threatening to choke creativity and innovation. "It's stifling to the academic side of doing research and thinking up new ideas, it's stifling to the new industry and the new enterprises that come out of that," he observes.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Undergrad Helps Make Engineering More Congenial for Women"
    Currents--UC Santa Cruz (06/14/04); Stephens, Tim

    University of California, Santa Cruz computer engineering major Angela Schmid attributes her success to her involvement in student organizations, networking with other women, and finding supportive faculty members. The 2004 Dean's Award honoree was elected co-president of the UCSC chapter of the Society for Women in Engineering in 2002, and she has also served as an officer in the campus branches of the ACM and IEEE. Schmid participated last summer in UCSC's Summer Undergraduate Research Fellowship in Information Technology (SURF-IT), a National Science Foundation-funded program, and is working with computer engineering professor and SURF-IT principal investigator Richard Hughey on the Kestrel parallel processor initiative. Hughey says that this summer's SURF-IT program will bring in 12 students, 10 of them female. He says the availability of female role models and mentors is critical to recruiting and retaining women in science and engineering, which is why he is attempting to have more freshman- and sophomore-level courses in his department taught by female faculty members. Other initiatives to make engineering studies more amenable to women include the addition of a unit on gender issues to an engineering ethics course that all computer and electrical engineering majors are required to take, and deeply entrenched policies for dealing with sexual harassment and discrimination. Schmid is considering the establishment of an honors society for leading computer and electrical engineering students that would coordinate outreach activities to local high schools and community colleges.
    Click Here to View Full Article

  • "Power to the People"
    Prague Post (06/10/04); Fernandes, Mark

    Czech scientists are building up their country's grid computing capabilities, and contributing to larger European efforts, such as CERN's European Data Grid. The potential of grid computing is becoming apparent with the recent data-transfer record set between CERN and the California Institute of Technology and the upcoming opening of the Large Hadron Collider (LHC) in Geneva; those projects show grid computing to be the successor to the traditional Internet, which depends on servers and other dedicated infrastructure as sources of information. Computing grids, in contrast, pool resources and make them available in the same way electricity is drawn from a power grid. The Czech Republic has its own grid network called the Czech Education and Science Network (CESNET), which is supported by the Czech government, the European Union, and revenues made by leasing its infrastructure. CESNET contributes resources to the larger European Data Grid and several other European projects, and Czech Academy of Sciences physics researcher Dr. Jiri Chudoba says the next few years will prove the importance of grid computing as CERN opens the LHC and uses grid technology to unravel scientific mysteries inside the data. Researchers anticipate discoveries in the field of physics, nanotechnology, chemistry, and biotechnology. Chudoba says major IT vendors such as IBM, Intel, and Hewlett-Packard are contributing to the project and are closely tracking the commercial potential of grid computing. In the future, businesses will be able to purchase computing resources from international infrastructure according to their particular need, but the expansion of powerful grid technology into the commercial sector creates the opportunity for hackers to harness those resources to crack passwords; to date, grids have largely been isolated in the hands of researchers, Chudoba says.
    Click Here to View Full Article

  • "Robotic Rock-Climber Takes Its First Steps"
    New Scientist (06/09/04); Knight, Will

    Engineers at Stanford University and NASA's Jet Propulsion Laboratory have developed a spidery prototype robot called Lemur that is designed to scale the sides of irregular surfaces. Other climbing robots employ suction cups or magnets to get a foothold, and are designed for flat surfaces; Lemur boasts a quartet of clawed, triple-jointed limbs that hook into a foothold, giving it a gait similar to that of a human climber. Once the robot maneuvers one of its limbs to a new foothold, it must maintain balance by shifting its weight via the rearrangement of its three remaining limbs, a task that is achieved through ad-hoc calculations from an onboard computer. Lemur selects the most efficient limb configuration for the next step it must take by quickly analyzing various positions through route-planning software. The robot must receive a model of the surface it has to scale--complete with foothold coordinates--before it begins its climb, but scientists ultimately want Lemur to be able to scan a surface and work out the most efficient path to take, using data collected from tactile sensors and video cameras to update its route as it ascends. Lead project engineer Tim Bretl with Stanford's robotics laboratory believes the technology behind Lemur could advance planetary exploration by making it possible to analyze geological samples from the sides of cliffs, as well as enhance search and rescue missions back on Earth. "A lot of people are becoming interested in using robots for disaster scenarios, like earthquakes," notes Gurvinder Virk of Britain's University of Leeds.
    Click Here to View Full Article

  • "To Judge Recent Attack on Linux's Origins, Consider the Source"
    Wall Street Journal (06/14/04) P. B1; Gomes, Lee

    Lee Gomes punches holes in a recent report from the Alexis de Tocqueville Institution insinuating that open-source Linux software commits copyright infringement on the basis that its creator, Linux Torvalds, has displayed nothing but contempt for intellectual-property laws from the start. The draft study, Gomes writes, is baseless innuendo that institute president Kenneth Brown supports despite findings to the contrary from the student he recruited to see if there were any similarities between the early version of Linux and Minix, the popular college operating system. The report argues that Torvalds must have stolen Minix code back in 1991, given the unlikelihood that the programmer, then 21 and in college himself, could have invented an operating system of such complexity in just a few months. Gomes counters that Linux's first incarnation was very small and barely viable, whereas its current version is the result of years of work by many volunteer programmers. In addition, the open nature of Linux would make stolen code obvious to anyone, and Torvalds has never denied that he used Minix as the starting point for Linux in the sense of examining it and working on a more improved version. Furthermore, Minix creator Andrew Tannenbaum recently insisted that Torvalds did not commit any impropriety. Additional evidence against Brown's allegations was published online by University of Maryland student Alexey Toptygin, who used software to check Minix and the early Linux for matches at Brown's directive--and found none. Gomes points out that one of the institute's underwriters, Microsoft (Linux's biggest adversary), has disassociated itself from Brown's study, denouncing it as "an unhelpful distraction from what matters most--providing the best technology for our customers."

  • "Beyond Proprietary Databases: Helen Borrie on the Future of Firebird"
    Linux Insider (06/11/04); Warrene, Blane

    Version 2.0 of the open-source database server Firebird is almost ready as is an official road map that project team leader Helen Borrie says will make the software more appealing to large enterprises. Firebird's aim is to replace Oracle and SQL Server in commercial applications, and Borrie notes the InterBase commercial database, from which Firebird was derived, has been used by large companies such as Lockheed, Boeing, and Motorola. Firebird support lists give a good indication of how the software is penetrating the commercial market, as the main list now generates about 60 messages per day and more often on weekdays, when people are at work; Borrie says Source Forge records about 2,000 downloads for Firebird each week, and the software is also gaining significant popularity outside the United States, especially in Brazil and Russia--in fact, many of the more active contributors to Firebird development hail from Russia. A broad range of language drivers are available for Firebird, as well as a good number of Linux certifications. Borrie says Firebird is also technically appealing because of its small footprint; it needs less than 1.5 MB on the server and about 350 KB on the client side. Firebird started off as a fork from the InterBase platform, which did not provide working build scripts; the developers who started Firebird rebuilt the code and fixed hundreds of bugs along the way. Version 1.5 converted the code to C++ while Version 2 is aiming to introduce major new features that will make Firebird suitable for much larger databases. The head development effort focuses on these core SQL features and performance enhancements, while another development branch is working to reintegrate the Yaffil code that was made available to Firebird last November. Finally, Borrie notes that the author of the original InterBase code, Jim Starkey, is working on an architectural restructuring of Firebird for a client; that project will be donated to the larger Firebird code base.
    Click Here to View Full Article

  • "Digital Pen Takes on Mouse"
    BBC News (06/10/04); Macdonald, Nico

    A team led by Dr. Jun Rekimoto at the Sony Interaction Laboratory has devised a "pick and drop" method for transferring notes and files between computers by selecting the information with a special pen and dropping it onto the display of another machine simply by touching the screen. The pen is assigned a unique ID that the computer reads when the instrument is in close proximity to its screen. Tapping an icon with the pen connects the computer to a "pen manager" server, which affixes the object to the pen. A shadow of the object appears on the display of another device when the pen comes close, and tapping the pen tip tells the pen manager server to copy the file onto that device. Such a technique is more convenient and less cumbersome than traditional file-exchange tools such as email, discs, or shared file servers. Sony's labs have also developed a "pick and beam" technique in which data projectors beam displays onto flat surfaces into which documents can be dragged from desktop computers using special pens. Philips Digital Systems Laboratory design consultant Ian McClelland said the development of such technologies was just as significant as the invention of the computer mouse, and Rekimoto and colleagues recently demonstrated their research at the CHI2004 conference. One attendee who was not bowled over was Dr. Russell Beale of the University of Birmingham's School of Computer Science, who dismissed the technology as "boys for toys," and said the Sony lab does not conduct assessment or theoretical development meticulously enough.
    Click Here to View Full Article

  • "Pay or Go Away: What Would Spammers Do?"
    EurekAlert (06/08/04)

    Researchers at the University of Michigan believe that charging spammers for every message they send would solve the spam problem within two to three years. Marshall Van Alstyne, an assistant professor in the School of Information, computer science doctoral students Thede Loder and Rick Wash, and Mark Benerofe, a technology industry and media executive in Atlanta, Ga., were in Washington, D.C., this week to present a proposal to the FTC's Bureau of Economics. The Attention Bond Mechanism (ABM) would have recipients and senders negotiate the terms of communication without any assistance from a third party. "The sender who believes his or her message is not spam is willing to put up that money--to risk it--to prove that if the recipient reads the email, they will agree that it is not spam," says Van Alstyne. The researchers say the technology needed to make the ABM system a reality is already available, adding that changes in infrastructure will be needed as well as proper wiring. The anti-spam technology would boost the "quality of information exchange and reduce the email volume that clogs networks and increases costs for consumers and business," adds Wash.
    Click Here to View Full Article

  • "A Computer That Has an Eye for Van Gogh"
    New York Times (06/13/04) P. 2-1; Heingartner, Douglas

    Researchers at the University of Masstricht in the Netherlands have developed Authentic, a computer system designed to enhance the authentication of paintings by distinguishing patterns of consistency in works of art attributed to specific artists. The system is currently analyzing all paintings attributed to Vincent van Gogh with the cooperation of the Van Gogh Museum; museum curator Dr. Louis van Tilborgh says using technology such as Authentic makes sense, especially to art history scholars. Authentic analyzes digital renderings of van Gogh's body of work for traits such as brush strokes, canvas organization, and color contrast, and accentuates patterns common to the oeuvre: Objects of greatest consistency clump together while deviations are labeled as suspicious. Authentic project leader Dr. Eric Postma draws parallels between this method and chess, noting that skilled chess players and painting experts can immediately identify patterns; a computer, however, can accomplish the same task faster. In its first test project, Authentic was able to tell the difference between paintings by van Gogh, Gaugin, and Cezanne with 95 percent accuracy, while in its second test the system successfully found the one van Gogh out of six that was painted on a different type of material through high-resolution x-ray analysis. The current Authentic experiment involves the study of hundreds of low-resolution paintings by van Gogh to find complementary color patterns that would take much longer to detect through manual research. But auction houses and museums are hesitant to embrace new technology, especially in high-stakes transactions where traditional attribution techniques are still considered the best measure of authenticity. "The ultimate decision about whether something is by artist X or not is a subjective one, and has to be based on what you see with your eyes," observes Gregory Rubinstein of Sotheby's.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "UN Favors Change in Internet Governance"
    Computerworld Singapore (06/09/04) Vol. 10, No. 22; Bell, Stephen

    A new working group that is still being formed by the United Nations (UN) is in the early stages of developing plans for changing and improving worldwide Internet governance, according to the group's secretary, Markus Kummer. In a presentation delivered to attendees of the International Telecommunications Union (ITU) Telecom Africa 2004 conference in Cairo, Kummer said the UN working group is in "the beginning of a reflection process on how best to coordinate Internet governance" and expects to present some of its conclusions by next year's Worldwide Summit on the Information Society (WSIS) in Tunis. According to sentiments expressed at the first installment of the WSIS in Geneva last year, nations appear to be divided as to whether the Internet should continue to be governed by the mostly private ICANN, or if power should be shifted to a governmental group such as the ITU. Some nations, dissatisfied with the current system, believe they should be in charge of managing their own country-specific domains. However, InternetNZ International Affairs Committee Chairman Peter Dengate Thrush says most nations are in agreement that governance should be shared among both private and public interests. Another major issue currently facing Internet governance is the refusal of some European countries, including the United Kingdom, to join the County-Code Names Supporting Organization (ccNSO) out of frustration with ICANN's spending policies and its control over the Internet Assigned Numbers Authority.
    Click Here to View Full Article

  • "Green Light for Bluetooth: Faster Speed Debuts"
    EDN Magazine (06/08/04); Miller, Matthew

    Bluetooth has a new Enhanced Data Rate (EDR) that advocates say will enable more Bluetooth applications, such as wireless speaker connections and cell phones that operate as cordless landline phones. The Bluetooth Special Interest Group (SIG) says the new 2.1 Mbps data rate is three times faster than Bluetooth 1.2 and maintains backwards compatibility, while higher Bluetooth speeds are needed in order to support the growing number of PC peripherals using the communication link. In addition, users depend on Bluetooth to send increasingly large digital images from their mobile devices to their PCs. EDR uses PSK (phase shift keying) instead of GFSK (gaussian frequency shift keying) for modulation, allowing larger packet payloads while maintaining the same packet timing and structure; because data is sent faster, Bluetooth devices will be able to avoid radio activity that voraciously pulls power. Cambridge Silicon Radio (CSR) has already produced working silicon that uses EDR, and is allowing customers to sample a BlueCore4 version with external flash memory, and a BlueCore4-ROM version with embedded memory and a smaller footprint for mobile devices. CSR also says it has developed a new phone together with Bluetooth provider IVT that serves primarily as a GSM mobile phone, but automatically connects to a landline station via Bluetooth. The landline connection should help users save wireless fees, and could allow for simultaneous data and voice transmission if EDR is used. Separately, Royal Philips Electronics has created Bluetooth and 802.11b system in package (SiP) devices that enable the two communication links to operate in the same unit. Because both standards use the same frequency, they have previously had trouble operating in close proximity inside a device.
    Click Here to View Full Article

  • "Tech on the Back Burner"
    Federal Computer Week (06/07/04) Vol. 18, No. 18, P. 74; Chourey, Sarita

    Congress has prioritized the war with Iraq and homeland security over technology initiatives for 2004, and most experts contend that these issues are being emphasized out of congressional members' desire to elevate their public image in an election season. "It's going to be a short legislative year, and then the lame duck will mostly be tied to the appropriations bills [Congress] didn't finish," notes the Information Technology Association of America's Olga Grkavac. David Martin, communications and policy director for Rep. Tom Davis (R-Va.), says the Acquisition System Improvement Act (ASIA) and global sourcing limitation programs will be the biggest issue for the House Government Reform Committee; the House version of the Defense authorization bill has already incorporated two ASIA provisions, while Rep. Todd Platts' (R-Pa.) Program Assessment and Results Act would make it a requirement for agencies to carry out program performance reviews in five-year intervals. Another proposal receiving heavy congressional consideration is the Strengthening Homeland Innovation by Emphasizing Liberty, Democracy, and Privacy Act, which is positioned to ensure the protection of civil liberties while supporting the war on terrorism and the deployment of screening and security initiatives. Other issues carrying weight in Congress include an amendment to the Public Health Service Act to fight bioterrorism, and the first authorization bill of the Homeland Security Department, which will be reviewed by the House Homeland Security Committee. Meanwhile, Rep. Judy Biggert's High-Performance Computing Revitalization Act is awaiting either a full committee vote or subcommittee action: The bill focuses on the areas of technical standards development; the development of software, algorithms, and applications; and education and training. Biggert says the interagency planning office must be required to coordinate an interagency planning process as well as devise and manage guidelines for the research, development, and implementation of high-performance computing resources.
    Click Here to View Full Article

  • "Experts Warn of VOIP Security Flaws"
    Techworld (06/09/04); Broersma, Matthew

    Telecommunications experts speaking at the recent VON Europe Voice over IP (VoIP) conference warned that Internet Protocol-based (IP) voice networks will need a new approach to security. VoIP provides cost and value-added advantages over traditional phone networks, but also comes with new threats such as spam, denial-of-service attacks, and worms, according to industry executives. "As with any IP network, security is a permanent issue...you're not secure or insecure, you have a security process," noted Alcatel security research director Francois Cosquer. Telecommunications and networking companies believe they can make IP networks as reliable as traditional infrastructure, but companies thinking about IP-based voice services still need to pay attention to security. Third-party testing can find software flaws that leave networks open, but this requires time and money, and Codenomicon CEO Ari Takanen suggested staying away from open-source applications. He said the issue was not so much how to protect against outside threats, but how to eliminate the software vulnerabilities that facilitate those threats. Cisco Systems' Greg Moore pointed out that VoIP networks stayed up during the Sept. 11, 2001, attacks in New York City while traditional PBXs failed. Cosquer also pointed out that the most advanced IP-based voice networks do not use the general Internet and so are not as vulnerable to hackers. Other experts noted that IP technology enabled smaller innovative companies to enter the market, such as Skype, which uses a peer-to-peer application.
    Click Here to View Full Article

  • "Hack Out the Useless Extras"
    New Scientist (06/05/04) Vol. 182, No. 2450, P. 26; Negroponte, Nicholas

    MIT Media Lab founding chairman Nicholas Negroponte says the speed and performance of software worsens with each succeeding release because of "featuritis," the tendency to bloat new releases with features and options that monopolize the hardware's improved speed and memory. He argues that "What you actually get is 10 different ways to do the same thing, with fewer and fewer of them intuitively obvious." Negroponte contends that the price of laptops has basically remained the same for 15 years thanks to featuritis; he also notes that mobile phones are undergoing feature bloat, a trend that will eventually grind to a halt because cell phone batteries will be unable to accommodate features past a certain point. The MIT professor points out that users should reject the attitude that they are stupid, and that machines must be built the way they are. Negroponte says the Media Lab is focusing on simplicity in personal technology, and argues that two approaches to simplicity--one short-term and one long-term--need to be executed simultaneously. The short-term approach melds good design with a streamlining of features and options: Simplicity should be a rule of thumb for mainstream products, while dedicated, special-purpose devices could be created for consumers of "advanced" features by adapting software to make the hardware act in different ways. The long-term strategy is to imbue computers with common sense. Negroponte asserts that "Simpler machines can be much less expensive, and while consumers, I believe, want this badly, the manufacturers have little interest in making this happen because the high end of any market is more profitable."

  • "E-Voting Debate: Paper or No Paper"
    Government Computer News (06/07/04) Vol. 23, No. 13; Miller, Jason; Emery, Gail Repsher

    Whether electronic voting machines are reliable enough to be used without voter-verified paper ballots is a subject of intense debate among Congress and state legislatures as the November general election approaches. Legislative support for paper trails is gaining: In 2003, Rep. Rush Holt (D-N.J.) sponsored a bill that would make paper ballots a requirement of all voting machines by 2004. Meanwhile, California's secretary of state recently decertified all touch-screen machines until they were equipped with paper ballots, and a bill was recently approved by Maine's legislature mandating that all e-voting machines furnish a paper receipt of voters' ballots. Stanford University professor David Dill favors the use of optical-scan machines because they produce both a printed ballot and an electronic vote count, and he calls the growing emphasis on voter-verified audit trails part of "a wave of common sense that is crossing the country." On the other side of the fence are e-voting proponents such as Maryland elections administrator Linda Lamone, who believes that direct-reporting electronic machines work fine, and it is human error that is chiefly responsible for problems. She boasts that e-voting machines have been used in 30 elections in her state without a single glitch, and she attributes this impressive track record to thorough anti-tampering safeguards and evaluation procedures such as third-party software testing, mock elections, and encryption of voting data after elections. But Dill contends that these measures cannot remove every hacking opportunity--for instance, the software running on the machines could already be tainted by malicious code activated by factors such as the number of votes cast, date, or time. This fall will see the use of optical-scan technology in 45 percent and e-voting systems in 21 percent of all U.S. counties, according to Election Data Services.
    Click Here to View Full Article

    For information regarding ACM's activities regarding e-voting, visit http://www.acm.org/usacm.

  • "IT on the Campaign Trail"
    CIO (06/01/04) Vol. 17, No. 16, P. 72; Varon, Elana

    How well the Democrats and Republicans take advantage of IT could determine the winner of the 2004 presidential election. IT has become a critical marketing tool for campaigners and a major force in the mobilization of voters and supporters. The transition from grass-roots to mass-media campaigning has had a negative effect on voter turnout, as voters grew disenchanted with the slick, homogeneous messages disseminated by radio and television; but grass-roots campaigning is enjoying a comeback of sorts through relational database technologies that easily aggregate detailed information about voters that can be used to target messages at specific demographics or individuals. This strategy could prove integral to winning over voters in swing states, considered to be the deciding factor for the presidential election. Republicans started work on an integrated voter database six years before the Democrats, spurred by the high costs of door-to-door campaigning: The result was the Republican National Committee's (RNC) Voter Vault, a repository of voter information provided by state parties in various formats that the RNC compiles and makes available through a common interface that field workers can access to produce target voter lists on the spur of the moment; Steve Ellis of the RNC is adding wireless connectivity to the system so that field workers with handhelds can conduct more effective door-to-door campaigns by acquiring voter lists via Voter Vault. The Democratic National Committee's DataMart database was set up as the core component of a new IT infrastructure, and it is used to connect voter registration and turnout records to any data the party can collate. Laurie Moskowitz, a consultant who ran field operations for the Democrats in 2000, says DataMart has enabled national and state parties to work with the same data for the first time. She says the presidential race will hinge on effective niche marketing, a process that involves the "mobilization of lots of small universes."
    Click Here to View Full Article

  • "The Future of Interoperability"
    GeoWorld (05/04) Vol. 17, No. 5, P. 38; Lake, Ron

    Current geographic information systems (GIS) technology does not support data sharing and integration very well because of its inability to accommodate multiple, proprietary data formats, nor was it designed for the Internet or distributed information architectures; networked data integration adds up to rapidly escalating data acquisition, integration, and maintenance costs, along with lost opportunity costs. However, geospatial interoperability frameworks based on Open GIS Consortium (OGC) OpenGIS Geography Markup Language (GML) and Web Feature Service (WFS) specifications are viewed as solutions. WFS and GML fulfill the same functions as an HTTP server and HTML, respectively. GML can encompass features with multiple relationships and geometric characteristics, which supports interoperability enablement and user adaptability to new models and systems. Extensible components are critical to interoperability, a maxim that applies to GML: Not only can users build required features through GML, but communities can develop new geometries, topology elements, coordinate reference systems, and other fundamental components. The transition of GML and other OGC specs into International Standard Organization standards is an important development for people whose work, leisure, and daily lives are heavily influenced by geospatial data and information. Standardization will permit GIS experts to avail themselves of a lot more data, boost their productivity, and take advantage of opportunities to add real data. Geospatial interoperability frameworks have been created to be a major driver in the IT world's move to concentrate on higher "protocol stack" levels.
    Click Here to View Full Article

  • "Robots and Sensors May Help Make Seniors Mobile"
    Futurist (06/04) Vol. 38, No. 3, P. 12; Coles, Clifton

    A variety of assistive technologies are being developed to give elderly people more independence and mobility via unique communication interfaces between the users and their environment, according to Rodolphe Gelin of the Robotics and Interactive Systems Department at France's Atomic Energy Authority. Alter Eco Sante's Automax unit has a swiveling grab bar to help patients stand up, while the Monimad is an intelligent walker that raises patients into the walking position. The Nemo+ system, which can be attached to a wheelchair, controls up to 200 voice-activated functions--turning on lights, opening doors, operating phones and TVs, etc.--through the use of stored infrared and other codes. Another robot features a voice-controlled manipulator arm that enables patients with impaired upper limbs to perform many normal chores, such as pouring a drink or picking up a book. Academic researchers, engineers, doctors, and Bretonne Aphycare have developed a monitoring device outfitted to a wristwatch that keeps track of the wearer's vital signs and detects falls; the device wirelessly alerts relatives or emergency services during a crisis, and has a 230-foot transmission range. A continuous monitoring system enables remote interaction between elderly patients in the home and relatives or medical staff via a computer-supported video connection. The system also allows patients to remotely control electrical equipment in the house.


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM