HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 7, Issue 742: Friday, January 14, 2005

  • "New FBI Software May Be Unusable"
    Los Angeles Times (01/13/05); Schmitt, Richard B.

    A central pillar of the FBI's computer system overhaul, which has already cost nearly half a billion dollars and missed its original deadline, may be unusable, according to reports from bureau officials. The prototype Virtual Case File software developed by Science Applications International at a cost of about $170 million has been characterized by officials as unsatisfactory and already out of date; sources indicate that scrapping the software would entail a roughly $100 million write-off while Sen. Judd Gregg (R-N.H.), with the Senate appropriations subcommittee in charge of FBI Funding, says the software's failure would constitute a tremendous setback. A three-month trial of the prototype software in Washington and New Orleans will gauge users' reactions, and any significant information culled from the experiment will be applied to a future case management system, reports a source well-acquainted with the FBI's plans. Virtual Case File was designed to help federal agents deter terrorist attacks by sharing information online, and its chief application lets users prepare and forward documents in a serviceable format. The computer system overhaul, which has cost $581 million thus far, was tagged as a priority by members of Congress as well as the independent 9/11 commission, although it was proposed before the attacks. Officials say the FBI is so convinced of Virtual Case File's inadequacy that it has taken preliminary steps to start calling for proposals from outside contractors for new software designs. The FBI has stopped making projections on when the overhaul might be completed. In addition to helping federal agents exchange terrorist-related data, the bureau wants the software to support document management and user collaboration on documents.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Putting XML in the Fast Lane"
    CNet (01/13/05); LaMonica, Martin

    As Extensible Markup Language (XML) increases in popularity, the increasing traffic burden is prompting calls for binary XML and other solutions. Compressing XML into a binary format would dramatically streamline the abundance of XML communications and especially benefit performance-sensitive applications or XML applications on mobile devices, where processing power is limited. Sun Microsystems has launched an open-source project for binary XML called Fast Infoset Project that is focusing on a compression method standard already used in the telecommunications market; Fast Infoset pilots have found that XML applications run two to three times faster with the binary standard, but Sun software executive and XML co-inventor Tim Bray worries that any move to binary XML could result in incompatible XML implementations. XML compatibility has been a great success so far, and its text-based nature allows anyone to inspect the message easily in Notepad, says Bray. "If I were world dictator, I'd put a kibosh on binary XML, and I'm quite confident that the people who are pushing for it would find another solution," he says; however, binary XML that goes through standards bodies and is open source would be acceptable to him. Some of the biggest proponents for binary XML are consumer electronics firms such as Canon and Nokia, which argue that binary XML would ease the transfer of large files to mobile devices. But such industry-specific advocacy could foreshadow vertical-market binary XML applications. IBM information management general manager Janet Perna says increased XML loads are a problem, but can be solved through processing and networking advances; she points out that people thought e-commerce would overload the Internet, but those fears proved unfounded.
    Click Here to View Full Article

  • "Ohio Pulls Plug on Electronic Voting"
    Cleveland Plain Dealer (01/13/05); Smyth, Julie Carr; Naymik, Mark

    Touch-screen electronic voting machines long criticized for inadequate security and reliability have been rejected by Ohio Secretary of State Ken Blackwell in favor of a statewide deployment of optical-scan machines that process manually filled-out paper ballots tallied up by a precinct-based computerized counter. The systems not only comply with Ohio's requirement to supply a paper trail, but they offer greater flexibility than electronic machines at less cost, says representative Carlo LoParo. He reckons that equipping touch-screen systems with paper trails could have entailed a 20% increase in spending on the machines. Blackwell's directive has met with the approval of Ohio's County Commissioners Association, whose executive director, Larry Long, calls it "the only way Ohio can comply with federal law without counties being required to pay for part of the cost for installing new voting devices." On the other hand, some Ohio elections officials are unhappy with Blackwell's decision. Lake County elections director Jan Claire says the mandate will mean scrapping a $3 million touch-screen deployment that has been operating since 1999, while giving taxpayers the added burden of paying for the printing and processing of paper ballots. Meanwhile, Cuyahoga County elections board director Michael Vu says the board had already opted to phase out the county's punch-card system in favor of an electronic system, and argues that a decision such as Blackwell's should be studied in depth by election officials statewide before going forward.
    Click Here to View Full Article
    (Access to this site is free; however, you will need to enter your zip code, year of birth and gender to gain access to the site.)

  • "Haptics: The Technology of Simulating Human Touch"
    InformIT (01/14/05); Rowell, Laurie

    Researchers continue to improve haptics technology that will one day enable more useful robots, fully functional prosthetic limbs, and true computer virtual reality. In the same way computer vision technology requires an understanding of optics and building an audio receiver requires an understanding of acoustics, haptics research means scientists need to know skin biomechanics. The MIT Touch Lab is working to measure pressure thresholds and other metrics that will help create touch capabilities for computer haptics; researchers at the laboratory are focusing on the primate fingertip and have created a new Ultrasound Backscatter Microscope (UBM) that depicts the ridges and layers of skin on a fingertip in more detail than available with magnetic resonance imaging equipment, and are mapping the friction and compressibility measurements onto 2D and 3D fingertip models. Another difficulty with haptics is the dual role required of interfaces, in the sense that any haptic equipment has to communicate touch as well as sense it. Haptics researchers Peter Berkelman and Ralph Hollis created an ingenious device several years ago that utilized Lorentz force magnetic levitation in a force-feedback device: Paired with a computer display showing basic shapes, users were able to use the haptics device to feel virtual reality surfaces and even consistently maneuver a peg into a hole. Haptics is also playing an important role in computer accessibility through products such as the Cyberglove, which reads American Sign Language through a neural-net algorithm and also is used for virtual reality and animation. Swedish researchers have created a sort of haptics user interface that lets visually impaired computer users feel computer graphics. Haptic mouse technologies give literal meaning to "drag and drop," applying pressure and simulating the sudden loss of mass, as well as allowing users to feel grooves at the edges of windows and sense checkboxes.
    Click Here to View Full Article

  • "Grids Unleash the Power of Many"
    Technology Review (01/14/05); Gartner, John

    West Virginia, North Carolina, and Colorado are separately creating state computing grids that will open up new opportunities for academic researchers, businesses, and government agencies. The MCNC Research and Development Institute in Research Triangle Park, N.C., is coordinating a grid deployment that includes seven North Carolina universities and has a goal to enlist 180 in-state businesses; to garner participation among small companies, the Start Up Grid Initiative allows those firms to tap into the grid for free for nine months and enjoy discounted rates afterward. Immediate access to supercomputing capabilities is a tremendous boon to small firms that often spend most of their time raising capital and acquiring technology, says MCNC grid computing director Wolfgang Gentzsch. Economic Strategy Institute research fellow Robert Cohen conducted a study in 2003 that said 24,000 jobs could be created through the grid project by 2010. Version 4.0 of the Globus Toolkit will help boost grid deployments by making it easier to share and manage resources. Enabling access to computing cycles while securing sensitive information is one of the most difficult parts of collaborative grid computing and requires encryption for all data files; still, the payoff is vastly increased infrastructure utilization, with efficiency rates increasing from 25% to 75%, according to Gentzsch. The West Virginia Cluster Computing Grid is linking rural academic institutions in that state, allowing those schools to investigate more interesting scientific problems and locally important fields such as oil and gas exploration. Colorado CoGrid project director Bob Marcus says figuring out a business model is one of the greatest challenges: Though state funding helps get the grid off the ground, eventually commercial applications must be developed.
    Click Here to View Full Article

  • "Conversations Control Computers"
    Technology Research News (01/19/05); Smalley, Eric

    Workplaces are typically filled with short conversations about scheduling and assignments that require employees to take notes and mark their calendars, and it is possible to converse and operate a handheld computer using a series of vocalizations provided by speech recognition software that captures appropriate conversational excerpts. This capability is provided by a trio of prototype handheld computer applications developed by Georgia Institute of Technology researchers, who detailed their work at the User Interface Software and Technology 2004 conference last October. The user activates the applications by holding down a button on the handheld, telling the system to record and transcribe the user's words. The Calendar Navigator Agent application monitors the user's conversation for scheduling-related keywords such as dates and times, and employs them to navigate and mark a graphical scheduling program. DialogTab captures conversational segments that can be used as short-term memory aids, displaying them onscreen as a vertical stack of tabs whose contents are mouse-accessible. Speech Courier accumulates keywords so as to relay important conversational segments to an absent third party via email. The researchers developed the applications with funding from the National Science Foundation and the U.S. Department of Education's National Institute on Disability and Rehabilitation Research. Georgia Tech researcher Kent Lyons says the system only monitors and records speech from the user's side of the conversation in order to uphold privacy, and says the next phase of the project is to identify other situations where the technique is applicable, and to gauge the method's practicality.
    Click Here to View Full Article

  • "Torvalds Criticizes Security Approaches"
    InternetNews.com (01/13/05); Kerner, Sean Michael

    Linux creator Linus Torvalds criticized parts of the process in which potential Linux kernel security problems are divulged to fellow kernel users during a recent mailing list discussion among developers that focused on creating a security contact point people can employ when such issues crop up. Developer Chris Wright said the current procedure is to discuss security issues in multiple locations such as the Linux kernel mailing list, the vendor-sec mailing list (whose access is limited by agreement between existing members), and kernel maintainers. Though Torvalds was generally comfortable with a central contact point, he took a strong stance against the embargo of security advisories on the vendor-sec list, which is done supposedly to give vendors time to prepare patches before final disclosure. "It should be very clear that no entity (neither the reporter nor any particular vendor/developer) can require silence, or ask for anything more than 'let's find the right solution,'" he wrote, arguing that users should receive updates prior to disclosure. He said the greatest possible number of people should be made aware of potential security problems early on, and declared that the only matter of concern to him is that kernel developers serve the people who rely on them by supplying secure and bug-free source code. Torvalds said he found the concept of an unrestricted, embargo-free list perfectly acceptable, but acknowledged the need for a compromise between a completely open list and the vendor-sec list. He also recommended that kernel users use components such as exec-shield and compilers to set up their own defenses. Torvalds said he does not mind the idea of security through obscurity, but was opposed to the concept that the kernel can be effectively secured via secrecy.
    Click Here to View Full Article

  • "Brave New Era for Privacy Fight"
    Wired News (01/13/05); Zetter, Kim

    Privacy proponents are concerned that the erosion of civil liberties will continue in President Bush's second term through a combination of surveillance legislation and commercial technologies and services. In addition to a renewal of Patriot Act provisions urged by Bush, civil libertarians are expecting the administration to promote new Patriot Act II prerequisites; one of the more hopeful signs for privacy advocates is the introduction of the Security and Freedom Ensured Act of 2003, which revises some of the Patriot Act's more intrusive surveillance stipulations. The issue that privacy advocates believe bears the most watching this year is data-mining collaborations between the private sector and government agencies. There are no legal barriers to the government's purchase of information on individuals from commercial data aggregators, nor is there any legislation restricting how such data may be used by federal agencies; conversely, few laws are in place to regulate private companies' disposition of data provided by government agencies. Another concern for privacy advocates is Congress approving the standardization of all U.S. driver's licenses to include machine-readable, encoded data by the end of next year. The provision gives the secretary of transportation, in consultation with the Homeland Security secretary, carte blanche to determine what kind of data to include within 18 months, leading critics to fret that such data could be linked to a national database of citizen profiles accumulated from other sources. Other issues raising civil libertarians' hackles include the deployment of radio-frequency ID tags to track consumers beyond the purchase of retail items, and a possible national adoption of California's DNA Fingerprint, Unsolved Crime and Innocence Protection Act, which authorizes the collection of DNA samples from anyone arrested for any crime even if the person has not been charged or convicted.
    Click Here to View Full Article

  • "Information Fusion Research Simulates Disasters to Manage Emergency Response"
    UB News Services (01/11/05); Goldbaum, Ellen

    Researchers at the University at Buffalo's Center for Multisource Information Fusion are focused on improving emergency response management in the wake of early reports of natural and manmade catastrophes, using $2.5 million in funding from the Air Force Office of Scientific Research. "Our goal is to take the typically chaotic flow of reports of variable quality and heterogeneous origin received from the field in the period immediately after the disaster and transform it into useful information for Decision makers and emergency responders to act upon," notes principal investigator and UB School of Engineering and Applied Sciences professor Peter Scott. The project combines theoretical information fusion research with the design of a large-scale computer model of California's 1994 Northridge earthquake based on data aggregated by the Federal Emergency Management Administration. Scott says the software generates realistic models of quakes in the San Fernando Valley with variable characteristics (epicenter location, proximity to populated areas, and quake depth), and estimates casualties based on those variables. The program also simulates and "fuses" reports from police, civilians, and other witnesses whose information may be inconsistent or superfluous. Scott says the project attempts to ease the discovery and management of sudden, unanticipated secondary events that may stem from the primary incident. The information fusion process starts connecting reports and thinking about secondary causes upon receiving the first two reports of damage or casualties, and the software is designed to suggest likely scenarios and confidence measures within seconds or minutes after receipt of the reports.
    Click Here to View Full Article

  • "Supercomputing Goes Global"
    Computerworld (01/10/05) P. 31; Willoughby, Mark

    The race for the fastest, most powerful supercomputer has reached global proportions, and MessagingGroup managing consultant Mark Willoughby observes that "There's considerable national pride invested in the quest to build a faster machine to discover that next subatomic particle lurking just beyond the bandwidth of today's champ." New supercomputing speed records are becoming increasingly short-lived as competition intensifies, and the current record holder is IBM BlueGene/L with its peak speed of 70.72TFLOPS. Willoughby compares supercomputers' evolution to that of factory power in the Industrial era: As the transition from steam to electricity moved factory power architecture from a centralized to a decentralized model, so has supercomputer architecture evolved from specialized and bulky processors to distributed, massively parallel hardware. IBM BlueGene/L, for example, utilizes over 16,000 dual-core processors organized into 16 clusters, with each processor connected to one of five internal communications buses. Willoughby predicts the reigning supercomputer of the future will be arranged in a grid configuration of loosely linked systems that are logically tapped across a global network to tackle a single problem. A lot of research is gathering around grid architectures, which have a greater dependence on specialized software than speedy hardware. Willoughby envisions the emergence of a global supercomputing grid that will enable users to reap profits by selling their unused computing cycles. He theorizes that the fastest supercomputer by the end of the current decade could employ over 1 million processors.
    Click Here to View Full Article

  • "At the Heart of the Open-Source Revolution"
    CNet (01/13/05); Festa, Paul

    In an interview with CNet, Mozilla Foundation and Open Source Applications Foundation (OSAF) Chairman Mitch Kapor says the two foundations are committed to breaking Microsoft's dominance of the Web-browsing and email software sectors through the open-source development model. Kapor insists that open source is not a panacea, but says that open-source software's low cost and higher quality will help give software users more control. Kapor says the need for an open-source model emerged from the "stagnation" of software development driven by the proprietary software model, which had settled in by the late 1990s; the result was a lack of innovation for everyday software applications used by average PC users. Kapor thinks the Firefox Web browser's remarkable traction is a validation that a well-designed and well-executed open-source product can have global appeal. In addition to the OSAF, Kapor has established the Mitchell Kapor Foundation and the Level Playing Field Institute, which focus on social, educational, and environmental issues. Kapor says the tech industry has an uneven record when it comes to those issues, though he admits that generalizing is both tough and hazardous. He calls many tech corporations "progressive" and socially responsible, but also points out that a "Silicon Valley attitude" to shirk corporate responsibility still lingers. He says, "If you're running a business, you have employees, and that comes with very basic responsibilities to be a good citizen. That's not a mainstream attitude in the technology industry." Kapor says the OSAF is working to establish the Chandler application as an open-source personal information manager that supports email, calendars, and contact, address, and task management.
    Click Here to View Full Article

  • "Flattening the Learning Curve for New Technologies"
    IST Results (01/13/05)

    IST's ACTIPRET project is designed to ease the process of learning to use new technologies with a system that video-records the hand and arm actions of an expert performing a task and then deconstructs the actions into individual behavior elements that are stored in an easy-to-replay sequence. Project coordinator Markus Vincze says the ACTIPRET system is unique in that it records and interprets human actions as well as how those actions are interrelated. The system also translates those actions and interactions from a video format to a text format, using natural-language descriptions to store the component parts of a series of actions. Vincze envisions how ACTIPRET could be applied to a copier jammed with paper, for example: "Instead of a series of icons on the cover showing what to do, you could have a set of AR [Augmented Reality] glasses that would play a video," he explains. "Such a concept could be applied to any new piece of equipment you acquire, making learning that much easier." Managing the ACTIPRET system's recognition and processing chores requires about half a dozen PCs, even for simple actions. Fields that stand to benefit from the deployment of such a system include plant maintenance, machinery repair, and surgery.
    Click Here to View Full Article

  • "Developers Voice Mixed Reactions to IBM Patent Policy"
    IDG News Service (01/12/05); McMillan, Robert

    IBM's Jan. 11 announcement that open-source software developers would have unrestricted access to 500 patents engendered a mix of optimism and caution among the developer community, with critics targeting the fact that the bulk of IBM's patent portfolio is still inaccessible. "One of the biggest questions is, despite this overture, is software patenting really good for the industry in general, and is it going to still be a very big problem for open source?" noted open-source proponent and Open Source Initiative founder Bruce Perens. IBM technical software strategy director Douglas Heinztman estimated that software patents accounted for about 50% of the approximately 3,200 patents IBM registered in 2004; the company owns roughly 40,000 patents altogether. Linux inventor Linus Torvalds said IBM is showing the business community how to adopt the free exchange principles of open-source developers, and praised the patent-sharing initiative as a first step toward addressing the software patent controversy. "The more people that think about this, and the more businesses that get comfortable with the notion of sharing, whether it be their copyrighted software or their patents, the better off we'll all be," he declared. Still, Torvalds wished that IBM had taken a position against software patents. Perens and other open-source developers prefer defending software innovations with copyright law rather than patent law. Samba project lead developer Jeremy Allison complained that software patents are often too easily approved and very hard for developers to comprehend.
    Click Here to View Full Article

  • "Learning Machines"
    O'Reilly Network (01/12/05); Oram, Andy

    Computer Professionals for Social Responsibility member Andy Oram believes robots could play a major role in boosting productivity for a rapidly aging population. But making robots capable of such a feat is a formidable challenge, as Carnegie Mellon University (CMU) research scientist Geoffrey Gordon indicated recently. The university devotes a lot of research to helping robots perceive their surroundings via distributed machine learning, a process that can be very complex because of the power requirements for robot-to-robot data transmission, as well as networked sensor nodes' fragility and susceptibility to interference in real-life environments; determining the most durable network scheme with the lowest transmission costs requires knowing the quality of links between nodes. Mesh networks--relatively looser systems with multiple redundant links--are widely hyped, but researcher Carlos Guestrin prefers a hierarchical tree architecture that delivers better scalability and gives each node a fairly accurate projection of environmental activity, though rapid self-reconfiguration is a must. Oram reports that Gordon appreciates commoditized technology for robots, and notes that CMU robots regularly use off-the-shelf Pentium chips running the Linux operating system. Gordon told Oram that predicting the emergence of major breakthroughs in robotics and artificial intelligence is difficult, because research often turns out to be a tougher prospect than originally assumed; "Progress tends to affect some deep aspect of the problem," Oram observes. The author details robotics research geared toward aiding disabled people, such as a sensor network that could perhaps detect a person in trouble and stage an intervention. Meanwhile, some research focuses on making robots capable of comprehending and responding to people's psychological states.
    Click Here to View Full Article

  • "Tech Trends Will Topple Tradition"
    EE Times (01/10/05) No. 1353, P. 31; Wilson, Ron

    The most significant technology breakthroughs could come in areas that currently present the most difficult challenges, such as increasing computing performance, manufacturing complex systems, and interfacing electronic and biological systems. Grid computing has already proven itself for certain applications, which Microsoft Bay Area Research Center scientist Gordon Bell calls "embarrassingly parallel." Beowulf cluster standards and the MPI messaging interface have helped the grid field, but issues with bandwidth, latency, and component task size limit the application of grid computing; Bell sees potential in operating peer networks as pipelined systems instead of parallel ones. Another serious technology barrier is the construction of nanoscale devices or complex macro systems, both of which require self-assembly technology. Biological systems provide a model for self-assembly, so it is not surprising that synthetic biology research is helping to drive advances in nanoscale electronics, for example. Synthetic biology involves inserting task-specific DNA into cells to produce custom proteins, and researchers are using similar concepts to self-assemble electronic circuits according to patterns laid by biological molecules. On a larger scale, self-assembly technology is being used to create robotic systems composed of as many as 100 components, and Harvard University computer science assistant professor Radhika Nagpal says such systems are actually quite near to realization. The electronics-biology front still presents serious hurdles, but researchers are now making progress on neural interfaces, for example. New integrated circuits with neuron-sensing and neuron-stimulating capabilities will be presented at the upcoming International Solid-State Circuits Conference by IMEC and the Max Planck Institute for Biochemistry in Europe.
    Click Here to View Full Article

  • "Fast-Moving Development"
    Computerworld (01/10/05) P. 26; Havenstein, Heather

    Architected rapid application development (ARAD) allows businesses to automatically generate up to 80% of native code, reducing development time and cost, and improving quality. ARAD software utilizes patterns to build generic components, such as user interfaces or business logic, and architects and senior designers can create their own templates. Major software development vendors Compuware, IBM, and Computer Associates are offering ARAD tools: Locus Systems general manager Richard Blais says Compuware's OptimalJ ARAD tool helps the company compete with offshore outsourcing competitors; "It's like having Bangalore in a box," he says. Locus used ARAD to generate between 65% and 70% of the code for three applications so far and has also seen a significant reduction in the number of bugs. ARAD frees developers to focus on architecture and design issues instead of technical coding details, says Bryan Schwiening of the Advanced Development Center of Austin; in addition, ARAD eases the repurposing of applications by making existing Web-enablement code available, for example. Gartner Research says the cumulative benefits of ARAD are such that it will become mandatory for mainstream companies looking to create service-oriented applications. ARAD development improves return on investment 15 times over traditional development tools, according to the research firm. In order to experience the full benefits of ARAD, developers need to become more disciplined in terms of using the tools and processes consistently, and adhering to the right patterns and architecture, says Upside Research President David Kelly.
    Click Here to View Full Article

  • "Europe Fights Tide of Absurd Patents"
    New Scientist (01/08/05) Vol. 185, No. 2481, P. 22; Fox, Barry

    The European Parliament is likely to soon pass a European Commission directive that bans software and business method patents in order to avoid a patent inundation that could hamper technological innovation by encouraging infringement lawsuits. There is a long-standing consensus between European politicians and programmers that copyright already provides adequate protection for software, making broadly applicable software patents unwanted. But patent applicants have been exploiting a loophole that allows software to be patented as a computer-implemented invention--a loophole that the draft directive seeks to remove. The directive has traveled a bumpy road to approval by the parliament: The original draft did not clearly distinguish between software-controlled inventions and the software that controls such inventions, so a provision that banned pure software patents was inserted. However, the European Industry Association for Information Systems, Communication Technologies, and Consumer Electronics (EICTA) complained that the legislation's wording permitted a patented technique with a "significant purpose" to be used without infringement, so the commission and the parliament are debating a new draft that is expected to make allowances for computer-implemented inventions that include a "technical contribution." The Foundation for a Free Information Infrastructure's Rufus Pollock warns that such a term is vulnerable to overly broad interpretation that could permit software patenting.

  • "Information Liberation"
    Information Highways (12/04) Vol. 12, No. 1, P. 16; Lima, Paul

    Critics complain that the tightly-controlled dissemination of information by media conglomerates is threatening democracy, which cannot flourish without the open distribution of knowledge. Many proponents believe the solution to this problem is the establishment of information commons that are collectively coordinated by like-minded individuals and are accessible to everyone. The Free Expression Policy Project (FEPP), a component of the Democracy Program at NYU School of Law's Brennan Center for Justice, supports the "fair use" of information without opposing the generation of for-profit content or copyright. The project's information commons report argues that such commons act as a counterweight against media consolidation and restrictive information access and distribution laws and provide new avenues for creating and sharing information, creative content, and democratic discourse. FEPP senior research fellow and former American Library Association President Nancy Kranich notes that libraries are a primary source of information for consumers, yet tight budgets and narrow information access policies are choking public access to knowledge. "Free expression in relation to democracy is not effective unless there is some level of equality and opportunity to disseminate and access information," says FEPP founder Marjorie Heins, who believes that the free market should have a greater sway on information distribution. In her view, information commons will help counter the "copyright mentality" that has thrown the balance between media giants' fair return for effort and controlling copyrighted information out of whack. However, In the Know Research and Information Consulting principal Phyllis Smith is concerned that information commons will add up to data overload for information professionals.
    Click Here to View Full Article

  • "Technology Trends and Opportunities for the Future"
    GeoWorld (12/04); Croswell, Peter L.

    Five major technological trends may have significant effects on information technology (IT) products and implications for geographical information system (GIS) users over the next decade. The pervasive, high-performance computing trend could accelerate the development and rollout of multifunction devices, field-based computers, mobile and wearable technology, RFID devices, and electronic media convergence; for GIS users, this could translate into technology that better meets their demands for server-centric systems, storage capacity, location-based services (LBSes), more flexible and remote data collection, and remote personal location and commodity tracking. The digital connectivity trend is expected to advance both wired and wireless local-area communications as well as wireless wide-area communications, resulting in additional GIS-based high-speed communications options, more flexible GIS server and network access, direct GIS access for emergency response, and more field-based and mobile GIS applications. The Internet will continue to evolve and Web services will emerge to support omni-environmental Web-based GIS deployments and help LBS applications and usage thrive. The geographic data management and visualization trend could impact technologies that include graphic displays and 3D data management/analysis/visualization, leading to broader map and data viewing and less need for hardcopy, more effective visualization and analysis, and an increased role for virtual reality environments in the GIS industry. The movement toward open systems and standards (in the form of open-source software, geospatial metadata and catalog standards) promises to foster more effective, out-of-the box options for inputting, updating, and accessing metadata; increased acceptance of server-based Linux systems for Web-based GIS applications; and a migration toward standards compliance and interoperability in commercial product development, among other things. The Geographic Capture and Compilation trend could move forward via advancements in areas that include robotic devices and data monitoring and collection.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM