HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 7, Issue 846:  September 26, 2005

  • "Who Best to Avert Disaster: Government or Business?"
    SearchSecurity.com (09/23/05); Brenner, Bill

    Paladin Capital Group principal Ken Minihan has had first-hand experience in the role both government and the private sector play in protecting the nation's digital infrastructure: Being a former director of the National Security Agency (NSA) made him realize the government's responsibility in averting disaster, while falling victim to identity theft taught him that business must also shoulder some of the burden. Minihan agrees with a 2005 Government Accountability Office (GAO) assessment that the Homeland Security Department must make a greater outreach effort to the business community, but comments that the private sector must likewise be more open with the DHS. "Businesses need to be clearer on what they need from the government, so they can play a greater role in homeland security," he says. Minihan stresses that the first line of defense are the individual IT workers responsible for keeping enterprise networks safe from cyber-criminals on a daily basis. He notes that most organizations are focused on enhancing physical security while cybersecurity gets short shrift, the irony being that physical safeguards will not matter if the network is running exploitable programs. Minihan says enterprises need to improve their online authentication methodology through new technologies and policies, as well as patch up vulnerabilities in both hardware and software. DHS cybersecurity director Andy Purdy expects U.S. cyber-infrastructure to be bolstered with a reorganization of his department. This restructuring aims to give DHS greater influence, and its changes include an assistant secretary of cyber and telecommunications who, in the words of the department, is "responsible for identifying and assessing the vulnerability of critical telecommunications infrastructure and assets."
    Click Here to View Full Article

  • "A Wireless Web for Ocean Waters"
    InternetNews.com (09/23/05); Kuchinskas, Susan

    The health of the Gulf of Farallones National Marine Sanctuary is continuously monitored by a network of floating sensors set up by San Francisco State University (SFSU), but uploading the sensors' data to the main computer was a time-consuming affair in which researchers traveled by boat to each sensor and manually transferred the data to a laptop, and then returned to the mainland to perform the actual uploading. This process has been greatly accelerated through the Networked Bay Environmental Assessment Monitoring Systems (NetBEAMS) project, a joint effort between SFSU, Agilent Technologies, and Sun Microsystems that enables sensor data to be accessed in real time by cell phones. The NetBEAMS application surveys the sensor network, which supplies information on salinity, temperature, water depth, and algae growth, and broadcasts the data through cell phones to a database that uploads the data to the Web. Sun, Agilent, and SFSU coders created sensor node software that boasted self-descriptors for the sensor, actuator, and data, along with a networking protocol. Both cell phones and sensors usually employ a serial datalink, so the researchers simply loaded the sensor software into the phones, and then connected the phones to the buoy-mounted sensor arrays. SFSU Computer Science Department Chairman Dragutin Petkovic says projects such as NetBEAMS are a vital training experience for students, who are now expected to have more than purely technical expertise upon graduation. Sun's Jim Wright says NetBEAMS is an opportunity to test the feasibility of building Java networks that observe and remotely control equipment, machines, and the environment.
    Click Here to View Full Article

  • "Professor Wants to Put Your Toaster on the Internet"
    University of Arizona (09/19/05); Stiles, Ed

    University of Arizona systems and industrial engineering professor Fei-Yue Wang envisions a system in which household appliances and other devices are remotely controlled by computers through software agents; such a system would reduce the cost and boost the efficiency of such devices. His concept involves linking cheap, reconfigurable, "dumb" appliances to an operation center that supplies autonomous software algorithms. Each appliance boasts sufficient memory and processing power to support one or two control agents for specific operations. Wang pictures a pair of central controllers--one based in the home and one based at the appliance company headquarters: The headquarters' operation center will use data mining to familiarize itself with how appliances are used in each home, thus providing a customized control system that improves both performance and cost efficiency. Wang says appliance manufacturers' participation would be easy to obtain, as the project involves the use of inexpensive computers that produce valuable data on individual users. This data could be employed to develop more narrowly focused marketing strategies. The professor splits control agents into three varieties: A device-level default agent with basic functions that controls the appliance during network downtime; a local network agent; and a custom-designed agent from the manufacturer, which is periodically downloaded to become the latest local network agent. Wang also believes his concept could be applied to automobiles and traffic control systems.
    Click Here to View Full Article

  • "Lawmakers Will Seek a Federal Study of Colleges' Success at Stopping File Swapping"
    Chronicle of Higher Education (09/23/05); Read, Brock

    At a congressional hearing on Thursday concerning the file-sharing habits of college students, two prominent representatives announced their intention to request the Government Accountability Office to commission a study exploring the relative success of antipiracy measures among colleges and universities. University of Florida housing director Norbert Dunkel was lauded for the development of Icarus, an in-house program Florida uses to kick students off the Internet if they are found connecting to a peer-to-peer network, an initiative that has essentially halted activity that infringes on copyright restrictions. Icarus has raised the ire of privacy advocates, who criticize its sweeping usage restrictions for curtailing students' access to legitimate information on the Web. A report prepared by university officials and representatives from the entertainment industry criticized campus administrators for their complacency, and lamented the introduction of myTunes and ourTunes, two programs that convert Apples' iTunes into a file swapping service; the group identified several legitimate services that share files that offer college students legal options for downloading music. Some schools have opted for imposing bandwidth restrictions, while other systems, such as Icarus, monitor networks for specific illicit activity. Discussions about file-sharing have become integrated into campus discourse, as electronic usage agreements are often a part of orientations, and the issue has been taken up inside the classroom. The group also identified the increasing use of Internet2 as a hub of illegal sharing, and renewed its call for schools to police network activity. It called on universities to remain vigilant and consistent in their treatment of violators, to ensure that all of the education and filtering programs in place are sufficiently reinforced.

  • "From Shared to Distributed Memory Systems for Applications"
    IST Results (09/23/05)

    The goal of the IST-funded POP project was to produce an environment where shared-memory applications designed using the OpenMP application program interface can run on distributed-memory systems. OpenMP was crafted by a consortium of computer vendors to facilitate the uncomplicated creation of mobile, high-level shared-memory applications using C, C++, and Fortran. OpenMP applications do not perform efficiently on distributed-memory machines because they are designed for use in shared-memory systems where memory is stored in a block and accessed by different processors. "What we have done is adapt OpenMP by extending and modifying protocols and runtime mechanisms to make it more flexible so programs can run on shared or distributed-memory systems without having to be retuned in each individual case," says POP project coordinator Jesus Labarta. "The end goal is to allow OpenMP applications to run anywhere, reducing the time and costs of reprogramming." NASA and IBM are testing POP's OpenMP environment on their distributed computing systems. NASA is investigating whether the environment can address interoperability problems in parallelizing their machines, while IBM is evaluating the POP method on its MareNostrum supercomputer.
    Click Here to View Full Article

  • "Professor David Haussler to Receive Carnegie Mellon's Dickson Prize"
    Currents--UC Santa Cruz (09/26/05); Stephens, Tim

    University of California-Santa Cruz biomolecular engineering professor David Haussler has won Carnegie Mellon University's Dickson Prize in Science for his contributions to bioinformatics and computational learning theory. His introduction of hidden Markov models and related techniques to RNA, DNA, and protein sequence analysis in the 1990s led to his participation in the International Human Genome Project, which involved his team supplying a computational solution that enabled the first working draft of the human genome to be completed. The human genome sequence was published by Haussler's group on the World Wide Web, and the team's next achievement was the development and management of the UCSC Genome Browser, an interactive "microscope" that lets scientists observe analyzed and annotated genome sequences at any scale. Haussler is performing continuing research in comparative and evolutionary genomics using the tools his team developed. His group has spearheaded a computer-based, probabilistic method for studying the genome of one mammalian species and comparing it to the genomes of one or more other species, and has also devised algorithms that have identified "ultra-conserved" genomic segments. Haussler is director of UCSC's Center for Biomolecular Science and Engineering, scientific co-director of the California Institute for Quantitative Biomedical Research, and a fellow of the American Association for the Advancement of Science and the American Association for Artificial Intelligence. He was named 2001's "Scientist of the Year" by R&D Magazine, and earned the 2003 ACM/AAAI Allen Newell Award.
    Click Here to View Full Article

  • "Data Encryption About to Make Quantum Leap"
    Globe and Mail (CAN) (09/22/05); Buckler, Grant

    Quantum cryptography promises to deliver unbreakable encryption because of its reliance on subatomic particles whose state can be detectably altered by the act of observing them. A quantum encryption key is relayed over optical fiber, or potentially through the air as a light beam; eavesdroppers cannot intercept the key without tipping off the sender and receiver, since their observation will change the properties of the photons comprising the key. ID Quantique CEO Gregoire Ribordy acknowledges that a quantum key could still be decrypted by a powerful enough computer with sufficient encrypted data, but quantum cryptography technology could thwart such a technique by enabling a key to change several times a second and be transmitted over the network. However, fiber-optic transmission of quantum-encrypted data is currently limited to distances of 100 kilometers. Potential solutions to this problem include the amplification of the signal at intervals by quantum repeaters, or the creation of networks that transmit data from point to point to point, with quantum keys for every link. Meanwhile, University of Toronto researchers have worked out an eavesdropping detection method using quantum decoys that boost the performance of quantum cryptography, enabling a stronger signal that can travel farther while remaining secure. Accenture Technology Labs research director Martin Illsley says the government, the military, and the financial industry currently comprise the primary markets for the shorter-range technology.
    Click Here to View Full Article

  • "Making the Internet Wicked Fast"
    Philadelphia Inquirer (09/22/05) P. D1; Parker, Akweli

    The Internet2 consortium is developing advanced, high-speed uses for the next-generation Internet. One such application is remote high-definition videoconferencing, which was demonstrated this week at the consortium's fall meeting. Pure HD video consumes too much bandwidth to be viable for most broadband links, but the MPEG-4 video-compression scheme enabled a concert by a student cellist to be broadcast at the conference in real time over a 1 Mbps connection. The videoconferencing system's image quality problems should be resolved with a compact HD camera and telephone set-up as well as with faster transmission speeds. Internet2 officials expect the next-generation Internet's higher speeds and pervasive use of video to be broadly adopted by the general public. Internet2 members are currently employing diverse techniques to hold videoconferences in regular video resolution and to transfer massive libraries of research data. The consortium members use a network that seeks to supply connected desktops with 100 Mbps transmission speeds; the network is being used by researchers to connect supercomputers with radar stations to boost the precision of weather forecasts, for instance.

  • "Towards the Narrative Annotation of Personal Information and Gaming Environments"
    University of Southampton (ECS) (09/15/05); Tuffield, Mischa M.; Millard, David E.; Shadbolt, Nigel R.

    University of Southampton researchers propose a technique for enabling the dynamic generation of narratives from available knowledge bases using the Semantic Web (SW) publishing paradigm. The authors refer to Bal's three-layer view of narrative consisting of, in ascending order, the fabula (raw chronological events), the story, and the narrative. By analogy, the collection of "knowledge nuggets" produced by SW enabling technologies through annotation of multimedia items constitute the fabula; story grammars that are most frequently deployed as templates represent the architecture used to depict the desired story; and the resulting output forms the narrative. Story grammars are the key drawback to this approach, because they must be defined by developers prior to system implementation and thus limit a system's ability to spot any new relationships to portray a narrative. The researchers discuss how their method for recognizing related narrative components and combining them into stories can apply to the Memories for Life (M4L) project and massively multiplayer online role playing games (MMORPGs) through the use of SW technologies. The M4L application seeks to develop narratives to contextualize digital photo collections, based on the photos and any pertinent data taken from available sources. The MMORPG application is designed to help guide players' actions and make them feel more involved in the game world by taking a snapshot of the game environment each time a user logs in. The proposed narrative has three layers: A high level narrative describing the most powerful characters' actions, domains, collected items, and so on; a second narrative that describes the major events that have transpired in a player's vicinity; and a third narrative describing characters opposed to the player that must be disposed of, as well as possible collaborators.
    Click Here to View Full Article

  • "SPARQL: Web 2.0 Meet the Semantic Web"
    O'Reilly Network (09/16/05); Clark, Kendall

    To provide the Semantic Web and Web 2.0 with a standard query language, the Data Access Working Group developed SPARQL, an RDF query protocol and language. The REST protocol that is involved in most Web 2.0 applications contains a standard set of operations, but lacks any form of a standard data manipulation language. The addition of a functional query language will greatly augment the ability of Web 2.0 to bring more meaning to its data, building on RDF as a data representation formalism. SPARQL brings a measure of standardization that neither REST nor HTTP could provide, and provides a common data manipulation to the Semantic Web and Web 2.0. SPARQL offers Web 2.0 with a singular query language and client that can access enormous repositories of data. SPARQL makes a great stride toward bringing together the Web 2.0 and Semantic Web communities, though there still needs to be an adequate SPARQL client implementation coded in Javascript. It will also be important to develop AJAX-friendly models for constructing and moving queries. Finally, the technology needs to mature and spread, though its implementation has already taken root in Java, Python, and other common languages.
    Click Here to View Full Article

  • "Implementing Accessibility Standards"
    SD Times (09/15/05) No. 134, P. 5; Handy, Alex

    Writing software has become much easier for disabled users thanks to more ubiquitous tools and standards, but such benefits are only available if universal access standards are implemented into the application from the start. This implementation can vary across different companies, but experts concur that providing accessibility makes good business sense. Sun Microsystems accessibility architect Peter Korn says accessibility standards have entered an "access by contract" phase whereby the accessibility interface consists of a collection of methods contained in each onscreen object. "These methods provide the information that users of accessibility technology need," he explains. Paul Snayd with IBM's accessibility technical team says application programming interfaces can benefit developers as well as users. He notes that developers need to decide what kind of users their application is designed for and what functions it must perform. Snayd adds that certain platforms--Swing, Java, Windows, etc.--offer more to help ease accessibility programming. "There are [also] lots of helpful checklists and things on Web sites, setting out what the objectives are," he says.
    Click Here to View Full Article

  • "Better by Design"
    Economist Technology Quarterly (09/05) Vol. 376, No. 8444, P. 27

    More and more major firms are embracing product life-cycle management (PLM) software that helps companies design, build, and manage products, thus playing a critical role in their operation, expansion, and profitability. PLM's many benefits include minimization of new parts and suppliers, seamless integration of old and new components, reduced prototyping costs, lower time-to-market for new products, swifter turnaround of marketing materials, less waste, and better product quality. The problems and complexities that come with globalization can be eased via a single, secure environment for international communication and collaboration provided by PLM. Other driving factors behind PLM's adoption include the growing complexities of the products themselves, and the need to satisfy numerous regulations and requirements. General Motors successfully shrunk its average development time for new vehicles from 48 months in 1997 to 12-18 months today by implementing a PLM system. The system has helped instill reusability within vehicle components and subsystems; it also facilitates collaboration with suppliers, and virtualizes the crash-testing process. Walter Donaldson of IBM says the next phase in PLM's development is to increase its adoption among midsize companies. PLM currently constitutes the fastest-growing segment of the business-software market, but Forrester's Navi Radjou thinks PLM's functions will eventually be split up and distributed among other applications.
    Click Here to View Full Article

  • "NARA: New Archive System Could Change Records Management"
    Government Computer News (09/19/05) Vol. 24, No. 28; Thormeyer, Rob; Miller, Jason

    The National Archives and Records Administration (NARA) expects its Electronic Records Archive (ERA) program to fundamentally alter the way federal records are presented to agencies for the foreseeable future. The ERA program, recently awarded to Lockheed Martin, is scheduled to become initially operational by 2007 and yield significant improvements in efficiency, such as the online submission process agencies will use when submitting records. NARA requires agencies to submit a request when they want to destroy information they deem historically insignificant, and ERA promises to make that process faster and more transparent. ERA will store the billions of government documents that NARA manages in such a way as to ensure that they will be accessible in future formats that have yet to be invented, preserving the authenticity of the records after their original format becomes obsolete. Lockheed recognizes that the system will have to be built on open architecture to incorporate new formats. Of the few details Lockheed has disclosed, it has said that digital adapters will be used to transform content into timeless formats. In the development of ERA, Lockheed will not depend on a specific vendor or product, but rather test multiple applications to determine which is most appropriate for a given function. "The important part of the system is the data," said Lockheed's Tom Kelley. "It is not tied to any given hardware or software." The initial phase of ERA will search for information at risk of being lost, such as records concerning Iraq and Afghanistan, as well as offering public access to a large repository of electronic records.
    Click Here to View Full Article

  • "It's a Whole New Web"
    Business Week (09/26/05) No. 3952, P. 76; Hof, Robert D.; Baker, Stephen; Ihlwan, Moon

    An entirely new Web is emerging, one that re-casts users as active participants and customized content creators. Media organizations, online retailers, and tech companies will need to adapt their business models to the increasingly user-centric, do-it-yourself Web. Audience participation is becoming more commonplace in new Web sites and online services: Examples include the PostSecret group blog, in which participants put down their thoughts on mock postcards that are selected for public viewing, while the Pandora music site features channels custom-built by users. Tagging technology that lets users track Web activities by labeling articles, Web sites, and other online material is attracting interest from advertisers and other companies as a tool for mapping trends and planning marketing strategies. Meanwhile, DonorsChoose.org is an online charity where educators can post requests for needed resources, which visiting donors can choose to fund. South Korea's Cyworld Internet service exemplifies the eroding boundary between the physical and virtual worlds: The service allows users to create customized 3D home pages that can support limitless numbers of photos, blogs, documents, and accessories, and establish connections with other members. Cyworld has become a very popular socialization and leisure tool for nearly one-third of South Korea's population. The explosion of user-centric Web sites is being partly driven by a set of programming technologies collectively known as Ajax, which ramp up the Web experience and bring Web applications closer to desktop programs in terms of speed.
    Click Here to View Full Article

  • "Basic Training for Anti-Hackers"
    Chronicle of Higher Education (09/23/05) Vol. 52, No. 5, P. A41; Carnevale, Dan

    The threat of terrorists penetrating computer networks and wreaking havoc prompted the creation of the Cyber Security Boot Camp, an intense 10-week summer program hosted by the U.S. Air Force and Syracuse University in which participating college students study and practice hacking so that they may learn how to defend against cyberattacks. Air Force Research Laboratory computer engineer Kamal Jabbour says the goal of the program goes far beyond making these cyber-defenders technically proficient: He wants them to become sensitive to the urgency of the threat in order to be decisive in action. Participants take cybersecurity courses that cover cryptography, steganography, network security, wireless security, and digital forensics. Students are required to analyze a security problem and present a solution in a detailed report each week, all the while conforming to a strict writing style. Participants also serve as interns with local companies and organizations in order to be exposed to real-world cybersecurity applications. The boot camp's high-pressure course load is complemented by adherence to stringent rules concerning housing, appearance, and physical fitness, which are laid out in a military regimen. The program climaxes with a hacking contest in which student teams penetrate their opponents' computers to capture virtual flags. Each team is divided into two groups--one dedicated to attack rivals' systems and the other committed to defending their own system.

  • "The Virtues of Virtualization"
    CIO (09/15/05) Vol. 18, No. 23, P. 86; Hapgood, Fred

    The enormous potential of virtualization software lies in its ability to fill in for physical computing components, removing the drawbacks and risks such elements present. Projects that have opted for virtualization have enjoyed substantial gains in implementation, uptime, and, perhaps most critically, utilization efficiencies. Virtualization also simplifies disaster recovery planning by instilling remote manageability. In addition, virtualized hardware can allow developers to wield much greater influence over the development process, as well as facilitate quality assurance without capital expenditures. The anticipated migration of virtualization from storage, servers, and development to networks and data centers in the near future will enable CIOs to build software versions of physical infrastructure on an as-needed basis. With this will come major automation opportunities, while virtualization's ability to discriminate among classes of service that once had to be accommodated collectively will bring policy issues to light. Possible implications for CIOs include a shift in responsibility to the maintenance of the execution environment, which could come to encompass whole systems.
    Click Here to View Full Article

  • "On the Data Road"
    GeoWorld (09/05) Vol. 18, No. 9, P. 38; Herbst, Margaret

    The U.S. Department of Transportation's Vehicle Infrastructure Integration (VII) project seeks to create a wireless "network within the road network" that communicates safety, commercial, and mobility data among vehicles and the highway infrastructure by connecting cars, public-transit vehicles, and trucks to each other as well as data processing centers. Such a system could make drivers more careful and emergency services more responsive to accidents. For VII to work, probe vehicles must be equipped with digital road maps so vehicles are aware of their position on the road as well as where they are in relation to other vehicles, while infrastructure managers will need digital maps to ascribe all vehicles to specific locations. The digital map database can expedite speed alerts, curve warnings, automated collision notification, and other advanced safety and mobility applications by guaranteeing that the vehicle always knows its exact location, its future route, and information about local services. The first major demonstration of VII will be at the World Congress on Intelligent Transport Systems, where companies will spotlight enabling technologies. VII's success hinges on the resolution of key issues such as interoperability standards, ownership of the data in the probe vehicles, and data management. Governmental and industry leaders aim to finalize VII implementation plans by 2008 and start system operations as early as 2010.
    Click Here to View Full Article

  • "The Changing Role of Software as Hardware"
    Embedded Systems Programming (09/05) Vol. 18, No. 9, P. 55; Banta, Gary

    Hardware and software have historically been kept separate, with hardware traditionally providing integration, differentiation, defined system economics, and product practicality. But software will increasingly facilitate the gruntwork that used to be the exclusive domain of hardware design, and this should allow all standards to be implemented using a single processor. The migration of innovation and product differentiation into the software domain is being impelled by exponentially rising costs of hardware development as well as software's ability to yield sustainable efficiencies that hardware cannot match. The deployment of functionality in software allows easily scalable design that can lower the development cost and time-to-market for software, while the genericity of a programmable/configurable processor can reduce the general risk of a design. Software can also streamline the process of managing the optimal balance of performance, size, power, and cost of an IP block. Trends indicate that the move to software as hardware is proceeding apace, as evidenced by the emergence of software-configurable processors. Concurrent with the shift toward software is the elevation of the abstraction level, which will enable software engineers to design hardware without needing comprehensive knowledge of the architecture's hardware design.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM