HomeFeedbackJoinShopSearch
Home

Be a TechNews Sponsor

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM. To send comments, please write to [email protected].
Volume 7, Issue 775:  Wednesday, April 6, 2005

  • "Call of the Wild for BIOS"
    CNet (04/05/05); Spooner, John G.

    A free BIOS movement is forming in order to break open the information monopoly of existing BIOS-writing firms such as Dell and specialty BIOS companies. Free Software Foundation (FSF) President Richard Stallman wants to provide an alternative to closed BIOS, but says PC makers and motherboard manufacturers need to open up necessary information, such as how BIOS loads and hardware devices initialize; currently, that information is released to BIOS companies under non-disclosure agreements. But as long as users do not control BIOS, they do not control their computers, and Stallman worries that digital-rights management and trusted computing security technologies could significantly limit computer freedom unless individual developers have the ability to write around those features or shut them in BIOS. BIOS technology is changing with the transition to the new Extensible Firmware Interface (EFI) framework, and BIOS-writing firms say the current closed system provides stability, reliability, and security. A hacker with information on how a BIOS was written could create a virus that constantly refreshes a PC, for instance, warns Phoenix Technologies' Mike Goldgof; "What people take for granted...is the reliability of the firmware today," he says. Intel is promoting an alternative BIOS framework called Tiano that uses EFI drivers to turn on PC elements such as the processor. Tiano is licensed under the BSD open source license, which allows users to change and commercialize the technology without having to release their modifications to the public, thereby preserving intellectual property. But critics say Tiano does not provide initialization code that would make it easy for developers to cobble together their own BIOS.
    Click Here to View Full Article

  • "Your Are What You Listen To: Users of Digital Music Sharing System Judge Others by Their Playlists"
    Georgia Institute of Technology (04/01/05); Sanders, Jane

    Music sharing technologies show tremendous potential in fostering community feelings, according to a study by human-computer interface researchers at the Georgia Institute of Technology and the Palo Alto Research Center. The study focused on co-workers at a midsized U.S. company that shared music libraries through Apple Computer's iTunes software, which allows users to share entire collections or playlists of songs. The software does not allow copying of music, but songs are streamed from the host machine to other people on the network while the host is online. The researchers noted users consciously crafted their shared playlists to reflect a desired image, noted the presence of others according to whether music was shared or not, and made efforts to find out the identity of people who shared music. The iTunes software was designed primarily for home users, and the study indicates music sharing technologies could be used as online communities, regardless of whether those involved share the same music taste. Of the 13 people interviewed, several people admitted to hiding certain music or adding files they thought would present a more balanced or pleasing image of themselves; when a company manager joined the network, people involved began to speculate whether the available music had changed or not. The researchers noted that people were less inclined to peruse anonymous collections and even went out of their way to match people to music. The study has been nominated for best paper at the Computer-Human Interaction conference (CHI 2005) set for April 5 in Portland, Ore.
    Click Here to View Full Article

  • "Dialogue Systems Juggles Topics"
    Technology Research News (04/13/05); Smalley, Eric

    A joint venture between Stanford University and Edinburgh University has yielded the Conversational Interface Architecture, a dialogue management system designed to improve verbal human-computer communications by giving computers the ability to recognize conversational context and anticipate what kind of phrase a person is likely to say next. Slot-filling dialogue systems prompt users to supply topic-specific data containing keywords the systems read to determine their response, and Edinburgh University research fellow Oliver Lemon says the Conversational Interface Architecture overcomes this limitation by following multithreaded conversations. Contextual tracking of conversations narrows the spectrum of words the system must try to recognize, streamlining the task of speech recognition. In addition, the system can identify corrective fragments, or phrases to correct something the user has just verbalized, and it enables users to start, broaden, and correct dialogue threads whenever they wish. This is possible thanks to the system's ability to track various kinds of utterances, such as yes and no answers, who-what-where answers, and corrections; focusing on utterance type rather than specific topics to limit speech recognition makes the system compatible with any application. Lemon predicts that practical multithreaded dialogue management could emerge within two years.
    Click Here to View Full Article
    The Stanford-Edinburgh breakthrough was detailed in the Sept. 2004 issue of ACM's Transactions on Computer-Human Interaction; http://www.acm.org/pubs/tochi/

  • "IT Employment on Upswing"
    InformationWeek (04/04/05); Chabrow, Eric

    Unemployment among IT professionals has fallen at a faster rate than employment in general, according to an analysis of recently released Bureau of Labor Statistics data by InformationWeek. The annualized rate for out-of-work IT workers was 3.7 percent for the four quarters that ended March 31, compared to 5.5 percent a year before; overall unemployment stood at a seasonally adjusted 5.2 percent last month, down from 6 percent a year earlier. The population of unemployed Americans seeking IT jobs declined to 131,000 from 149,000 between the last quarter and the previous quarter. Some 3.38 million Americans were employed in IT in the first quarter, while the last quarter experienced an increase of roughly 39,000 workers from the end of the previous quarter and approximately 57,000 from a year before. Database administrators and computer systems administrators experienced the largest amount of year-to-year employment growth, with respective increases at an annualized rate of 28 percent and 19 percent; network-systems/data-communications analysts and computer programmers saw the biggest declines with annualized rates of 7 percent and 4 percent, respectively. The size of the IT workforce increased 1.2 percent over the previous quarter to total 3.51 million. Many of the fall-offs the IT workforce has experienced stem from the dot-com meltdown and a subsequent decrease in the number of students enrolling in IT and computer-science programs, as well as the migration of business technologists to other fields to cope with unemployment. Raymond James & Associates chief economist Scott Brown expects IT employment to maintain a short-term lead over other professions as businesses recruit IT workers to develop and implement new applications from technological acquisitions.
    Click Here to View Full Article

  • "HPCC Brings 'Supercomputing to the Masses'"
    HPC Wire (04/01/05) Vol. 14, No. 13; Curns, Tim

    Intel senior fellow Steve Pawlowski will be the keynote speaker at this month's High Performance Computing and Communications Conference (HPCC), where he will discuss supercomputing's future direction, trends in processor performance, how software will exploit increasing cores, the future path of technology as Moore's Law increases, and how his company will profit. Pawlowski says he was selected as keynote speaker because Intel aims to stimulate the high-performance computing (HPC) market via commodity processing, which will essentially bring supercomputing "to the masses." He believes that if current processing power trends continue, a blue-collar petaflop machine could emerge by 2009 and a 10 petaflop machine by 2012; "Having that level of performance capability, using common building blocks to configure the machines, may change the equation in terms of how we build high performance systems," Pawlowski reasons. Another HPCC participant is Dr. Bob Lucas, director of the University of Southern California Information Sciences Institute's Computational Science Division, who believes that the modeling and simulation of military forces (FMS) is a field that greatly needs to adapt to large scalable clusters of commodity processors. Lucas is also one of the developers of Project Urban Resolve, a Joint Forces Command initiative that will be demonstrated at the conference; the project uses various FSM software packages to simulate cities for strategic military planning, and HPC is critical for realistically modeling civilian populations. Lucas believes the government must play a key role in the adoption of HPC projects. "I think it's in our national interest to develop a new generation of machines which allow U.S. scientists and engineers to maintain preeminence for those problems which do not perform well on clusters," he says.
    Click Here to View Full Article

  • "Net Aids Access to Sensitive ID Data"
    Washington Post (04/04/05) P. A1; Krim, Jonathan

    Despite talk in Congress about regulating large data brokers, a simple Web search will turn up a dozen smaller operations where identity thieves could easily obtain Social Security numbers. Sites such as www.secret-info.com and www.Iinfosearch.com offer Social Security numbers and other sensitive personal information for about $45, after cursory verification. A private investigator contracted by the Washington Post was able to find a reporter's Social Security number by claiming to need information for tax-filing purposes. A federal law passed in 2001 makes it illegal to sell or transfer non-public financial records without offering an opt-out function, but loopholes exist for tax filing, employment checks, and financial transactions. ChoicePoint and LexisNexis are tightening down on sales of Social Security numbers by giving law firms, media, and private investigators second-class access to individual profiles. Some members of Congress want to introduce new legislation that would ban outright the sale of Social Security numbers without consent. Meanwhile, the financial industry says it is moving away from Social Security numbers as a means of verification, though those numbers offer a convenience to customers. George Washington University privacy law professor Daniel Solove says the current system of using Social Security numbers as de facto identifiers is bad because identity thieves can easily obtain those numbers, but is made worse because it is so difficult to get new numbers issued.
    Click Here to View Full Article

  • "OSI Tackles License 'Explosion'"
    DevX News (04/04/05); Wagner, Jim

    Open Source Initiative (OSI) President Emeritus Eric S. Raymond says the OSI board of directors is planning an April 6 meeting with stakeholders at the Open Source Business Conference to restrain the surge of open source licenses, many of them tailored for specific corporations. The OSI has approved a total of 58 licenses so far, and Open Source Development Lab (OSDL) general counsel Diane Peters says OSDL officials would like that number to be reduced to only a handful, including the upcoming GPL 3.0. Raymond promises that the OSI board will propose a number of reforms designed to counter open source license proliferation at the meeting, without specifying what those reforms would be. Their adoption will depend on the consensus among stakeholders, and Raymond recommends that attendees who do not like the measures should suggest improvements. "I think it will eventually work out because everybody involved with the problem realizes a common material explosion of different licenses isn't in anyone's interests," he notes. Companies that wish to deploy open source software into their businesses are especially concerned at the growing use of customized licenses. Raymond says every license accrues a base of developers but is rejected by other developers searching for a less restrictive license for developing software tools, while corporations view license proliferation as a burden for their legal counsel who must review each software license prior to enterprise adoption.
    Click Here to View Full Article

  • "Degrees of Change"
    CIO Australia (04/05/05); Bushell, Sue

    Schools in Australia have long eschewed business skills in favor of technology training for students in IT programs, but that has started to change as CIOs demand more IT staffers with such skills, concurrent with declines in IT program enrollments spurred by fewer job prospects for graduates. Old Dominion University information technology professor Joan Mann believes "hybrid" positions that meld strong technical skills with business acumen would allow CIOs to maintain a good relationship between users and IT, and she suggests in the Journal of Online Education that universities substantially revise their curricula and teaching techniques to support the need for hybrid employees. Her views are shared by Australian National University (ANU) master of software engineering Clive Boughton, who notes that schools must often create a multidisciplinary organization to support the demand for hybrid IT workers. IT Skills Hub CEO Brian Donovan reports an increasing demand among employees for IT personnel with business knowledge and "soft skills" ranging from teamwork to consulting to client communication, and the Australian Information Industry Association's Michel Hedley comments that universities are at least re-evaluating their curricula more frequently; meanwhile, University of Technology Sydney visiting professor of e-business Steve Burden says most faculties are making a solid effort to improve students' business skills. ANU has instituted a program to nurture both business and software engineering skills by assigning third- and fourth-year students to teams that work closely with industry on industry-sponsored initiatives. However, many graduates lack skills in business analysis, and IT shops are trying to fill this gap through the employment of mentoring services as well as specialist IT trainers. Perpetual Trustees Australia's Richard Boyer concludes that the best CIOs can hope for from universities, ultimately, are graduates possessing the analytical knowledge for both business and IT technologies.
    Click Here to View Full Article

  • "Demonstrating Secure Wireless Personal Access Networks"
    IST Results (04/05/05)

    The IST-funded PACWOMAN project sought to address scalability, mobility, reconfigurability, security, and quality of service challenges that limited the viability of wireless personal area networks (WPANs). PACWOMAN's strategy was to deliver cheap and secure wireless access to users with 100 Kbps to 10 Mbps bit rates, thus becoming a core component of subsequent fourth-generation networks as a personal access network. IMEC's Fillip Louagie says three distinct network concepts were developed at the beginning of the project: "The Personal Area Network or PAN for the individual's personal access space, the Community Area Network or CAN to act as a kind of outer local or linking network, and the existing Wide Area Network [or WAN]." Before the project wrapped in February, the researchers had devised a working demo of the WPAN infrastructure that showed how the PAN connected to a WAN, and consequently to the global network infrastructures. They demonstrated how a PDA with a Bluetooth link could register through the user's WPAN with a central server on the CAN and connect with another user's personal device, as well as how service registration and authentication could be executed over the multi-network infrastructure. Louagie says the PACWOMAN researchers projected two primary functions for the PAN: Low data rate services supporting applications that usually operate at between 10 Kbps and 100 Kbps (such as location sensors), and medium and high data rate services supporting applications operating at between 100 Kbps and 10 Mbps (video streaming, for instance).
    Click Here to View Full Article

  • "Top Court Mulls P2P 'Pushers'"
    CNet (04/06/05); Borland, John

    Several Supreme Court justices debating whether peer-to-peer (P2P) file-swapping networks should be legally liable for copyright infringement committed by users appear interested in reaching a compromise that concentrates on networks that induce such infringement. Patent law defines inducement as what happens when a company develops a product and then provides instructions on how to employ it in a way that infringes on another company's patent, or promotes it in a manner that encourages infringement. "It would allow the legal test to focus on your own behavior, over which you have complete control, rather than on the action of third parties," reports San Francisco copyright lawyer Annette Hurst. A similar principle underlined Sen. Orrin Hatch's (R-Utah) Induce Act, which tech industry opponents criticized for being too broadly worded; an inducement-based concept from the Supreme Court could fall into a similar trap, resulting in lawsuits stemming from various interpretations of "inducement." "That's my big worry, that the court wants to recognize some inducement claim, then we face another 15 years of litigation to figure out what that means," notes Electronic Frontier Foundation attorney Fred von Lohmann. Entertainment firms say the court demonstrated little warmth for file-swapping software companies in last week's hearings, where the judges seemed opposed to new P2P companies using illegal usurpation of property to acquire startup funding; however, computer industry proponents are heartened by what they saw as sympathy on the part of justices cognizant of their fears that lawsuits could impede the rollout of products with substantial non-infringing uses. The inducement-based middle ground concept contemplated by the judges has been endorsed by organizations such as IEEE.
    Click Here to View Full Article

  • "Grid Computing Can Allow Security Threats"
    eWeek (03/30/05); Naraine, Ryan

    At Ziff Davis Media's Enterprise Solutions Virtual Tradeshow on March 30, a panel of security experts discussed the security risks inherent in large-scale grid computing implementations. Panelists cited best practices as critical for keeping information transmitted over corporate grid systems confidential. Advanced Systems Group chief technical officer Mark Teter warned that a hacker could steal sensitive corporate data by exploiting a grid's automated resource allocation processes, and said businesses should employ encryption technologies to address this danger. Triad Information Security Services security consultant Lenny Mansell advised businesses deploying grid systems to flag important assets as well as the threats those assets face, and argued that the deployment of proper classification for handling confidential data was vital to the data's integrity, availability, and secretiveness. "Access to confidential data needs to be restricted to those with a need to know and you have to set up audit trails," he remarked. Mansell also noted that many ideas with a bearing on a well-coordinated information security practice have a bearing on grid computing, and that an administrator's job becomes more difficult if a dependence on authentication and authorization for users and groups exists. "Policies and processes must be created to address this expanded reliance on these extended models," he said, making a case for clear definitions of the limits of administrative control.
    Click Here to View Full Article

  • "Reading Is Key to IT Innovation"
    Computerworld (04/04/05); Kolawa, Adam

    Parasoft CEO Adam Kolawa warns that many U.S. researchers and scientists are operating in ignorance of their peers' latest work, and this practice is endangering America's global competitiveness in science and technology. Reading up on colleagues' research can prevent the unnecessary reinvention of already available processes and methods, and the subsequent waste of time and resources. Kolawa thinks the root of the problem could be the transition to electronic publishing, which is driving a reduction in the publication of printed journals and magazines and causing a disruption of scientists' routine of reading the latest research on a regular basis. The problem is typical of software engineering, where early adoption of online publishing prevented the establishment of such a routine; but Kolawa is alarmed by the manifestation of this trend in other disciplines and industries. Unless this trend is halted, he writes, the United States research community will inevitably suffer a decline in its ability to innovate. "If we begin to lose the initiative in innovation and research in the material sciences, then we risk losing our position as global intellectual leader in a number of important areas, including medical research, aerospace, astrophysics, geophysics and engineering," Kolawa warns. He notes that online publishing can help solve this problem because it makes the knowledge that scientists are ignoring readily available. But the disciplines and communities that need this knowledge must train themselves to retrieve it frequently.
    Click Here to View Full Article

  • "Carnegie Mellon's Collaborative Research Is Driving Force Behind Revolutionary New Tool for Writing Software Codes"
    Carnegie Mellon News (04/01/05)

    Professor Jose M.F. Moura of Carnegie Mellon University's Department of Electrical and Computer Engineering reports that his team has developed "SPIRAL" software that automatically produces code for signal-processing applications that could boost the speed of computer operations while reducing their cost. SPIRAL can generate code for new and old applications in 10 minutes or less, sparing end users a great deal of time, money, and frustration, Moura states. SPIRAL was employed by IBM researchers collaborating with SPIRAL team member Franz Franchetti of the Technical University of Vienna to produce a highly optimized FFT library for the Blue Gene/L supercomputer. "An ultra-fast supercomputer like Blue Gene/L is useless if we are not able to harness that power to run a range of applications," explains IBM Research Blue Gene systems architect Jose Moreira. "Collaborating with Carnegie Mellon and Vienna using SPIRAL has played a key role in enabling IBM researchers to optimize applications to run on the system, including life sciences applications involving molecular dynamics." National Instruments' Shawn McCaslin says SPIRAL offers a wide assortment of diverse solutions to determine the optimal signal-processing and math functions for complicated computer deployments. Other members of the SPIRAL team include Carnegie Mellon professor Markus Pueschel, Maria Manuela Veloso with Carnegie Mellon's School of Computer Science, University of Illinois, Urbana-Champaign computer science professor David Padua, and Drexel University computer science professor Jeremy Johnson.
    Click Here to View Full Article

  • "Fast NASA Action Begets World's Largest Linux Supercomputer"
    Linux Insider (03/31/05); Korzeniowski, Paul

    The U.S. government's Return to Flight initiative to correct the mistakes that caused the destruction of the space shuttle Columbia over two years ago prompted the rapid installation of a 10,240-processor supercomputer, named after the ill-fated spacecraft. The project also dovetailed with NASA's agenda to install a supercomputer with 10 teraflops to 15 teraflops of data processing horsepower. The Columbia system, which was built by NASA and SGI, employs industry-standard 64-bit Linux microprocessors, and each node has the scalability to support 256 processors with 3 TB of memory. SGI's traditional project management methods were jettisoned to accommodate the accelerated installation schedule, "because the established [techniques] would not be able to scale up and support the enormity and complexity of the tasks involved with delivering the new system," says SGI's Dick Harkness. SGI had to expand its construction process to handle six supercomputer processor nodes at once instead of the usual two, and focus on previously low-priority items such as cabling and LED assembly to get a better idea of whether their suppliers would be able to deliver needed elements on time. Both SGI and NASA took steps to improve heat management for the system: SGI designed new water-cooled doors, while NASA overhauled the plumbing for its supercomputer water cooling system. Columbia can execute 42.7 trillion calculations per second on 16 to 20 systems, for an efficiency rating of 88 percent, according to the Linpack benchmark. The supercomputer can expedite modeling of the hydrogen gas flow chamber in the shuttle's propulsion system, and run new applications in earth simulation, aerospace vehicle design, and space science.
    Click Here to View Full Article

  • "The Evolving mSpace Platform: Leveraging the Semantic Web on the Trail of the Memex"
    University of Southampton (ECS) (04/03/05); Schraefel, MC; Smith, Daniel; Owens, Alisdair

    University of Southampton researchers posit that the Semantic Web could potentially bring the Web closer to Vanevar Bush's vision of the mime, a machine that stores all digital information and that can support knowledge construction by supporting a person's assembly of associative connections between documents. The mSpace platform facilitates such knowledge building with Semantic Web technologies, and the authors suggest that the platform can support interaction mechanisms upon which the mime is founded. An mSpace offers a method for managing high-dimensional spaces on a two-dimensional space, which supports a number of manipulations--sorting, swapping, addition, and subtraction. "These manipulations mean that the person can construct a representation of a space, and Bush-like pull in associations on demand, which support their interests," explain the researchers. They say the mSpace model's most innovative quality is its ability to fold spatial representations of multiple types of associated information into one context that can be manipulated to effect information-space explorations determined by users. The mSpace software framework consists of three core elements--a client, a model, and a data storage layer: It is the client's function to query the domain space, represent the results in the interface, and supply the proper manipulations for the domain; the model's job is to define the domain's available dimensions and their relationships; and the storage layer facilitates quick returns on complex queries on the data space. Deployment of a lightweight mSpace application has yielded insights on how mSpace can be implemented as a more generic spaces browser, with the ultimate goal being the development of mSpace as a generic Semantic Web browser. "The contribution of our framework approach is to provide a practical platform for hypertext exploration that takes advantage of Semantic Web protocols which let us support in the wild of the Web, Bush's sense of the way the human mind works, through human-made association," the authors conclude.
    Click Here to View Full Article

  • "Sounding the Alarm as Big Brother Goes Digital"
    EE Times (04/04/05) No. 1365, P. 1; Yoshida, Junko

    Barry Steinhardt with the ACLU's Technology and Liberty Program warns that the deployment of digital and radio-frequency (RF) technologies may actually increase rather than decrease society's vulnerability. "We're developing the infrastructure for the surveillance society without, at the same time, creating the chain to hold the monster," he argues. Steinhardt is alarmed at the prospect of passports equipped with RFID chips containing the bearer's personal information, which is unencrypted and thus exploitable by terrorists or identity thieves; he says the ACLU prefers more robust security measures, such as requiring passports to make physical contact with a reader. Steinhardt criticizes the lack of meaningful regulation governing the collection, usage, and storage of data in the United States, and says the Bush administration is completely uninterested in data protection. He illustrates his point by noting that the White House went to the International Civil Aviation Organization with its concept for a global e-passport standard instead of to Congress, where such a proposal would be a hard sell. The organization adopted the standard, and Bush announced the need to comply with the standard at the G8 summit meeting. Steinhardt reports that an ethics code for the engineering community developing the surveillance society infrastructure, though a noble conceit, is ultimately ineffective, and concludes that legislation is the best option for protecting citizens' privacy. He also laments the country's policy of eroding constitutionally mandated fair-use rights for consumers, a trend that is choking U.S. innovation and threatening to undermine the American economy.
    Click Here to View Full Article

  • "Moving IT Forward"
    eWeek (03/28/05) Vol. 22, No. 13, P. 47; Coffee, Peter

    Members of eWeek's Corporate Partner Advisory Board agree that enterprise IT departments are beginning to shake off the malaise of downsizing and damage control and adopt a more positive IT investment outlook. AT&T Global Networking Technology Services' Glenn Evans says his company seeks to consolidate redundant applications now that SBC Communications is poised to acquire it. IT departments that have spent the last few years investing in standards-based technologies will profit when integration and auditability is demanded of their systems. Wal-Mart and the Defense Department are pressuring their partners to embrace radio-frequency identification (RFID) tags so that workflows can be more easily tracked, and eWeek Labs recommends that mobile applications and infrastructure support real-time telepresence for users as a complementary technology for RFID. Sales and marketing, travel and leisure, and other industries are advised to investigate mobile technologies as well. Several members of the board report that highly hyped technologies characterized by sluggish traction are finally ready for prime time, among them voice over IP, open-source and thin-client technologies, and utility computing. Duke Energy desktop hardware product line manager Kevin Wilson notes that his company will be less concerned with spyware problems than previously anticipated, an attitude that illustrates a refocusing away from IT's drawbacks and toward its advantages.
    Click Here to View Full Article

  • "Grand Challenge, Take 2: DARPA Doubles Prize Money in Mojave Desert Robot Race"
    Defense News (03/28/05) Vol. 20, No. 13, P. 16; Walker, Karen

    The Defense Department has mandated that one-third of the U.S. Army's ground vehicles be autonomous and unmanned by 2015 so that battlefield casualties can be reduced. Stimulating the development of robot vehicle technologies to fulfill this mandate is the goal of the Defense Advanced Research Projects Agency's (DARPA) Grand Challenge. The first Grand Challenge contest offered $1 million to the team whose vehicle successfully traversed an off-road course in a certain amount of time with no human assistance, but none of the entries completed the race. DARPA has raised the ante by another $1 million for this year's competition, in which participating vehicles will be required to negotiate an unforgiving 175-mile Mojave Desert trail in less than 10 hours. "It's very difficult to build a robotic vehicle that can avoid obstacles and travel the Grand Challenge course at the speeds necessary to win the prize," notes DARPA Grand Challenge program manager Ron Kujanowicz. About 200 teams have entered the Grand Challenge 2005, including all of last year's finalists. Returning teams include Team TerraMax, a joint venture between Oshkosh Truck, Italy's University of Parma, and Rockwell Collins. Last year's Team TerraMax entry, which fell out of the race due to a software malfunction, was a modified Marine Corps Medium Tactical Vehicle Replacement truck. "The hardest thing about this challenge is the technology to negotiate unplanned bumps or obstacles," remarks Rockwell Collins senior director Mike Myers; Oshkosh's John Schwartz reports that the 2005 entry will incorporate Rockwell Collins autopilot technology.

  • "The Challenge of the Decade"
    Campus Technology (03/05); Calhoun, Terry

    As university libraries move toward digitizing their entire collections, digital rights management (DRM) is a key issue that requires planning input from library officials, IT specialists, faculty, students, and even members of the general public, writes Society for College and University Planning communications director Terry Calhoun. Just a few years ago, university librarians and other officials worried over student and faculty preference of the Internet over print resources; now, instead of fighting the trend toward Internet search, leading schools are seeking out ways to open up their print holdings. For example, the University of Michigan has signed a deal with Google to digitize its entire collection of more than 7 million volumes in order to make those print resources more available and easier to use; the project will take six years, and the first digitized resources will not come online for approximately 18 months. University of Michigan associate university librarian John Wilkin says an entirely new DRM approach is needed to deal with intellectual property issues when the entire collection is made available over the Internet. Though Wilkin is uncertain about what that DRM schema will eventually look like, he says the near- and mid-term solution is a "digital rights matrix" that clarifies access and use privileges for different users; for its service, Google plans to provide complete access to texts out of copyright while providing short quotes and citations for materials still under copyright. Other universities are taking a more conservative approach to digitization, such as Harvard University's project to digitize only 18th century works. An effective DRM solution for university libraries will require input from various departments and disciplines, and be free of excessive technological baggage. Calhoun suggests that rather than "management of rights," DRM for digitized university collections should be "digital management of rights."
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM