Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to technews@hq.acm.org.
Volume 6, Issue 643:  Wednesday, May 12, 2004

  • "Drowning in Spam: New Techniques Focus on Identifying Senders"
    Pittsburgh Post-Gazette (05/10/04); Spice, Byron

    Internet technologists and businesses are working on authentication schemes that would dramatically reduce the amount of spam email, which makes up more than half of all messages sent today and is an incredible burden for companies and users. A survey from the Pew Internet & American Life Project showed that nearly 30 percent of email users did not use that medium as often because of spam. So far, authentication seems the only practical solution for the spam problem, since e-postage concepts are largely unworkable, filters are not powerful enough, and legislation is not likely to do much but push spam operations overseas. There are three major contenders for authenticating email now being studied by the Internet Engineering Task Force: Sender Policy Framework (SPF) developed by pobox.com co-founder Ming Weng Wong, Microsoft's Caller-ID for email, and Yahoo!'s DomainKeys encryption method. SPF is the most simple and is set to begin operation in two or three months, according to Wong; email domain holders would register with SPF and identify their user machines so that ISPs would be able to check if the domain used is authentic and matches with the email's return address. Microsoft's Caller-ID for email focuses on the return address instead of the domain, but requires more in-depth inspection of the email message. Yahoo!'s DomainKeys is currently being used internally, and involves a more complicated encryption scheme that adds a digital signature to each email. All three systems could work together, though DomainKeys would be the best solution, even if associated with the most overhead. Though the methods are likely to be employed on a voluntary basis, experts say the basic anonymity of email is still important as it facilitates whistle-blowers and others who legitimately want to hide their identity.
    Click Here to View Full Article

  • "Code That Kills, For Real"
    Salon.com (05/11/04); Rosenberg, Scott

    Software quality is critical to virtually every aspect of modern life, but code quality takes on life-or-death importance in military applications. The Pentagon's Systems and Software Technology Conference examined software development problems unique to the U.S. military, noting that military applications often take up to a decade to complete and are massive in scale. Speakers at the conference fell into two camps: Those that adhere to past practices of rigorous documentation and testing, and those that embrace private-sector solutions. With the Defense Department changing its overall operational strategy to become more agile and responsive, and the increased burden on software developers to write code, many Pentagon leaders feel it is critical to make use of standard technologies such as Web services and XML; Navy CIO David Wennergen addressed Defense Department IT contractors and vendors directly, telling them that IT projects going forward would focus on interoperability and Web services. Suppliers that did not adjust to the new paradigm would be dropped, no matter how much the government has spent on their product. In the context of a war situation, however, the software quality problems that plague business applications pose even greater threat: Critics of the COTS (commercial, off-the-shelf systems) approach warned that there were hidden costs and dangers to the COTS-only mentality. Even commercial products must be thoroughly understood, and the military will have to build interfaces and do its own testing, said Randall Jensen of Hill Air Force Base's Software Technology Support Center. Also, the type of megaprojects the Defense Department undertakes are so large that the parameters and context of the project inevitably change over the course of the project; ensuring software quality in this type of situation requires perseverance and flexibility even as hardware, software, and personnel change.
    Click Here to View Full Article

  • "Princeton's Image-Smashing Math Whiz"
    Business Week (05/12/04); Holahan, Catherine

    Maria Klawe, dean of Princeton University's School of Engineering & Applied Science and president of the Association for Computing Machinery (ACM), challenges stereotypical perceptions of mathematicians, not least by being a woman. This places her in a unique position to encourage women and minorities to pursue careers in computer science, engineering, and mathematics. In addition to faculty stints at Oakland University, the University of Toronto, and the University of British Columbia, Klawe set up and managed the Discrete Mathematics Group at IBM's Almaden Research Center, and co-founded the Committee on the Status of Women in Computing Research, which holds mentoring workshops, conferences, and networking events to boost the status and population of the female tech workforce. Klawe's other notable achievements include the founding of the Electronic Games for Education in Math and Science project, an educational software firm that creates video games designed to make girls overcome their aversion to math. She also launched Silicon Chalk, a company that sells software that educators can use to transmit classroom lectures and supplemental data to students' PCs. Klawe is the fourth woman to serve as ACM president, and ACM executive director John White says, "Her biggest concern is the lack of women in the field of computing technology." Her efforts have borne fruit: The National Science Board estimates that women currently earn around 18 percent of all undergraduate engineering degrees, compared to 9 percent in 1979. Klawe plans to have Princeton's engineering program offer more classes to students outside the scientific fields, as well as devise more seminars that inspire students to address real-world rather than theoretical problems.
    Click Here to View Full Article

  • "More Cash Flowing to Robotics Research"
    Associated Press (05/11/04)

    The U.S.'s leading universities in robotics research are experiencing a big boost in federal funding, much of it from the Defense Department. The funding boost coincides with falling prices for many of robotics key technologies, including charged-coupled devices, sensors, software, and microprocessors. The Pentagon is significantly raising its spending budget for robotics research and development in an effort to reduce human battlefield casualties through the deployment of autonomous air, land, and sea vehicles. The Defense Department is required by Congress to have one-third of all military ground vehicles unmanned by 2015, while its R&D budget for unmanned aerial vehicles is expected to exceed $10 billion through 2010. Chuck Thorpe, director of Carnegie Mellon University's Robotics Institute, reports that federal funding has skyrocketed 117 percent in the last decade, and lists the Defense Department as a primary source of last year's $24.8 million in endowments. The California, Georgia, and Virginia institutes of technology claim federal robotics research sponsorship has leaped by 50 percent or more in recent years. The Defense Advanced Research Projects Agency's (DARPA) Jan Walker says her organization is sponsoring over 40 robotics projects. Among them is a CalTech neuroprosthetic effort to enable people to control machinery by thought, and a Carnegie Mellon initiative to develop an all-terrain combat vehicle that does not require a human operator. Experts caution that federal sponsors may place unrealistic expectations on robotics, an example being the recent 150-mile desert race between autonomous vehicles hosted by DARPA, in which the furthest distance traveled was approximately seven miles.
    Click Here to View Full Article

  • "MIT Aims for the Bottom Line"
    Wired News (05/11/04); Baard, Mark

    MIT's Media Lab announced its consumer electronics lab (CELab) program on May 10; Media Lab founder and chairman Nicholas Negroponte said researchers are returning to their roots by focusing on easy-to-use, fun consumer goods. CELab is not an actual facility, but an umbrella term for numerous research projects at the Media Lab and Media Lab Europe. Negroponte said engineers have lost their way by designing overcomplicated consumer products, and CELab will team up with industrial designers and architects to simplify such items and make them more engaging for users. Among the projects CELab will concentrate on is a collaboration between architect Frank O. Gehry and William Mitchell's Smart Cities team to design a smart car that can warn drivers of obstacles up ahead. CELab's corporate sponsors will pay up to $200,000 each in annual membership fees, and in return receive the right to license the lab's intellectual property, as well as be permitted to join CELab's steering committee and work with Media Lab researchers. At the meeting where the CELab announcement was made, Atari and Chuck E. Cheese founder Nolan Bushnell recommended that engineers keep the bottom line in sight, and design applications that fulfill consumer wants instead of needs. Mitchel Resnik, director of the Media Lab's Lifelong Kindergarten, was not concerned that CELab will divert funds away from more altruistic projects: "As we develop things for the masses, there will be offshoots for the fringe groups," he insisted. It is a particularly turbulent time for the Media Lab, which has suffered layoffs and a funding shortage of late.
    Click Here to View Full Article

  • "AU Computer Program Lures Blacks, Women"
    Birmingham News (05/10/04); Spencer, Thomas

    Alabama's Auburn University supports the highest concentration of black computer science graduate students and faculty in the United States, and boasts a large number of females enrolled in its computer science graduate program; figures such as these attracted the interest of the National Science Foundation (NSF), which will study the school's successful recruitment and retention of women and minorities in an effort to apply the model to other universities. Since the late 1990s, more than 50 percent of computer science and engineering graduate students in the United States have been foreign-born, but enrollments have been declining for a number of reasons, including increased visa restrictions after 9/11, better-established overseas universities, and increased offshore outsourcing. Human-Centered Computing Lab director Juan Gilbert, a black AU faculty member, believes that America stands to lose its technological leadership unless more women and minorities are brought into the computer science and engineering field. "Diverse backgrounds yield diverse minds, which yield diverse solutions," he attests. The NSF estimates that roughly 9,000 computer science graduates, approximately 100 of them black, were produced by U.S. universities between 1991 and 2000, while only about 150 blacks are taking computer science doctoral courses nowadays. There are currently just eight black computer science doctoral students enrolled at Auburn; almost 9 percent of the total number of black computer science doctorates in the past five years graduated from Auburn. Forty-three percent of Auburn's computer science Ph.D. candidates are female, while the national average for women is fewer than 20 percent of computer science graduates and 14 percent of faculty. Auburn's success in recruiting and holding onto minority computer science and engineering students is primarily attributed to its concentration on people rather than just technology.
    Click Here to View Full Article
    (Access to this article is free, but first-time visitors will need to enter their zip code, year of birth and gender.)

  • "Good Thinking: Boffins Wow With 'Psychic' PC"
    Macworld UK (05/10/04); Sze, Tan Ee

    Singapore's Institute for Infocomm Research (I2R) is conducting research at its NeuroInformatics Lab into improving brain-machine interface (BMI) technology--specifically, thought-controlled computers--through its NeuroComm platform. NeuroComm is built from scratch on a Windows platform, and brings together expertise in areas such as pattern recognition and signal processing to support data acquisition, brain signal analysis, clarification, and graphic display. I2R executive director Lawrence Wong says BMI could help severely disabled people lead fuller lives, and "adds a new dimension to existing multi-modal human-computer interface, incorporating machines such as computers, robots, and other devices into 'neural space' as extensions of our muscles or senses." Unlike earlier BMI research, the institute is focusing on a non-invasive approach to acquiring and analyzing neural signals, which will be integrated with electroencephalography to make future BMI systems easier to use and more reliable. NeuroInformatics Lab director Guan Cuntai says that NeuroComm will enable I2R to undertake new cognitive research, confirm new user paradigms, construct demos, and create applications. BMI research faces several hurdles, some of which I2R is working to overcome with the NeuroComm platform, built using C/C++ and thus portable to Linux and Unix. Among them: Being able to consistently acquire and accurately process brain signals despite a constantly fluid brain state during cognition.
    Click Here to View Full Article

  • "The Quest to Find a Better Search Engine"
    Financial Times (05/12/04) P. 9; Waters, Richard

    The tremendous success of Google has raised the bar for what people expect from search engines, and companies worldwide are struggling to deliver. The search engine's increasing importance as a tool for modern times illustrates the proliferation of "unstructured" data, where critical corporate information more and more frequently resides. The Google scheme, in which Web pages are ranked by popularity based on how many links they have, does not generally work well for companies, whose data access requirements are very different. Furthermore, successful corporate search engines must be able to abstract data from documents rendered in hundreds of distinct formats while maintaining a high degree of accuracy. Specialist search engine companies are focusing on the development of new algorithms that can better evaluate relevance, and Verity CTO Prabhakar Raghavan explains that various techniques for assessing relevance must be combined for searches to generate better results. Innovative technological approaches being applied to the search engine problem include the creation of taxonomies that use automated classification to impose a basic structure on unstructured data. Another area of focus is personalization engines that anticipate the kind of results a user might want, based on their jobs within a company. FAST's Ali Riaz observes that most search engine users fail to specify their search requests: "Human beings are very lazy about giving information, but they're very greedy about taking it," he notes.
    Click Here to View Full Article

  • "E-Serenity, Now!"
    Christian Science Monitor (05/10/04) P. 11; Paton, Dean

    Information environmentalists, while not yet coalesced into a movement, are developing pointed arguments for limiting the amount of information people are exposed to and take in: A recent conference in Seattle focused on the issue, where University of Washington professor David Levy, a former researcher at Xerox PARC, said much of the information we are exposed to today is meaningless and numbs our ability to perceive truly important messages. Former Xerox PARC director John Seely Brown said the context of information is critical, and that trusted, concrete sources of information offer the best context--regularly read newspapers, face-to-face conversations, and good books. "Our current forms of media are creating mushy minds," he argued at the conference, titled "Information, Silence, and Sanctuary," and underwritten by the MacArthur Foundation and the National Science Foundation. "We have to question what the meaning of progress is. Not all forms of technology lead to progress." In Brown's view, new information sources such as Web search engines, cell phones, and even online newspapers are too focused on efficiency and not focused enough on effectiveness; media needs to be designed for effectiveness, he said. Other conference participants said they use triage to sort out the bulk of daily information, including non-substantive media such as television and instant messaging, and still others try to identify the purpose behind messages and reject those that imply a person is insufficient and needs to buy something. Levy said modern technologies should be seen as part of a continuous evolution of human communication, explaining that there are important historical, cultural, and spiritual values that need to be included in modern media. Levy keeps the Jewish Sabbath with his wife, which means no computer use for one day.
    Click Here to View Full Article

  • "Computer Chip Noise May Betray Code"
    New Scientist (05/11/04); Knight, Will

    Researchers Adi Shamir and Eran Tromer of Israel's Weizmann Institute have determined that eavesdroppers could theoretically break encrypted computer messages by monitoring the noise generated by computer chips. The researchers sampled the high-frequency audio emitted by computer central processing units (CPUs) in a recording studio, and found that the frequency of the sound correlated with specific cryptographic keys. Furthermore, measuring the duration of certain noises allowed the researchers to extract the length of a string of characters. On a Web page detailing their research, Shamir and Tromer indicate that "preliminary analysis of acoustic emanations from personal computers shows them to be a surprisingly rich source of information on CPU activity." Cambridge University researcher Markus Kuhn is impressed with these findings, as is Peter Honeyman at the University of Michigan, who notes that other sounds produced by the computer do not appear to affect the technique. Kuhn points out that others have analyzed how data about cryptographic activity can be remotely extrapolated by examining chips' power-supply fluctuations.
    Click Here to View Full Article

  • "Eye-Tracking Devices Full of Possibilities"
    Charleston Post and Courier (05/10/04); Conover, Daniel

    Eye-tracking technology has been used by researchers to study how people process visual information since the 1970s, but scientists such as Andrew Duchowski, an assistant professor of computer science at Clemson University, envision even more practical applications. Among the possibilities being explored: Images and advertisements that use eye-tracking and biometrics to recognize and address the observer; sophisticated computer interfaces featuring a significantly faster mouse that tracks the user's eye movements to anticipate his needs; the real-time linkage of brain-scan and eye-track data, which could lead to more effective ways of educating learning-disabled students; and monitoring systems that keep track of how much work employees are actually doing. With funding from the National Science Foundation, Clemson is collaborating with Greenville Technical College on a project to employ eye-tracking technology in tandem with virtual-reality simulators used to train aircraft mechanics. Another Clemson project focuses on enhancing the experience of Collaborative Virtual Environments so participants can direct their attention around the environment as they would in a real-life conference room. Perhaps the biggest challenge eye-tracking developers face is enabling the technology to rapidly, inconspicuously, and reliably lock on to and track the human pupil. Duchowski projects that such calibration will most likely become automatic, but only after the cost of eye-tracking systems has fallen considerably. He forecasts that practical eye-tracking applications are at least five years off.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "The Unfolding Saga of the Web"
    Wired News (05/12/04); Delio, Michelle

    Carnegie Mellon consulting IT professor Stuart Feldman directs a team of young IBM technologists whose task it is to forecast the evolution of the Internet and its effects on people's lives 10 and 20 years hence. The professor, who heads IBM's Internet Technology division and will co-chair the upcoming WWW2004 conference, likens the Internet's current developmental state to that of young adulthood: More specifically, he calls the Web "a bratty adolescent" and expresses the need for a more mature, "civilized" Web. Feldman wants the Web to become more serious while not losing its fun aspects, which will be necessary as society comes to rely on the Web more and more. He predicts that the Web will change dramatically in the next decade with the advent of services, but there will also be a return of sorts to the original Arpanet scheme as the sharing of applications comes to the fore. Feldman admits that people will surrender a certain degree of control over their personal data for the sake of convenience, but adds that privacy compromises will be made on an individual basis. The IBM researcher maintains that experts should be chiefly concerned with security, and projects that more security services will emerge as connectivity proliferates and more and more critical data migrates to the Web. Feldman postulates that the future Web will be divided into safe and not-so-safe zones so Web surfers will have a better understanding of where they are roaming and the risks they may be taking. In his opinion, the Web's growth up to now has been important and generally positive, but he stresses that "it's not time for the confetti and streamers to drop out of the ceiling yet."
    Click Here to View Full Article

  • "Open Source 'Blending' Into Animation"
    InternetNews.com (05/07/04); Kerner, Sean Michael

    The latest version of the fully integrated Blender 3D graphics creation suite could help open source software penetrate the animation segment, especially now that the suite comes with features and enhancements previously unavailable because they were originally designed as part of Blender's proprietary, closed source incarnation. Blender3D version 2.33 enables modeling, animation, rendering, post-production, game creation and playback, and real-time interactive 3D, and the program is compatible across multiple platforms. Original Blender programmer and Blender Foundation founder Ton Roosedaal says the software is in a very youthful stage, and is going through the classic cycle of open source projects. "I see there is interest from companies now in getting paid support, so for the Blender Foundation the aim is to build a good support network for this," he comments. Movie Editor.com founder Robin Rowe, who also leads the open source Cinepaint project, says open source software has made significant inroads into the film industry, even though proprietary software such as Maya has held a dominant position in animation. He says Linux is now the leading operating system used by major studios' animation and special effects units. However, Jupiter Research analyst Michael Gartenberg sees Blender support as limited beyond a certain point. "At the end of the day for many companies, especially when they are building applications on top of the environments that Blender is providing, those folks are going to want and need a full level of tech support that just isn't available from an open source product," he asserts.
    Click Here to View Full Article

  • "ASU Advance Could Provide Insight Into Human's Ability to Recognize Patterns"
    ScienceDaily (05/11/04)

    An Arizona State University (ASU) research team led by math and electrical engineering professor Ying-Cheng Lai has devised a mathematical and computational model for oscillatory associative memory networks, a breakthrough that could shed new light on how humans process and recognize patterns and lead to advances in artificial intelligence. Oscillatory memory networks are the mechanism behind the human brain's ability to register, record, and recall patterns such as faces. Lai says these systems, whose individual elements can freely flip between states, give the brain the edge over computer memories' binary number system. The ASU team's model for oscillatory networks could be deployed through the use of electronic circuits serving as phased-locked loops. The research's immediate robotic applications could include the creation of artificial memory devices that employ strong, secure oscillators. This could give robots the ability to identify patterns and perform on-the-fly reasoning as they carry out their duties. This would widen the scope of unanticipated scenarios the robots could respond to and lead to smarter robots.
    Click Here to View Full Article

  • "Wireless PDAs and Smartphones: A Hacker's Heaven"
    TechNewsWorld (05/07/04); Germain, Jack M.

    Wireless PDAs, Wi-Fi devices, and smart phones are posing security issues, even as they allow users to receive email and text messages while on the go. The devices are open to viruses and password theft, as well as spam, and security experts say they lack the antivirus protection, encryption programs, and message-filtering software of their desktop counterparts. The Protect Data Group says that roughly half of the data on corporate handhelds has no security protection, and 25 percent of enterprise users recently polled said their PDAs had been stolen or lost. A Pointsec Mobile Technologies survey of business executives showed that 81 percent considered the data on their PDAs valuable, but most were not worried about security issues; however, almost 75 percent were interested in security systems for the devices. The devices are becoming more popular for data storage and telecommunications, but AirPrism founder and CTO Kap Shin says that security issues differ for enterprise and consumer usage, and that software encryption solutions are available for enterprise use but not for consumers yet. Salutionary's Jeffrey Guilfoyle believes that security should be a core component of infrastructure and not left up to individual vendors, as was the case with the PC industry earlier. He says security needs to become an issue of shared infrastructure and protocols for telecommunications providers. He says, "These are all real-world scenarios. Sending emails from phones and PDAs can become infected. This is the same code used on computer infections."
    Click Here to View Full Article

  • "Computational Origami"
    Computerworld (05/10/04) Vol. 32, No. 19, P. 26; Brewin, Bob

    A handful of mathematicians who are also origami experts could aid the next industrial design innovation or help understand how proteins fold. Industrial consultant Robert Lang also considers himself a full-time origami artist, and creates software algorithms that automate the folding design process; such programs can be used to figure out how to fold equipment or devices with large surface areas. Lang recently helped EASi Engineering in Germany develop a better airbag design, modeling the airbag first as an origami problem. His Treemaker software allows people to do the same work, though he says users must first understand paper-based origami in order to fully utilize Treemaker's capabilities. Only a handful of people actually use the Treemaker software because of the mathematical, computational, and engineering skills required, Lang says. MIT Professor Erik Demaine, another expert applying mathematics and origami to complex problems, is studying protein folding, especially why proteins fold wrong and develop disease; in the future, a comprehensive understanding of protein folding could allow scientists to create their own disease-fighting proteins. IBM Research computational center manager Ajay Royyuru says supercomputers will need at least one petaflop of computing power in order to comprehensively model protein folding. IBM plans a 360-teraflop machine by 2005, but Royyuru does not know when a 1-petaflop computer will be available--but at that point, computational origami may help scientists unfold current mysteries.

  • "From Grid to Growth"
    InformationWeek (05/10/04) No. 988, P. 51; Ricadela, Aaron

    North Carolina plans to recover from massive layoffs in its chief industries by investing in grid computing and other technologies with the goal of accelerating research and development, cutting costs, and creating new jobs. Former Sun Microsystems grid computing executive and supercomputer researcher Wolfgang Gentzsch joined high-tech incubator MCNC to direct the development of an $8 million "North Carolina state grid" with the aim of relocating underused supercomputers to university campuses so that academics and industrial researchers can access their computing power. MCNC is also funding the development of three more supercomputer clusters scheduled to go online this month at the University of North Carolina at Chapel Hill and North Carolina State; an additional three are planned for late summer. "They're trying to market supercomputing as a resource, rather than a sleepy little computing center stuck over there on Cornwallis Road," notes Research Triangle Park Foundation VP Gary Shope. Meanwhile, former director of the University of Illinois' National Center for Supercomputing Applications Dan Reed accepted a $3 million endowed professorship at UNC and launched the Renaissance Computing Institute to foster the development of more innovative products by providing cutting-edge computing technologies for industry and the humanities in the state. The Research Triangle Park Foundation announced back in March that it would invest $5 million over the next five years to create 100,000 high-tech, nanotechnology, food science, and drug research jobs. UNC professor of technology management Al Segars cautions that intellectual property disputes between academics, venture capitalists, and entrepreneurs could hinder the rollout of innovative products developed with the use of more-accessible supercomputers.
    Click Here to View Full Article

  • "Clearance Needed"
    eWeek (05/03/04) Vol. 21, No. 18, P. 29; Carlson, Caron

    The need for IT workers with security clearances has grown over the past few years, but demand exceeds supply. The federal government wants to break down the information "stovepipes" among agencies and integrate their networks, but this requires outside expertise with security clearance, and the clearance process is currently backlogged. Previously, no security clearance was needed for mid-level federal IT jobs, but today most entry-level jobs at many government agencies require clearance; as a result, hundreds of IT jobs remain unfilled. The technology industry wants the clearance process simplified and has suggested the use of private-sector adjudicators and a system so that agencies and contractors can share cleared workers. Because the clearance process can be as long as two years, many federal contractors are using recruitment companies to find cleared personnel. One such company, IT consultancy Orizon, uses Comsys Services division Secure IT, which specializes in finding cleared IT professionals and usually recruits military personnel who have already been vetted. IT workers that already have basic clearances usually garner 5 percent more in salary, while top-secret clearances add as much as 20 percent to salaries, or 30 percent for those willing to take a polygraph test says Secure IT President Bob Merkl. Merkl says even people who have led "choirboy lives" can take a long time to get cleared.
    Click Here to View Full Article

  • "Decoding Information-Worker Productivity"
    Optimize (04/04) No. 30, P. 66; Davenport, Thomas; Conway, Susan; Vey, Meredith

    The Information Work Productivity Council (IWPC) is attempting to measure the effects of IT on information-worker productivity by studying three business processes that are relatively unstructured and carried out by autonomous knowledge staff, but are still critical to business success and growth: Two of the processes, new-product development and product management, are specific, while the third, management of personal information and knowledge at work, is generic. The council performed a phone survey of 21 information and knowledge managers in large companies and a Web survey of over 500 U.S.-based information and technology users, and concluded from the data that information workers spend over three hours every workday performing information-intensive tasks. Moreover, there are few information workers with the training or knowledge to carry out these tasks efficiently and effectively. Information managers were distributed in roughly equal proportions among three categories of personal information management competence: Those in leading-edge companies were already assigning value to individual information such as email and instant messaging; the second group was confronted with similar productivity challenges but still lacked a holistic response strategy; and the third group had yet to consider individual productivity as a corporate issue, and support to individual users was split across various technologies. Most Web survey respondents were attempting to restrict or control personal information by employing such measures as frequent email checks and the deletion of generic messages, while many were actively pursuing information management to avoid information overload. Research conducted by IWPC last year discovered three key things about how IT affects the productivity of virtual teams: IT improves collaboration between virtual product- and service-management teams; virtual teams face interpersonal, logistical, and infrastructure and support challenges when it comes to global collaboration; and global collaboration needs improved IT support, not just better practices, to maximize success.
    Click Here to View Full Article