ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 655: Friday, June 11, 2004

  • "League of Women Voters Is Split on Paperless Computer Voting Systems"
    New York Times (06/11/04) P. A22; Konrad, Rachel

    Hundreds of members of the League of Women Voters are infuriated with their national leadership's refusal to advocate their demands for the inclusion of a paper-based audit trail in electronic voting systems, and some local chapters are threatening to oppose the group's national position at the league's Washington, D.C., convention this weekend. In 2003, league leaders declared that paperless voting terminals were reliable and officially stated that vote thievery was beyond the systems' capabilities, but this did not pacify many members who believe that paperless systems make legitimate recounts impossible. League President Kay Maxwell says paperless terminals are beneficial to the blind, the illiterate, and non-English-speaking voters, and adds that calling for a paper trail so close to the presidential election would force hundreds of U.S. counties to scramble to upgrade their systems at a cost of millions of dollars. She argues that the possibility of hackers exploiting paperless machines is a less serious threat than low voter turnout and problems with voter registration. League member Kim Alexander calls the league's official support for paperless systems a major barrier to voting reform. Some local chapters warn that they will express their opposition to the league's national posture by nominating new board members and a new candidate for president who endorses paper trails. One such contender is former ACM President Barbara Simons, who claims the league's stance shows that the organization is not in sync with younger voters who are experienced with computers and are familiar with their inherent risks.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

    Barbara Simons is currently co-chair of ACM's U.S Public Policy Committee (USACM). To learm more about USACM's activities involving e-voting, visit http://www.acm.org/usacm.

  • "A Hunt for the Haphazard"
    Financial Times (06/11/04) P. 9; Whitfield, John

    The generation of random numbers is an essential ingredient to a wide range of computing applications, including data encryption and complex system simulation, but such a task is hindered by the fact that computer operations depend on rules that obviate randomness; "The problem is how to get something that's designed to behave precisely to do something unexpected," notes computer security consultant Landon Curt Noll. The cryptography industry is the most demanding consumer of random numbers, and University of Geneva physicist Gregoire Ribordy explains that a random number makes an ideal encryption/decryption key. Most computers currently rely on mathematical operations that generate strings of numbers that only have the semblance of randomness. One solid technique for producing random numbers is to feed data that comes from external physical processes into the computer. Noll and colleagues have invented random number generators that operate on this principle: One generator draws on the unpredictable bubbling of a lava lamp, which is captured on camera, while another produces numbers based on the electronic noise inside a Webcam with a lens cap on. Computer scientist Mads Haahr at Dublin's Trinity College has created a device that generates random numbers from the white noise of a radio tuned to an empty wavelength, and visitors to Haahr's Web site can freely download these numbers. Noll predicts that random number generators will eventually become built-in components of PCs and cell phones--as a matter of fact, included in the next generation of Pentium chips is a device that creates random numbers from noise produced by its own diodes.
    Click Here to View Full Article

  • "User Exchanges: It's Good to Share"
    CNet (06/10/04); Becker, David

    Only a few software vendors allow user exchanges, despite the added value those forums bring to their product. User exchanges for applications such as Macromedia's Dreamweaver and Adobe Photoshop provide users with free tools to customize their applications and share those improvements with the rest of the community, if they wish. Proponents say the freedom to create and share enhances the product's value, as well as provides rich feedback to the vendor; but because of unfounded legal concerns and unclear financial payback from supporting those forums, many software makers have restricted user exchanges. Microsoft, for example, mostly restricts add-ons for its popular Office and Windows products to only licensed professional developers, which Softletter editor Jeffrey Tarter says represents a tremendous lost opportunity for both the company and its large user base. Users cite the sense of community and altruism as their reason for sharing code, which can range from custom Web page buttons to Photoshop filters to scripts that automate certain tasks. IBM Lotus lead developer Craig Lordan says the Sandbox user exchange, where Lotus users share experiences and customized add-ons, started off as a promotional tool to show how useful the original product was: "It's a neat way to show how Lotus customers are doing things," he says. In addition, Lordan says the sharing of small code snippets does not seem to affect demand for professionally developed third-party add-ons, which are offered for sale alongside free code. Macromedia community manager Scott Fegette says the situation is the same at Macromedia Exchange because personal contributions usually focus on small handy additions not worthwhile for commercial developers. Tarter says many companies look down on code created by users, and feel their only their own developers can add value.
    Click Here to View Full Article

  • "Southampton Researchers Build Communication Tools for Future Missions to Mars"
    Innovations Report (06/09/04); Lewis, Joyce

    Researchers at the University of Southampton's School of Electronics and Computer Science (ECS) are working with NASA's Work Systems Design and Evaluation Group to find a robust means of communication between astronauts on Mars and Earth-based Remote Science Teams (RSTs). Traditional forms of remote collaboration such as real-time conversation and computer-screen sharing would not apply in such a scenario because of communication delays, while the fact that a typical RST spans across different time zones is another complicating factor. The Southampton/NASA researchers have developed meeting relay software that integrates video of the astronauts' analysis and planning conference with other materials and results that can be reviewed by the RST via an uncomplicated interface. The RST can use the data available most efficiently by rapidly navigating to important points in the meeting record. The meeting relay software was developed under the aegis of the e-Science CoAKTinG project, whose members include Open University researchers who contributed software that the Mars crew employs to organize scientific data and information in the course of meetings. A team of geologists, social scientists, engineers, and programmers have been testing NASA technologies in a simulated Mars habitat located in the Utah desert, and the meeting replay software is one of the technologies being put through its paces. Professor Nigel Shadbolt, director of the AKT consortium, says, "This is a prime example of a mission that depends on detailed planning using future technologies to manage and orchestrate complex behavior and information."
    Click Here to View Full Article

  • "IT Employees Well-Paid, In Demand"
    Forbes (06/08/04); DiCarlo, Lisa

    Meta Group's annual information technology staffing and compensation guide shows that IT workers earn salaries that are 20 percent larger than other workers on average, and that 45 percent of companies are willing to pay top dollar for skills in Wi-Fi, security, data management, application development, and other critical areas. The survey of 650 companies reveals that only 19 percent of companies used offshore labor, and the research firm believes fewer companies are using the skills of IT workers overseas because distance, language, and culture remain difficult barriers to address. Moreover, Meta found that demand for certain IT skills remains strong, but the retirement of baby boomers in the next five to eight years could create a crisis situation if there is not a commitment to replenish the IT labor pool. The United States would do well to have a thorough national policy for training workers for new technologies, and attract more young people to careers in computer science and engineering. Nonetheless, 72 percent of companies say worker morale is low, which they believe is a result of perceptions about the current market for IT jobs. "Long term, being in IT is an excellent [career] choice but now it's hard for the average person to see that," says Meta program director Maria Schafer, author of the study.
    Click Here to View Full Article

  • "Toward Intelligent Assistants"
    EurekAlert (06/08/04)

    The German Research Foundation, Deutsche Forschungsgemeinschaft (DFG), wrapped up six years of research into its "Design and Design Methodology of Embedded Systems" Priority Program with a report detailing methods created for the development of current and future embedded systems in 14 DFG-funded projects. The effort involved collaboration with 34 firms and a dozen university and research institutes outside the Priority Program, and yielded over 100 publications and more than 20 dissertations between 1997 and 2003. The goal of the program was to embed as many intelligent functions as possible within a single system, ensuring that the various components interact to the best of their ability while keeping electricity consumption to a minimum and reducing size. One initiative at the University of Tubingen's Wilhelm Schickard Institute focused on the pre-commissioning test for automotive systems, with the end result being the Spyder emulation environment to simulate real conditions reliably. A project at Munich's Institute for Computer Science involved the development of a mathematical methodology for characterizing systems at the convergence point between discrete and continuous processes, using a train braking system as a model. The Technical University of Ilmenau undertook a project focusing on the modeling of multiple coordinate drives so that the verification of timing conditions can be enhanced, while researchers at Karlsruhe's Institute for Information Processing Technology hashed out a homogenous design methodology for embedded systems that was tested on an automatic sampler for a chemical analysis system.
    Click Here to View Full Article

  • "A Jet-Powered PDA for Astronauts"
    Wired News (06/10/04); Shachtman, Noah

    Astronauts are already overburdened by the myriad tasks and chores they must perform on missions in addition to the complex experiments and procedures they must conduct and monitor, so NASA Ames Research Center engineers have spent six years developing a robot that can lighten the load for both space travelers and mission controllers. The result is the Personal Satellite Assistant (PSA), a spherical, sensor-laden prototype drone that can maneuver on forced air jets. The PSA is capable of checking oxygen, temperature, and air pressure through its sensors, while its onboard camera can help facilitate teleconferences between astronauts and the ground crew. The robot could be used to relay instructions from mission controllers to the astronauts through a built-in speaker or liquid crystal display screen, as well as a laser pointer to indicate what controls should be activated. In addition, the drone could monitor some experiments without human assistance. PSA program manager Gregory Dorais explains that making the sphere space-ready will require a sizeable investment in time and money--three more years of development, and millions of dollars more in addition to the few million NASA has already provided. However, NASA Watch editor Keith Cowing writes, "The question I have is whether this is shoving technology where it is not needed--or whether the astronauts have actually said, 'You know, I could really use something (like) that.'" The PSA features a Pentium II processor and the GNU Linux operating system, and was initially inspired by the tricorder device from the TV show, "Star Trek."
    Click Here to View Full Article

  • "Invasion of the Spambots"
    Salon.com (06/08/04); Williams, Sam

    Spambots are mutating into numerous varieties that relentlessly penetrate new areas, such as instant messaging, blogs, chat rooms, and cell phones, and these mutations are being driven by two antithetical online publishing trends: Growing homogeneity in the use of Google and other basic software tools, and increasingly specialized content. These new, indirect techniques are designed for the purpose of enhancing visibility rather than solicitation or receipt confirmation, in the hopes that popular search engines such as Google will highly rank links to marketers' sites in search results. Innovative spambots lend themselves particularly well to adult entertainment companies such as Edge Productions, whose VP Domenic Merenda has split the programs into three varieties--address-harvesting bots, URL-proliferator bots, and lead-generation bots, the most advanced and expensive option. The lead-generation bots analyze R- and X-rated chat-room logs, where they scan transcripts to determine the names and addresses of the most active participants, who are then targeted by adult-oriented ads produced by third-party vendors. However, this strategy can backfire due to large numbers of bots disguised as people who turn out to be the most active forum participants. Carnegie Mellon University researchers have developed automated CAPTCHA programs to discourage spammers' use of lead-generation bots in chat rooms, although the safeguard is not foolproof. CAPTCHAs are set up so that users must identify a randomly generated word to prove they are human, the catch being that the word is distorted and often displayed against a patterned background that even the most advanced optical character recognition systems cannot decipher.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "Breakthrough Invention in Computer Gaming "Human Pacman" Where the City Becomes a Mixed Reality Pacman World"
    Press Release Network (06/08/04)

    A breakthrough gaming system developed by researchers at the University of Singapore's Mixed Reality Lab moves the world of Pacman out of the digital arena and into the real world. The "Human Pacman" system enables a person to assume the role of either Pacman or his Ghost adversary and walk actual streets collecting virtual "cookies" visible through goggles. Players pick up the cookies by walking through them, and can also collect "power pills" represented by real objects equipped with Bluetooth computers. A person playing a ghost can catch the players representing Pacman by physically touching them. In addition, people all over the world can watch the players' progress--and influence the game's outcome--via the Internet. Human Pacman is symbolic of a new mode of future computer gaming in which people use the entertainment system to interact, socialize, and have fun. A demonstration of Human Pacman drew sizable interest and praise from attendees at the CHI 2004 conference in Vienna, while an academic paper on the gaming system was published in Personal and Ubiquitous Computing this month. The research team that developed Human Pacman was lead by Dr. Adrian Cheok of the Mixed Reality Lab, which conducts research on human-computer interaction, wearable computers and smart spaces, mixed reality, multi-modal recognition, fuzzy systems, and embedded systems.
    Click Here to View Full Article

  • "Greedy Hackers Can Hog Wi-Fi Bandwidth"
    New Scientist (06/08/04); Biever, Celeste

    All hackers have to do to hoard most of the bandwidth at Wi-Fi hotspots is alter just one line of code in the Linux operating system, according to Swiss Federal Institute of Technology researcher Imad Aad at the MobiSys 2004 conference on June 7. The hack for changing the Multiple Access Control (MAC) protocol on a Linux computer was made possible by the market rollout of new Wi-Fi access cards last year. Each hotspot user is randomly assigned a data transfer rate by one line of the MAC protocol, and these rates are continuously re-designated so that users receive data at a more or less consistent rate. However, a hacker can set a high-value rate for himself and monopolize bandwidth through the MAC protocol line alteration, which is easy to do on Linux computers, where all the code is freely available. Aad demonstrated DOMINO, a software tool that Wireless Internet Service providers (WISPS) could employ to nail greedy users by rapidly examining data exchange rates between users and the access point. But MobiSys delegate Adam Wolisz of the Technical University of Berlin called DOMINO limited because it only flags bandwidth hogging in the MAC layer. "There are so many places that you can misbehave to gain advantage," he argued, listing the TCP protocol as an example. Aad countered that TCP-layer cheating is less efficient than MAC-layer cheating.
    Click Here to View Full Article

  • "Elections Chief Calls for More Secure Vote"
    Reuters (06/09/04); Sullivan, Andy

    Without endorsing a specific measure, federal Elections Assistance Commission Chairman DeForest Soaries today recommended that electronic voting systems be augmented with improved protection against hackers and bugs. Possible solutions he mentioned included printers that provide an audit trail, cryptography, voice identification, or even randomly reviewing the e-voting machines on election day. Soaries insisted that e-voting systems manufacturers must disclose their source code to election officials in order to prevent exploitable security flaws. He also noted that officials could find out if software on a particular system has been changed following certification via a federally administrated reference library. Soaries warned that one potential solution, printers, could only exacerbate voting problems if they are hastily deployed without being thoroughly tested. About one-third of the American voting population is expected to vote electronically in the November election, and Soaries said he would present his proposals to the commission in a couple of days; however, the commission cannot require states to adopt specific standards. Johns Hopkins computer-science professor Avi Rubin commented that Soaries' influence is limited by finite resources and his wish to avoid estranging local election officials.
    Click Here to View Full Article

    See http://www.acm.org/usacm for more e-voting information.

  • "CIO Council Finds IT Workers Lack Necessary Skills"
    Government Computer News (06/07/04) Vol. 23, No. 13; Miller, Jason

    The newly released 2003 Clinger-Cohen Assessment Survey from the CIO Council indicates that very few federal IT workers have comprehensive experience in a variety of IT areas. The survey, which focused on over 19,000 federal IT employees, found that just above 6 percent said they have extensive knowledge of enterprise architecture. Just under 5 percent reported having wide-ranging knowledge about e-government, and a little less than 15 percent said they have extensive knowledge of cybersecurity. The self-assessments "could reflect that the workforce, in general, is equipped to handle complex jobs without the need to understand how a particular technology works," according to the report. This is based on the observation that competency proficiencies were ranked higher than skill proficiencies. The report suggested that such discrepancies might be caused by insufficient training and certification in new technologies or that such functions are being handled by contractors. Most respondents said they were proficient in such areas as hardware, configuration management, operating systems, technical documentation, data management, problem solving, and customer service. In its report, the CIO Council urged that its Committee on IT Workforce and Human Capital team up with its Federal Architecture and Infrastructure Committee to help integrate human capital within the Federal Enterprise Architect Business Reference Model, so that agency planning and management could be enhanced.
    Click Here to View Full Article

  • "NASA, Xerox Collaborate"
    Federal Computer Week (06/09/04); Chourey, Sarita

    Xerox and NASA are partnering on several information technology projects, which have so far yielded tools used to investigate the crash of the space shuttle Columbia and the NX Knowledge Network, which melds software developed at NASA's Ames Research Center and Xerox's global research centers. "Xerox gets some technology to develop into a [potentially] commercial product, and NASA gets a knowledge management system that we needed, a collaborative tool for collecting, archiving and retrieving a variety of data sets," explains Ames technology partnership manager David Lackner. NX, which is being employed by researchers at Ames and 12 universities, will be used in a pilot application that could help NASA Astrobiology Institute scientists ascertain the presence of life on Mars, and will also serve as a risk management, anomaly analysis, and accident investigation tool. Lackner estimates that developing NX without any outside help would have probably cost NASA roughly $2.5 million. Xerox business developer Randy Nickel observes that NASA technologies share close similarities with technologies required by major companies, and says that NX components will be embedded in Xerox products. He also contends that NASA research often results in real-world applications that are useful to Xerox customers. "Working with high-tech companies allows NASA to pursue its mission of space discovery in a more collaborative spirit, while taking advantage of the best technology the commercial sector has to offer," says NASA's Craig Steidle.
    Click Here to View Full Article

  • "The Storage Story: More, More"
    Business Week (06/08/04); Salkever, Alex

    PC storage will move beyond the traditional spinning-disk hard drive in the future, but in the meantime is seeing tremendous advances in conventional hard drive technology. Hewlett-Packard unveiled a Personal Media Drive at the Windows Hardware Engineering Conference that can store about 500 GB worth of data in a shoe-box-size unit; the drive basically functions as a portable external hard drive and can be switched out without opening the PC case. The Personal Media Drive shows that, while not grabbing continuous headlines itself, hard drive technology's rapid advance is essential to today's digital lifestyle. The National Storage Industry Consortium wants to put 1 TB of data on a square inch of media by 2008, enabling a CD-size disk to store a monochrome image of every person on the planet, says Carnegie Mellon University storage systems researcher Edward Schlesinger; that type of capacity means that information could be stored locally instead of pulled off the Internet--music labels could store albums on PCs for release upon payment, for example. An immediate problem for increasing storage density, however, is heat that reduces magnetic charges and degrades data: The phenomenon, called superparamagnetic limit, is being addressed in the short term by standing magnetic charges on end--perpendicular to the disk surface, as opposed to the traditional longitudinal format. That tact eases the thermal effect and is being adopted by every storage maker, says Maxtor's Ken Johnson. Further out, in about five to 10 years, hard drive makers may switch to heat-assisted magnetic recording, which would use special films to better hold magnetic charges; lasers would be necessary to heat cells enough so that they release the charge and allow the data to be rewritten. IBM is researching nanomechanical storage that does away with the spinning disk altogether, instead using super-accurate probes to manipulate and read individual atoms, similar to a scanning electron microscope.
    Click Here to View Full Article

  • "New Data Storage Standard iSCSI Could Reshape Huge Tech Market"
    Investor's Business Daily (06/09/04) P. A4; Deagon, Brian

    The data storage industry could undergo a significant sea change with the emergence of the Internet Small Computer Systems Interface (iSCSI) standard, which promises to boost the efficiency of data storage while reducing its cost to businesses. ISCSI would complement "new wave" technologies such as grid computing, open-source software, and blade servers, and facilitate a directional shift toward the commoditization of data storage. The software supports storage networks for smaller businesses as well as departments or work groups in large corporations, which Precursor analyst Bart Kaplan says constitutes a market conservatively estimated to be worth almost $15 billion. Since Internet Protocol serves as the root element of iSCSI, a lot of tech support and associated costs can theoretically be eliminated. About a decade ago, large firms started transferring their most business-critical information to large data centers with high deployment and maintenance costs, whereas today most major corporations use large storage area networks. Nowadays, most big companies store business-critical data on high-end storage systems. The next wave of networked storage--storage over IP via iSCSI--accelerates Ethernet, while ISCSI, grids, and blades will comprise a new computing architecture that will standardize the high-end market for fibre-channel network products. However, McData's Tom Clark argues that "it's naive to think companies running massive data applications will step down to Ethernet and storage over IP."

  • "New Frontier for Wireless: Sensor Networks"
    Network World (06/07/04) Vol. 21, No. 23, P. 10; Cox, John

    Traditional sensor networks used for monitoring heavy equipment and building infrastructure are going wireless in a number of first-time production deployments: Wireless technology such as Wi-Fi, Bluetooth, RFID, and low-power radio transmissions such as the newly approved IEEE 802.15.4 standard all promise to make sensor networks easier and cheaper to deploy, as well as opening up some new applications; but early deployments show that, while the technology is sound, the configuration and network management issues are complex. Building-automation firm Andover Controls is using wireless sensors equipped with actuators, and possibly motion-detection equipment, to automatically control air conditioning units in hotels. By switching air conditioning off when no one is in a room, hotels could save significant amounts of money. Philips Lighting Electronics and Tyco Thermal Controls are testing similar networks to monitor and control fluorescent lighting and pipe-warming equipment, respectively; Tyco Thermal Controls general manager Ken McCoy notes that wiring in traditional sensor networks can could $10 per foot, but says wireless technology introduces new challenges. The technology from Ember has worked well, but Tyco had to approach network management differently--instead of the control panel pulling information, sensor nodes push data out to receptive control panels. Accenture Technology Labs set up a wireless sensor network at Pickberry Vineyard in California early this year and had to adjust the rate of transmission so as to conserve sensor battery life. For efficiency, gateway devices apply rules to gathered data before passing it on to the Accenture enterprise network. Accenture researcher William Westerman says sensor networks should also be configured to aggregate relevant data instead of sending it discretely, such as the temperature in a particular geographic zone.
    Click Here to View Full Article

  • "Lights, Camera, Interaction"
    InfoWorld (06/07/04) Vol. 26, No. 23, P. 55; Udell, Jon

    Hardware and software developers are known for their inability to adopt the mental perspective of novice users, but gaining insight into the beginner's mindset through observation--the only effective strategy for usability testing--is difficult, given the high cost and inconvenience of formal study in fixed labs. A new generation of tools is being developed to remove such barriers: Alucid Networks, UserWorks, and Ovo Studios are offering portable labs--toolkits of equipment to record and edit video footage of both onscreen operations and the users as they carry them out--as a less expensive and more adaptable solution. UsersFirst's VisualMark observation and analysis tools run directly from a CD-ROM on the target machine and channel screen and user video feeds to a Macintosh-based portable lab for live observation, eliminating the need to install software on the target machine. The Windows-based Morae toolkit from TechSmith features a remote viewer that permits observers to watch screen activity, but not live user video, on the target machine; both streams are fed to the user's drive for later study, while Morae's "rich recording" technology tallies up the mouse clicks and Web-page views needed to conduct operations, allowing usability analysts to "focus more on the qualitative side as they're observing tests," according to Morae's Shane Lovellette. Morae and VisualMark can extract and annotate important points of interaction and edit them into a single reel for usability analysis. Bridging the gap between users' and developers' mental models will theoretically become even simpler if current software development trends hold. For example, biofeedback could be employed to determine a user's stress level, while a services-oriented architecture should also be beneficial.
    Click Here to View Full Article

  • "When, Where, and WiMax"
    EDN Magazine (05/27/04) Vol. 49, No. 11, P. 26; Miller, Matthew

    Advocates of the WiMax wireless communications standard are hoping that demand for WiMax-certified products will surge in the same way that the 802.11 standard blossomed once the WiFi Alliance began certifying interoperability. However, the hype surrounding WiMax may have set up some unrealistic expectations. The current plan calls for the release by Intel and Fujitsu of WiMax-capable media-access controller and physical layer chips this year, and the WiMax Forum will begin to host interoperability "plugfests" in late 2004. The first WiMax-certified products are expected to make their market debut in the first six months of 2005: In that time period, Intel anticipates the rollout of outdoor antennas furnishing high-speed service to businesses and "premium residential" clients, while self-sustainable customer-premises equipment (CPE) should hit the market in the latter half of 2005, with portable WiMax following in 2006 and 2007. This prediction is debatable, as it will be a formidable challenge to develop user-friendly CPE at affordable prices, while consultant Monica Paolini notes that outdoor antennas are more receptive than indoor, and may therefore be more appealing to consumers. Other stumbling blocks include power budget issues, thermal and electrical envelope considerations, authentication, and cell-to-cell and network-to-network handoff; furthermore, Aperto Networks' Alan Mendez notes that smaller cells are needed to support indoor antennas and portable-system users, which leads to increases in base station installations and capital expenditures. In addition, WiMax will be vying for market share with Cable and DSL providers, 3G cellular technology vendors, and other competitors. Paolini predicts that WiMax's residential market will not start to heat up until 2007, while ABI Research forecasts that spending on WiMax gear will not overtake proprietary fixed-wireless equipment until 2009.
    Click Here to View Full Article

  • "Q&A: Managing Information Overload"
    Optimize (05/04) No. 31, P. 58; Klein, Paula

    MIT Sloan School of Management professor and Center for eBusiness director Erik Brynjolfsson cites a two-year study of executive recruiters that finds that worker productivity is most directly affected by employees' communication networks. Among the conclusions the report makes is that recruiters who make proportionately more customer- and client-focused communication are more productive, better paid, and boast higher project-completion rates than recruiters who devote more time to internal communication; there is no correlation between the amount of email messages sent and received and work performance; and workers who made extensive use of technology produced more revenue and completed more work annually despite taking slightly longer to complete projects because they were skilled at multitasking. Brynjolfsson comments that technology is not being used to its full potential because of information overload, a topic that will be discussed at MIT's eBusiness Annual Conference. The professor offers a four-pronged strategy that companies are employing to deal with information overload: Disregarding some of the data, either by skipping collation altogether or focusing only on the most important data; the construction of intelligent, machine-based filters and automated decision makers; the introduction of more distributed decision makers; and augmentation of workers' information-processing capability through improved training, education, and recruitment. "Successful companies in the future will have design principles that let them exploit low-cost information without being paralyzed by information overload," Brynjolfsson predicts. He disputes the argument that the growing ratio of white-collar to blue-collar employees is indicative of a fall-off in white-collar productivity, and believes that CIOs are wrong to define productivity as cutting costs or strictly associate it with the IT function. Future trends Brynjolfsson anticipates include the emergence of more new-business models and a transition from supply chains to value networks.
    Click Here to View Full Article

    [ Archives ]  [ Home ]