HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 595: Friday, January 16, 2004

  • "Seeds of Destruction"
    CNet (01/15/04); Lemos, Robert

    Some computer security specialists believe effective deterrents against viruses and worms could be developed by studying outbreaks of agricultural epidemics, which have significant parallels. The spread of worms such as MSBlast shares similarities to the Dutch Elm blight, which was a foreign species introduced into an environment that had no defense against it; "People have brought over species that we didn't expect here, just like people have created viruses that Microsoft didn't expect to deal with," notes Jeff Dukes of the University of Massachusetts. The wide vulnerability of computers and networks to malware is attributed to a technological monoculture, in much the same way that ecological monocultures such as the Irish poor's dependence on one species of potato in the early 19th century led to devastating famines. An October report by major security experts warns that overreliance on Microsoft technology has created a computing and Internet monoculture; one of the report's authors, InternetPerils President John Quarterman, wrote that nearly all of the most recent cyberattacks targeted monoculture applications. Just as farmers are recommended to diversify their plantings to avoid famines, computer researchers urge developers to diversify programs so that they are less susceptible to viruses. The October report suggests that the current software environment would be much more secure by introducing non-Windows products made by companies other than Microsoft. An even more critical monoculture than Microsoft technology could be the Internet routing infrastructure's heavy dependence on the Simple Network Management Protocol. Securing technology through diversification is even more essential these days, what with the U.S. economy's growing reliance on computers.
    Click Here to View Full Article

  • "Silicon, Not Just Software, Key to Pervasive Media"
    EE Times (01/14/04); Wallace, Richard

    The tech industry is focusing more on media than on computing, and silicon technology is enabling the transition: Evidence of silicon's importance in pervasive media was everywhere at the 2004 Computer Electronics Show, even though software claimed most of the limelight. MIT Media Laboratory director Andrew Lippman said faster and cheaper electronics was breaking down the barriers to interfacing and connectivity, and he called the integration of wireless technology into audio and video components a "threshold event" that heralds big changes soon. MIT also announced a new initiative that would make it easier for small firms to access the lab's new pervasive-media technologies, including material and design methods; wireless, parasitic, and self-generating power technologies; sensors; actuators and displays; network ecosystems; and cooperative wireless communications. An interesting example of silicon's role in pervasive media was seen in a display by Royal Philips Electronics and Visa International that showcased Philips' near field communications (NFC) technology embedded in a credit card for wireless authorization and transaction. Another NFC demonstration involved an RFID-enabled PDA that could purchase tickets wirelessly when brought near a poster with embedded NFC device. Philips Semiconductors CEO Scott McGregor said the reader costs less than $5 while the embedded chip just tens of cents. Though pervasive media software would not go far without new hardware, chip makers are also becoming more proficient software coders: Intel's Xscale architecture and Analog Devices' BlackFin DSP architecture are used in Microsoft's new Portable Media Center and Media Center PC, for example. In the larger market view, hardware vendors such as Gateway, Hewlett-Packard, and Intel are positioning themselves in the consumer electronics market.
    Click Here to View Full Article

  • "'Robot Scientist' Said to Equal Humans at Some Tasks"
    National Geographic News (01/14/04); Roach, John A group of British scientists have built a "r

    bot scientist" capable of composing hypotheses, designing experiments, and delineating results as well as the best people. Ross King of the University of Wales and Stephen Oliver of the University of Manchester report that the robot scientist consists of a liquid-handling apparatus connected to computers, and enhanced with highly advanced robotics and artificial intelligence software. The robot demonstrated its abilities by correctly figuring out the specific functions of genes in baker's yeast using "knockout" yeast strains that are missing one gene. In a report published in the Jan. 15 issue of Nature, King, Oliver, and colleagues at the Universities of Wales and Manchester, Aberdeen's Robert Gordon University, and London's Imperial College note that the robot sheds critical light on the scientific process, which will inevitably become automated. "It is inevitable because it will be required to deal with the challenges of science in the 21st century," the researchers explain. "It is also desirable because it frees scientists to make the high-level creative leaps at which they excel." However, Boston University biomechanical engineer James Collins insists that there will always be a need for human imagination and creativity in the scientific process, and is doubtful that the robot scientist will completely supplant human scientists; he notes that the device would be of significant help in sifting through vast amounts of data in gene research and other areas of study. King and Oliver say that they are now chiefly concerned with demonstrating that their robot scientist is capable of yielding new scientific knowledge.
    Click Here to View Full Article

  • "Lines Blurring Between Handhelds and Wearables"
    TechNewsWorld (01/14/04); Saunders, John

    Though there is a clear boundary between portable and wearable computers--wearables allow for entirely hands-free operations, for instance--experts such as Gartner's Jackie Fenn expect that boundary to erode as wearable technology moves out of niche markets such as the military and into the mainstream. Fenn foresees the emergence of a new breed of portable/wearable device that boasts more flexibility and networked capabilities than current tools, and she predicts that cell phone headsets and advanced eye-wear will become very popular. Fenn acknowledges that wearables' mainstream penetration may be accompanied by social or cultural change, but is confident that society will acclimate itself. Tek Gear managing director Tony Havelka anticipates business changes as the spectrum of wearable technology configurations and capabilities expands. He believes wearable gadgets will quickly bridge the gap between military or industrial markets to more consumer-based applications, while the flexibility, portability, and power of wearables will increase thanks to miniaturization and wireless technologies. Havelka predicts market changes as well, in which buyers pay more attention to the bottom line rather than the technology, a trend that could spell trouble for more traditional manufacturers. A recent research project in Canada focused on how portable computing devices can be used to provide what University of Toronto psychiatrist Dr. David Kreindler calls "wireless psychiatric telemetry." In an experiment to determine whether moods can be better quantified, 40 participants regularly filled out questionnaires on Qualcomm pdQ devices, with the results wirelessly sent to researchers looking for signs of bipolar disorder.
    Click Here to View Full Article

  • "'Signs of Life' Expected in 2004 IT Salaries"
    IT Management (01/15/04); Gaudin, Sharon

    A pair of studies from Janco Associates and Foote Partners indicate that after several years of stagnation, IT compensation is finally gaining momentum, even with increasing offshore outsourcing of IT positions. Industry analysts estimate that upper-level IT managers will receive a 3 percent to 4 percent salary increase this year along with bonuses, but salaries for lower-level IT employees such as programmers, business processors, and help desk workers are more likely to stay down because of the offshoring explosion. Janco's 2004 IT Salary Survey reckons that mean total compensation for CIOs experienced a 1.32 percent increase in large organizations and a 2.66 percent increase in mid-sized companies; meanwhile, mean total compensation for all mid-size enterprise positions rose from $75,759 in the last quarter of 2002 to $76,003 in early 2004. The Janco study also finds that some enterprises that had shed training, planning, and infrastructure positions are starting to hire in those areas again, albeit on a limited level. In addition, the Janco poll notes that many enterprises are elevating voice/wireless communications and security positions, while data warehousing, object programming, and e-commerce are in high demand. Foote Partners predicts that salaries for many IT workers will continue to decline, given that offshored IT labor has reduced the importance of domestic staff for the third consecutive quarter. However, company president David Foote forecasts that "Any job that requires in-depth knowledge of the company, like prototyping, data modeling, enterprise project managers, security experts and network administration, will be safe."
    Click Here to View Full Article

  • "World Wide Web Consortium Publishes CC/PP 1.0 as a W3C Recommendation"
    Business Wire (01/15/04)

    The World Wide Web Consortium (W3C) announced on Jan. 15 the publication of the CC/PP (Composite Capability/Preference Profiles) Structure and Vocabularies 1.0 Recommendation, a profiling language designed to help deliver Web content across a wide spectrum of devices by describing device capabilities and user preferences. CC/PP serves as a framework that allows devices to communicate their delivery context to a Web server, so that content can be effectively displayed on the user's device. "CC/PP plays a vital role in supporting the ability of people to access the Web from an increasingly diverse range of devices," notes W3C Device Independence Working Group (DIWG) Chairman Rhys Lewis. "By providing a stable framework for devices and Web servers to optimize content delivery, CC/PP provides a foundation for a device independent Web, and actual device empowerment." CC/PP is the first W3C-recommended Resource Description Framework (RDF) application, and its RDF component allows the standard to support extensible vocabularies that do not need a central registry, as well as easy integration of information from disparate sources. The design of CC/PP was a collaborative effort between W3C and the Open Mobile Alliance, which developed the User Agent Profile specification, a mobile phone vocabulary; and the Java Community Process, which devised a Java API that permits a Java Web server to access and employ CC/PP data supplied by a client device. DIWG plans to embed the final version of RDF datatyping in a revised 1.0 specification, and is developing Protocol and Processing Rules that will homogenize how CC/PP data is sent to servers using different protocols as well as how proxies can tweak CC/PP data by inserting their own unique features.
    Click Here to View Full Article

  • "Java Guru Speaks Out on Interoperable Tools Efforts"
    CRN (01/14/04); Montalbano, Elizabeth

    Java pioneer James Gosling, who recently returned to Sun Microsystems as CTO for the Java development platform, says competition among Java development platforms is healthy and that several options would likely be available in the future, just as they are now. Sun's NetBeans standard framework is running against IBM's Eclipse and Borland's OpenTools API efforts; standard APIs, however, are needed to allow simplified Java development between platforms and different third-party tools. An attempt to standardize plug-ins between IBM and Sun's platforms stalled in November, but Gosling says communication is still ongoing and both sides are open to a possible compromise. Importantly, IBM has not joined the Java Tools Community (JTC) which was formed recently by Sun, BEA Systems, Oracle, and other interested parties, and that group is working on standard APIs within the framework of the Java Community Process (JCP). Gosling says the community process is admittedly slow in some cases, but that a reasonable estimate is for the JCP to approve standard APIs in a year. Despite its inefficiencies, Gosling is pleased with the overall process Java's community-led development has had in the past nine years since it was conceived; he says that scrutiny of millions of Java developers causes any new developments to be truly innovative, reliable, and secure in comparison to rival coding platforms from proprietary vendors such as Microsoft. NetBeans engineering director Steve Wilson says the NetBeans 3.6 version due out in March will include many solid steps forward, including a GUI appearing native to Windows with redesigned workflow between debugger and GUI designer. NetBeans 3.6 also brings in code folding so that developers can collapse some parts to focus on others. NetBeans 4.0 is due out sometime in the third calendar quarter of 2004 and will include automated refactoring and support for version 1.5 of the J2SE Java client standard.
    Click Here to View Full Article

  • "Can We Conquer Nanotech Fear?"
    Financial Times (01/15/04) P. 8; Harvey, Fiona

    The biggest alleged threat of nanotechnology is the "grey goo" scenario, in which billions of microscopic, self-replicating machines or "nano-robots" run amuck, deconstructing every atomic structure into a shapeless mush. Experts such as Ken Donaldson at the University of Edinburgh dismiss such apocalyptic visions as science fiction, arguing that the creation of such machines involves practical challenges that are probably insurmountable. On the other hand, the University of Leeds' Rik Brydson and others warn that nanotech could be dangerous in other ways: For one thing, no one knows the environmental effects of heavy concentrations of nanomaterials, nor how carbon nanotubes and other nano-substances could affect the human body. Problems facing nanotechnologists include the fact that materials may exhibit different behavior at nanoscale, and the potential for delayed negative reactions from exposure to such substances. Tackling such challenges will involve new techniques and a cross-disciplinary strategy. Organizations such as Canada's Action Group for Erosion, Technology, and Concentration are clamoring for nanotech testing and the development of safety standards, while recent conferences and studies in the United Kingdom focus on addressing public concerns about nanotech and understanding the latest nanotech research. Nanotechnologists have come to realize the need for public accountability by seeing the detrimental effects of health scares and miscommunication on genetically modified food research and other scientific efforts. Donaldson argues that scientists must participate in public debate so that a potentially crippling moratorium on nanotech research can be avoided. "We need to be seen as responsible, not as uncaring," he insists.

  • "Swiss Expert Leads Fights Against Computer Viruses"
    Swissinfo (01/11/04); Strebel, Etienne

    Swiss Internet security expert Urs Gattiker, a professor of management and information science at the International School of New Media at Germany's University of Lubeck, says he is involved with Cyberworld Awareness and Security Enhancement Structure (CASES), a publicly funded, pan-European initiative whose goal is to raise awareness of information security issues among the public. He notes, "Everybody should be able to benefit from information technology, but most people don't really want to know too much about the ins and outs of security issues--they just want the technology to work." Gattiker says specific legislation is unnecessary in the fight against viruses and hackers, and points out that Switzerland's legislative solution is simply to criminalize those who are not authorized to use information technology. He advocates that people should recognize the fact that the spread of recorded information such as medical, fiscal, and financial data is making data protection increasingly difficult. "We have to be aware of this, but I am convinced that we can only change the situation if people fight for their rights and make sure their data is handled correctly," Gattiker insists. The security expert also notes that Switzerland should look to Denmark as a market regulation model for communication technology. He says, for instance, that the fierce competition among Danish telecoms has benefited consumers.
    Click Here to View Full Article

  • "Lab Starts on Next-Generation Supercomputers"
    Associated Press (01/16/04)

    Los Alamos National Laboratory's computer and computational sciences division has been awarded a three-year, $4.2 million grant from the Defense Department's research division to help develop the next generation of supercomputers by 2008. Los Alamos researchers will work on performance modelling and analysis tools, evaluate networks, and develop software tools. The researchers, led by Adolfy Hoisie, won the Best Paper Award at this year's Supercomputing 2003 Conference for their work on performance modelling. Hoisie says, "We're excited about the enormous potential of performance modelling and the opportunity to apply a variety of methodologies that we developed to help in the design of these future supercomputers." Los Alamos researchers were able to detect and correct a serious performance bottleneck in the lab's Q machine, the world's second most powerful computer, and showed how to apply the methodologies to similar computers.
    Click Here to View Full Article

  • "We Can Trap More Crooks With a Net Full of Honey"
    Washington Post (01/11/04) P. B1; Schrage, Michael

    Michael Schrage, co-director of the MIT Media Lab's eMarkets Initiative, writes that public institutions and private enterprise will shield their networks, data, and other valued assets with tools that purposefully mislead and entrap those who would hope to exploit, steal, or damage those assets. Operation Pin, an international effort to deter and rein in pedophiles and collectors of child pornography by baiting them with bogus Web sites or "honeypots," is one example that Schrage cites. Installing honeypots is cheap and simple, and the lures can be augmented with spyware and other mechanisms for tracking down those who access them. Their ultimate purpose is to help companies and institutions take action against wrongdoers before they can commit their crimes. Schrage believes Fortune 500 CIOs and federal agencies that fail to deploy honeypots to protect themselves against hackers, saboteurs, and other malicious parties are acting irresponsibly, while the government should make it a priority to promote the widescale implementation of honeypots on a state, federal, and industrial level. Schrage does not deny that honeypots and similar tools "will undeniably spawn legal and policy battles over "appropriate" entrapment and "reasonable" deception that will make today's complaints about viruses, spam and privacy seem like pleasant conversations." But he argues that concerns about potential abuse and litigation should not dissuade people from considering such tools. Schrage concludes that "The use of honeypots affirms a sad but true global realpolitik: In the war on terrorism--as well as other kinds of crime--digital dishonesty may not be the best policy. But it's a pretty good one."
    Click Here to View Full Article

  • "Latest in Gizmos, Cars Share Stage"
    Baltimore Sun (01/15/04) P. D1; Mirabella, Lorraine

    Auto makers are trying to introduce new technologies at a pace and in a way that consumers can accept, as evidenced at the Motor Trend International Auto Show in Baltimore. Center of Automotive Research Chairman David Cole says no company knows exactly what technology consumers consider a value for their money. General Motors is pushing technology in its standard line of cars like the mid-size Chevrolet Malibu, which sells for about $20,000; the new Malibu has a remote starter and automatic crash notification system that contacts an operator when car sensors detect a crash, while the car's On-Star navigation system has a cell phone, allowing operators to ask the driver if they are alright after an accident. To prove that technological sophistication does not equal marketplace success, Consumer Reports auto testing director David Champion points to the BMW 7 series' iDrive device, saying that "It's like trying to learn Windows from DOS at 80 miles an hour." The iDrive integrates so many functions into one screen that it is difficult to accomplish simple tasks such as switching radio stations, Champion says. Although the iDrive may appeal to technophiles, it can be too complicated for many people, admits Motor Trend feature editor John Matthius; part of the problem is that the iDrive has a single large screen with just one joystick controller to operate it. Other new technologies at the Motor Trend show include dashboard displays in the new Ford Mustang that can be customized in 120 colors, and rear-mounted cameras that let drivers see what is behind them without turning, in the Acura MDX and Infiniti Q45. There are other auto technologies no one really remembers or sees nowadays, including "heads-up displays" showing speedometer readings in the windshield or rain-sensing windshield wipers developed in the late 1950s.
    Click Here to View Full Article

  • "Transforming Thoughts Into Deeds"
    Wired News (01/14/04); Philipkoski, Kristen

    Companies such as Cyberkinetics in Foxboro, Mass., are working on brain-computer interfaces that allow people to control devices by thought. Cyberkinetics' effort, BrainGate, involves a computer chip outfitted with electrodes surgically plugged into neurons in the user's motor cortex; neuronal signals are transmitted via a fiber-optic cable to a digitizer that feeds into a computer system, which in turn translates the signals into commands. BrainGate's development has been funded by $16 million in venture capital and grants from the National Institutes of Health, and Cyberkinetics CEO Tim Surgenor says about $6 million has been spent thus far. BrainGate has been tested on monkeys, and Surgenor expects human trials with quadriplegic patients to begin by year's end, while a commercial product could hit the market within three years. He adds that the final version of BrainGate will be unnoticeable and wireless, and its activation and deactivation would be thought-controlled. Surgenor believes BrainGate users could control assistive robots as well as computers with the device, or even move their own limbs using muscular electrical implants. Other brain-computer interfaces under development include a virtual keyboard designed by Laura Laitinen at the Helsinki University of Technology, and a device that has already been successfully tested on human patients by Neural Signals researchers. Another project focuses on assisting mobility-challenged people by noninvasively attaching wires to their scalp through which brain waves can be read.
    Click Here to View Full Article

  • "Offshoring--A Hot 2004 Campaign Issue?"
    CNet (01/13/04); McCullagh, Declan

    Offshore IT outsourcing could be a major campaign issue in the months leading up to November, if IT labor activists can effectively promote the woes of jobless white-collar workers. Democratic presidential candidates have rallied against job losses in the manufacturing industry due to overseas competition, with Dennis Kucinich even advocating taking the United States out of the North American Free Trade Agreement (NAFTA) and the World Trade Organization (WTO). But offshore outsourcing is becoming a bigger issue in the IT world especially with companies such as IBM announcing that thousands of software programming jobs will be shifted to India and China. Forrester Research says the trend will only increase as 3.3 million U.S. service industry workers are displaced by foreign competition over the next 15 years. Washington Alliance of Technology Workers organizer Marcus Courtney says major Democratic presidential candidates have not yet realized the serious threat posed to service-sector jobs, including high-paid engineering positions; his group plans to work with labor and other left-wing groups to promote the issue of IT outsourcing after the candidate field narrows in six to seven weeks. At that time, IT outsourcing opponents will be able to target their message: "We'll have nine months or eight months to gain the momentum necessary to make it a significant issue," Courtney says. Although the Democratic contenders largely hold the same stance against rampant free-trade, President Bush remains close to his party's liberal free-trade stance that creating jobs is more important than protecting them. That fits with many corporate leaders' view as encapsulated in a Computer Systems Policy Project report where CEOs from Intel, IBM, Hewlett-Packard, Dell, and others argued that protectionist policy would seriously harm U.S. competitiveness on the global scene.
    Click Here to View Full Article

  • "Evolution Inspires IT Development"
    CIO (01/01/04) Vol. 17, No. 6, P. 20; Fitzgerald, Michael

    The keys to managing IT system complexity may lie in the evolutionary process. "Evolution itself is a fantastic search engine--it goes through millions and millions and millions of things, and comes up with extremely creative designs," points out Icosystem Chairman Eric Bonabeau, who adds that current computer systems have a high level of complexity and often invisible relationships between each other that could be abstracted by studying natural world templates. There are two fundamental evolutionary approaches: One uses evolution as a model, as exemplified by genetic algorithms and the like, while the other analyzes biological systems to uncover clues as to how solutions to specific problems evolve in nature. Examples of biology inspiring IT include artificial intelligence and expert systems designed to mimic the human thought process; neural networks based on the brain; self-healing autonomic computers; and systems security initiatives based on the human immune system. Janine Benyus' book, "Biomimicry: Innovation Inspired by Nature," which illustrates how people arrive at solutions by imitating natural systems, is a partial foundation for applying evolution to IT.
    Click Here to View Full Article

  • "System, Cure Thyself"
    Computerworld (01/12/04) Vol. 32, No. 2, P. 30; Hamblen, Matt

    Analysts and researchers believe the next two to five years will witness the development of systems and networks that can repair themselves across blended applications, storage, and computing resources. There are various definitions for self-healing, ranging from Alan Ganek of IBM's description of an infrastructure's ability to cope with problems to Forrester Research analyst Jean-Pierre Garbani's concept of how a technology monitors itself and self-diagnoses problems to work out solutions; self-healing can therefore encompass both autonomic computing and utility computing. Analyst Richard Ptak comments that a truly self-healing system must be capable of self-monitoring, self-analysis, planning, and execution, and adds that current systems deploy these functions "with varying degrees of sophistication." He explains that deploying all four functions across devices and circuit elements is the most formidable challenge, and outlines a road map for meeting this challenge: He predicts the rollout of self-healing chips in two years that will become capable of supporting self-repairing devices in four years, while organic circuits could emerge within perhaps another year. Analyst Jasmine Noel believes manufacturers will probably team up in the next few months to integrate the four functions of self-healing. The Defense Advanced Research Projects Agency (DARPA) is currently assessing research and testing proposals for its Self-Regenerative Systems initiative. In its solicitation for bids, DARPA notes that "Network-centric warfare demands robust systems that can respond automatically and dynamically to both accidental and deliberate faults." Meanwhile, Georgia Institute of Technology researchers are working on corporate self-healing systems, and Karsten Schwan of the Center for Experimental Research in Computer Systems says that self-repair across network layers via network-aware middleware is one area of emphasis.
    Click Here to View Full Article

  • "Wireless for the Disabled"
    Technology Review (01/04) Vol. 106, No. 10, P. 64; Lok, Corie

    A team led by Georgia Institute of Technology computer scientist John Peifer wants to enhance the lives of disabled people by designing wireless assistive technologies out of affordable off-the-shelf components. The prototype system devised by Georgia Tech researcher Jack Wood is a wearable captioning device for the hearing-disabled that would allow users to follow dialogue in movies, conferences, or lectures in real time through the wireless transmission of captions to a personal digital assistant (PDA). The user can read the text off the PDA or off of a commercially available mini monitor that can be attached to eyeglasses. Giving vision-impaired people more navigability and independence is the purpose of a product designed by Jeff Wilson consisting of a pair of headphones and a wearable computer controlled by a handheld device; a shoulder strap on the bag housing the computer is outfitted with a GPS sensor, which works in conjunction with a head-tracking sensor on the headphones to track the wearer's location and direction. The computer plays beeps over the phones, and the user moves toward the apparent source of the sound, thus following a preprogrammed route. Tracy Westeyn's "gesture panel," designed for people whose fine motor skills are limited, lets users control household devices by waving their hand in front of the panel, which is equipped with 72 infrared light-emitting diodes arranged in a grid configuration. A camera directed at the panel registers interruptions in the infrared beams concurrent with the gesture, and feeds the data to a laptop, which then triggers a specific device control command. Georgia Tech's Rehabilitation Engineering Research Center on Mobile Wireless Technologies for Persons with Disabilities, where Peifer and his team work, plans to help wireless-device manufacturers realize the benefits of refining their existing products for the handicapped.

  • "Distributed Development: Lessons Learned"
    Queue (01/04) Vol. 1, No. 9, P. 26; Turnlund, Michael

    Cisco Systems director of engineering Michael Turnlund writes that the most successful technology projects stem from close team interaction, and distributing teams physically adds complexities to project development that must be resolved through more coordinated management in order to meet the challenges associated with workgroup containment, componentization, development environment, and verification. Communication overhead can be most effectively abbreviated by containing the workgroups to discrete tasks on a geographic level, but Turnlund explains that team members lacking guidance ought to be collocated with at least their lead and one or two senior technical members in order to avoid degraded performance, and adds that the inclusion of a white board is essential to keeping projects comprehensible. "Whether the teams are collocated or far-flung, up-front effort on key integration points and testing standards will prevent turf wars down the road," he comments. Successful code componentization within a distributed environment relies on adherence to the axiom that increasing numbers of moving parts in need of discrete accommodation and less formally defined intrasystem interfaces give rise to more complexity and risk within system development. A single-location development environment supports the reduction of all system components into fine-grained, self-contained units, but this model fails from a combinatorial standpoint--therefore, groupings of component control that follow a logical scheme must be facilitated. Overall, a development environment becomes less homogeneous as team members become more sophisticated; Turnlund notes that variables can be contained through open source tools or vendor-based environments, although each option comes with caveats: In the case of single vendor environments, a lot of variance in code development across teams is eliminated, but teams are permanently beholden to those tools and vendors. Development verification carries its own complexities, Turnlund notes. The "wall-of-shame" code verification strategy used by smaller, younger companies does not fit well across a diverse organizational culture, in which case more formal verification techniques are a better option.

  • "Covering the Intangibles in a KM Initiative"
    IT Professional (12/03) P. 17; Poage, James L.

    The success of an organization's knowledge management (KM) initiative is measured by how widely a knowledge-sharing system or process is accepted and used, and maximizing its use hinges on people and processes, or "intangible" issues, that KM designers often overlook in favor of the technological element. Addressing intangibles can be done by adapting some common engineering and computer science methods. Modularization, in which software and hardware are split into more manageable parts to better deal with complex technology, can also be applied to KM intangibles, which are made up of defining factors such as system compatibility with organizational processes and employees' work styles; complexity of technology use; how much effort employees make to enter, update, and retrieve content; roles of employees in process; and users' feeling of control over their work. These factors cover a wide range of categories, including people roles and human interfaces, implementation and supportability, technology, organizational culture, processes, and strategic. Other engineering techniques that may be useful for KM strategy design include the use of flow diagrams to outline multiple participating user and component perspectives; drilling down specific intangible issues until their potential impact and relevance to the KM initiative at hand is determined so that solutions can be identified; simulation of the KM process through role-playing; and the rollout of prototype components of the KM process as they are built and the collection of user feedback. The efforts of the KM strategy design team must be complemented by the participation, support, and proactive measures of the organization's top management.
    Click Here to View Full Article



    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM