Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to technews@hq.acm.org.
Volume 6, Issue 613:  Wednesday, March 3, 2004

  • "Scattered Glitches as E-Voting Gets Biggest Test"
    Associated Press (03/02/04); Konrad, Rachel; Klug, Foster; Morgante, Michelle

    Ten U.S. states used electronic voting systems in the March 2 primary, representing the biggest test of the technology in the country to date. Although machines suffered from technical malfunctions in California, Maryland, and Georgia, most of these glitches were attributed to human error. Safeguards California election officials have instituted to protect e-voting systems from interference include random testing of touch-screen machines by independent computer specialists and an advisory that poll workers forbid voters from bringing wireless devices such as cellular phones into booths. This did not prevent a power fluctuation on Super Tuesday that caused poll workers in San Diego County to receive a screen for the Windows operating system instead of the voting program when they plugged in the e-voting machines; it took over two hours to remedy the problem; county representative Mike Workman said between 10 percent and 15 percent of the county's 1,611 precincts were affected. A Maryland precinct accidentally got a neighbor's encoders, while Georgia election officials reported that one county in their state neglected to program their encoders. Voters in both states were able to employ paper ballots while the encoders were undergoing repairs, while Carol Darick, chief elections judge at the Maryland precinct where the glitch took place, said some voters were reluctant to cast ballots electronically once the problem was rectified. Georgia Tech student Peter Sahlstrom took pictures of 10 Diebold voting machines that were left insecure and unattended in the lobby of the school's student center on March 1.
    Click Here to View Full Article

    For information on ACM's actions regarding e-voting, visit http://www.acm.org/usacm.

  • "'Smart' Cars Are Learning to Avoid Collisions"
    Wall Street Journal (03/02/04) P. D8; Karr, Al

    Smart crash-avoidance systems are being developed by researchers and car manufacturers at universities and transportation authorities under the aegis of the Transportation Department's Intelligent Vehicle Initiative (IVI), which aims to correct driving habits that could lead to potentially dangerous situations. A total of $28 million for fiscal 2003 was granted to the IVI, which is focusing on the most common types of car crashes--rear-end, road departure, intersection, merger, and lane-change collisions. A General Motors research group has carried out a field test of a crash-avoidance system that integrates forward-collision warnings and control with adaptive cruise control. Drivers are warned of impending collisions with slow or stopped vehicles ahead of them via audio and heads-up displays on the windshield, and the system automatically reduces accelerator pressure and applies the brakes if motorists fail to take remedial action. Vehicle technology developed at MIT's Age Lab is geared toward older drivers, though it can also be used by younger drivers; one system the lab is working on involves windshield displays that alert the motorist to the vehicle's speed, whether that speed is over safety limits, and suggestions to brake. Meanwhile, the Minnesota Department of Transportation is planning a spring demonstration of a crash warning system at a rural intersection. The system will utilize radar to gauge traffic gaps and then flash roadside notices to drivers on how safe it is to get on the highway. The effectiveness of these various systems is still undetermined, considering that much development and field testing remains to be done, to say nothing of how motorists will respond to such systems.

  • "IETF Conference Debates Antispam Proposals"
    TechNewsWorld (03/02/04); Mello Jr., John P.

    Various proposals to bring spam under control--which are gaining momentum as spam proliferates to epidemic levels--are being discussed at this week's Internet Engineering Task Force conference in Seoul. "The spam issue has created enough urgency and even desperation, so rather than following traditional standard-setting practices where different proposals are hashed out at lengthy and infrequent meetings with standards bodies, instead there's been a rush to market to get solutions into place and experiment with them and let their strengths and weaknesses come out through real-world trials," observes Constant Contacts CEO Gail Goodman. Among the proposals under scrutiny is Sender Policy Framework (SPF) technology, a countermeasure against email address spoofing and joe-jobbing. Another is the Microsoft-supported Caller-ID, which aims to defeat phishing, the practice whereby spammers trick readers into giving away sensitive data by making them think the messages come from a trusted source. Pobox.com chief technology officer Meng Weng Wong characterizes SPF and Caller-ID as complementary technologies. However, experts such as Eric Johansson of the TriArche Research Group warn that the rush to deploy antispam solutions could undermine the freedom of the Internet. Most of the online community is focused on email authentication measures that aim to ID senders and block them if they go astray: "That's a threat to free speech because if you can shut off a spammer, you can shut off anybody," argues Johansson, who is developing a decentralized email authentication solution. PostX's Sean Eldridge strongly doubts that email will be wiped out as a result of spam, but its use as a communications medium will definitely suffer if the problem worsens.
    Click Here to View Full Article

  • "PARC Eases Communications Between Devices"
    CNet (03/02/04); Shim, Richard

    Researchers at Xerox's Palo Alto Research Center (PARC) have developed software that reportedly enables all consumer electronics devices to talk to each other, boosting the ease-of-use of networked home devices that is so critical for content and device providers seeking to make the playback of digital media universal. The architecture, known as Obje, basically trains devices to teach each other how to establish communications between themselves: Code is transmitted to the devices via either a network or a direct connection. PARC's Hermann Calabria says language setup is almost immediate but is dependent on what is being transmitted. Obje needs a virtual machine to work, even though it is designed to operate irrespective of devices, operating systems, and networks. The software interoperates with other networking technologies and can work without having to be loaded onto all network devices. Calabria notes that the software is not yet ready to be commercialized, although the concepts have reached maturity. PARC researchers have also announced a wireless networking technology designed to ease the setup of secure links by having client devices request a key that grants the client network access privileges. "User studies show that one of the biggest problems with security is that people misconfigure settings or they don't bother with it at all because it's too hard," points out PARC scientist Dirk Balfanz. "This is a way to enable it easily."
    Click Here to View Full Article

  • "ICANN Fleshes Out Global Ambitions"
    InternetNews.com (03/02/04); Wagner, Jim

    The International Corporation for Assigned Names and Numbers (ICANN) is set to further expand its international operations at the tri-annual meeting in Rome this week. ICANN is easing out of its role as subcontractor to the U.S. Department of Commerce and into a new role as a non-profit entity beholden to the global body of Internet users and their governments. Among the first steps ICANN has taken in its new role is the establishment of the country code top-level domain Names Supporting Organization that will serve as an international consensus body managing not only the general top-level domains such as .com, .net, and .org, but also the country-specific codes such as .fr for France and .uk for the United Kingdom. ICANN's Kieran Baker says the new role for ICANN is without legal precedent, and will be advised by governments, including the United States, through the government advisory committee. The ICANN board also recently voted to establish the first of several expected regional offices that would wield the same power as the U.S. district offices: The new office in Brussels will allow for ongoing discussion of regional concerns, as well as easier direct access to ICANN. Among the other contentious issues on ICANN's agenda include the waiting-list service (WLS) and SiteFinder service launched by VeriSign but shelved by ICANN. VeriSign's WSL allows people to buy a spot on a waiting list for open domain names, a practice competitors say is unnecessary and an abuse of monopoly. The Domain Justice Coalition, a group of registrars united against VeriSign, last week filed a lawsuit against VeriSign and ICANN over WSL, while, ICANN forced VeriSign to withdraw its SiteFinder service last year because it interfered with technical operations, though the commercial practice is also debated.
    Click Here to View Full Article

  • "Discarded Cell Phones, Printers, Keyboards May Be Hazardous Waste"
    Newswise (03/01/04)

    An EPA-funded draft report prepared by University of Florida environmental engineers concluded that a wide variety of electronic devices fulfill EPA criteria for constituting hazardous waste; UF environmental engineering associate professor Tim Townsend says that such findings could spur federal or state governments to revise e-waste disposal rules. The EPA commissioned the report from Townsend based on a earlier study he conducted which found that the lead content in cathode ray tubes is enough to classify them as hazardous waste. Townsend and his research team inspected cell phones, keyboards, printers, flat-panel monitors, remote controls, VCRs, computer mice, laptops, and central processing units, and applied the Toxicity Characteristic Leaching Procedure to many of these devices. Each type of device produced lead leachate that surpassed EPA standard waste levels in at least some cases. Townsend cautioned that the tests may not necessarily mirror actual landfill conditions, so he is conducting an experiment whereby two columns of mixed municipal solid waste and e-waste are buried in a Florida landfill to determine how much hazardous material is leached out. EPA environmental specialist Marilyn Goode noted that her agency must conduct further research before instituting new rules for e-waste disposal. She added that the EPA is in the midst of completing a regulation designed to prod businesses to recycle television and computer monitors rather than dump them. Townsend noted that experts say over 60 million PCs will become obsolete in 2005.
    Click Here to View Full Article

  • "Did Your Vote Count? New Coded Ballots May Prove It Did"
    New York Times (03/02/04) P. D2; Robinson, Sara

    A truly trustworthy voting system must furnish a voter-verifiable audit trail and maintain the secrecy of ballots, and various systems have been proposed. The "frog" voting system suggested in a working paper from the Caltech/MIT Voting Technology Project in 2001 and modified for an all-electronic approach would employ two distinctive e-voting machines and a "frog" memory card: Voters would first obtain a frog that features all ballot options, and then use the first e-voting machine to make their choices; on election day, the voters would plug the frog, with its stored choices, into a "vote caster" machine that would display their selections on a screen, and once voters are satisfied they would cast the vote, putting the frog into a "frozen" mode that renders its data alteration-proof and placing the memory card into the ballot box. The vote caster, unlike the first voting machine, would need to be heavily secure, but researchers believe a simple design would ensure security. To guarantee that votes are accurately tallied, VoteHere chief scientist Dr. C. Andrew Neff and cryptographer Dr. David Chaum have devised two separate mathematical voting systems. Both systems require the counting process to be conducted publicly over the Internet, with election integrity upheld by voters and third party observers and ballot confidentiality supported by election officials. Once voting is complete, each voter would be given an encrypted receipt that can only be decrypted via a collaborative venture between all election trustees; all receipts would be published online following the closing of the polls, and voters would be able to find an image of their individual receipts and verify their accuracy using their serial numbers. Decoded ballots would be posted online without serial numbers so that they could not be linked to individual voters. Some researchers think the complexity of the cryptographic systems could undermine their acceptance by the voting public.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Spontaneous Networks Will Speed Net Access"
    New Scientist (02/14/04) Vol. 181, No. 2434, P. 22; Ananthaswamy, Anil

    Hewlett-Packard software engineers believe laptop, smart phone, and PDA users who employ cell-phone networks to link these devices to the Internet could accelerate access speeds tremendously by engaging in "collaborative mobile networking." Each device would be equipped with self-organizing software that detects other nearby devices, establishing a high-speed network where each device can use any of the other devices' cellular connections to increase its access speeds. The network would use a community server provided by the cell-phone company or service provider that compiles the data streams coming in through various cellular links; the server would speed up download times by parsing large files and transmitting them over the multiple cellular connections. The new system, which was devised by HP researcher Puneet Sharma and colleagues, banks on the fact that most people only use the full capacity of their Internet link for the short time it takes to download a file, leaving the bandwidth free while they are reading the file. The researchers conducted a lab test in which each of four mobile users downloaded a 1 MB file while connected to the Internet through a simulated GPRS link with a peak speed of 45 Kbps. Via collaborative mobile networking, each user boosted their peak available bandwidth to more than 200 Kbps. Sharma admits that a collaborative mobile community could be swamped by a flood of data routed through the community server by a determined hacker, while people with phones connected to a community server will have get used to the fact that their bandwidth will sometimes be tapped by others downloading files. Their phones' batteries could also be exhausted faster as a result, but Sharma believes people will be willing to accept such a trade-off, considering how successful peer-to-peer networks have been.

  • "Passing Packets Under Ever More Scrutiny"
    CommsDesign (03/01/04); Wilson, Ron; Wirbel, Loring

    Internet protocol was conceived as a payload-independent method of routing so that intermediary infrastructure did not care about the packet's content; today, floods of spam, viruses, and worms have to be stopped from entering a network, while trade secrets and other information that violates company policy needs to be stopped from exiting. New challenges are emerging as well, such as European cellular service providers having to block pornographic images from being spammed to graphics-enabled handsets. Tarari vice president John Bromhead says filtering those images means scanning packets for flesh-toned graphics--but this type of intrusive packet inspection, even of compressed and encrypted packets, is a tremendous drain on performance if systems are not optimized. Bromhead says some 29 percent of traffic from top Web sites was compressed data, which requires 20 times the processing time when being decompressed. In the future, it may be necessary to scan secure packets, which means security checkpoints will have to exchange security keys and decrypt files. Thankfully, there are combined hardware and software solutions that are adaptable and fast enough to handle these challenges: Tarari touts a Content Processing Engine multiprocessing chip that is optimized for performance and also easy security management. A programmable system-level solution from iPolicy combines a number of security layers on one platform, including deep-inspection firewall, VPN, anti-virus controller, content-filtering engine, and intrusion-detection. Both spammers and corporate executives have not yet considered the effects of these new types of packet inspection tools. Company policies are likely to become more complex while the hardware will continue to improve; National Broadband already plans a national optical network that would operate security and policy engines in the optical switches.
    Click Here to View Full Article

  • "Artificial Emotion"
    Boston Globe (02/29/04); Allis, Sam

    Sherry Turkle, founder of the MIT Initiative on Technology and Self, will host a March 5 confab on "Evocative Objects" that has a direct bearing on her views that human beings' growing attraction to advanced machines is based on emotional attachment rather than machine intelligence. She explains, "[Machines] seduce us by asking for human nurturance, not intelligence. We're suckers not for realism but for relationships."
    Sony's AIBO dog, for instance, builds an emotional rapport with users because it recalls its owners, obeys commands, and has an adjustable personality. In 2003, Turkle observed that the discontinued Hasbro toy My Real Baby helped allay the anxiety of some of the more manic residents at a nursing home, while an initiative to enhance elderly health care and companionship with robots is gaining momentum in Japan. Robotics' role in health care is an issue that merits as much debate as cloning, according to Turkle, who notes that the technology's appeal to the assisted living sector reflects society's tendency to understaff its nursing facilities. Terry McCatherin, activities director of the nursing home where the My Real Baby experiment was held, is unsettled by the idea of people becoming so emotionally invested in machines that they prefer them over flesh and blood. "We are very vulnerable to technology that knows how to push our buttons in a human way," says Turkle, adding that this is a facet of human nature people must understand and accept.
    Click Here to View Full Article

  • "Tools of Success"
    Sacramento Bee (03/01/04); Kalb, Loretta

    The IT sector in California will lead in job growth by the end of the decade, according to the state Employment Development Department (EDD), but technology jobs at that time will hardly resemble those in the 1990s. Instead of operating separately from business, IT is integrating into business operations so that IT workers need to have an understanding of their industry, company, and how to work with people; these skills could give a much-needed boost to current job seekers, says California State University, Sacramento, career counselor Cici Mattiuzzi. Companies are already scouting overseas for cheap labor or considering overseas outsourcing: Basic programmers and researchers are most vulnerable to this type of job competition, while technology workers that identify a company's business needs and have soft skills will be insulated. VantageMed CIO Rajiv Donde holds a master's degree in business administration and always considers technology in light of the company's business goals; he leads VantageMed's new development efforts and tries to keep a balance of skills on his 35-member project management team. To current technology students, Donde recommends getting a business degree. CPS Human Resource Services CIO Brian Gegan agrees, and says IT workers should make an effort to understand how their company operates. Mattiuzzi says that while just three employers signed up for interviews at the College of Engineering this year--the lowest number ever--demand is picking up for graduate students with some experience. The EDD reports that eight of 10 fastest-growing jobs by the end of the decade will be computer-related; the positions set to grow the most in the meantime are: Computer support specialists with 107 percent growth, computer software engineers with 105 percent growth, and network and system administrators with 100 percent growth.
    Click Here to View Full Article

  • "New Web Tools Aim to Customize Searches"
    SiliconValley.com (03/01/04); Bazeley, Michael

    Engineers are developing next-generation Web search technology that tailors searches to an individual's surfing habits and demographic data: In this way, search engines would be able to infer whether a user who inquires about, say, a jaguar is looking for the cat rather than the automobile. Moreover Technologies CEO Jim Pitkow comments that current engines rank search results according to their popularity, which can be frustrating for users with minority preferences. Personalized search technology keeps track of the keywords users employ and what Web pages they visit so that search engines can build a profile of user likes and dislikes that can be applied to future search results. Yahoo!, America Online, Google, Microsoft, and many smaller outfits have made sizable investments in search personalization technology, with the goal of increasing their sites' user appeal. These services could apply information about users' hobbies, personal interests, age, gender, or residence to further filter search results, and experts believe data provided by the user is much more useful than tracking search habits. Privacy is a major issue surrounding search personalization, as many Web surfers may object to the idea of companies archiving their search histories. "People don't want to feel they're being watched and monitored and that people know where they've been searching," notes Jason Jerome of Relevant Media. Deploying search personalization on a large scale will also be difficult, since a huge computing power investment is needed to calculate personal search results for each user.
    Click Here to View Full Article

  • "Oulu-Based Game-Professor Forecasts Boom for Network Games"
    Pressi.com (03/01/04)

    University of Oulu professor Tony Manninen in Finland says his city is a nucleus for future networked game development, especially on the mobile platform: In his recent doctoral dissertation, Manninen argued that games are a tremendously untapped field and that much could be added in terms of social interaction. Networking through mobile devices allows players to compete against other humans, which greatly enhances the game's variability as well as the sense of winning and losing; however, networked games need to vastly improve in terms of interactivity in order to fully exploit the social aspect of networked gaming, Manninen says. The addition of speechless communication through facial expressions and appearance would allow players greater depth of social interaction. Manninen touts a new model for networked game development that takes into account new advances in mobile technology as well as new applications, such as feature recognition. Despite the fact that many people now carry a mobile game device with them nearly all the time, commercial vendors and game developers have been slow to innovate new games for the mobile platform; mobile phone positioning, for example, is an aspect available for mobile games but not for console and computer games. Oulu, Finland, has long been recognized for its mobile technology research, and there is now a core of local companies focused on developing innovative mobile games. Manninen says games have not changed too drastically for the last 20 to 30 years, and just about 1 percent of the area has been utilized. After the completion of his doctoral thesis, Manninen intends to hone his gaming expertise.
    Click Here to View Full Article

  • "Simple Optics Make Quantum Relay"
    Technology Research News (03/03/04); Smalley, Eric

    Prototype quantum cryptography systems could be able to transmit data across long distances using a quantum repeater built out of available optical gear by researchers at the NASA-Caltech Jet Propulsion Laboratory (JPL); quantum networks could likewise be enabled by the devices, which are designed to sustain quantum entanglement. Transporting entangled particles is tricky because the act of copying such particles destroys their quantum data, which makes ordinary repeaters ill-suited for quantum communications. The linear optical quantum repeater created by the JPL researchers consists of a network of beam splitters and photodetectors that direct photons based on whether particular photodetectors find other photons, which is sufficient to facilitate the purification and swapping of entangled particles. In typical quantum communications schemes, multiple entangled pairs are condensed to a single pair of fully entangled photons, and the linear optical quantum repeater performs purification prior to entanglement swapping. JPL principal scientist Jonathan Dowling says the project's ultimate goal is to build a full-scale quantum computer using the linear optical scheme. Former JPL researcher Pieter Kok reports that providing fully entangled particles in a way that is reproducible is a key goal, while quantum memory is also a major required element. Dowling notes that the system must also be shrunk down so it can be embedded on a quantum optoelectronic ship. Other researchers have developed repeaters designed to harness the interactions between photons and gas atoms, as well as nonlinear optical repeaters that induce photon interaction by using complex equipment.
    Click Here to View Full Article

  • "Two Desktops, Twice the Health Risk?"
    ZDNet Australia (02/25/04); Donoghue, Andrew

    Peter Buckle at the Robens Institute for Health Ergonomics, whose department is a component of the European Institute of Health and Medical Sciences, reports that IT professionals' susceptibility to Repetitive Strain Injury (RSI) is not only determined by the hardware they use, but by the tasks they perform and the various obstacles they encounter. Buckle says RSI problems are often the result of people working for prolonged periods in fixed positions that usually involve the person working with part of their body away from a neutral position; stress is another factor, and this can be amplified by the job itself as well as the input device used. His advice for IT workers is to use various input devices and select the one that is best aligned to their work style, and to refrain from keeping their hand on the computer mouse when it is not in use. Buckle also thinks short keyboards that lack the numeric keypads section allow users to move the mouse closer to the mid-line of their bodies and lower the static load on their shoulders--however, short keyboards are relatively expensive and are not standard equipment, while most desktop systems come with mice and keyboards. The professor reports that his department has furnished a checklist of criteria people should consider when buying a computer mouse, such as keeping the mouse design simple and making the most of the device through training. Buckle points out that RSI-prone IT professionals often work with multiple screens or desktop systems, while other people using their terminals can exacerbate the problem. "In that kind of situation you need a lot more flexibility; you can't just set your system up for yourself and leave it, because someone else is going to come in and change it all round," he comments. Buckle thinks risk assessments based on Display Screen Equipment regulations should be carried out whenever any aspect of the workplace--hardware, software, tasks, etc.--changes.
    Click Here to View Full Article

  • "Intel's Crystal Ball"
    InformationWeek (02/23/04) No. 977, P. 36; Greenemeier, Larry

    Intel chiefly remains a research and development firm seeking to provide the building blocks for new silicon-based technologies that benefit both businesses and consumers. CEO Craig Barrett thinks his company's business may migrate from the Internet to the health sciences, as its transistors approach microscopic size and their appeal to doctors and researchers grows. Intel has also set aside $200 million for investment in emerging companies working on home-networking products and systems, which will employ Internet processors and integrate with Intel's wireless effort. The company strongly believes that computing and communications infrastructures will continue to blend into each other, as evidenced by its push for the Centrino wireless laptop processor, which has been highly successful. Intel is still concentrating on the enterprise market but is nearing critical mass on the low end, while corporate adoption of its high-end Itanium processor has been slow. In the same way that it has made PCs capable of managing more complex and sophisticated applications by steadily raising processor speeds, Intel is collaborating with other vendors to boost Wi-Fi technology's ability to accommodate faster data speeds. Intel's expertise in designing integrated components has also put the company ahead of the competition: "With so many companies competing on price, [component makers] will be looking for integrated solutions where all sorts of applications are included on one board," notes Parks Associates VP and Principal Kurt Scherf. Intel expects 2004's R&D investment to total $4.8 billion, a $400 million increase over 2003.
    Click Here to View Full Article

  • "Robot, Make Thyself"
    New Scientist (02/14/04) Vol. 181, No. 2434, P. 26; Biever, Celeste

    Engineers are researching self-assembly as a tool for achieving the viable mass production of micromachines, whose potential applications include "smart dust" surveillance devices and nanoscale drug delivery systems. Some research has focused on adapting origami to chipmaking, in which 2D objects manufactured by traditional photolithography are folded into 3D structures such as microchips, micromachines, and nanobots. Because a silicon sheet can only be folded a limited number of times, Harvard's Mila Boncheva is combining individual folded objects into more complex configurations using handcrafted hollow copper units linked together by solder. Boncheva has successfully fabricated a ring oscillator and a shift register via self-assembly in a warm solution. Her latest experiments have yielded jewelry-like structures whose shape can be adjusted. "The beauty of this method is that we not only control geometric structure, we also control the electrical functionality of the devices," Boncheva notes. She comments that the structures are designed to fold up in the same way that amino acid chains fold into 3D proteins. The next step is to integrate this technique with self-folding processes to create a two-step fabrication methodology in which individual units self-assemble through folding and then combine into more intricate 3D structures. The starting point is a 2D component; Boncheva has called on MIT programmer Erik Demaine to develop software for creating 2D structures that self-assemble into useful 3D structures. Researcher Elliot Hui of the University of California, San Diego, says, "There's a lot you could do with a 3D microscopic structure. There is just a high probability that in the future we are going to need techniques like these."

  • "Wicked (Good) Wikis"
    Darwin (02/04); Boyd, Stowe

    A Working Model managing director Stowe Boyd is very impressed with Wikis as a collaborative tool that enables socialization and produces social capital. Wikis are comprised of a series of hyperlinked documents that can be edited en masse with a browser; they allow all authors to edit each other's work; and in their traditional form they manage pages as an HTML presentation and as a markup form for users with editing and authoring entitlements. The open nature of Wikis has raised doubts that ambitious projects such as Wikipedia will succeed, but Boyd cites Dan Gillmor, who writes, "The Wikipedia articles tend to be neutral in tone, and when the topic is controversial, they will explain the varying viewpoints in addition to offering the basic facts." When everyone has editing privileges, this fairness becomes a critical element. Boyd comments that, like Blogs, Wikis can foster growth of social networks, but their potential for socialization outstrips that of Blogs because Wiki social interactions are accelerated and more collaborative, while Wikis' adjustable architecture can support multiple collaborative mechanisms, often at the same time. The author points out that "Wikis are based on emergent intelligence and knowledge: the belief that the best results come from allowing decisions to emerge bottom-up, in a relatively free-form interchange between the participants of a group, with only a light-handed editorial or managerial top-down control being applied." Boyd observes that members of a project group must police themselves when participating in a Wiki collaboration, adding that this situation is precisely what is needed to culture social networks.
    Click Here to View Full Article

  • "Grid Computing's Promises and Perils"
    Network Magazine (02/04) Vol. 19, No. 2, P. 29; Conry-Murray, Andrew

    Grid computing does indeed carry major benefits, provided the organization fulfills certain parameters: Businesses best suited to grid computing are those already using high-performance computing, such as pharmaceutical companies and financial services firms; Gartner VP Carl Clauch maintains that there is as yet no business value for employing grid computing for operations such as tracking inventory, partly because such applications cannot take advantage of compute parallelism. There are two general grid categories--there are research-oriented grids used primarily for academic collaboration, and they involve complex, expensive machines that employ open-source software, government funding, and in some cases volunteers who contribute CPU cycles to specific projects. The other grid type is the enterprise grid, which can be deployed via open-source resources or proprietary products, and only makes additionally generated computing capacity available internally to the enterprise. Proponents say that grid computing's greatest business benefits are savings and speed: IBM's Dan Powers notes that a grid solution allows organizations to tap mostly idle computing resources, lowering the need for new hardware while raising the return on investment for existing equipment; meanwhile, grid computing lets applications run faster and deliver results more rapidly, which provides customers with much-needed agility. Among the factors that can impede grid deployment is accounting--"As people share resources, they want to account for them effectively," notes Sara Murphy of Hewlett-Packard, while Clauch reports that software licensing models have yet to reconcile themselves to the fluctuating hardware and usage of the grid environment. Bullet-proofing grid computing against unauthorized intrusion and denial of service attacks while also keeping intellectual property secure is a major hurdle, as is overcoming people's reluctance to share computing resources, Murphy points out. Deployment of grid standards is being driven by the Globus Alliance and the Global Grid Forum, whose notable achievements include the Open Grid Services Architecture and the Open Grid Services Infrastructure. Grid supporters believe that Web services mechanisms will emerge as the standard grid computing interface.
    Click Here to View Full Article