ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 591:  Wednesday, January 7, 2004

  • "Mac Founders Push for New Ideas"
    Wired News (01/06/04); Terdiman, Daniel

    Some of the creators and designers of the original Macintosh PC, which celebrates its 20th anniversary this month, are disappointed that today's computers are not significantly more advanced than the early Macs, and blame this lack of innovation on a resistance to radical ideas, at least at Apple. Former Apple programmer Jef Raskin, considered by many to be the "father of the Macintosh," argues that graphical user interfaces need to change rapidly, and complains that Apple staff suffer from a paucity of interface knowledge and understanding. His current project is a new computing environment known as The Humane Interface, which is supposed to operate at a far higher level of efficiency than Windows or the Mac OS. Meanwhile, onetime original Mac team member Andy Hertzfeld believes that Apple's rate of innovation far exceeds that of the mainstream industry, though at the same time he is puzzled at the inertia exhibited by the PC industry in the 20 years since the original Mac's debut. "Personal computers are still way too frustrating and hard to use," he admits. Another member of the original Mac development team, Bill Atkinson, is more enthusiastic about the Mac's progress since 1984, noting that its ability to help ordinary users carry out sophisticated tasks remains one of its most appealing features. He also points out that the transition of once-expensive, customized capabilities such as video-editing tools into standard Mac features is proof of the continued evolution of the PC.
    Click Here to View Full Article

  • "Spam Is Still Flowing Into E-Mail Boxes"
    Washington Post (01/06/04) P. E1; Krim, Jonathan

    A federal antispam law that went into effect on Jan. 1 has done little to abate the tide of junk email clogging users' inboxes, according to estimates from various ISPs and spam-filtering companies. "We're not seeing the hard-core spammers cleaning up their act in any way," reports Andrew Lochart of Postini, whose latest estimates peg spam as constituting almost 85 percent of the approximately 1 billion emails the company handles every week; the percentage of spam in the email Brightmail handles has remained steady at roughly 60 percent, while America Online and Earthlink say spam patterns have changed little in the past several weeks. AOL's Nicholas J. Graham adds that there has been a noticeable shift overseas in the origins of spam the company blocks, which AOL's antispam group thinks is indicative of spammers hijacking vulnerable offshore machines and networks. The federal antispam law, which supercedes more restrictive state legislation, forbids spammers from masking their true Internet addresses and using misleading subject lines; it also requires email marketers to give consumers a valid way to opt out of receiving future messages. Despite the passage and enactment of the law, the most infamous spam still lacks unsubscribe links, while Lochart says the inclusion of such links is probably a ruse by spammers to confirm that they have hit a legitimate email address. Garbled characters designed to thwart spam filters are also appearing in subject lines with more frequency. Critics believe the apparent ineffectiveness of the new law so far indicates that the most notorious spammers will ignore it, while other marketers will increase the amount of email they send as long as it remains within the law. However, ISPs caution that rating the law's effectiveness now is premature.
    Click Here to View Full Article

  • "The Ultimate Global Network"
    Independent (London) (01/07/04); Sarson, Richard

    British computer scientists Sir Tony Hoare and Robin Milner believe that within two decades the power and ranks of computers will increase a hundredfold, and lead to the emergence of a Global Ubiquitous Computer; the researchers have thrown down the gauntlet to fellow U.K. computer scientists by launching seven "Grand Challenges" with the ultimate goal of making the ubiquitous computer as secure and reliable as possible. One challenge investigates how a Global Ubiquitous Computer can be built from a design, technology, and trade-off standpoint, its aim being "to see a set of engineering rules of thumb maturing into design principles, which can then be applied to other systems," according to Jon Crowcroft of Cambridge. The discovery of newer, more suitable devices than current computers to operate such artificial organisms is the goal of the "journeys in non-classical computation" challenge. The "memories for life" challenge focuses on solving problems related to the storage, privacy, and searchability of the personal information entered into computers. Meanwhile, the "dependability" challenge addresses the security and reliability of programs running at home, the office, in vehicles, and elsewhere, in the hopes that simple design errors that cause system failures can be avoided. The "In vivo-in silico" challenge will study the nematode worm with the goal of developing resilient, self-healing software, while "the architecture of brain and mind" challenge will concentrate on understanding the biological underpinnings of human cognition so that intelligent robots can be created. Hoare says the object of these challenges is not to provide software tools, but rather to supply the theoretical principles such tools would be based on.
    Click Here to View Full Article

  • "Head-Up Displays Get Second Chance"
    EE Times (01/05/04); Murray, Charles J.

    Carmakers and suppliers expect head-up display (HUD) technology to get a new lease on life with the advent of multicolor light-emitting diodes (LEDs), smaller liquid-crystal displays, and windshield optics innovations. Steven Stringfellow of General Motors' Electrical Center reports that second-generation HUDs will boast better light sources than the parabolic overhead-projector-type-bulbs of the first generation, which produced too much heat and raised worries of toxicity because of the presence of mercury. Siemens engineers have rectified this problem by phasing out fluorescent displays and incandescent bulbs in favor of micro-LED arrays printed on silicon; Siemens VDO's Mark Brainard notes that the display does not appear to the driver to be projected onto the windshield, but rather floating in front of it. Advantages of these HUDs over monochrome technologies include improved brightness and a range of up to 64,000 hues. The system also incorporates light sensors that work with a software algorithm to adjust the HUD's luminosity in accordance with changes in ambient light. GM engineers have devised their own LED-based HUD, one that features improved contrast over earlier models, while a new understanding of the optical properties of windshields has helped engineers solve distortion and ghosting problems. ISuppli/Stanford Resources' Kimberly Allen says the display's appearance on the windshield is most critical to motorists: "If it doesn't look right, they're not going to buy it," she explains. Next-generation HUDs are expected to serve as much better navigational aids via their integration with vehicular global-positioning systems, while vendors and automakers believe the market penetration of adaptive cruise control, collision avoidance, and other technologies will boost HUDs' value even further.
    Click Here to View Full Article

  • "Grid Software, Now With the Mac Touch"
    InternetNews.com (01/07/04); Singer, Michael

    Scientific research will be the primary focus of Apple Computer's Xgrid grid computing software, of which a free beta version was officially introduced on Jan. 6. Developed by Apple's Advanced Computation Group, Xgrid is essentially Apple's user-friendly interface adapted for distributed computing applications carried out on multiple Macs. Xgrid exploits Apple's Rendezvous networking technology as a key component of the software to help detect and manage multiple network nodes, and enables Apple to draw from the high performance computing (HPC) clustering marketplace, where software running on IBM, Hewlett-Packard, Dell, and Sun Microsystems servers currently resides. "The bright spot in the server market is the HPC Linux server area and what it seems like Apple is saying [is] that it has [a] UNIX RISC server that can operate as a Linux cluster with a minimum amount of change," notes International Data's Jean Bozman. Upon installation, the Xgrid software sets up a "virtual" IT environment that can run batch and workload processing off of idle computing capacity on all networked resources. However, Apple does not tout Xgrid as a solution for all clustering difficulties--the software is not a replacement for clustering hardware or software, nor does it "grid-enable" existing applications on a computer. Its function is to establish a remote execution environment and file staging capabilities that oversee the operation of tasks on distributed computing resources. This gives each networked machine the ability to access all files needed to carry out tasks.
    Click Here to View Full Article

  • "Mobile Robots Take Baby Steps"
    Wired News (01/07/04); Shachtman, Noah

    Biologically inspired robots designed for military operations are a major area of focus for the Defense Advanced Research Projects Agency and the Army's Tank-automotive and Armaments Command (TACOM), which are funding research and development into devices that resemble animals such as lobsters, canines, bees, and snakes--in both form and function. Such machines are highly desirable for their ability to recon and operate in environments not suitable for human soldiers. A pair of robotics firms will share $2.25 million under separate TACOM contracts to develop four-legged prototype drones that can carry supplies, ammunition, and food into battle, though achieving this goal is a formidable challenge. Such a drone would not only have to traverse uneven terrain, but carry a much more compact and less detectable power supply, according to Ben Krupp of Yobotics, one of the two TACOM grant recipients. Marc Raibert of Boston Dynamics, the other TACOM grant recipient, explains that cracking the challenge involves harmonizing the body, mind, and computer brain of the drone. He says the robot dog's vision system will be key to its success, and has recruited Larry Matthies of NASA's Jet Propulsion Laboratory to apply his breakthrough work in 3D robot vision to Boston Dynamics' "Big Dog" project. TACOM research scientist Paul Meunch says the goal of the canine drone project is to demonstrate the feasibility of the technology to the military. Other biologically-inspired robotics projects attracting Pentagon interest include a wheeled mechanical version of an elephant's trunk from Carnegie Mellon University's Howie Choset, which could be employed to inspect ships' engines; and a mechanical lobster for the detection of coastal mines.
    Click Here to View Full Article

  • "Digital Warfare Adapted for Iraq"
    Associated Press (01/02/04); Keyser, Jason

    The apprehension of former Iraqi dictator Saddam Hussein and other war criminals was partially attributed to Army Battle Command Systems, a series of technologies that have dramatically expedited the process of planning and coordinating raids and troop mobilization. Tracking street fighters and hunting down rebel leaders is the purpose of the technologies, which were originally designed for battlefield operations involving helicopters and tanks. Army Battle Command Systems allow American soldiers to check digital maps, watch enemy movements, download instructions from commanders, and focus on specific buildings via satellite imagery on in-vehicle computer screens. Commanders can also view the progress of raids in real time using imagery from unmanned aerial drones and maps that track the movements of ground and air forces by satellite. The touch-screen monitors allow soldiers to add icons to maps that appear on all networked screens. Commanders especially prize the computer system for enabling them to keep raids on track even when conditions change suddenly. The U.S. Army's 4th Infantry Division is the only unit currently using the system, and it is not without its drawbacks: Army Battle Command Systems' component technologies are vulnerable to wear and tear in a physically grueling environment, and getting damaged parts replaced is a slow process.
    Click Here to View Full Article

  • "Bots, Humans Play Together"
    Technology Research News (01/07/04); Smalley, Eric; Patch, Kimberly

    Carnegie Mellon University researchers are planning to explore new areas of human-machine interaction by holding soccer matches of mixed human-robot teams to probe such issues as the optimal time for people and robots to communicate and split up tasks, according to CMU computer science professor Manuela Veloso. She says the results of the experiment should have a bearing on any application for real-time collaboration between humans and groups of robots, such as autonomous robot vehicles, automated construction crews, space exploration, and search and rescue operations. The robots involved in the experiment are a bigger version of soccer-playing robots devised by CMU, using Segway scooters to scale them up to more human proportions. Their human counterparts will also travel on Segways, and both human and robot players will use the same ball manipulation device and be close to equal in terms of top speed, acceleration, and turning abilities. Veloso says the autonomous soccer robots are equipped with vision cameras and computer algorithms so they can track the ball under variable illumination. Each robot also features a laptop to process camera images and a second laptop to run algorithms so it can choose a specific action. Researchers will be able to rapidly develop and assess robot control algorithms via a computer infrastructure consisting of a graphical computer interface, teleoperation programs, and logging programs. Veloso notes that the robots are capable of speech, while researchers are attempting to transfer the soccer-playing software designed for smaller machines to the Segway, as well as improve the robot's obstacle avoidance system, and add devices that will enable people to command or communicate with the robots.
    Click Here to View Full Article

  • "High-Growth Era Seen for Embedded Systems"
    Electronic Engineering Times--Asia (01/01/04); Sharma, Hema

    Embedded systems imbue devices with intelligence and allow them to interact with users and surrounding objects. The number of embedded systems is expected to increase as they become more technically sophisticated. Systems research is increasingly focusing on application-specific instead of general-purpose design, and on reliability, security, scalability, cost, manageability, and flexibility instead of purely performance. Reliable design methodologies and tools for embedded systems are critical to meet expected demand in terms of volume and complexity. In the absence of end-to-end electronic design automation (EDA) solutions for embedded systems, designers can rely on a set of proven point tools used inside a good methodology: Concurrent design of hardware and software is a major consideration for embedded systems designers because it helps to ensure integration and synchronization of the system. Unified development processes are under development in research centers around the world, focusing on system functions reuse and design space exploration. Reliability is key for embedded systems, many of which control mission-critical systems, and every embedded system should be able to reset itself in case of hardware error. Java and Linux are growing quickly in the embedded system space as well: Java offers the benefits of dynamic application loading and portable application development, but also taxes already-constrained system resources such as memory, execution time, and energy consumption. The size and volume of embedded systems also require designers to carefully consider their low-power and cost requirements, while the market for embedded memory is expected to grow fastest during the next three years with embedded software, processors, and boards following in that order.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "BitTorrent, 'Gi-Fi,' and Other Trends in 2004"
    EarthWeb (12/31/03); Naraine, Ryan

    Regardless of the economic situation, innovation will continue unabated in 2004. New peer-to-peer technologies such as BitTorrent promise to make file-transfer much easier by slicing files into small distributed pieces that can be assembled on the user's computer; unlike some other systems, BitTorrent networks increase performance as they scale in size and could provide enterprises with a powerful file-transfer tool. Serial ATA is also expected to make a big splash in 2004, simplifying hardware and wiring, and paving the way for new hard drives, controllers, and connectors. DRM technologies have been vetted by consumers at sites such as Apple's iTunes, and the popularity of the protected content will spur new device offerings from manufacturers and concessions from media owners. Google is rumored to be on the lookout for a social networking connection for its popular Blogger software, signaling a larger trend to integrate social networking, blogging, and wiki (blogging "best-practices") movements; Google is also launching a Google Print effort to index print feeds of selected offline works, pitting the search giant against e-commerce titan Amazon.com, which is pursuing e-commerce search technologies through its A9 research group. NewLans, founded by serial entrepreneur Dev Gupta, is advocating a new 2 Gbps wireless LAN standard called "Gi-Fi" using the 56 GHz band, recently opened by the FCC. Network-connected computers should also become more secure with Microsoft's Windows XP Service Pack 2, due out in the second quarter; that update will bolster the embedded Internet Connection Firewall (ICF)--which will also be turned on by default in new computers with Windows XP--and deal a significant blow to Symantec and McAfee, which sell firewall subscription services.

  • "Open-Source Community Defends GPL Against SCO Attack"
    eWeek (01/05/04); Galli, Peter

    SCO Group CEO Darl McBride's assertion in December that the GNU General Public License (GPL) is unconstitutional and contravenes patent and copyright laws has prompted a backlash by open-source advocates. Rolland Roseland of The Schwan Food Company reports that McBride's allegations are ridiculous. "Owners of IP [intellectual property] have the right to copyright their IP or to provide it free to society," he declares. "It is their choice, and we should be supporting that choice." Furthermore, as Caldera International, SCO employed the GPL to distribute Linux, which according to its insinuations would make it guilty of copyright infringement, Roseland contends. Kevin Murphy of The Children's Hospital of Philadelphia adds that the personal beliefs of the open-source community and the GPL's authors have no bearing on the license's legitimacy, and is confident that the GPL will survive SCO's attempts to invalidate it, especially with a defense team of legal counsel appointed by IBM. Software engineer manager Tim Dion explains that SCO's argument is that IBM's supposed violation of the terms of the Unix System V contract entitles SCO to ownership of the code, despite the fact that SCO admitted that IBM and Silicon Graphics were the code's copyright owners. He also notes that SCO has failed to present any evidence of infringement despite all its posturing.
    Click Here to View Full Article

  • "Getting Unplugged: How Wi-Fi Technology Is Changing the Wireless Future"
    Christian Science Monitor (01/05/04) P. 13; Lamb, Gregory M.

    Wireless fidelity (Wi-Fi) technology is not only changing how people use computers to communicate, but how they pay to access the network: Instead of signing exclusive contracts with different providers, new wireless research is working on ways to get devices to communicate directly with each other and bypass the centralized infrastructure. The MIT Media Lab, for instance, is developing "viral communications" that would free Wi-Fi-equipped devices from infrastructure reliance, instead allowing them to send and receive data on the backs of other network-connected devices. Such a scenario would be more resilient than traditional networks because data is able to travel numerous paths to reach its destination; in addition, the unpredictable routing of data would make it difficult for hackers to latch onto a signal, and encryption would protect intermediary devices from spying on passing data. The Media Lab's Dr. David Reed says that, in the future, unplugged devices may even be able to draw operating power from connected, wired devices. He sees a wireless future as an inevitable conclusion driven by market forces. Unlike the paperless society that never materialized because of the usefulness of physical documents, wireless will succeed because nobody expresses any affection for wires, Reed notes. Pyramid Research wireless analyst John Yunker sees market forces already at work in the hotel industry, which is offering free Wi-Fi access to customers as a competitive amenity. Meanwhile, Reed says cell phone companies that are not ready to adapt to the new market demands will find themselves obsolete; some cellular firms are beginning to sign roaming agreements that allow users to seamlessly switch networks and be charged on only one account.
    Click Here to View Full Article

  • "Fiber Optics Takes the Long Way Home"
    Technology Review (12/03); Hecht, Jeff

    U.S. telecommunications giants are opposing grass-roots efforts to put fiber-optic connections in the home. Growing numbers of rural municipalities and utility companies are building end-to-end fiber-optic links for their constituents, but they are opposed in court and in legislative bodies by telecommunications companies; telecoms firms argue that replacing copper wire last-mile connections is too expensive and that they need to recover from over-investment made on long-haul optical networks in the late 1990s. The real reason for their reticence, however, is an unwillingness to compete and let go of their old network infrastructure, which continues to be profitable and largely unchallenged. Rather than wait for basic broadband service, many rural communities are taking matters into their own hands: The nonprofit Fiber-to-the-Home Council counts 94 communities in 26 states that already offer direct fiber-optic connections to some of their constituents. Grant County Public Utility District in Washington State, for example, runs the Zipp Network for its residential and local business customers who would otherwise be left without any wired broadband connection. But because of telecom-backed legislation in Washington, Utah, and other states, municipalities cannot directly sell Internet service to constituents and are limited to wholesale operations. The telecom industry and the FCC are fighting local fiber initiatives in the Supreme Court as well, and SBC Communications and the FCC are appealing a lower court ruling that allows Missouri municipalities to offer telecommunications services directly. Other fiber-optic installations are occurring as new housing subdivisions are being built, especially in affluent areas where high-end Internet connections are an attractive amenity; the cost of installation during new construction is relatively cheap compared to replacement.
    Click Here to View Full Article

  • "10 Tech Trends for 2004"
    Mercury News (01/01/04); Fortt, Jon

    Personal technology will continue to morph in 2004 with cheaper laptops and TVs, more capable mobile phones, combination DVD recorders and set-top boxes, and video in iPod-type devices and on blogs. Intel's forcefulness in growing the laptop market means those systems will likely continue to outsell desktop PCs in the United States--a trend that started last summer. In addition, more LCD manufacturing capability in Asia will translate into lower prices for laptops. Bluetooth-enabled keyboards and mice will likely be shipped with new PCs in 2004, giving a big boost to the technology. Camera phones continue to prove popular and will be more so when they provide one-megapixel resolutions. However, better imaging technology in small form factors is leading to an outbreak in digital voyeurism with a number of cases cropping up across the U.S. in the last year. Video will also reach the iPod market, though an Apple vice president recently said the company was not interested in such devices now, partly due to Hollywood's constraints on content distribution. Archos and RCA, meanwhile, are marketing portable video devices that could take off if an easy-to-use system is devised to convert DVDs and TV shows into readable format; that format conversion will also likely affect the set-top box market in 2004 as a number of new entrants encroach on TiVo's space. TiVo could escape competitive pressure by integrating a DVD recorder into its system so that users can move their captured TV content--conveniently stripped of commercials--onto DVDs, and then possibly onto their PC hard drives.
    Click Here to View Full Article

  • "Security: From Bad to Worse?"
    InformationWeek (12/29/03); Keizer, Gregg

    A TruSecure study issued Dec. 29 indicates that spyware and peer-to-peer file-sharing software will make 2004 just as bad as 2003, if not worse, for businesses beleaguered by cybersecurity woes. Bruce Hughes of TruSecure's ISCA Labs reports that "perimeter killer" worms that attack networks directly through software flaws and unprotected Internet ports experienced a 200 percent increase in 2003, and such worms will constitute the biggest danger to businesses in 2004; he predicts that such worms will incur at least $1 billion in damages in the coming year. Hughes also projects a rise in "zero day" attacks, in which exploits appear prior to the disclosure of a software vulnerability. "Some hacker is going to release exploit code ahead of the patch and create significant damage to those unprepared," he warns. Hughes notes that spyware may be relatively less malign than viruses, but the two have begun to overlap, so companies should be vigilant for more malevolent spyware iterations. He foresees peer-to-peer (P2P) software as an especially frustrating headache for businesses, and has learned through analysis of hundreds of files shared on Kazaa that almost half include worms, viruses, and Trojan horse programs. Hughes urges companies to limit P2P usage on their networks, audit the enforcement of such regulations, and familiarize workers with the risks of P2P. Hughes sees the collaboration between government and the private sector in catching and prosecuting virus authors as a hopeful sign.
    Click Here to View Full Article

  • "Electronics Now Boarding"
    EDN Magazine (12/25/03) Vol. 48, No. 28, P. 31; Wright, Maury

    Electronics has started to penetrate the primarily mechanical domain of the aerospace industry with new technologies that promise to make flying safer, more efficient, and less costly. Commercial and private aircraft rely on land-based navigation systems using technology dating back to the Second World War, and satellite-based systems such as the global positioning system (GPS) have begun to make inroads. The FAA recently certified some GPS-based systems for mission-critical applications, and the technology's benefits include more efficient use of airspace, better fuel economy, and safer landings in low visibility conditions. Both land-based technologies and GPS are employed in the GPS Landing System, which costs less money than the conventional instrument-landing system and is more runway-efficient. The Electronic Flight Bag is designed to eliminate the need for pilots to burden themselves with maps, approach plates, logs, manuals, flight plans, and other paper documents that could hinder in-flight operations by storing such data electronically and transferring the information to the flight deck. Many aviation accidents are partly attributed to pilots' difficulty in understanding data relayed by mostly mechanical flight deck instrumentation, so more innovative electronic avionics systems are being developed to usher in the age of the "glass cockpit." Examples include multifunction flat-panel displays that boast wide viewing angles and are readable in bright sunlight. Meanwhile, synthetic-vision systems promise to accurately depict terrain and runways outside the cockpit even if inclement weather hinders visibility.
    Click Here to View Full Article

  • "Research Net Set to Fly"
    Network World (01/05/04) Vol. 21, No. 1, P. 1; Marsan, Carolyn Duffy

    U.S. university researchers are launching a new network testbed that will help the U.S. regain international leadership in network research, according to proponents. The United States has not had a real-world environment where network research is the foremost priority since Arpanet in the late 1960s. The new National LambdaRail (NLR) network is privately funded by universities in a dozen U.S. cities, and will use network gear and fiber from Cisco and Level 3; developments coming out of NLR should begin to affect commercial offerings in as little as 18 months, according to organizers. Since Arpanet, U.S. researchers have used the National Science Foundation's NSFnet, but competed for bandwidth with high-end scientific modeling research. Likewise, the Internet2 educational network is often congested by students' recreational file-transfer activity. NLR will reserve half its bandwidth specifically for network research such as traffic analysis and protocol testing; the network will use optical dense wavelength division multiplexing (DWDM) technology which provides for up to 40 wavelengths, each operating at 10 Gbps. By comparison, Internet2 runs one wavelength at 10 Gbps, says NLR CEO Tom West. On top of the DWDM network, NLR will run a routed IP network and a switched Ethernet network some say is the first wide-area use of 10 Gbit Ethernet. That aspect promises new applications and insights for enterprise Ethernet users, and Cisco is tuning its products to support this research. Although the specific products for NLR have not yet been announced, they will likely include a wide range of network research, including development of new optical technology and better security applications and techniques. Chicago and Pittsburgh have already established links on NLR and other cities are expected to come online in May.
    Click Here to View Full Article

  • "Can We Talk?"
    Network Magazine (12/03) Vol. 18, No. 12, P. 33; Greenfield, David

    Unified Communications (UC) aim to duplicate the experience of functioning in a common office regardless of where one is, and to eliminate phone tag, cut message overload, quicken customer response time, and augment remote collaboration through the integration of corporate wireless and wireline communication with unified messaging, presence intelligence, and user-supplied routing instructions. Though UC technology is in an early stage, planning a UC transition now is wise; the best way to know when such a transition is appropriate is to define the technologies involved and then map out the business case, with special attention paid to the two main Session Initiation Protocol (SIP) server setups--the SIP proxy server and the Back-to-Back User Agent (B2BUA). Companies are likely to benefit from UC in three fundamental ways: They can slash costs through the deployment of Voice over IP, increase productivity, and raise the level of customer satisfaction. BearingPoint's Manuel Barbero notes that straight Return on Investment analysis is inapplicable to UC, and recommends that people consider the functionality enablements facilitated by such technologies several years after the original investment. The B2BUA architecture is popular among early SIP entrants because it can embed external processing into SIP, while SIP proxies could comprise the foundation of converged infrastructure within a company. For one thing, SIP proxies manage less information than B2BUA, and offer more security and reliability as well. B2BUAs fit in well along the edge of an enterprise's SIP network because they interrupt the media stream and enhance the SIP signaling, and are particularly well-suited to application servers for deploying Class-5 functions. The distribution of presence information follows one of two approaches, depending on the vendor: Early entrants may deploy client-based presence without the server, while mature products depend on a centralized server to lower the network traffic stemming from presence updates.
    Click Here to View Full Article

  • "5 Commandments"
    IEEE Spectrum (12/03); Ross, Philip E.

    A lot of so-called technology "laws" have sprouted in the wake of the first 50 or so years of the solid-state age, when in reality they are actually rules of thumb. Moore's Law, which posits that the number of transistors on a chip doubles annually, is not written in stone: VLSI Research's G. Dan Hutcheson says the average doubling period has wavered from 17 months to 22 months to 32 months from 1975 to 1995, while in recent years chip capacity has doubled every 22 to 24 months; Hutcheson maintains that the limit of Moore's Law will be dictated by economics, not technology. Rock's Law, attributed by Moore's Law author Gordon Moore to Intel investor Arthur Rock, claims that the cost of semiconductor tools doubles every four years--and by that reckoning, chip fabrication plants were supposed to cost $10 billion each by now. However, VLSI estimates that individual fabs currently cost just $2 billion, which places the focus on product value rather than fab cost. Former PC Magazine columnist Bill Machrone is credited with Machrone's Law, which states that the PC you want to buy will always cost $5,000, a figure that is currently off by about 80 percent to 66 percent, by Machrone's own admission. Metcalfe's Law deigns that a network's value grows as the square of the number of its users, but economists Andrew McAfee and Francois-Xavier Oliveau identify a handful of factors that can actually decrease network value: These include contaminants (spammers and other Internet bottom-feeders), saturation (when a network already possesses most of the valuable content that new members can add), cacophony (overwhelming complexity of communication among network members), clustering (fragmentation of members into groups that only use part of the network), and the rise in search costs until most network wealth is practically inaccessible. Finally, Wirth's Law, which asserts that software execution is slowing down faster than hardware is speeding up, is assumed by the contention that users suffer from feature bloat because the rising speed of calculations is accompanied by an increase in the amount of calculations needed for the job. ETH Zurich's Niklaus Wirth, who actually credits the law to former IBM Research scientist Martin Reiser, attests that software companies' pressure to roll out new products, not user tolerance, is chiefly responsible for feature bloat.
    Click Here to View Full Article

    [ Archives ]  [ Home ]