ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 7, Issue 739: Friday, January 7, 2005

  • "R&D Funding and IT Policy to Play Major Role in Bush's Second Term"
    InformationWeek (01/04/05); Greenemeier, Larry

    President Bush's second term will seek to address international challenges to U.S. technological dominance by funding research and development that spurs new tech innovations as well as certifies that new technology is available to most people. Commerce Department undersecretary for technology Phillip Bond says federal R&D funding has jumped 44 percent in the last four years, while the administration commits $2 billion a year to networking and IT R&D at the National Science Foundation, the National Institute of Standards and Technology, the Defense Department, and other agencies. In addition, the White House has earmarked $1 billion for nanotechnology development and is also interested in quantum encryption. Other key elements of the Bush administration's public-participation strategy include broadband and wireless communications, cybersecurity, and e-government. "We expect [the] WiMAX [wireless interoperability standard] to burst onto the scene during the [president's] second term," says Bond. Center for Digital Government chief strategy officer Paul Taylor says the president's second term should include the development of a strategy that fortifies the nation's enduring science and technology leadership through the establishment of tax incentives, funding, and an immigration policy designed to boost the number of Ph.D.s. He warns that a lack of interest in math and computer science among the current crop of American students has resulted in an "unpaid bill that's coming due this decade."
    Click Here to View Full Article

  • "Tech Firms Aim to Change Copyright Act"
    Washington Post (01/06/05) P. E1; Krim, Jonathan

    Members of the Business Software Alliance (BSA) want Congress to force ISPs to actively help copyright owners stamp out digital piracy by revising the 1998 Digital Millennium Copyright Act (DMCA) as part of a bigger initiative to resolve a diverse array of copyright and patent issues. A report expected today details the alliance's concerns without suggesting any specific amendments to the 1998 law, but Adobe CEO Bruce Chizen and BSA executive director Robert Holleyman argue that ISPs should no longer be immune from piracy liability. Uncovering the identity of file swappers can be difficult without ISPs' cooperation, and current law only requires communications firms to furnish the names of users who match numeric computer addresses if they are subpoenaed as part of a lawsuit filed against a user. "If they [online providers] don't have to, they don't want to do the work," says Chizen, who along with Holleyman insists that the alliance wants to collaborate with ISPs on the DMCA revisions. However, this has done nothing to assuage the fears of ISP officials and privacy proponents such as Verizon associate general counsel Sarah Deutsch, who claims that the alliance "wants its own shortcut, at the expense of consumer privacy and the ISPs." America Online's Nicholas Graham says such a move threatens the much-needed principle of a "safe harbor" from liability for communications providers. The BSA has also called for changes to U.S. patent law, arguing that tech innovation is being threatened by escalating litigation costs over patent rights. The alliance recommends that administrative mechanisms be set up to enable third parties to challenge patents after they are granted and restrict damage awards for intentional patent infringement.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "New Congress, Old Tech Issues"
    IT Management (01/04/05); Mark, Roy

    The regulation of IP-based networked services, the allocation of additional spectrum for commercial wireless broadband providers, and calls for anti-spyware legislation and a prohibition on peer-to-peer (P2P) technology are just some of the technology-related issues held over from the 108th Congress that the 109th Congress will face. The congressional tech agenda will once again be dominated by Republicans who favor deregulation, but the 108th Congress was marked by the majority party's failure to follow its own tech leaders' legislation. Furthermore, key tech legislation was impeded in the last session by a core group of GOP lawmakers who still believe that state, county, and local treasuries will suffer in the mad dash to deregulate. On the other hand, the tech agenda could be relegated to the sidelines by the president's commitment to Social Security reform, debates over Supreme Court nominees, and the congressional administration of the war in Iraq. Issues that the FCC and the courts are currently focused on could also muddy up IT trade groups and tech legislators' agendas: Progress & Freedom Foundation fellow Jim DeLong expects that "The Grokster case will put everyone in the deep freezer [on intellectual property issues]." An ongoing proceeding on IP-enabled services in the FCC has thus far resulted in the commission ruling that Voice Over IP (VoIP) is not subject to state regulations because it is an interstate service, while also requiring VoIP providers to abide by federal wiretap regulations. Roger Cochetti with CompTia says his organization is primarily concentrating on tax incentives for IT workforce retraining and "protecting the network economy from needless regulations."
    Click Here to View Full Article

  • "Search Looks at the Big Picture"
    Wired News (01/06/05); Gartner, John

    Computer scientists are working on visualization technologies that would allow image searches to be done without relying solely on text descriptions, such as is currently done with Google, Yahoo!, and other search engines. Image search engines that can identify components of an image will be less vulnerable to manipulation by pornographers or deceitful advertisers that tag pictures as "Britney Spears" or some other popular content. Xerox Research Center Europe and a group of European universities are jointly working on software that recognizes "key patches" in pictures, such as an ocean and beach, or car and tires; since development began in 2002, the software has learned hundreds of objects, and would be useful in distinguishing between dissimilar pictures that share a common keyword. Improved image search would also open up new advertising opportunities, such as a function that allows online shoppers to search for a similar-looking red sweater, but at a lower price than one already found. Such capability would open up a vast new market as companies work to improve their image search rankings for popular products, people, or places. IBM Pervasive Media Management group is also developing visualization technology, but for video; the Marvel system also relies on learned images and categorizes them into concepts, such as travel or sports. IBM is currently working with CNN and ABC to classify news coverage concepts so Marvel technology can be applied to news archives. Search companies have not yet endorsed visualization technology, but Yahoo! is trying to bolster its video search with its Media RSS format and by working with Hollywood studios on metadata tagging.
    Click Here to View Full Article

  • "As for Managing Knowledge, MUMMY Knows Best"
    IST Results (01/07/05)

    The IST-funded MUMMY project has developed a prototype system designed to facilitate mobile, personalized knowledge management through the use of always-on high-speed connections provided by wireless networks and camera-equipped handhelds. MUMMY project manager Dirk Balfanz says the system's advantages will include reduced costs, time savings, and general improvements in work quality. Successful development of the MUMMY platform hinged on defining a single vision supported by all the project partners. "The project started with a technology push from the research partners towards the end users, yet what we needed was an integrated vision that contained the research innovation on one side but which also matched the end-users needs on the other side," Balfanz recounts. "The main obstacle to overcome has been bridging the gap between science and user needs." He notes that some of the MUMMY package's individual components could potentially be used independently. Such components include a mobile collaborative Scalable Vector Graphics tool that allows plan and context-related information to be shared, annotated, and co-developed; a hypervideo tool that enables a video stream to be annotated with other media; and a context manager that monitors a logged user's current working context and permits all captured data objects to be enhanced with metadata. Balfanz says a pair of dedicated applications have been delineated, in keeping with end users' frequent need for data tailored to their specific workflow: These applications, which will be field tested this year, focus on site inspection for hazardous material inventory and technical services.
    Click Here to View Full Article

  • "Is Your Wireless Network Secure?"
    Waynesville Daily Guide (01/05/05)

    The University of Missouri-Rolla chapter of the ACM's Special Interest Group on Security, Audit, and Control (SIGSAC) carried out an audit of the Rolla community's wireless networks, and determined that 56 percent of the 589 recorded networks were totally insecure, according to Nixa, Mo., graduate student Jason Trent. SIG Security advises that wireless network users keep wireless components away from windows and toward the center of their house in order to reduce the strength of the radio signal outside their intended coverage area, while also keeping them clear of electrical equipment that can interfere with the signal. Another recommendation is to enable Wired Equivalent Privacy (WEP) at the highest level of offered complexity; anyone that wishes to access the wireless network is authenticated through WEP, which also encrypts or conceals any traffic the network generates. Disabling the broadcast of the network's Service Set Identifier (SSID) means that a client will be required to know their access point's SSID before linking to it, notes Manhattan, Kan., computer science sophomore Laura Woodward. "With access to your wireless network, anyone with a laptop computer and a wireless card can use your Internet connection to send spam email, or even break into another computer through your Internet connection," she warns. "And if caught, you would get the blame because it would be traced back to your network and computer, not the actual culprits." SIG Security conjectures that most wireless users are unaware of their security vulnerabilities and have no intention of sharing their services, although the audit revealed several locations where sharing wireless access points with the public was intentional.
    Click Here to View Full Article

    For more information on SIGSAC, visit http://www.acm.org/sigs/sigsac/.

  • "Incompatible Tech Confuses Consumers"
    USA Today (01/07/05) P. 1B; Kessler, Michelle; Snider, Mike; Baig, Edward C.

    A lack of consensus among consumer electronics manufacturers has led to an absence of uniform standards, resulting in incompatible products. Fallout from this incompatibility includes wasted resources for manufacturers, wasted money and confusion for consumers, and impedance of promising tech markets. The Diffusion Group's Michael Greeson reports that over 50 percent of those who purchase a digital electronic product bring it back because they cannot get it to operate, while Creative Strategies analyst Tim Bajarin says electronics makers are far less experienced than PC companies when it comes to establishing common standards. Bajarin notes that product incompatibility is reinforced by competition between manufacturers, arguing, "In this age of everything going digital, the [electronics companies] can no longer think of themselves as digital islands." Competing technologies include Ultra Wideband, with Intel- and Motorola-led groups locked in a standards stalemate for over a year; next-generation DVD, which is split between the HD-DVD and Blu-ray camps; digital rights management, whose proponents support a patchwork of frequently incompatible software; and next-generation digital video formats, with rival standards backed by Microsoft, RealNetworks, and the Internet Streaming Media Alliance. Organizations set up to ensure interoperability include the Digital Living Network Alliance, which is dedicated to making all devices on a network compatible. Meanwhile, the Coral Consortium is working to make entertainment content interoperable. "Simplicity is one of the reasons why we're so focused on standards...[and] teaming up with other players," notes Hewlett-Packard CEO Carly Fiorina.
    Click Here to View Full Article

  • "Bell Labs Grapples With VoIP, Open-Source"
    IDG News Service (01/05/05); Ribeiro, John

    Jeffrey Jaffe, Bell Labs' president of research and advanced technologies, says delivering carrier-grade voice over IP (VoIP), which in essence constitutes bundling telephony network benefits such as operational support, network management, monitoring, performance understanding, and fault detection into the IP network, is one of Bell Labs' goals. Another Bell Labs R&D project of note is the development of methodologies for incorporating carrier-grade capability in open-source in order to significantly lower telecommunications platform development costs. Jaffe reports that testing is a major challenge related to the use of open-source technology, explaining that "coming up with new processes and technologies, so that we can bring up open-source to be able to be at a level of something that you had planned and done yourself, is a nontrivial problem in software methodology." Jaffe details Bell Labs' nanotechnology focus, which includes nanobatteries that can extend cell phone life cycles, and silicon antennas that can deliver more wireless communications capability at reduced cost and with less intrusion. He also notes that microelectromechanical systems (MEMs) microphones allow multiple microphones to be installed in cell phones, so that voice quality and directionality can be improved. Jaffe cites the use of nanotech for optical routing via the construction of an 8-in. wafer that boasts 1 million individually moveable MEM mirrors, while "nanograss" can improve the heat management and packaging of telecommunications systems. Jaffe sees a number of trends and technologies--nanotech, convergence, and so on--as key enablers of driving down costs and boosting teledensity. He states that sustaining the multidisciplinary environment in which Bell Labs R&D has thrived is a key factor in its expansion, claiming that "We only want to come to countries where we have a sufficient commitment to the country that will build a sufficient critical mass to have...interdisciplinary research."
    Click Here to View Full Article

  • "Put Science at Center of Decision-Making on Third World Development, Experts Tell UN"
    EurekAlert (01/06/05)

    A task force for the U.N. Millennium Project says science and technology are key to bolstering third-world development, and must have prominent roles in development policy, according to a recent report to the United Nations. The Millennium Project aims to achieve Millennium Development Goals by 2015, but the Task Force on Science, Technology, and Innovation reported that economists have dominated development policy for too long. In today's global economy, science and technology are also fundamental to overall prosperity. The report suggested scientific advisory bodies be appointed at the national leadership level, and that poor countries establish research institutions; although critics of this approach point to immediate health care needs, the task force said there were many examples of success, such as in Malaysia, where the Academy of Sciences Malaysia advises the government and has helped transform that country from a raw materials producer to a diversified economy since the 1980s. Nigeria and South Africa are emulating Malaysia's example and have established similar organizations, while engineers in Uruguay developed low-cost, portable water treatment systems that have helped reduce waterborne disease for peacekeeping duties in Africa and for disaster areas. The African Virtual University utilizes distance learning technologies to train more than 23,000 people in professional careers, over 40 percent of them women; similar programs could be established on a national level, possibly helping local universities with Internet-based learning capabilities. Even with the establishment of new research and learning institutions, links to industry are needed to quickly benefit the local economy. In Taiwan, the government has created the Industrial Technology Research Institute and more than 30 industry consortia to quickly transfer technologies to local electronics industry.
    Click Here to View Full Article

  • "Voicemail Software Recognizes Callers' Emotions"
    New Scientist (01/08/05) No. 2481, P. 21; Biever, Celeste

    MIT Media Lab researchers Zeynep Inanoglu and Ron Caneel have designed Emotive Alert software that can determine callers' emotional states through their tone of voice and mark the urgency of the call for recipients using text messages and emoticons. The software sifts out the distribution of volume, pitch, and speech rate, and matches them to eight stored "acoustical fingerprints" approximating urgent, not urgent, formal, informal, happy, sad, excited, and calm emotional levels. The software sends the recipient the emoticon corresponding to the fingerprint the voice message most closely matches as well as a text message indicating the two closest-matching emotional labels. Emotive Alert has learned to distinguish between these various fingerprints by being fed hundreds of excerpts from old voice mail messages emotionally tagged by researchers. Tests with real-life messages showed a high degree of accuracy in distinguishing between excited and calm and happy and sad, but less discrimination between formal and informal and urgent and non-urgent; Inanoglu thinks this discrepancy has to do with Emotive Alert's inability to measure words or personal subtleties in volume and speech speed. By integrating Emotive Alert with a speech-recognition system that draws connections between patterns of words and specific emotions, the software loses its ability to label messages in any language, but Inanoglu is planning to create a system that can construct a personalized acoustical fingerprint for most-frequent callers. Andrew Monk of Britain's University of York is concerned that Emotive Alert could be exploited by spammers.
    Click Here to View Full Article

  • "The TCO of Open Source"
    CIO Today (01/05/05); Shaw, Russell

    Yankee Group analyst Dana Gardner contends that open-source enterprise deployments yield little to no savings over proprietary deployments when one factors in operational costs over time. This problem stems from open-source applications' fragmentary design approach, whereas proprietary platforms offer an integrated software solution. IBM, Sun Microsystems, Microsoft, and other companies balance out piecemeal applications design with products developed with systems and software in mind. "Therefore, they can claim their total operational cost for their products could actually be less [than open source]," says Gardner. He expects current open-source applications development to ultimately transition from a fractional strategy to a more integrated approach in order to make software that can accommodate mission-critical enterprise applications. IDC's Dan Kusnetzky maintains that other factors besides low or no cost may be of greater consideration when debating an open-source enterprise implementation, such as little or no license or asset management requirements, freely available source code, and lack of ownership by any one company. "Those familiar with [cost of ownership] studies...know that hardware and software, when taken together, typically make up less than 30 percent of a five-year cost structure," he reports, noting that staff-related costs often account for 50 percent to 70 percent of the cost structure. Gartner research director Nikos Drakos says corporations' successful planning and deployment of an open-source platform hinges on their possession of proficient and committed internal staff, flexibility, and an interoperable IT architecture arrangement.
    Click Here to View Full Article

  • "Ready, Aim, ID Check: In Wrong Hands, Gun Won't Fire"
    New York Times (01/06/05) P. E6; Eisenberg, Anne

    New Jersey Institute of Technology researchers are developing a gun that will not fire if its built-in circuitry and software fail to recognize the shooter's grip. "We can build a brain inside the gun," says electrical engineering professor Timothy Chang, who designed the grip-recognition system's hardware. The system employs sensors in the weapon's handle to measure the amount of pressure exerted by the hand as it squeezes the trigger, and then software designed by professor Michael Recce checks the grip against a series of stored patterns of authorized shooters to find a match. Recce estimates that pulling a trigger takes one-tenth of a second, which is enough time for a computer to match the patterns and process the authorization. The first tests were done under simulation, but more recent tests involved live ammunition and actual semiautomatic handguns; the circuitry and pattern-recognition software currently resides in a laptop that the pressure sensor-outfitted weapon is wired to, but Chang plans to embed the digital signal processing elements into the gun's magazine over the next several months. The institute's Donald H. Sebastian says the system boasts a recognition rate of 90 percent, which should increase with the installation of additional sensors. The gun could one day be equipped with Global Positioning System receivers, accelerometers, and other components that could record the time and direction of gunfire and be used to re-create events in crime investigations. The technology is especially promising as a deterrent against accidental deaths resulting from unsecured guns falling into the hands of children or other unauthorized people.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Preparing for a Doomsday Attack"
    CNet (01/03/05); Cooper, Charles

    In an interview with CNet, VeriSign CEO Stratton Sclavos says the major issues related to Internet registration this year will be its legal dispute with ICANN over the extent of its governing reach and the .net rebid, which it believes it will win. He says he is uncertain whether the U.S. government should become more involved in the oversight of ICANN, but he does believe government officials will need to monitor the situation, considering a substantial portion of the economy runs on the Internet, and the addresses handle a lot of traffic. Government officials would want to keep risk from extending to the economy. In this particular case, the government should make sure that ICANN embarks on a fair and transparent process, and chooses a provider that is able to handle the scale of the infrastructure and intellectual property involved in processing over 14 billion queries per day. "I think having the [government] have some oversight over that process and some dialogue with ICANN to make sure the process is followed is probably about the most they are willing to do," says Sclavos. He adds that cybersecurity will take much more than appointing a government cybersecurity czar, and he sees adoption of IPv6 technology as having key implications for securing networks. "I think IPv6 gives you a footprint for figuring out how to track every point on the network and thereby develop the tools to be much more secure," says Sclavos.
    Click Here to View Full Article

  • "PC, Net Marriage to Drive Standards Push"
    Electronic Engineering Times--Asia (01/03/05); Kou, Karen

    The Asian consumer electronics market is focusing heavily on the integration of networking and computing technologies, with demand for such integrated capabilities driving new standards initiatives as well as greater bandwidth and processing performance. A good example is the Chinese PDA market, where leading manufacturers have already built wireless and multimedia functions into their new models. Local Chinese electronics vendors are joining together to create domestically developed intellectual property and standards independent of larger international competitors; the Intelligent Group and Resourcing Sharing effort is developing interoperability protocols among Chinese IT integrators, device manufacturers, and telecom firms. On the hardware side, multicore architectures are providing mobile, telecom, and desktop devices with the necessary performance boost. Some of the most radical change is occurring in the Asian telecom market, with traditional voice and lease line service providers under serious threat from new services. "We're seeing the term 'triple play' becoming more significant--carrying data, voice, and video over one media," says Agilent Technologies' Frank Cappellari; IP networks are being used for some of these triple-play deployments, but also offer maximum network uptime and reduced costs. IDT's Phil Bourekas says new technologies are spurring the sustained development of infrastructure, deployment of new applications, and growth in the user base. To meet the needs of the fast-changing marketplace, chip firms are partnering more closely with customers or other silicon providers with a focus on providing greater processing power for value-added services.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Single Government ID Moves Closer to Reality"
    Washington Post (12/30/04) P. A25; Lee, Christopher

    The Personal Identity Verification Project, under management of the National Institute of Standards and Technology, plans to provide every federal employee and contractor an identification card as a means of preventing terrorists, criminals, and unauthorized people from entering government buildings and computer systems. The project stems from a presidential directive requiring secure identification among federal workers and contractors. The identification cards will use advanced smart card technology, more resistant to counterfeiting and tampering than current cards, to store biometric data, allow encryption of computer data, and enable users to securely communicate with different systems. Nearly 2 million people will be issued the cards after a background check with sensitive offices and systems requiring an even more rigorous check; employees will start using the cards as early as fall 2005. The project expects to increase convenience for federal workers, because the card will allow the access to several different buildings whereas multiple cards are currently required. However, concerns about the cards include: potential for privacy invasion with employees' rank and pay grade printed on their cards and ability of federal agencies to monitor movements of particular employees through federal buildings. A public meeting concerning the identification cards is scheduled for January 19; registration to attend the meeting is required. NIST has already spent over $1 million on the project; new standards are expected to be finished next month. Experts say private businesses could follow the government's lead in requiring more secure ID card standards.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "When Technology Became Cool Again"
    Salon.com (12/27/04); Leonard, Andrew; Manjoo, Farhad; Mieszkowski, Katharine

    Technology enjoyed a resurgence of sorts in 2004 with the emergence of political blogs, wireless networking, and open-source, while outsourcing, peer-to-peer (P2P) file trading, and other trends served as a reminder of tech's dark side. As a powerful tool for fundraising and mobilizing support, political blogging proved to be a double-edged sword as major practitioners such as Howard Dean and John Kerry failed to achieve victory. Meanwhile, e-voting's myriad security problems and lack of accountability spurred widespread doubts about the outcome of the most recent elections, doubts that threaten to undermine democracy to the degree that both Democrats and Republicans will hopefully pressure federal, state, and local governments to implement paper trails and other verification measures. Experts dismissed music industry lawsuits against digital pirates as ultimately ineffective, while P2P swappers began downloading bulkier files; evidence that the legal market for digital content grew significantly did nothing to dissuade the industry from pushing for even more restrictive anti-infringement legislation in the form of the Induce Act. Open-source software began to make headway against proprietary software with the 2004 release of Mozilla's Firefox 1.0 browser, which put a dent in Microsoft's browser market share. Wireless networking grew to critical mass thanks to its ability to ease Internet access. On the other hand, offshore outsourcing is chipping away at the U.S. IT workforce, with no solution for supporting American workers who lose their jobs to overseas professionals in sight. Google's aggressive rollout of services such as Gmail, desktop search, and digital libraries has cemented its position as a major force on the Internet, although the massive amount of user data Google is privy to could have sinister connotations if less trustworthy people were in charge.
    Click Here to View Full Article

  • "Beating the Cyber Threat"
    Chief Executive (12/04) No. 204, P. 27; Khosla, Pradeep K.

    Even as the computing and communications infrastructure becomes more important in people's lives, it is also facing an increasing threat from malicious software and hackers that have already cost billions of dollars to industry and could damage critical infrastructure such as the power grid or air traffic control systems, writes Carnegie Mellon University engineering dean Pradeep Khosla. Whereas computing was probably less secure in the 1980s in terms of software quality, there was a limited threat because of low connectivity, technology-savvy users, and the skill required to compromise systems. Today, an outdated security approach requires users to secure their systems through patches, requiring inordinate effort and resulting in many opportunities for attackers; moreover, infected systems are not able to continue operating through attacks. A combined government, industry, and academic research agenda needs to be defined in order to remedy computing and communications infrastructure, argues Khosla, who is founder of the Carnegie Mellon CyLab. Drastic changes are required, including self-healing systems and the network awareness as to what is being sent, so that data packets can still find their way to receiving routers even when a system is knocked out. Businesses need sophisticated risk analysis to understand the payback on proactive, upfront security compared to reactive security actions. Law enforcement should have improved tools that allow officials to identify perpetrators of attacks through biometric identification required at log-on, and software measurement technology is needed to reduce the number of vulnerabilities. Finally, public cyberawareness must be increased through cooperation from industry, government, community organizations, and the K-12 education system, Khosla explains.
    Click Here to View Full Article

  • "Tech 2005: What's New and What's Next"
    PC World (12/04) Vol. 22, No. 12, P. 74; Desmond, Michael

    Technology forecasts for the next two years include a restructuring of the PC model with the introduction of dual-core AMD and Intel processors in most new PCs, which Insight 64 analyst Nathan Brookwood says will be integrated with multiprocessor-aware OSes to yield significant performance gains; first-generation dual-core processors should boast lower clock rates than single-core processors in order to reduce heat, cost, and wear-and-tear. Brookwood expects 64-bit processors to be featured in two-thirds of all PCs in 2006, while memory thresholds on high-end PCs should reach at least 4 GB. The replacement of PCI and AGP graphics interfaces with PCI Express, which Brookwood thinks will eventually boast a 1 Gbps data transfer rate, could ultimately rework the PC into a modular configuration. Digital photography is expected to be revolutionized by a wealth of new products, including GPS-equipped cameras, smarter desktop software that indexes images by identifying objects and applying metadata tags, and automated image enhancement features such as red-eye removal. Predicted trends for cell phones include extended battery life, greater power, improved screens, additional features, the incorporation of minidrives, support for both digital cellular networks and local area Wi-Fi, and reliable in-flight connectivity. Cybersecurity threats will likely expand, according to Johannes Ullrich with the SANS Institute, who sees the recent emergence of Pocket PC and cell phone viruses as a taste of things to come; though chipmakers and OS writers are attempting to patch the highly exploited buffer overrun, Ullrich is concerned that such patches will be unavailable to network routers and mobile devices for some time. Meanwhile, the enthusiasm for home automation technologies is expected to be rekindled with the emergence of such products and services as the Insteon protocol and Zensys' Z-Wave.
    Click Here to View Full Article

  • "Wrestling XML Down to Size: Reducing the Burden on Networks and Servers"
    Business Communications Review (12/04) Vol. 34, No. 12, P. 35; Kobielus, James

    XML's hogging of bandwidth and processing resources throughout the Web services environment will be remedied by more compact XML encoding and accelerated XML processing, which also promise to lower transmission latencies and facilitate network support for an increasing amount of Web services traffic while simultaneously upholding quality of service at satisfactory levels. Taking bandwidth-hungry ASCII text out of the equation in favor of more compact binary encoding and serialization can compress XML encoding, and the World Wide Web Consortium has developed a pair of Candidate Recommendations for binary XML encoding within Simple Object Access Protocol (SOAP) 1.2 payloads. SOAP Message Transmission Optimized Mechanism (MTOM) and XML-binary Optimized Packaging (XOP) together outline the production of optimized binary encodings of XML content in SOAP 1.2 payloads while also retaining XML documents' logically transparent data structure. Despite the current absence of commercial MTOM/XOP implementations, deployments by numerous vendors are expected over the next few years, so that products will support binary encodings throughout their XML processing elements. Though MTOM/XOP-based encodings will ramp up XML wire transmission, they will devour processing cycles on application servers, thus validating the need for wirespeed XML processing. The past several years have witnessed the introduction of commercially available hardware/software appliances or "accelerators" to quicken XML overhead processes by shifting some of the CPU-heavy chores normally handled by application servers, managing the compression and transformation of XML content, and routing the content to proper applications. The next several years are expected to see vendors roll out blades and appliances to fulfill wirespeed SOAP processing needs, while SOAP hardware accelerators will become de rigueur for all network, middleware, and application platforms.
    Click Here to View Full Article

    [ Archives ]  [ Home ]