ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 691:  Wednesday, September 8, 2004

  • "Right to Vote vs. Right to Secrecy"
    Kansas City Star (09/07/04); Goldstein, David

    Experts such as former ACM President and National Committee for Voting Integrity member Barbara Simons warn that a new email voting system the state of Missouri plans to implement for soldiers stationed overseas is vulnerable to tampering because the ballots are not encrypted. She says, "We could have a virus that changes ballots and then erases itself. It's certainly feasible technically." Political experts and voting-rights proponents add that soldiers would be deprived of their right to a secret ballot because the system requires ballot confirmation via the release of voters' names. "It violates everything that we have regarded as essential to the functioning of democracy--the secrecy of the ballot," laments the ACLU's Laughlin McDonald. Lt. Col. Ellen Krenke of the Pentagon's Federal Voting Assistance Program says the system should be regarded as a last resort after absentee voting via regular mail and fax. Missouri Secretary of State Matt Blunt announced the new email voting system in response to complaints from soldiers who were disenfranchised in last month's primary because of delays and other problems with mail-based absentee voting. Fanning the flames of controversy is the fact that Missouri elections are often close; the state is also playing a critical role in this year's race for the White House, while Blunt, a Republican, is competing for the Missouri governorship against Democrat Claire McCaskill. Blunt representative Spence Jackson insists that "enough safeguards" would be deployed to make the email balloting system a practical alternative for soldiers who never receive absentee ballots or who lack access to fax machines. Simons, however, says the system is an even more troublesome scheme than the Secure Electronic Registration and Voting Experiment for overseas voting that the Pentagon terminated because of security criticisms.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

    For more on ACM's activities involving e-voting, visit http://www.acm.org/usacm.

  • "XML: Too Much of a Good Thing?"
    CNet (09/07/04); Becker, David

    In the six years since the main XML specification was created, hundreds of derivative schemas and dialects have emerged to serve interests ranging from poultry farming to cave exploration. Although some worry that the proliferation of XML formats could lead to compatibility problems, XML co-inventor Tim Bray says the phenomenon is evidence of XML's success, explaining that the original idea was to create a language from which developers could easily craft other languages. XML book author John Simpson, who has created his own schema for cataloging "B" movies, says XML is more accurately described as a grammar than a language itself, and that any XML-based system can be easily tweaked to understand other XML dialects; "The fact there are different standards is immaterial...it's almost trivial to get it from one dialect into another," he says. XML has become the foundation for a growing number of major IT projects, including the Indigo communications subsystem Microsoft plans to build into the Longhorn operating system and services-oriented architecture frameworks touted by systems integrators. XML's ability to describe complex data easily over the Internet is a boon to numerous industries, such as farmers, food processors, and grocers, who use the Meat and Poultry XML format to share information about meat quality, cut, and expiration. Industry coordinator Blake Ashby says speedier data sharing means less time manually checking information and stock-keeping, and fresher product. Still, many groups struggle with competing XML standards that analyst Ron Schmelzer warns could confuse and burden small businesses, for example. Eventually, these competing standards will be consolidated according to market forces, at which point the basic XML framework will make integration easy.
    Click Here to View Full Article

  • "Online Support to Advance Design for All"
    IST Results (09/07/04)

    IST's three-year Design for All (DfA) project promotes universally accessible products and services to ensure that Europe's disabled and elderly population is not left out of the information society. DfA was organized to aid the European Design for All e-Accessibility Network (EDeAN), and the project's D4ALLnet online platform enables EDeAN's various special interest groups to work over the Internet to produce benchmarking, standardization, assessment, and legislation standards for regulators, designers, and engineers. "DfA is increasingly motivated by the globalization of the market, the aging of the population, and the progressively increasing number of people experiencing permanent or temporary loss of functional ability," notes D4ALLnet coordinator Constantine Stephanidis. "DfA advocates a proactive approach towards products and environments that can be accessed and used by the broadest possible end-user population, without the need for additional adaptations or specialized re-design." D4ALLnet is comprised of the HERMES virtual networking infrastructure and the ARIADNE online resource center, which together form a European hub for the distribution of DfA knowledge and best practices, Stephanidis says. He points out that the deployment of DfA practices in Europe is experiencing "significant progress," but additional research and development, along with efforts to boost awareness and build an EU-level legislative infrastructure, are necessary.
    Click Here to View Full Article

  • "Automatic Icons Organize Files"
    Technology Research News (09/15/04); Patch, Kimberly

    A joint project between MIT, the University of Southern California, and ESC Entertainment has yielded VisualID, a system that automatically produces icons for specific computer files and tweaks them to generate icons representing related files. The degree of modification is dictated by how similar file names are. VisualID is designed to make file organization and navigation in cyberspace more efficient and less mentally taxing because psychological research demonstrates that people can more easily register and recall pictures than text, according to USC researcher J.P. Lewis. He says the principle is similar to people's tendency to search for books based on their appearance rather than their titles: "As...memory fades, the appearance of a book often stays with us longer," Lewis notes. Sticker versions of the VisualID icons can also be affixed to physical objects to distinguish between tools, for example. Experiments by the project participants show that people are more likely to recognize files visually when presented with detailed icons, while Lewis points out that making the icons as distinctive as possible was the most formidable technical challenge the researchers faced. The next challenge is devising a method to visually assign both unique file types and identities to given titles, and Lewis says the researchers are also probing how VisualID could be applied to cognitively impaired users. VisualID was detailed by the researchers at last month's Association of Computing Machinery's Siggraph 2004 conference, and Lewis expects the system to take approximately five years to fully develop.
    Click Here to View Full Article

  • "EU Boost to Open Source Software"
    Techworld (09/03/04); Broersma, Matthew

    Sept. 10 marks the official launch of Coordination Action for Libre Software (CALIBRE), a project funded by the European Union that aims to improve the deployment of open-source software development projects and allow open source to more deeply penetrate the mainstream so that Europe will perhaps overtake the rest of the U.S.-led software industry. University of Limerick professor Brian Fitzgerald says there is little formal organization or understanding of the open-source software development model, and it is CALIBRE's belief that painstaking analysis, software engineering and economic studies, and massive data accumulation by researchers is necessary if open source is to become an industry priority. "As CALIBRE represents the leading authorities on open source in Europe, or indeed worldwide, we are in a unique position to transfer these lessons to European industry," Fitzgerald says. Other open-source challenges CALIBRE aims to meet include developing techniques and strategies to ensure code quality, providing developer talent, devising business approaches to open source, and researching socio-cultural obstacles to open-source initiatives. Fitzgerald and others see software patents constituting a potentially serious threat to open-source development, and CALIBRE intends to address this issue as well. The project will also study the related trends of distributed software development, which includes outsourcing, globalization, and similar factors, and unconventional, "agile" development. Britain's University of Limerick and University College Cork are the CALIBRE project's leaders, with the additional participation of the National Microelectronics Application Center and a dozen mostly European academic and industrial research teams.
    Click Here to View Full Article

  • "Purdue, Olympus Corp. Creating Technology for Sensor Networks"
    Purdue University News (09/07/04); Venere, Emil

    Enhancing security and helping elderly and impaired people is the goal underlying a three-year joint venture between Purdue University and Olympus to develop technologies for environmental monitoring via wirelessly networked sensors and ubiquitous cameras. The sensors will be enabled for independent operation as well as communication with each other using software developed by Purdue researchers. "The sensors will talk to one another through wireless networks, requesting information that they need to make sense of the local scene and sending information to other sensors as they request it," explains Avinash Kak, director of Purdue's Robot Vision Laboratory. One possible scenario is the mapping of an environment in three dimensions using stereoscopic and perhaps laser sensors, which could be augmented with information from additional sensors worn by people relating their visual point of view, location, direction, body temperature, etc. Such a scheme could be used to prevent elderly people with poor eyesight from colliding with objects or each other, determine medical emergencies, or monitor and protect children in day care or classrooms. Both Olympus and the Purdue lab will set up identical facilities in their home countries to facilitate overseas communication and collaboration between researchers. Kak describes the Purdue-Olympus project as unique because of its unprecedented integration of imaging and vision sensors, and he notes that Olympus is an ideal partner because its technology in both these product categories is outstanding. He says the ubiquity of the sensors will no doubt raise the specter of invasion of privacy, but insists that ultimately each individual will control how much personal data is broadcast to a sensor network.
    Click Here to View Full Article

  • "E.U.-Backed Group Researches Digital Home"
    IDG News Service (09/03/04); Blau, John

    The Amigo research project sponsored by the European Union plans to make devices from multiple vendors interoperable for home networking through the development of open-source middleware. "We aim to use as many of the existing standards and specifications as possible," noted Koninklijke Philips Electronics researcher Harmke de Groot at last week's e/home conference. The middleware along with the architectural rules and documentation would be openly distributed, thus implementing compatibility between consumer electronics, computers, mobile communications, home automation, and security devices. The Amigo project involves the participation of 15 companies, including France Telecom, Telefonica, the University of Stuttgart's Institute for Natural Language Processing, and Microsoft's German subsidiary. Another goal of Amigo is to create "open and intelligent services that go beyond what many manufacturers envision today," explained De Groot, who cited ambient intelligence as an example. De Groot's Amigo Web page describes ambient intelligence as encompassing four core elements: Ubiquity (an environment composed of invisible, multiple interconnected embedded systems); awareness (the system's ability to pinpoint and identify objects and people along with their objectives); intelligence (a digital environment's ability to analyze context, adjust itself to its denizens, and learn from their behavior); and natural interaction (the enablement of more human-like communication in a digital environment through sophisticated technologies).
    Click Here to View Full Article

  • "Computer Scientists at UH Developing 'Nurturing' Computers"
    University of Houston News (09/07/04); Merkl, Lisa

    The National Science Foundation's Division of Information and Intelligent Systems has awarded University of Houston computer science professor Ioannis Pavlidis a $640,169 research grant based on his breakthrough work with the Automatic Thermal Monitoring System, which non-tactilely monitors the physiological state of its user with a thermal imaging camera. Thermal imaging of the user's face enables the system to take bioheat readings from which almost all vital signs can be obtained, so that the user's health can be observed continuously. "An increased anxiety level, for instance, is indicated when we detect periorbital warming through thermal imaging," notes Pavlidis. A user's health could be monitored by a computer from up to several feet away with this method. The NSF grant will fund a three-year initiative to embed physiologic monitoring in human-computer interaction by tracking a person's health during computer use, with Pavlidis conducting human trials in collaboration with Columbia University's Medical University Lab and the Rochester, Minn., Mayo Clinic's Physiology Lab. The obliviousness of today's systems to their users' actual health is prompting researchers to propose techniques computers can use to determine and respond to users' physical and emotional conditions, thus facilitating a two-way interaction in which both the user and the computer are aware of each other and can respond accordingly. A goal of Pavlidis' project is to augment the user's experience and construct a new preventative medicine archetype by combining home- and office-based computing resources with new sensing, algorithmic, and interface techniques.
    Click Here to View Full Article

  • "So Your Roomba Vacuums...Does It Also Take Pictures?"
    Wall Street Journal (09/07/04) P. A1; Forelle, Charles

    A growing community of electronics and computer enthusiasts are earning the moniker "hardware hackers" because they like to tinker with existing commercial technologies, expanding or completely changing their functionality. Advocates claim their resurging interest in modifying gadgetry is being driven by a deluge of inexpensive, advanced consumer electronics. San Diego hardware designer Joe Grand remarks that tinkering has largely been the province of software hackers, but tweaking hardware is now gaining "coolness." Hardware hackers often practice their hobby out of a desire to come up with improvised solutions to problems: During his tenure at AtStake, Grand devised a way to deter colleagues from raiding the office fridge by installing the fridge into the cabinet of an unused minicomputer, and equipping the door with a customized circuit board that read employees' ID badges and only unlocked for authorized eaters. Meanwhile, New Zealand electronics teacher Stan Swan built a Wi-Fi connection range booster using a standard Wi-Fi receiver, computer cable, a garden hose coupler, and an inexpensive, parabolic utensil for fishing things out of woks. The inventor says the instrument can typically detect Wi-Fi hotspots three miles away with an unimpeded line of sight, whereas the usual maximum range of consumer Wi-Fi hardware is a mere few hundred yards. Tinkerers are sometimes attracted to the practice as an opportunity to turn a profit: For instance, the market rollout of Creative Labs' MuVo2 portable music player earlier this year was followed by the debut on eBay of extracted MuVo 4 MB disk drives priced at about $250. Hardware hackers sometimes risk the ire of consumer hardware manufacturers, who may perceive such activities as a threat to profitability.

  • "UW Computer Scientists Tout Achievements and Explain Industry Shortcomings"
    Wisconsin Technology Network (09/06/04); Stitt, Jason

    University of Wisconsin-Madison (UW) computer science and electrical engineering professors said the IT field is driven by academic research into computing techniques, network architecture, and computer security. The UW computer science department was one of the first, having been established in 1963, and now ranks as one of the top 10 computer science departments in the country. Department Chair Guri Sohi said the modern microprocessor "out-of-order execution" technique was developed at UW, enabling a performance breakthrough. In order to keep up with Moore's Law, Sohi said chip makers would have to utilize multiple processor cores without exception. Distributed computing is another innovation that is making its way out of the university laboratory and into the business sector: UW's Condor distributed computing architecture, for instance, has allowed the university to share supercomputing resources for greater flexibility and efficiency, and is continually gaining new adherents outside of the school, says UW computer science professor Miron Livny. UW researchers are also working to understand emerging security threats by studying the "background radiation of the Internet," or random network traffic that is addressed to non-existent destinations by viruses and other malware. UW professor Somesh Jha says hackers are becoming much more sophisticated and powerful in their methods, and that National Security Agency testers were recently able to compromise the central data server to a Swiss bank in just eight minutes, for example. Existing security technology focuses on specific patterns and signatures, but fails to detect malware according to their function. The current paradigm means anti-virus software is unable to protect against viruses unless users install updated signature files from vendors.
    Click Here to View Full Article

  • "Some in Tech Industry Critical of Bush Administration's Cybersecurity Efforts"
    National Journal's Technology Daily (09/03/04); New, William

    The Bush administration has not made cybersecurity as critical an issue as many in the technology industry have advocated, and the issue is unlikely to receive significant focus again until after the November election, according to sources. This includes the provision to create a Homeland Security cybersecurity director position, which has been stuck in a House authorization bill. "It's likely that we'll have to wait until after the election for there to be any real changes," says VeriSign vice president Tom Galvin. A Homeland Security Department spokeswoman says a debate over the status of cybersecurity issues will not take place prior to the election, although HSD is considering holding another summit on cybersecurity in December; the December 2003 meeting provided a needed push to cybersecurity work. A number of House Democrats sent a letter to agency Secretary Tom Ridge in August listing concerns about a vulnerability assessment on critical infrastructure and key assets. The White House Office of Management and Budget recently issued a memorandum ordering federal agency heads to set minimum information security standards as required by law. Meanwhile, the Energy Department and the Homeland Security Department have opened a test center for the U.S. Computer Emergency Readiness Team (US-CERT), which experts say is one key to securing the U.S.'s critical infrastructure.
    Click Here to View Full Article

  • "Every Move You Make Could Be Stored on a PLR"
    USA Today (09/08/04) P. 5B; Maney, Kevin

    A personal life recorder (PLR) is a portable device that records everything a person sees and hears via a camera and microphone, while using miniaturized storage technology that requires virtually no power and boasts massive capacity. Stuart Parkin with IBM's Almaden facility believes magnetic random access memory (MRAM), which he co-invented, is the key enabling technology for a practical PLR. IBM is just one of several companies working on the development and commercialization of MRAM, which is expected to become capable of storing 400 times as much data in the same space as today's smallest and densest hard drives within five to seven years. MRAM would offer instant access and consume dramatically less power than any hard drive, while the storage of entire programs in local memory would eliminate boot-ups and loading delays. Freescale is slated to introduce a 4 MB MRAM chip only one-tenth the size of the world's smallest hard drive by year's end, while IBM is planning the 2005 commercial rollout of a prototype MRAM chip developed in collaboration with Infineon. The Defense Advanced Research Projects Agency is underwriting some of the research into magnetic memory, and Parkin remarks that the military is looking into MRAM as a radiation-shielded memory solution for missiles. With its increased memory capacity, MRAM could conceivably fuel the development of richer software applications and tools such as the PLR, which Parkin believes could become a practical technology within 10 years.
    Click Here to View Full Article

  • "Big Tech on Campus"
    CNet (09/06/04); Reardon, Marguerite

    More and more universities are offering students and faculty campus-wide wireless Internet connectivity, distance learning, and other high-tech products and services, but educators note that such tools should be implemented mainly for educational purposes, and caution that some school-supported programs promote digital lifestyle technologies with flimsy classroom value that yield more benefits to providers than students. For example, Duke University's pilot distribution of over 1,600 new Apple iPods to students is touted as a way to augment class lectures, audio books, and language lessons, but critics say the program is little more than a marketing ploy to make Duke more attractive to prospective enrollments. Some campuses are subsidizing the cost of these initiatives through partnerships with tech vendors or service providers: MIT, for instance, built its iLab remote laboratory, which students can access anytime on a standard Web browser, with the help of a $25 million endowment from Microsoft. The Campus Computing Project reports that over 90 percent of U.S. campuses boast some form of wireless networking, while more than 45 percent of campuses reported strategic plans for deploying or updating wireless networks in the fall of last year. Some schools are starting to offer both faculty and students anytime/anywhere Wi-Fi broadband connections, which benefits purveyors of digital lifestyle products by exposing young consumers to the perks of seamless mobile computing. However, Wi-Fi, peer-to-peer applications, and other services come with significant security shortcomings. Unsurprisingly, schools are also major customers of antivirus, antispam, intrusion detection and prevention software, and other security products.
    Click Here to View Full Article

  • "The Human Factor Trumps IT in the War on Terror"
    Government Computer News (09/01/04); Jackson, William

    Information technology can be used as an intelligence gathering and analysis tool in the war on terrorism, but the organization of the intelligence community will need to change to make the data as effective as possible, according to industry experts. The place of IT in the war on terrorism was the topic of a panel of computer scientists at the University of Maryland. "While there is a lot of good information out there, it isn't getting to the right people at the right time," explained William J. Lahneman, coordinator of the Center for International and Security Studies in the School of Public Policy. The culture of "knowledge is power" in the intelligence community prevents more effective sharing of information, and James Hendler of the university's Institute for Advanced Computer Studies agreed that changing the culture of intelligence agencies would be a huge challenge. The scientists also stressed the need for a change in IT architectures, including the Web. And although terrorists have not used the Internet to carry out any significant attacks, they are using the Web more effectively to galvanize supporters than the U.S. government, according to the researchers.
    Click Here to View Full Article

  • "Behind in Broadband"
    Business Week (09/06/04) No. 3898, P. 88; Yang, Catherine; Ihlwan, Moon; Tashiro, Hiroko

    The number of broadband connections in the United States continues to rise, but the nation is falling behind in the broadband race in terms of share of population with broadband and speed of connections. Among nations in the Organization for Economic Cooperation & Development, the United States was 10th in broadband penetration last year, after ranking third in 2000. What is at stake is whether foreign companies will become leaders in technologies such as full-motion video, Web-based medical care, more sophisticated Internet telephoning, and online gaming, which depend on broadband. In online gaming, for example, Korea's NCsoft appears to have gained an edge, while Electronic Arts still does not have a multiplayer hit. The United States, which still does not have a comprehensive broadband plan, needs stronger leadership on broadband, and President George W. Bush and Democratic challenger John F. Kerry do not intend to address the issue until after the election. The commitment to competition, rather than huge government subsidies, helped Korea and Japan emerge as key broadband markets, in that allowing startups to use the networks of phone companies has resulted in lower prices and higher speeds. The United States has allowed the Bells to maintain their grip on their networks, and consumers are now paying $35 or more for a 1.5 megabit-per-second connection, compared with Yahoo! BB in Japan's price of $25 for 26 megabits. Wireless technology remains an option for producing another rival for phone and cable companies, but Bush or Kerry will need to take a tougher stance when dealing with powerful broadcasters, who currently hold the rights to the best spectrum for wireless broadband--that which is used for analog TV signals. Legislation that provides tax breaks for building networks at least as fast 20 MBps would provide another boost to broadband.
    Click Here to View Full Article

  • "Agents of Change"
    Computerworld (09/06/04) Vol. 32, No. 36, P. 24; Thibodeau, Patrick

    Computer experts predict autonomous software agents will revolutionize the operation of systems such as financial markets, supply chains, and computer networks. NASA's Earth Observing-1 satellite has been testing autonomous agents for the agency over the past year, using the software to keep watch over fuel consumption, for example; the task requires monitoring a number of factors, such as on-going experiments, staying on course, and dealing with unexpected demands. Steve Chien of the NASA Jet Propulsion Laboratory says the project is somewhat motivated by cost savings, since autonomous agents mean human intervention is not required, but that significant technical advances are being made as well. NASA's software agents are coded in the ubiquitous Java language, but draw from disciplines such as game theory, economics, and psychology. Autonomous software agent researchers recently convened at Columbia University for the Third International Joint Conference on Autonomous Agents & Multi-Agent Systems to test their programs and discuss applications. The Trading Agent Competition saw entries compete against one another to simulate PC factory operations, where the software had to negotiate with multiple other systems, including customers and suppliers, while monitoring changing market conditions; similar software is already being used to automate market trading, but the business-to-business aspect makes the challenge much more difficult. University of Michigan artificial intelligence laboratory director Michael Wellman says the Trading Agent Competition is a futuristic scenario, but one that is likely to come to pass. IBM is using autonomous software to help build more robust computer systems, and other researchers, including those at the U.S. Defense Department, are investigating autonomous software's ability to maintain communications links in the event of partial system failure.
    Click Here to View Full Article

  • "Robots Creep Into Biomed Landscape"
    EE Times (09/06/04) No. 1337, P. 1; Merritt, Rick

    The interface of robotics and bioengineering will spur major advances in the field of nanomedicine, according to experts at a recent biomedical robotics workshop sponsored by the Institute of Electrical and Electronics Engineers (IEEE). Potential technologies discussed at the workshop include tagged biochemicals or microelectromechanical systems (MEMS) that employ wireless communications, medical imaging, and electronics to track or treat illnesses; micro-level implants that could help regulate motor control functions in people with neurological damage by bonding themselves to individual neurons; the enhancement of microsurgery systems with force feedback to better facilitate robot-assisted surgery; and robotic placement of needles and microprobes to minimize hand tremor for surgeons and speed up the recovery process for patients. Requests for proposals for nanomedicine research centers have been issued by the likes of the European Union, the National Science Foundation, and the National Institute for Standards and Technology over the past year. "There's a convergence coming among robotics, microfabrication and bioengineering," explained University of Washington in Seattle professor Howard Chizeck. But the envisioned technologies cannot be viable without significant technical breakthroughs. Disease-tracking and -destroying nanoparticles, for instance, would have to use new tagging and communications methods, deal with potentially major communications latencies, be able to process data locally within the human body, constitute no threat to the system, and be safely removed from the patient. Workshop organizers announced BioRob 2006, the first joint IEEE conference to convene the group's separate biomedical and robotics communities.
    Click Here to View Full Article

  • "The Democratization of Supercomputing"
    Scientist (08/30/04) Vol. 18, No. 16, P. 30; Daviss, Bennett

    Accessing the supercomputing power needed to meet major scientific challenges such as protein folding is easier than ever thanks to advances in supercomputer speed and capacity concurrent with falling prices. Developments fueling this trend over the last several years include the growth of cluster computing, the emergence of low-cost 64-bit processors such as AMD's Opteron chip, the pervasive availability of commodity components, and increased adoption of the open-source Linux operating system by vendors. Distributed computing is cheap but unwieldy, while at the other end of the spectrum are supercomputers whose power is offset by bureaucracy and software compliance; occupying the area between these extremes are clusters. Simultaneous with researchers and vendors' adoption of open-source operating systems is the expanding use of off-the-shelf hardware, whose speed and capacity makes such products a viable alternative for more costly, specially tailored application-specific integration circuits and small computer systems interface drives. One of the drawbacks of more powerful, low-cost supercomputers is the difficulty in maintaining the software, a problem that will likely be exacerbated by the migration of software programs from old systems to new systems. University of Tennessee computer science professor Jack Dongarra thinks the professional science culture, with its deeply entrenched tradition of proprietary programming, is partly responsible for this situation. Researchers note that the software shortage issue must be resolved in parallel with the problems with swapping data between multiple processors and in and out of storage. "We've realized that two areas needing attention are the speed of the interconnect switch as well as memory bandwidth," explains Thomas Zacharia with Oak Ridge National Laboratory's computer and mathematical sciences division.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "My Oh MIMO"
    Network World (08/30/04) Vol. 21, No. 35, P. 35; Mathias, Craig

    New multiple input, multiple output (MIMO) wireless technology promises to revolutionize the enterprise by making WLANs and other wireless implementations much more powerful and reliable. MIMO systems benefit from the spatial dimension in wireless transmission, such as slightly delayed and reflected signals, while other single input or single output systems suffer in terms of performance. MIMO will likely be adopted for a wide range of wireless networks, but will have the most impact in the WLAN arena, specifically the upcoming 802.11n standard expected out as early as the end of 2005. Because the signals are basically 3D pictures of information, they can carry much richer data more efficiently than previous Wi-Fi implementations that had to use channel-bonding to increase data rates. Eventually, ratcheting up bandwidth in such a manner or relying on complex modulation will not be able to scale in line with data loads. MIMO is likely to see its commercial debut in pre-802.11n products that vendors release before the final standard is ratified, and these will most likely target the residential market, where Wi-Fi demand has seen the bulk of its growth. Ultrawideband (UWB) will likely come alongside 802.11n as a component connector within rooms, but because of its range and networking capabilities, it will never be able to serve as a replacement for a real WLAN as Wi-Fi specifications can. Another booster for MIMO-enabled 802.11n is VoIP over Wi-Fi, which will be able to take advantage of the additional headroom MIMO provides; and because MIMO is better suited for cluttered physical environments, range will be improved for these applications as well. Still, the increasing reliance on wireless in the enterprise, the introduction of mesh-networking techniques, and lower prices will foster denser WLAN hotspot deployments and eventually an entirely wireless enterprise.
    Click Here to View Full Article

    [ Archives ]  [ Home ]