HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 700:  Wednesday, September 29, 2004

  • "Paperless E-Voting Gets Thumbs Down From ACM"
    IDG News Service (09/28/04); Legard, David

    The early results of a survey conducted by the Association for Computing Machinery (ACM) indicate that 95 percent of ACM members favor the provision of physical records in voting systems so that voters might confirm that their choices were correctly registered. The ACM stated on Sept. 27 that electronic voting systems must respect voters' rights to have a paper ballot. "Voting systems should enable each voter to inspect a physical (e.g. paper) record to verify that his or her vote has been accurately cast, and to serve as an independent check on the result produced and stored by the system," ACM reported on its Web site. The association noted that independent computer scientists documented e-voting systems' vulnerability to malicious tampering, hardware malfunctions, and/or programming errors in four published reports, yet suppliers of e-voting systems and related trade organizations have downplayed the results, claiming that the systems boast sufficient protection. Although the state of Nevada mandated the statewide deployment of e-voting machines, paper records will be provided and used should a recount become necessary.
    Click Here to View Full Article

    For more information on the member opinion poll and survey results, visit https://campus.acm.org/polls/.

  • "Senate Weighs H-1B Visa Changes"
    CNet (09/27/04); Frauenheim, Ed

    The Senate is mulling a revision to the H-1B visa program that would add an exemption for foreign-born U.S. graduates with master's or doctorate degrees, which has earned the ire of labor groups and the approval of the business community. WashTech President Marcus Courtney claims the exemption is a senseless measure, and cites a recent study indicating that unemployment among tech workers has significantly increased, refuting assertions that skilled U.S. workers are in short supply. The proposal is also opposed by the U.S. branch of the IEEE, whose representative, Vin O'Neill, argues that guest worker visas are sparking an increase in offshoring. On the other hand, Jeff Lande with the Information Technology Association of America contends that the United States channels a sizeable amount of resources into foreign grad students, noting that "If they can't stay, we're wasting that investment, and we're losing access to some of the best and brightest around." An anonymous aide to a Democratic senator says that the package Congress is considering would also require the institution of additional worker protections, including authorization for the Labor Department to randomly audit companies that may be in violation of the H-1B program. The chief sponsor of the bill, Sen. Saxby Chambliss (R-Ga.), says he also wants to halt the practice among U.S. companies to bring in professional workers on L-1 visas so they can displace employees at third-party companies, and has called for a requirement that companies employing workers overseas have them on staff for at least a year prior to transferring them to a U.S. office on an L-1 visa. The anonymous aide says the package could be promoted as standalone legislation by the Senate Judiciary Committee, or be included in a larger bill.

  • "More Federal Lawmakers Want Paper Records of Electronic Ballots"
    Associated Press (09/28/04); Konrad, Rachel; Hallifax, Jackie

    Federal lawmakers are clamoring for the addition of paper trails to electronic voting systems in the face of growing doubts about paperless machines, but county registrars argue that making such changes before the November election would only sow confusion among poll workers and lead to breakdowns, despite computer scientists' insistence that counties could install paper-based systems if so ordered by courts. Among legislators favoring the inclusion of paper trails is Rep. Robert Wexler (D-Fla.), who filed a lawsuit demanding that all Florida-based touchscreen voting systems provide paper ballots; Rep. Rush Holt (D-N.J.), who warns that users of paperless systems have no way of knowing that their vote was properly registered; and Sen. Dianne Feinstein (D-Calif.), who is co-sponsoring legislation to make paper ballots a required component of all voting machines by 2006. Wexler has targeted the lack of recount accuracy in paperless systems, while other critics call attention to the machines' vulnerability to tampering, malfunctions, and software bugs. Wexler's criticism in particular could stimulate other major politicians to oppose paperless e-voting machines more vocally. Technology companies and certain constituents are at odds with paper trail advocates: Disabled proponents argue that blind voters would be disenfranchised without touchscreen machines, while another argument promotes the importance of touchscreens' multilingual options for non-English-speaking voters. Roughly 50 million Americans will be eligible to vote electronically on Nov. 2, and over 100,000 e-voting machines have been deployed throughout the United States.
    Click Here to View Full Article

    For information regarding ACM's e-voting activitities, see http://www.acm.org/usacm.

  • "The Future Voice of Speech-Driven Interfaces"
    IST Results (09/28/04)

    The IST program-funded SpeeCon project is helping European organizations create speech recognition databases to enable consumer multilingual speech-driven interfaces (SDIs) across the European Union. Successful development of the SDI market hinges on addressing two key technical challenges: The transference of interface command words into multiple languages, and satisfactory performance of consumer devices under varying acoustic conditions. "The collection of speech data is very costly--there are around 600 speakers required to create each language database," notes Herbert Tropf with SpeeCon project coordinator Siemens. "They need to be recruited, recorded, and the resulting data then needs to be transcribed and validated." The broad range of expertise among SpeeCon partners allowed 24 distinctive language databases to be collated, and Tropf says that each database usually features between four and six dialects and a spectrum of age groups; each speaker would have to repeat several hundred terms that blend application specific data, general phrases, and other words with phonetic resonance. The project studied six segments of the SDI market--mobile phones, audio/video devices, information kiosks, personal digital assistants, toys, and automotive devices--and recognized the greatest growth in the mobile phone and automotive sectors, as well as identified a need for speech application technology that can accommodate various environments, genders, dialects, and age groups. Adapting interface commands to differing acoustic environments was a big challenge for the project, and the SpeeCon project team used the collected data to develop algorithms that facilitated the coupling of raw speech data and environmental data to enable use of data in changing acoustic circumstances.
    Click Here to View Full Article

  • "Doh! New Format Could Store All of Homer's Life on One Optical Disk"
    Imperial College London (09/27/04)

    Imperial College London researchers are working on Multiplexed Optical Data Storage (MODS), an optical disk technology reportedly capable of storing as much as 1 terabyte (TB) of data on a CD- or DVD-sized disk. Dr. Peter Torok with Imperial College's Department of Physics believes commercial versions of the technology could debut between 2010 and 2015 if his team can further develop the technology with additional funding. The research team thinks MODS disks could cost about as much to fabricate as garden-variety DVDs, while MODS disk players would support backwards compatibility with existing optical formats. A single-layer, 250 GB MODS disk could store 118 hours or 7,080 minutes of broadcast quality film, while a double-sided, 1 TB disk could store 472 hours or 28,320 minutes of film. One bit of information can be encoded in each pit and land region of conventional CDs and DVDs, but the MODS researchers have devised a method to encode and recover up to 10 times as much information by using asymmetric pits, each featuring a "step" sunk within at one of over 300 different angles; the technique can be employed to precisely measure the pit orientation that reflects the light back. "We are using a mixture of numerical and analytical techniques that allow us to treat the scattering of light from the disk surface rigorously rather than just having to approximate it," notes Dr. Torok. Imperial College is developing the MODS disk technology with assistance from the Institute of Microtechnology at Switzerland's University of Neuchatel and Greece-based Aristotle University's Department of Electrical and Computer Engineering.
    Click Here to View Full Article

  • "File-Swap Software Gets a Speedy Update"
    SiliconValley.com (09/28/04); Chmielewski, Dawn C.

    The open-source BitTorrent file-swapping software can transfer digital files such as movies, music, and video games at a dramatically faster pace than other software. While other file-sharing services such as Kazaa usually take 12 hours to transfer a feature-length film, BitTorrent can accomplish the same task in roughly one-sixth the time; moreover, larger numbers of people sharing a specific file can speed up the downloading process. Rather than a permanent file-swapping network in itself, BitTorrent is a software tool that generates ad hoc networks of users who are after the same digital file. The software's speed stems from its ability to chop up a file into manageable fragments and enable users downloading that file to concurrently exchange portions of the file they have already downloaded from others. Upon a download's completion, the impromptu BitTorrent network is disconnected, which thwarts tracing. The software's mainstream penetration is sparking fears of copyright infringement in the entertainment industry. "We will continue to be technologically creative in both making our films and in zealously pursuing those who steal them," assures the Motion Picture Association of America's John Malcolm. Bram Cohen, BitTorrent's inventor, is unworried that legal action may be taken against him if his software is used to steal unauthorized digital content, arguing that he created BitTorrent as a simple distribution tool. Malcolm, however, is adamant that creators of software employed in acts of digital piracy share culpability, and "should take little comfort from their temporary celebrity status."
    Click Here to View Full Article

  • "Internet Fails to Shine for 'Silver Surfers'"
    CNet (09/28/04); Hines, Matt

    The growing population of senior citizens represents a lucrative opportunity for the technology industry, which could tap into a new market while improving the quality of life for elderly people by offering products designed to better meet their needs. The American Association of Retired People (AARP) estimates that "silver surfers"--Internet users older than 60--currently account for 15 percent of the online U.S. population, but AARP director of policy and strategy John Rother remarks that this demographic remains largely underserved by the IT and Internet industries, despite consumers' increasing willingness to turn to technology as a solution to the problems of aging. He says the most beneficial technological innovation for the elderly will be the transition to an electronic health care system, and he envisions Web tools and other products that could help people taking multiple medications or seeing multiple doctors better manage their care and thus reduce accidental fatalities. Potentially profitable health care technologies spotlighted at a recent Aging by Design conference at Massachusetts' Bentley College include computer-driven medication dispensing systems, and camera-equipped terminals that facilitate remote senior-caregiver interaction. Rother supports the implementation of national health care IT infrastructure standards as a vehicle for upholding seniors' privacy. Projects exploring new avenues of senior caregiving include MIT's PlaceLab experiment, in which volunteers reside in a sensor-equipped living space in an effort to understand how technology could improve the quality of life. Ideo CEO Phil Terry says IT companies must directly calibrate business models and goals with the needs of elderly customers if they are to better serve this market.
    Click Here to View Full Article

  • "IPv6 Expert Sees Adoption Growing...Slowly"
    Network World (09/27/04); Marsan, Carolyn Duffy

    Internet Protocol version 6 (IPv6) should really start to take off in North America in 2006, says IPv6 Forum chief technology officer and North American IPv6 Task Force chair Jim Bound. Infrastructure is needed before the multitude of Internet-enabled household appliances can come online, but Bound expects to see such devices widespread in five to eight years. The biggest IPv6 project in the United States right now is the Moonv6 Internet backbone, which will stretch across several U.S. sites and then connect overseas to partners in Europe, South Korea, and China; after a short grace period in which tunneling is allowed, Moonv6 peering sites such as the Internet2's Abilene node in Chicago will move to native IPv6. Bound says Moonv6 is being proposed to U.S. ISPs and that NTT Verio, AT&T, Sprint, MCI, and Verizon are currently in discussions concerning the technology. Applications are critical for the commercial deployment of IPv6 and vendors such as Oracle, PeopleSoft, and SAP are aware of the need. In terms of IPv6's integration with RFID, Bound says there is no technical issue since RFID is an identifier, not a network technology; IPv6 addresses could be used, however, to put RFID-tagged items on the network. Another big step forward was ICANN's announcement that DNS infrastructure would support native IPv6, demonstrating IPv6's reliability and relatively maturity. IETF work on IPv6 recently produced the Mobile IPv6 specification and should soon move on to multi-homing standards, which allow companies to utilize multiple ISPs. The upcoming Consumer Electronics Show will be an important indicator of what appliances might take advantage of IPv6, and Bound expects software vendors to also begin testing IPv6-enabled products.
    Click Here to View Full Article

  • "The Office Is Future-Proof"
    Financial Times Survey (09/27/04) P. 2; Harvey, Fiona

    The office environment will dramatically change in 50 years' time, with desktop computers disappearing, robots handling more manual tasks, and global connectivity enabling more intercontinental collaboration. Data centers located outside the city will run powerful database and processing applications, serving up computing power as a utility; many more people will work remotely, using handheld devices to stay connected wherever they go, although those devices will be much more sophisticated and easier to use than current handhelds. People will still rely on the Internet, but will not have to connect via a Web browser or type in Web addresses to get the information or applications they want. The connectivity cloud will enable easy information-sharing between people, but will also serve as a mechanism where computers can talk to one another and automatically complete many basic tasks, such as controlling lighting and heating or tracking physical security. The physical clustering of employees will not be as important in the future, and the global network will mean more office jobs will be threatened by outsourcing; on the other hand, increasing factory automation will outperform cheap human labor in some cases. More applications will be voice-activated and computer displays will be built into nearly every surface, while haptics will be used instead of a mouse, so that computers will be able to read hand movements or other gestures. Gartner Research vice president Mark Raskino says one artifact of both the past and present office will remain--the qwerty keyboard. "Keyboards are just a very convenient way of inputting large volumes of textual data, and I don't think we'll better them in the next few decades," he predicts.

  • "Inside Kerry and Bush's Technology Agendas"
    PC Magazine (09/22/04); Miller, Michael J.

    Michael J. Miller reports that the technology agendas of presidential rivals John Kerry and George W. Bush differ widely on various issues and are more sympathetic in other areas; the Kerry campaign responded directly to Miller's inquiries while the Bush campaign referred him to the Georgewbush.com Web site. Kerry's strategy regarding the offshoring of technical jobs is to limit such practices with a multi-pronged approach that includes various tax incentives, while the Bush campaign's outsourcing strategy, according to the Web site, calls for education and government reforms. Believing the expansion of broadband is essential for maintaining America's economy, quality of life, and job market, Kerry's agenda is to provide tax incentives for companies that invest in next-generation broadband and deploy broadband in underserved markets; broaden the spectrum available for unlicensed and licensed wireless services; and spark demand for broadband networks through greater R&D investments. Bush's tactic is to keep broadband access tax-free, remove outdated regulations that dissuade investments, and give consumers more options for affordable broadband access by promoting wireless access and broadband via power lines, as well as the passage of the Commercial Spectrum Enhancement Act. Miller found the Bush Web site to be absent of specific comments on such issues as unauthorized online file-sharing and the threat of spam, viruses, and worms. Kerry, however, was vocal on all of these matters: His file-sharing strategy is the continued enforcement of copyright laws and anti-piracy measures both inside and outside the United States, while his anti-malware agenda includes the implementation of global standards and best practices to bolster weak links, establishing a strong private-public alliance, and supporting a cybersecurity intelligence system that can spot cyber-threats. Bush and Kerry are more synchronized in terms of setting up tax credits for companies investing in R&D.
    Click Here to View Full Article

  • "Linux Could Become a Big Force in the Weather-Forecasting Field"
    Investor's Business Daily (09/28/04) P. A4; Tsuruoka, Doug

    Some meteorology experts believe supercomputers running Linux open-source software could be used to improve weather forecasting by tapping into a global programmer community, a development that has only recently become feasible. "You've got to have computers that stay up and running for a significant period of time so you don't miss forecasts," notes Linux Networx VP Eric Pitcher. "But now you've got plenty of applications, and Linux's open nature and architecture makes it easy to add computing power when you need it." Linux Networx has designed Linux-running supercomputers for the European Center for Medium Range Weather Forecasts and Iceland's Energy Authority. The first system is arranged in a cluster or grid configuration and is used to produce 10-day forecasts for 25 European countries to employ when routing oil tankers and cargo ships, while the second system models meteorological events in three dimensions. Pitcher says the cluster or grid scheme enables operators to easily split tasks into managed loads at very fast speeds, an essential quality in the prediction of major storm events such as hurricanes. He says the systems' biggest advantages over competing supercomputers is more cost-effectiveness and greater efficiency, which adds up to more accurate weather simulations for significantly less money thanks to the absence of license fees. The continuous improvement of free software such as Linux by a global open-source programming talent pool means that everyone benefits through reduced costs and greater flexibility.

  • "Information System to Help Scientists Analyze Mechanisms of Social Behavior"
    University of Illinois at Urbana-Champaign (09/16/04); Barlow, Jim

    The National Science Foundation announced the University of Illinois at Urbana-Champaign's BeeSpace project as the recipient of a $5 million, five-year research grant under the aegis of the Frontiers of Integrative Biological Research program on Sept. 16. BeeSpace is a software environment designed to aid researchers in the analysis of all sources of information applicable to social behavior mechanisms, using the society of the Western honeybee as its template. Library and information science professor Bruce Schatz says the purpose of the project is to better comprehend the relationship between genes and behavior beyond nature-nurture. BeeSpace will concentrate on biology and informatics research using new technologies developed by animal sciences professor Sandra Rodriguez-Zas and natural-language processing expert ChengXiang Zhai that will respectively support gene expression analysis and statistically evaluate information sources to enable semantic indexing of data to facilitate multiple-source, multiple-viewpoint interactive navigation. Entomology professor Gene Robinson says his BeeSpace team will build a molecular signature of all critical honeybee functions by profiling gene expressions occurring in the brains of individual bees that are captured while performing routine behavior. Broad categories of social roles that could be administered to higher organisms will be employed by the scientists. "By testing the BeeSpace environment with users at different levels, we hope to demonstrate the utility of concept navigation across community knowledge," says Schatz. "Similar information technology can then serve as a model of the Interspace, the generation of the Net beyond the Internet, where all the world's knowledge can be easily analyzed across many sources."
    Click Here to View Full Article

  • "Public 'Left Out' as Governments Plot Internet Regulation"
    ZDNet UK (09/24/04); Wearden, Graeme

    The United Nations' centralized approach to Internet regulation is disenfranchising regular Internet users and attracting the wrong kinds of people to the governance issue, said Web expert Esther Dyson, a founding chairman of ICANN. Dyson spoke at a debate organized by the Internet Society UK and Oxford Internet Institute. Concentrating power in either ICANN, which Dyson described as having "low-rent, measly" authority, or in a United Nations body would put a focus on power that attracts the wrong type of groups. "People who are self-appointed to represent other people are there, governments are there, the private sector is there, but the world at large isn't," she said. But U.N. Working Group on Internet Governance (WGIG) leader Marcus Kummer said the current approach, which includes one year of worldwide public consultation by the WGIG, allows for diverse and representative views. WGIG is currently working to produce a report due next July that will report on recent U.N. agreements. Such agreements have stated the Internet is a global "facility" that must be governed through "multilateral, transparent, and democratic" means. But the current framework disenfranchises those who cannot attend Geneva meetings or speak official U.N. languages fluently, said Adam Peake of the Japan-based Center for Global Communications. Dyson also said the U.N. is using too broad a brush to address Internet issues. "The question of how we stop spam is much different from the question of how we allocate domain names," she said, advocating discreet, limited efforts.
    Click Here to View Full Article

  • "Danger of Image-Borne Viruses Looms"
    Washington Post (09/23/04); Krebs, Brian

    Computer security experts say that hackers are close to being able to spread viruses by simply getting users to visit an infected Web site or to open an email message, using a new security flaw in Microsoft's Windows XP and Server 2003 operating system. Last week, three demonstration programs were published online. The tools being developed could allow hackers to take control of computers remotely. Microsoft is offering a free patch to fix the flaw. The flaw is in the code that the applications use to display JPEG image files, and experts say that hackers could embed viruses into digital photos. "This year we've seen a lot of changes to the fundamental ways we thought we were secure," says SANS Internet Storm Center director Marcus Sachs, and TruSecure chief scientist Russ Cooper predicts that a JPEG exploit will appear very soon. Most companies do not consider digital images virus threats, so most images go through corporate firewalls without hindrance. F-Secure antivirus research director Mikko Hypponen, however, is relatively unconcerned about the new threat since two other image formats were also found to be insecure earlier this year, yet neither of those vulnerabilities have resulted in a worm or virus. Hypponen says, "I don't really see any reason why this vulnerability would be any different. There is certainly a lot of discussion and hype about it, but I wouldn't be surprised if we didn't see any worms or viruses using this flaw at all."
    Click Here to View Full Article

  • "The Jumble Cruncher"
    New Scientist (09/25/04) Vol. 183, No. 2466, P. 36; Casti, John L.; Calude, Cristian

    Quantum-generated random numbers could enable today's computers to surpass the limitations of the Turing machine concept. In 1936, British mathematician Alan Turing created a sort of calculator model that could theoretically solve any computable problem; the difficulty with his scheme was that Turing machines could not reach a conclusion without exhausting all possibilities. Answers for some simple computable problems, such as the Goldbach conjecture that asks if every even number greater than four is the sum of two primes, have yet to be verified completely because of this limitation, while other, more important questions, such as the solution to Newton's equations of gravity, are also left unverified. Truly random numbers, not pseudorandom numbers that are generated through fixed rules, add an element that would allow computers to bypass the limitations of the Turing machine concept, which must follow rigid rules; but because of those rules, Turing machines cannot generate random numbers. Fortunately, there are existing devices that use quantum mechanics to generate strings of random bits fast enough for use in computers. The Quantis device from Quantique of Geneva, Switzerland, uses photons shot against a half-silvered mirror to generate random information for use in cryptography, for example. Quantum mechanics are not entirely understood yet, so it cannot be known for certain that the generated bits are truly random or rely on some yet-to-be-discovered set of rules, but they are beyond the capabilities of Turing machines and so promise to break the barrier posed by the Goldbach conjecture, for example.

  • "Whether Linux or Windows, No Software Is Secure"
    Chronicle of Higher Education (09/24/04) Vol. 51, No. 5, P. B21; Spafford, Eugene H.; Wilson, David L.

    The debate about whether the Windows or Linux operating system is more secure obscures the more important issue of generally shoddy software development, argue Purdue University researchers Eugene Spafford and David Wilson: Although different arguments and figures can be used to support either open-source or proprietary software, the fact is that software of both types can either be secure or insecure. The OpenBSD operating system is a good example of a secure open-source product while Gemini's Gemsos proprietary operating system has achieved the highest level of security certification from the Defense Department. Software is made secure not by virtue of being open-source, but through rigorous security testing, trained developers, high-quality tools, and focused design. Despite the claim that Linux code is examined by a number of developers, actual experience has shown serious flaws are found only after years of use; the Linux Security Auditing Project and Sardonix Security Portal, two efforts to systematically inspect open-source software security, have closed without accomplishing anything significant. Windows is more often a target for worms and viruses, and receives an inordinate amount of media attention due to its market penetration, but Linux is threatened by attack software, such as "root kits," easily found on the Internet. In networked environments, systems are often run by novice users who do not have the special skills and deep knowledge needed to maintain many proprietary systems and operate them securely. Besides creating security weaknesses, poor development practices also create bugs that cause systems to crash or otherwise cause trouble. Systematic effort is needed to create secure and reliable software, whether open-source or proprietary.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "The Grand Challenges of IT"
    Computerworld (09/27/04) Vol. 32, No. 39, P. 32; Hoffman, Thomas

    Researchers are working to address the biggest challenges in three key IT areas--artificial intelligence, processor performance, and chip miniaturization--by going outside traditional research methodology. Despite notable progress in the field of AI, computer systems are still poorly equipped when it comes to common-sense reasoning and understanding context. By next summer, the Defense Advanced Research Projects Agency plans to have completed the first series of tests of a computer-based "executive assistant" that can perform administrative chores such as prioritizing email requests for a business executive or a military commander; IBM, meanwhile, has been developing a software architecture that uses optimization-based approaches to support context-sensitive information as well as dynamically generate context-customized responses. Transistor power dissipation and other processor performance metrics are about to hit a wall, and IBM fellow Bijan Davari notes that the economic cooling of chips is becoming an increasingly difficult proposition. Researchers are therefore taking new approaches to processor performance as well as chip architecture and design: IBM is assessing new materials on a chip to boost performance, reduce power density, or both, and also investigating the potential of cooling gels and water-cooled microprocessors; MIT's Raw Architecture Workstation Project aims to improve bandwidth within chips; Los Alamos National Laboratory scientists are devising parallel processing systems that could deliver as many as 1,000 TFLOPS as early as 2008; and Sematech researchers are using low-k materials to increase circuit density while reducing the likelihood of electrical signal leakage. Miniaturizing chips becomes difficult because heat output increases with greater circuit density, so MIT is working to reduce the power supply voltage. Anantha Chandrakasan with MIT's Microsystems Technology Laboratories notes that power supplies cannot be scaled down without a scaling down of device thresholds. MIT researcher David Clark thinks the most urgent IT research challenge is the provision of universal communications to everyone worldwide.
    Click Here to View Full Article

  • "Home Is Where the Future Is"
    Economist (09/16/04) Vol. 372, No. 8393, P. S6

    The dream of the automated home, a concept that dates as far back as the late 1890s, has shown remarkable staying power, and the idea may be edging closer to reality thanks to the development of automated household technologies in many academic and corporate labs. The emergence and spread of sophisticated consumer networking technologies such as broadband and mobile phones over the last several years has also been a major factor in the drive toward the smart home. Wireless networks can establish communications among household devices as well as the Internet, while always-on broadband connections permit appliances to send and receive information whenever they wish. A home could be turned into a distributed computing system with the incorporation of wireless chips into every household device. Mobile phones could also become a key enabling technology for smart homes, with companies such as Nokia planning to use them as devices that can remotely control all household appliances. Before this can happen, the devices must be designed to support ease-of-use and simple interconnection, a challenge that will not be met until companies make learning the responsibility of the technology rather than the consumers, according to Robert Pait of Seagate Technologies. Incompatibility between devices from different manufacturers is another obstacle, one that must be overcome with the institution of a wireless-networking standard. Standardization efforts include Carnegie Mellon University's Pebbles project, Europe's Digital Living Network Alliance, and the Internet Home Alliance.
    Click Here to View Full Article

  • "Machine-to-Machine Technology Gears Up for Growth"
    Computer (09/04) Vol. 37, No. 9, P. 12; Lawton, George

    Harbor Research expects at least 1.5 billion devices to be Internet-linked worldwide by the end of this decade; machine-to-machine (M2M) technology aims to exploit this connectivity to make machines capable of direct communication with each other. M2M would enable machines to not only collate data about other machines, but use that information to formulate and implement a course of action. M2M technology is poised to grow exponentially thanks to a number of trends, including the wider use and falling costs of information-gathering sensors, along with the expansion of M2M into wireless technology. Most current applications of M2M include the monitoring of activities and environments and the control of devices and systems, while potential applications include traffic management, telemedicine, robotics, offsite equipment diagnostics, and vehicle fleet management. A typical M2M scheme consists of field nodes (sensors) for detecting real-world conditions and events, communications equipment for sending the node data to other nodes and centralized management applications, and software for input analysis and decision-making; the sensors gather and compile the data, converting it into analog and then digital signals that are routed across the network to the centralized management software, which then directs commands to the controllers or actuators that trigger machine action. Wireless M2M deployments add mobility and lower costs, and the practicality of wireless M2M has increased thanks to advances and cost reductions in wireless communications technologies. To succeed, M2M will need to overcome numerous hurdles: Companies will need to upgrade their employee training so that workers can effectively use and integrate M2M components; the M2M technology and development tools will need to mature; and the costs of hardware, software, and networking will have to fall significantly for M2M to be cost-effective.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM