HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 640:  Wednesday, May 5, 2004

  • "E-Vote Problems Overwhelm Feds"
    Associated Press (05/03/04); Yen, Hope

    The upcoming presidential election's integrity could be called into question because the U.S. Election Assistance Commission (EAC) claims in its first annual report that it lacks the funding to effectively address concerns about the security of electronic voting machines, and cites a dearth of authority to enforce e-voting standards. Computer scientists doubt the trustworthiness of paperless touch-screen e-voting systems because they cannot support proper recounts, and at least 20 states are debating whether to make paper ballots a requirement. The commission, led by onetime New Jersey Secretary of State DeForest B. Soaries Jr., will hear testimony from voting-equipment company executives, election officials, and academics on May 5, and issue a set of recommendations, such as the retention of paper ballots by poll workers as a backup measure in case e-voting machines start to malfunction. "If you look at the evolution of voting in America, only in [the] last four months has there been a federal agency whose exclusive focus is to deal with voting," the EAC chairman comments. "It's the foundation of our democratic structure on one hand, but on the other we've really left it to the states to manage completely." Most states turn to the National Association of State Election Directors (NASED) for guidance, and NASED relies on Wyle Laboratories, SysTest Labs, and Ciber to evaluate and certify e-voting machines and software for the entire country using outdated standards. NASED intends to transfer its certification powers to the National Institute for Standards and Technology, but the EAC report says a lack of sufficient funding is delaying the project.
    Click Here to View Full Article

    For more information on e-voting, visit http://www.acm.org/usacm.

  • "Reforms, Not Rhetoric, Needed to Keep Jobs on U.S. Soil"
    CNet (05/04/04); Frauenheim, Ed; Yamamoto, Mike

    Assigning blame in the offshoring of U.S. high-tech jobs and the erosion of science and engineering graduates, as politicians are wont to do, will not solve the problem: What is needed are reforms in U.S. education, more focused professional retraining, and heavier research investment; without them, the United States could lose its global technology leadership. Both President Bush and Democratic presidential candidate John Kerry are stumping for improvement in math and science education, but their various solutions could be undone by a profound shortage of qualified educators. An independent commission chaired by former IBM chief Louis Gerstner suggests that an additional $30 billion should be injected into funds for public-school teacher salaries, while the money educators earn should be determined by their performance. Meanwhile, Gartner analyst Linda Cohen says universities must stop concentrating purely on computer science and place more emphasis on business and management skills. The generally poor performance of federal retraining programs points to a need for change: Industry organizations are clamoring for more funds to benefit tech workers, while the IEEE wants the federal Trade Adjustment Assistance program to be extended to currently ineligible tech professionals. The success of such programs hinges on organizations focusing more on economic development than social services, according to Bay Area Technology Education Collaborative CEO Mike Wilson. U.S. research and development efforts are caught in a downward trend, as evidenced by a sharp decline in the number of U.S. publications in research journals and a generally flat rate of investment in physical sciences over the last decade; moreover, federal R&D spending increases outlined in the president's 2005 budget will primarily be channeled into homeland security and weapons development. Both Bush and Kerry support making a current federal R&D tax credit permanent, a move that industry groups say will encourage more investment.
    Click Here to View Full Article

  • "U.S. Is Losing Its Dominance in the Sciences"
    New York Times (05/03/04) P. A1; Broad, William J.

    The United States is quickly losing ground in international scientific research standings, and has already fallen behind Europe and Asia in terms of doctoral degrees awarded, for example. The change in scientific dominance has other evidence as well, including the number of papers published in scientific journals, Nobel Prizes awarded, and patent citations. Globalization and increasing standards of living in other countries are the fundamental reasons for the shift, as foreign countries produce more scientists and foreign science students in the United States choose to return home instead of stay. The American Physical Society recently reported that U.S. papers published in its Physical Review journal have become the minority behind papers from Western Europe and remaining countries; Physical Review editor Martin Blume says China's ascendancy is especially remarkable, having submitted more than 1,000 papers in one year. China is also drawing away American industrial research dollars as major U.S. firms, such as General Electric, set up research facilities in that country. The impetus is not just low costs, but also the abundance of scientists, says Industrial Research Institute President Ross Armbrecht; even more worrying is the loss of intellectual property as foreign workers in the United States leave to start companies in their own countries. A number of organizations have raised alarms about the situation, including the Council on Competitiveness, as have politicians seeking to attack the Bush administration. Senate Democratic leader Tom Daschle recently addressed the Association for the Advancement of Science, saying Bush has not made science a top priority--but White House science advisor John H. Marburger III refuted the claim, pointing out that U.S. federal science funding is at an all-time high, though the amount of money going toward military research is even proportionally greater than during the Cold War, suggesting lop-sidedness.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Some Counties Might Fight E-Voting Ban"
    Los Angeles Times (05/04/04) P. B1; Pfeifer, Stuart

    California Secretary of State Kevin Shelley's ban on Diebold e-voting systems in Kern, San Diego, San Joaquin, and Solano counties, along with new guidelines that election officials in 10 other counties must comply with in order to use e-voting machines in the November election, may spur some officials to fight these edicts in court. Officials in certain affected counties doubt that Shelley has the authority to approve or prohibit voting systems on a county-by-county basis, and protest that deploying a second, paper-based voting system as an option, in keeping with the new guidelines, would negatively impact poll worker training. Shelley recently calculated that making paper ballots available would cost the state $1 million, but registrars and voting machine manufacturers claim the expense would be much higher. "The secretary's proposals increase our cost and our risk with no evidence of improving the integrity of the election," argues San Bernadino County Registrar of Voters Scott Konopasek, who nevertheless admits that paper ballots are the best measure given their cheapness, ease of use, and lack of ambiguity. Shelley promised that any costs counties incur to satisfy the new guidelines would be covered by e-voting machine vendors, a decision that Sequoia Voting Systems' Alfie Charles calls unwarranted. Riverside County Registrar of Voters Mischelle Townsend thinks Shelley's actions defy common sense, noting that that e-voting systems have been welcomed by poll workers and voters. "[Shelley] has chosen not to listen to the people who are conducting the elections and instead to people putting forth what-if scenarios that have never occurred," she contends.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

    For information about ACM's activities regarding e-voting, visit http://www.acm.org/usacm.

  • "How to Save Energy: Just Guess"
    Wired News (05/04/04); Delio, Michelle

    A Georgia Institute of Technology research team led by Center for Research on Embedded Systems and Technology director Krishna Palem has developed Probabilistic Bits (PBits) chips whose components are intentionally designed to be unreliable in order to squeeze out more capacity and energy efficiency. "In the chips being built today, the hardware obeys the software instructions absolutely, even though the application software does not require such precision," Palem explains. "So the simple idea that we have is to make the following connection: If probabilistic algorithms do not need the hardware to be reliable, then why invest a lot of money and time in making hardware reliable?" Furthermore, batteries are depleted faster because computer circuits consume a great deal of power by carrying out precise calculations. Palem notes that computer applications such as voice recognition and encryption already employ probability to boost computing speed, and posits that processors such as the PBits chip will not only increase mobile devices' ability to run more complex applications, but could theoretically extend Moore's Law into the bargain. The Georgia Tech researcher reports that his team tested PBits by running simulations using a spoken-alphabet voice-recognition application common in numerous cell-phone applications, and the results demonstrated a power usage compression factor of 1,000, which far exceeded expectations. Palem's project is underwritten by the Defense Advanced Research Projects Agency. The PBits chip will be built into a proof-of-concept device that will be tested this summer using applications with apparently high precision requirements, such as financial risk and financial analysis applications.
    Click Here to View Full Article

  • "Free Software Project Undaunted Despite Apple Threats"
    IDG News Service (05/03/04); Ribeiro, John

    The India-based Sarovar free software development community site announced that it would halt its hosting of the PlayFair free software project upon receiving a notice of alleged copyright infringement from Apple Computer, but open software advocate Anand Babu, who now serves as PlayFair's maintainer, promises that the project will soon be back online. "We are working on a better version, and we are hosting it outside the U.S.," he explains. PlayFair was designed to allow users to play music from the Apple iTunes Music Store on less restricted formats. Apple has had PlayFair in its crosshairs ever since its anonymous author first hosted the project at the Open Source Development Network's SourceForge.net Web site, and the company pressured SourceForge.net to remove the project in April by invoking the Digital Millennium Copyright Act. "What is really happening is that a corporation is using legal means to shut down a free software project in India for the first time, and the small project is left defenseless even though they believe that they are right," argues Sarovar maintainer Rajkumar Sukumaran. He insists that PlayFair, which is licensed under the GNU General Public License, is merely used to enable fair use for music legitimately bought from the iTunes Music Store by removing the digital rights management mechanism from a song, as long as the authorized key is available. Sarovar notes that PlayFair users are not granted any special facilities outside those provided by Apple, while Sukumaran says the program is not used for music distribution, nor can it be employed to play music, copy music to CD, disseminate it on a peer-to-peer network, or edit content.
    Click Here to View Full Article

  • "Virtual Reality at Work, Literally"
    IST Results (05/03/04)

    The Information Society Technologies program-funded VIEW OF THE FUTURE project was created to determine how virtual reality (VR) technology could be practically deployed in the workplace and reach its full potential. The project, coordinated by John Wilson, professor of human factors at the University of Nottingham, established five chief obstacles to the widespread adoption of VR and virtual environments (VE): Usability problems, a lack of improved technical development, difficulties with integrating VR technology to current workplace technology, potential health and safety issues, and few practical examples and proof of VR's benefits. VIEW OF THE FUTURE developers used real industrial end-user input to improve VR technology and invent new tools to comply with user needs and widen VR's usability. The tools give users an easy, accurate way to control the 3D menu, add relevant data, or map out the routes they and others have followed in the VE, as well as the activities they have performed. Wilson praises the project's PI-casso mobile VR system as a tool that "gives a high quality VR experience that isn't anchored to one place and is accessible to a wider range of users." Meanwhile, the Basic Application Framework allows VR systems to be tailored to users, and supports new interaction devices, 3D user interfaces, and navigation metaphors. The participation of industrial collaborators in the project allowed VR/VE applications to be developed using real-world and organizational requirements as a platform, continuously assessed against utility and usability needs, and tested to highlight VR's workplace benefits. The final VIEW OF THE FUTURE report concludes that the project demonstrated "good adventurous science and also well-founded, industrially relevant application."
    Click Here to View Full Article

  • "Telling Lies"
    ScienCentral (04/29/04); Lurie, Karen

    Cornell University experimental psychologist Jeff Hancock set out to determine whether people are more likely to lie on the phone, in email, in instant messages, or face-to-face by having a team of 30 students keep a record of all their social interactions and any falsehoods that cropped up; the final tally registered 1,198 interactions and 310 lies. Students engaged in an average 6.11 social communications and 1.6 lies each day, which translated into lies being involved in about 25 percent of their social communications. Hancock concluded from the research that email communications exhibited the most honesty while phone conversations tended to be the least honest, with IM and face-to-face interactions falling in between. The results surprised Hancock and other social psychologists, because some believed that humans would be more likely to lie using media that reduced the discomfort one associates with lying. By that logic, email would appear to be the media most likely to encourage lying. "A lot of times we'll think of the Internet as a very deceptive place where you can say and do whatever you want, because you're anonymous and things like that; whereas this research suggests that sometimes on the Internet we may in fact be more honest," Hancock explains. He notes that the phone supports lying more readily because most lies appear spontaneously in conversation, but also observes that more experienced email users are likely to lie in email more frequently. Hancock says different modes of communication are more or less supportive of lying according to three features: Whether messages are exchanged in real time, whether the conversation is being recorded, and whether or not the speaker and the listener share the same physical space. Hancock's research was presented at the April 29, 2004, "Computer-Human Interaction Meeting" in Vienna, Austria, sponsored by ACM's Special Interest Group on Computer-Human Interaction. The paper will be published in the April 2004 issue of the Proceedings of the Conference on Computer-Human Interaction.
    Click Here to View Full Article

  • "Hampshire College Student Uses J.K. Rowling's Quidditch as Basis for Artificial Intelligence Experiment"
    AScribe Newswire (05/04/04)

    A Hampshire College computer science student developed a virtual evolutionary environment for computerized teams playing the game Quidditch, which young witches and warlocks play in J.K. Rowling's Harry Potter stories. The project explores the evolution of artificially intelligent teamwork, program co-evolution, and genetic representation. Quidditch teams are created randomly by one software program, giving each team unique capabilities in different Quidditch functions, such as chasers, beaters, and seekers; another program evaluates team performance, and winning teams survive another generation and propagate their successful traits. Hampshire student Raphael Crawford-Marks developed the simulator based on the Breve simulation environment created by a fellow student, and also used a Hampshire-developed computer language called Push to create the Quidditch-playing programs. He based the entire Quidditch concept on queries from his professor and the RoboCup soccer tournament, which pits real-life robot teams against one another in regulated tournaments. That event is considered a benchmark in artificial intelligence and allows for evolutionary development of robotics hardware and software. Crawford-Marks says his computer simulation works much faster than RoboCup soccer because it is virtual, and notes that he has seen tremendous advancement in his virtual Quidditch teams. When the program started, the teams played as six-year-olds might, but after 50 generations, they have become nearly impossible to beat. Supervised by Hampshire computer science professor Lee Spector, the project runs on a Beowulf-style cluster computer and involved the integration of computer science skills, design, and data analysis.
    Click Here to View Full Article

  • "3D Search a Thing of the Future"
    NewsFactor Network (05/03/04); Lopez, Jason

    A long time must pass before ubiquitous image-based search engines can emerge, but progress is being made in systems that could serve as niche tools in the near future. A 3D model search engine developed by Thomas Funkhouser of Princeton University's Shape Retrieval and Analysis Group has been on the Web since 2001; the engine mines an index of over 30,000 models to find the shapes that most closely correspond to configurations the user draws with a mouse. Funkhouser explains that shape indexing and retrieval is accomplished with statistical techniques, and notes that his team is working on a practical query interface and modeling shapes that generate solid results. The tool could prove useful in the short term in such fields as inventory, engineering, design, and mapmaking. Apostolous Gerasoulis, whose Teoma search engine is used by Ask Jeeves, comments that image-based search is an additional information retrieval technology that could prove very useful. He admits that the engine is too inaccurate to analyze a piece of artwork, for example; "But it might be able to look at a street and tell you where you are," he adds. Computers lag far behind the human mind when it comes to rapidly assigning context to data, particularly in the processing of images. But even if a ubiquitous image-based search engine remains a pipe dream, Funkhouser and other researchers are helping establish a foundation for search engines capable of carrying out specific functions.
    Click Here to View Full Article

  • "Facing Facts in Computer Recognition"
    Pittsburgh Post-Gazette (05/03/04); Spice, Byron

    Face recognition is a major challenge in the development of computer vision systems, but researchers such as Henry Schneiderman of Carnegie Mellon University's Robotics Institute and former Robotics Institute director Takeo Kanade are developing software with surprising accuracy. Schneiderman has developed Face Detector, a computer program whose precision in finding faces in still images is unmatched. Researchers cannot precisely impart to a computer what a face is supposed to look like, so the computer is shown samples of faces as well as non-facial objects so that the program can develop statistical rules for identifying faces. The pixels in an image are represented by numbers in a computer, and Schneiderman's program employs low-resolution monochrome images with a 768-pixel area, which means that the computer must ascertain facial patterns by examining 768 numbers. Statistical study indicates that facial symmetry plays a key role in face recognition, with eye spacing of particular importance; Schneiderman adds that the forehead could be a valuable element as well because of its unusual brightness level and symmetrical surface. Face Detector can pick out 93 percent of the faces in a set of images while generating four false positives, and raising the percentage of detected faces also raises the number of incorrect identifications. Schneiderman's program is being touted as a security measure whose applications could include detecting people in secure locations or identifying specific faces in throngs. Schneiderman also says the program could be used to search for and organize digital camera images; it has already been used in one-hour photo processing shops for the purpose of locating eyes in photos to reduce red eye.
    Click Here to View Full Article

  • "Will Your Next Display Be Flexible?"
    IDG News Service (05/03/04); Shah, Agam

    The mainstream adoption of thin, flexible display screens could happen within 10 years, postulate researchers attending the recent Flexible Displays & Electronics Conference. Patricia Kinzer of conference host Intertech points to the rapid development of the flexible display industry, noting that the technology is migrating into the wearable products space as well as the handheld applications sector. Peter Slikkerveer of Philips Research reports that the mass production of flexible displays will not emerge until scientific, manufacturing, and cost-related issues are addressed. Kimberly Allen of iSuppli/Stanford Resources says that flexible display manufacturing is still too costly, and does not think that mass consumer flexible display products will be available until at least 2010. She points out that two currently available flexible display products, an electrophoretic sign from Gyricon and a curved 3D liquid crystal display (LCD) used by Nike, only have niche appeal because of their limited usability. Among the technologies discussed at the conference were organic light-emitting displays that use flexible polymer substrates, though Allen cautions that such screens would require a barrier layer to keep out water. LCD-based flexible display technology, meanwhile, faces scientific obstacles such as the image quality's vulnerability to deterioration when both the backlight and the substrate are bent. John Burgos with the U.S. Army's display technology team notes that the military is investigating potential applications for flexible displays, which could lighten the load for soldiers in the field and significantly augment their awareness of environment and battlefield conditions.
    Click Here to View Full Article

  • "NIST Quantum Keys System Sets Speed Record for 'Unbreakable' Encryption"
    EurekAlert (05/03/04)

    Scientists at the National Institute of Standards and Technology (NIST) recently demonstrated a quantum key distribution (QKD) system that successfully transmitted a stream of photons to produce a genuinely secret key at a rate that surpassed the speed of previously reported QKD systems by a factor of approximately 100. The test involved the generation of an encryption key that could be relayed back and forth between two NIST buildings separated by a space of 730 meters; the photons are produced by an infrared laser, while keys are transmitted and received by telescopes equipped with 8-inch mirrors. A computer can generate ready-made quantum keys by using specially printed circuit boards to process the data in real time. QKD is sensitive to the slightest measurements made by an eavesdropper, resulting in detectable changes at the receiver. The NIST system diverges from previously reported QKD systems in the way it recognizes a photon from the sender out of a large group of photons from other sources: The QKD photons are time-stamped by the scientists, who then search for them only when they are expected. NIST physicist Joshua Bienfang says this scheme can only be effective by keeping the observation time very brief, adding that his team has modified some high-speed communications methods to boost the photon-scanning rate. The system's 1 million bits-per-second data transfer rate makes QKD practical for streaming encrypted video and other new applications, and researchers are using the system as a testbed to devise data-handling methods associated with quantum encryption. The project was partially funded by the Defense Advanced Research Projects Agency.
    Click Here to View Full Article

  • "Fine-Tuning BMI"
    Computerworld Singapore (05/11/04) Vol. 10, No. 19; Sze, Tan Ee

    The Institute for Infocomm Research's (I2R) NeuroComm platform is a real-time brain signal acquisition and analysis system designed to refine brain machine interface (BMI) methodologies. The platform is based on Windows and incorporates signal processing, pattern recognition, and other areas of I2R specialization; its core component is built in standard C/C++, making NeuroComm easy to port into Linux, WinCE, Unix, and other platforms. I2R plans to use NeuroComm to tackle some of BMI research's major problems, such as the fluid nature of a person's brain state during cognitive tasks and brain signal variations between different people even when they are performing the same mental task. NeuroInformatics Lab director Dr. Guan Cuntai reports that I2R will be able to conduct new cognitive studies, confirm new user paradigms, construct demos, and build applications with the platform. "Since BMI is a highly adaptive close-loop system, and the brain signal is constantly changing and complex, so one of the challenges here is to build a novel mathematical model for the brain neuro-system and for the brain signal itself," he explains. A recent BMI breakthrough involved the implantation of chips in the brains of paralysis victims, but I2R's approach is noninvasive, and employs electroencephalography to make future BMI systems easier to use and able to accomplish more. I2R executive director Lawrence Wong notes that BMI promises much more than just giving neurologically- and movement-impaired people a new mode of communication: "Brain-machine-interface adds a new dimension to existing multi-modal human-computer interface, incorporating machines such as computers, robots, and other devices into 'neural space' as extensions of our muscles or senses," he boasts.
    Click Here to View Full Article

  • "Crackers Redux"
    eWeek (04/26/04) Vol. 21, No. 17, P. 29; Fisher, Dennis

    Cliff Stoll chronicled the attack on Unix machines at the Lawrence Berkeley National Laboratory in Berkeley, Calif., and university and military facilities nearly 15 years ago in his book, "The Cuckoo's Egg: Tracking a Spy Through the Maze of Computer Espionage." The story that Stoll, a volunteer system administrator at the Berkeley lab at that time, tells shows that similar methods and tactics were used this spring to hack into Linux machines at Stanford University, the National Supercomputing Center for Energy and the Environment in Las Vegas, the San Diego Supercomputer Center, and some locations of the TeraGrid, the distributed network of supercomputing centers. Although security technology and techniques have improved over the years, Stoll offers a lesson that also would have helped security experts involved at Stanford and the supercomputer centers. The attackers appear to have targeted unsuspecting users to compromise their passwords as well as the poor security practices of the supercomputer centers, in a strategy that was not innovative or original. It is deja vu, according to Mark Rasch, chief security counsel at Solutionary and a former U.S. attorney who prosecuted the attackers in 1986. "They start with a password compromise, which leads to a password attack, then root, then a root kit and so on," says Rasch, adding that changing guessable passwords afterwards comes a bit late. "If this guy is smart, he was creating accounts that aren't root, that they haven't found yet."
    Click Here to View Full Article

  • "Radio Freedom"
    New Scientist (04/24/04) Vol. 182, No. 2444, P. 28; O'Brien, Danny

    The rollout of practical commercial applications for a technology currently known as ultra-wideband (UWB) has been held up by bureaucratic red tape since it was first proposed by engineer Gerald Ross in 1978, and the FCC's long-in-coming authorization for such applications has ignited a storm of protest from various industries. A UWB signal, in which short radio pulses are smeared across the radio spectrum, could theoretically carry gigabits of data per second and support such things as high-speed personal area networks, intrusion detection, and personal radar. The FCC did not approve the technology for licensing until 1998, after UWB had built up a sizable stable of advocates and military experiments demonstrated no interference with other radio systems. Following the FCC's approval was a tsunami of arguments against UWB by mobile telephone companies, GPS satellite navigational systems manufacturers, the airline industry, and others, whose worries ranged from the obsolescence of expensive spectrum allocations to cumulative UWB signal interference. More intense testing followed to determine UWB's interference potential, none of which was conclusive, and in February 2002 the FCC authorized a bowdlerized version of UWB restricted to the frequency bands between 3.1 GHz and 10.6 GHz. Dewayne Hendricks of the FCC's Technological Advisory Council says this makes the technology useful only for personal area networks. "UWB is essentially crippled," he attests. Furthermore, the implementation of high-speed wireless networks, UWB's killer app, is being hindered by competing, incompatible standards, and this situation makes Hendricks and others think that the technology should be developed outside the United States to prove its non-interference once and for all.

  • "Mesh Networks Winning Converts"
    Network World (05/03/04) Vol. 21, No. 18, P. 12

    Advantages of wireless mesh network technology include greater range and flexibility: "The big strength of wireless mesh is the ability to put it up and tear it down very quickly," says International Data analyst Abner Germanow. Mesh nets employ complicated algorithms that facilitate automatic discovery, routing, and rapid handoffs, and lack a single point of failure. Software links nodes together in a peer-to-peer architecture, enabling each individual wireless client to be aware of neighboring nodes and share information and images among them. In the event of node failure or congestion, mesh net nodes can reroute signals to circumvent the problem. The net's range varies according to the client's 802.11 radio, radio power levels, and the design of the antenna, while the need to deploy Ethernet cable and electrical wiring is kept to a minimum; network security also differs among vendors. Among the problems currently facing mesh network technology is a lack of commercial products and industry standards, although the Institute of Electrical and Electronics Engineers' 802.11 committee has set up a task group to meet the second challenge. Germanow points out that the networks' auto-discovery mechanism creates a lot of traffic, which is also a problem: "When you get into a large number of nodes, then the routing traffic volumes can take over with all the nodes saying 'Here I am,'" he explains. The solution to this problem, according to PacketHop's David Thompson, is to minimize chatter algorithmically, and he observes that mesh networks' strength and reliability increase as more nodes are added. Mesh network technology is proving popular as a mechanism for outdoor-based municipal and public safety wireless networks.
    Click Here to View Full Article

  • "The Pursuit of Productivity"
    CIO Insight (04/04) No. 38, P. 23; Parkinson, John

    As companies seek increased productivity, they will be forced to look to offshore solutions, writes Cap Gemini Ernst & Young Americas' chief technologist John Parkinson. Given that companies continually refine their process design and focus, they will be able to achieve steadily improving productivity rates, both in terms of application development and lifetime ownership costs. The unit cost of an hour of effort is about $85 in the United States, combining labor and other support costs, and companies get more output for their money when productivity increases. Offshore solutions, however, offer a big one-time savings opportunity not available in the United States; moving operations to places such as India and China allows for dramatically lower labor costs, which are the largest portion of today's application development expenditures. Resources cost less in offshore locations: Companies can get the same one hour of effort for $65 in Canada or Ireland, for $45 in Eastern Europe or more developed areas of India, and for just $12 in "emerging" areas of India and in China. The cost equation, however, is not so simple, because companies have to factor other issues not present in purely domestic operations, such as the threat of geopolitical instability, cultural impediments, and legal and financial differences; these risks and barriers increase the unit cost of one hour of effort, but offshore options still yield savings. Not all work can be outsourced abroad yet, but the barriers to doing so are increasingly fewer and less daunting. As the forces of globalization, technological advancement, and improved education take effect in India and China, the cost equation of sending IT operations abroad will probably stabilize at about 20 percent savings over U.S. costs by 2010. The digital world is more susceptible to the effects of offshoring than is the analog world, and U.S. political and private sector leaders should adjust to the new reality, concludes Parkinson.
    Click Here to View Full Article

  • "Voted Down"
    Government Executive (04/04) Vol. 36, No. 5, P. 34; Harris, Shane

    The Secure Electronic Registration and Voting Experiment (SERVE) was organized under the auspices of the Federal Voting Assistance Program (FVAP) to test whether Americans living overseas could vote over the Internet and thus avoid falling victim to the vagaries of paper-based absentee voting. However, four members of a 10-person panel commissioned by FVAP to assess SERVE reported that the system was dangerously insecure, and their findings were enough to convince Deputy Defense Secretary Paul Wolfowitz to abruptly terminate the project. Among the dissenting panelists were two outspoken critics of electronic voting, former Compaq computer scientist David Jefferson and Johns Hopkins associate professor Avi Rubin, who has become synonymous with e-voting criticism after disclosing flaws in Diebold voting terminals. Rubin, who describes SERVE as "the worst idea I've ever heard," explains that its weaknesses directly stem from the inherent instability and vulnerability of the Internet. Jefferson notes that an airtight online voting system requires a redesign of the entire Internet infrastructure, the construction of safer computers and operating systems, and the replacement of obsolete machines by consumers--a 10-year effort, in the best-case scenario. Jefferson stands by his findings about SERVE, insisting that until computers offer airtight security, "I feel a responsibility to kill it...And to keep killing it." On the other hand, Carnegie Mellon University scientist and panelist Mike Shamos takes these findings to task, arguing that the conclusions were based on guesswork and were not arrived at scientifically. Internet voting supporters angrily contend that the four panelists who called SERVE unsafe are in error because they do not understand the electoral process. FVAP director Polli Brunelli says that eventually "brilliant people" will develop a safe online voting system, making it easier for the 6 million overseas Americans to vote, and perhaps everyone else.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM