HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 712: Friday, October 29, 2004

  • "Voting Machines Remain Unsecured, Expert Warns"
    EE Times (10/28/04); Brown, Chappell

    Speaking at an American Institute of Physics forum on the future of information technology at IBM's T.J. Watson Research Center this week, Stanford University computer security expert and former ACM President Barbara Simons warned that electronic security is ill-prepared to accommodate elections. "In my view, voting is a national security issue, and I have to say that I fear that what we are going to see with this upcoming election is a huge amount of chaos and a lot of questioning of results," she declared, noting that e-voting systems' security deficiencies extend beyond vulnerabilities to hacking. Simons said the only surefire way to record votes accurately is to use paper ballots. She was involved in a government study of voting machine security issues that was terminated because the panel's findings were unequivocally downbeat, and she said the system ultimately embraced by the government was even more problematic. Data security and privacy issues constitute the two key areas of current election integrity research, and the issues dovetail with legal and social facets of society that technologists have little influence over. Experts caution that the abrupt introduction of computer technology into such domains may have unforeseen consequences. For instance, election officials have no control over election results because e-voting machine software is proprietary to the machine vendors. Although voting machine makers recently agreed to give pieces of their code to NIST's National Software Reference Library, their failure to furnish source code or 11th-hour patches means election security's requirements remain largely unfulfilled, noted computer scientists.
    Click Here to View Full Article

    Barbara Simons is co-chair of ACM's U.S. Public Policy Committee Committee (http://www.acm.org/usacm).

  • "Stamp of Reproval"
    CNet (10/28/04); Lemos, Robert

    Election officials and outside critics of electronic voting machines blame outdated federal voting standards for the imperfect solution some 30 percent of American voters will use during the presidential election. The Maryland State Board of Elections and many other organizations purchased thousands of electronic voting machines with confidence that those systems were certifiably in line with federal election standards, but the problem was those standards were written in 1990 and did not reflect the current technology landscape. The Federal Election Commission issued updated standards in April 2002 that experts say are an improvement, but Election Assistance Commission co-commissioner Paul DeGregorio says today's direct recording electronic (DRE) systems have several iterations to work through before they are fully reliable. "I think it is going to be 10 years before these issues work their way out," he says. Much has been made of inadvertently disclosed source code and other critical evaluations of DRE systems, but observers all agree the most critical component to securing e-voting is tighter testing and certification procedures. Up to now, states mostly relied on federal voting standards as guidelines, and have been free to implement voting as they see fit, and many states that purchased new machines have rushed to re-test them in the last year after realizing the inadequacy of initial processes. The League of Women Voters expressed confidence "in the certification and standards process" in February 2004 before reversing itself six months later and calling for verifiable systems. The main problem with testing and certification processes was the lack of accountability and rigorous standards enforcement, says election technology consultant Roy Saltman.
    Click Here to View Full Article

    For information about ACM' activities involving e-voting, visit http://www.acm.org/usacm.

  • "Technology Adds New Element to Getting Out Vote"
    Los Angeles Times (10/28/04) P. A14; Menn, Joseph

    Technology could be the deciding factor in the presidential election: Campaigns, activists, and political parties' application of technology toward the courting of prospective voters is reaching unprecedented levels, in terms of cost and scope. Christopher Arterton of George Washington University singles out personal digital assistants (PDAs) as perhaps the most important strategic tool in the election, because they give activists "the accessibility and portability of a database that used to be available only when you went to the headquarters." GOP volunteers have been using PDAs to note Republican residents in their neighborhoods; on election day, another set of volunteers will check off the recorded voters' names as they show up at the polls, while Republicans whose names are not checked off will be phoned and urged to vote. "What it allows us to do is reach out to more voters on election day than we could have without the technology," explains Christine Iverson with the Republican National Committee. Meanwhile, America Coming Together (ACT), a pro-Kerry organization, has been canvassing for voters in Nevada, Oregon, and other key states using Tungsten T2 PDAs. Canvassers can access a tremendous amount of data--addresses, political inclinations, gender, personal concerns, and so on--with the handhelds, so they can tailor their pitch to potential voters; the T2 can also rapidly transmit any important new information the canvassers glean to the central ACT database. The PDAs can display video clips to voters, offering a new spin on targeted advertising, while ACT technology staffers predict that additional digital neighborhood maps and added wireless communications will enhance the canvassing system even further. What is more, the information compiled through such efforts will be used to orchestrate future campaigns.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "NASA Steals the Supercomputing Crown"
    New Scientist (10/27/04); Biever, Celeste

    NASA, Intel, and Silicon Graphics researchers boasted on Oct. 26 that the Columbia supercomputer achieved a data transfer rate of 42.7 teraflops in a standard benchmark test, making it the fastest supercomputer in the world. Top500 supercomputing list supervisor Jack Dongarra also notes that Columbia has reached a speed of 52.7 teraflops, off the record. Columbia, which is used by NASA to simulate shuttle missions and weather systems, has stolen the title from IBM's 36-teraflop Blue Gene/L, which recently trounced the reigning Top500 champion, the NEC Earth Simulator. The 10,240 Intel Itanium 2 processors the NASA system employs are split up into 20 512-processor nodes that accelerate data exchange rates by pooling their memory. Columbia project manager William Thigpen reports that this design allows data to be transferred between nodes at a faster speed than between processors within the same node, and eases debugging and programming as well. The Columbia benchmark is the third announcement of supercomputing superiority since Sept. 29, when IBM declared Blue Gene/L's assumption of the crown. NEC reported on Oct. 21 that its SX-8 system had overtaken Blue Gene, although this statement has yet to be confirmed. University of Georgia computer scientist and Journal of Supercomputing editor Hamid Arabnia advises that a supercomputer's real value should be rated by its cost effectiveness, not its speed.
    Click Here to View Full Article

  • "Tech Major Loses Its Luster"
    News Observer (NC) (10/28/04); Cox, Jonathan B.

    Experts warn that U.S. companies may have little choice but to offshore technology jobs as the number of domestic university computer science graduates continues to decline. The Computing Research Association estimates that the number of new U.S. undergraduate computer science majors has dropped 28 percent over the last four years, and those in the field attribute this trend to waning enthusiasm for the subject in the wake of the dot-com bust, which dashed many students' hopes of pursuing a tech career as a path to quick riches. Kevin Jeffay of the University of North Carolina's computer science department says a computer science major entails a lot of hard work and dedication that students may wish to avoid, especially if they perceive the end result of their labors to be a less-than-stellar career: "If you're going to work your butt off and have this Dilbert-like life, you don't want it," he explains. Jeffay also notes that reports of offshore outsourcing's growth are causing concerned parents to discourage their children from taking up computer science. Some experts believe undergraduate computer science students who pursue degrees despite the industry recession will have better chances of employment, since companies will be drawn to the higher level of quality such dedication signifies. Colleges have also started to supplement business and other majors with computer instruction, which may make a computer science degree redundant. Furthermore, the ranks of qualified workers are swelling by experienced employees' decisions to further their education and acquire graduate degrees following the layoffs of the last several years.
    Click Here to View Full Article

  • "A New Rule of Cursor Control: Just Follow Your Nose"
    New York Times (10/28/04) P. E7; Austen, Ian

    Notable examples of cutting-edge computer navigation technology designed as alternatives to the mouse include the nouse, a nose-tracking tool developed by computer vision scientist Dr. Dmitry Gorodnichy of the National Research Council of Canada. The nouse consists of a Web camera that uses software to track the movements of the user's nose and translate them into cursor movements. Gorodnichy says the nose is a more convenient target for computer vision systems than the eyes: The nose is easy to locate, and the position of its tip remains more or less consistent; the nouse is also programmed to turn itself on or off when the user blinks twice. Gorodnichy believes the nouse would initially be used by paralysis victims in hospitals, who could summon help through eyeblinks. He says the system was developed to enhance the computer's capabilities rather than to replace the mouse. Another unique navigation technology is being implemented by a company that makes motorized wheelchair control systems as a replacement for breath-based controllers: The tool, Think-A-Move, tracks the movements of the user's tongue inside the mouth by measuring related sound changes in the ear canal. The users wear a gadget with a microphone that picks up the ear signals and relays them to a computer or some other electronic device via a Bluetooth wireless link. Think-A-Move CEO Jim West notes that users must train the software to recognize the auditory changes signaling tongue movements through repetition, as well as assign specific functions to these movements.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "In Healthy Indian Job Market, Developers Want It All"
    IDG News Service (10/28/04); Ribeiro, John

    Software developers in India are once again demanding incredible salary increases and other compensation as outsourcing firms there are limited by employee shortages. Bangalore-based Wipro could take on more contracts if it had more workers, says talent engagement and development vice president Bijay Sahoo; in the third quarter alone, Wipro hired 3,300 new software developers. Indian IT companies sometimes receive nearly 100 applications for each position, but many of those come with false resumes or are from unqualified people, says Kenexa Technologies managing director Raghuveer Sakuru. Firms prefer to hire from their competitors since those people are able to become productive faster than recent graduates. India is only producing about 80,000 university-trained engineers per year, compared to the approximately 110,000 new graduates required to sustain 30 percent annual growth in the software and services sector, according to TVA Infotech CEO Gautam Sinha. Recent hiring spates by multinationals such as IBM and Accenture and the drop in university IT enrollments during the dot-com bust have left the industry without trained workers; this deficit means qualified personnel expect at least 10 percent salary raises per each six-month review and job-offer salaries of 70 percent over their former salary when switching, says Sakuru. And companies are paying attention to other aspects of retention as well, including offering training, international assignments, and exposure to technology. Sakuru says India still offers cost benefits of 60 percent to 70 percent, even with the rapid rise in salary, and that future increases will narrow the margin but still leave enough cost advantage to make India competitive.
    Click Here to View Full Article

  • "Data-Mining"
    California Computer News (10/27/04)

    A multi-university collaboration between English literature and computer science researchers seeks to apply data mining technology to digital library collections. The effort spearheaded by the University of Illinois at Urbana-Champaign's Graduate School of Library and Information Science (GSLIS) has won almost $600,000 in funding from the Andrew W. Mellon Foundation to develop a data-mining scheme that can find interesting new patterns in volumes on British and American literature. GSLIS dean John Unsworth says statistical analysis has often been used in literary study, but this is one of the earliest applications of data mining. The project builds on existing data-mining tools already developed by the University of Illinois' National Center for Supercomputing Applications, and will be applied to terabytes of humanities resources made available online. Unsworth says complex analysis of broad sets of works would yield interesting new discoveries and lead to better targeted literary research. The project team already tested their technology on the full set of Shakespeare plays, finding that "Othello" shared many characteristics with comedies, pointing a way to interesting academic study. Unsworth says the technology will prove even more useful when applied to larger sets of work. Collaborators on the Web-based Text Mining and Visualization for Humanities Digital Libraries project include English literature academics from the University of Georgia and the University of Maryland, plus computer science colleagues from the University of Virginia and other institutions.
    Click Here to View Full Article

  • "Hacking--Do the Pros Now Rule?"
    CNet (10/28/04); Kiat, Ong Boon

    Internet Security Systems chief scientist Robert Graham warns that professional hackers have gained ascendancy in 2004, a development that has sinister ramifications for corporate cybersecurity. He says hackers are becoming more coordinated, and the motivation for their exploits is changing from thrill-seeking to financial profit; furthermore, hackers have started to author their own malware rather than simply downloading widely available attack programs or following well-known intrusion strategies. Another alarming trend Graham identifies is virus writers racing to see how fast they can augment their malware to thwart the latest antivirus signatures, which is leading to increasing numbers of bug variants. The security guru maintains that firewall technology cannot improve any further because of commoditization, and the more complex firewall applications require greater skill from users to remain effective. Graham doubts that application firewalls will make a big impact in the future, arguing that "With applications that people are running on the Web, no amount of additive things can cure fundamental problems that are already there in the first place." He predicts that the next major security issues will revolve around voice over IP and general packet radio service, since the former lacks protocol-level encryption and authentication, while the latter involves widely open systems shared by mobile operators. Graham points out that hackers' motives and techniques may be changing, but viruses have remained at a more or less flat level of sophistication over the last several years; he says malware is following a gradual, rather than dramatic, evolutionary arc.
    Click Here to View Full Article

  • "Haptics: Cybertouch and How We Feel About It"
    InformIT (10/22/04); Rowell, Laurie

    Robotics researchers still encounter significant roadblocks when it comes to haptics--the field of study for inputting sensory information into computer systems or feeding that information back to users' hands, skin, or mouths. The incredible amount of information created by physical interaction with objects is one of the main barriers, and the most sensitive touch sensors currently on the market cannot detect surface variations as fine as those that fingertips can perceive. And while Moore's Law has dramatically lessened the cost of computing technology, it has not applied to mechanical components such as motors or magnets. Georgia Tech researchers are working on magnetorheological (MR) fluids that could help take some of the risk out of imprecise haptic sensors as systems using that technology can only resist or redirect energy applied by the user, decreasing the chance of injury. Artificial hand research has progressed in the last few years, but users still have to rely mainly on their eyesight to guide hand motions for tasks such as tying shoelaces. Another Georgia Tech innovation is the Intelligent Machines Dynamics Laboratory that allows people to remotely control a virtual forklift through a haptic Internet-connected interface; the forklift has sensors that allow users to feel walls or other obstacles they might damage with the forklift. Physical therapy is one field where haptics holds tremendous promise right now, with researchers at Rutgers University creating haptic devices that monitor muscle development in injured hands and ankles. The glove device lets patients feel a virtual ball or other feedback, then records resistance and other medical data to provide optimal physical therapy.
    Click Here to View Full Article

  • "Adding Reliability and Trust to Smartcards"
    IST Results (10/27/04)

    VerifiCard project coordinator Bart Jacobs notes that the European smart card industry has been unable to acquire certification at the topmost levels of the Common Criteria computer security standard due to a "lack of infrastructure--in the form of formal descriptions of smart card platforms and security policies, plus methods and tools to verify their security." IST's VerifiCard, which involves a consortium of leading European universities, research centers, and smart card manufacturers, has devised the technology to confirm that the latest wave of Java Card smart cards complies with the standard by enabling the specification of security needs by users and product security attributes by developers, while also giving evaluators the means to authenticate the veracity of such claims. Java Card adheres to all smart card specs, eliminating the need to change existing smart card-aware applications. Deployed in Java Card's read-only memory is a Java Virtual Machine that functions as the card's operating system, controls accessibility to all resources inside the card, and supports signature, log-in, loyalty, and similar applications. Upgrades and flexibility can be added to the card after it has been issued to the customer by downloading Java code, and Jacobs cites several initiatives that use the Java Card standard, such as a biometric ID card project undertaken by the Dutch government. Through VerifiCard technology such as the Jakarta toolset, developers were able to delineate the operations needed to verify the precision of the cards and their applications. VerifiCard establishes higher levels of trust and confidence in smart cards that deliver organizational and societal benefits as the number of dependable and credible smart card applications increases.
    Click Here to View Full Article

  • "Mixing Biology and Electronics to Create Robotic Vision"
    University of Arizona (10/22/04); Stiles, Ed

    Charles Higgins, assistant professor of electrical and computer engineering at the University of Arizona, describes current robotic vision systems as "pitiful," and is working to significantly upgrade the technology by developing electronic vision circuits based on insect vision processing systems. Such a goal is typical of the field of neuromorphic engineering, which Higgins believes could lay the foundation for robots with true sight and sight-based response capabilities--which could dramatically advance human-robot interactions--within one or two decades. His overall objective is for robotic vision to evolve in the same way robotic speech processing has over the last 30 years. A team of students led by Higgins is working on an airborne visual navigation system, using analog integrated circuits that can mimic insect visual behaviors such as self-motion estimation, obstacle avoidance, and target tracking in a pair of miniature blimps. The circuits use parallel processing rather than standard processors for the former's ability to imitate biological problem-solving. Higgins aims to create a microchip-based vision system that can track a moving target while ignoring objects of similar shape or color, or a processor that can recognize different objects. The professor says it is not his intention for such a system to completely replicate an insect's visual capabilities: "I'm looking at...a very specific vision subsystem that accomplishes a specific task," he explains. Higgins imagines "vision chips" that cost only $20 each, allowing people to purchase them in mass quantities.
    Click Here to View Full Article

  • "Light-Based Net May Mean Blazing Connections"
    NewsFactor Network (10/27/04); Martin, Mike

    A team of Canadian researchers report in Nano Letters that experiments with a new class of materials demonstrate a key enabling technique for laser-guided fiber-optic communications systems theoretically capable of boosting the Internet's transmission speed 100-fold. Using a hybrid buckyball-polymer film that can make photons detect each other's patterns, University of Toronto researchers Ted Sargent and Qiying Chen have discovered that the material could process data transmitted at telecommunications wavelengths. The film, which was developed by Carleton University researchers Wayne Wang and Connie Kuang, bridges the "Kuzyk quantum gap" that has limited molecular materials' ability to switch light signals with light. "With this work, the ultimate capacity to process information-bearing signals using light is within our practical grasp," boasts Sargent. The breakthrough could lead to the development of photonic network systems that can send signals around the world with pico-second switching times. Sargent explains that his team's work spotlights nanotechnology's ability to design and fabricate custom materials at the molecular level. Washington State University physicist Mark Kuzyk, who made the finding that became the Kuzyk quantum gap, is thrilled by the discovery. "This intelligent nanoscale approach to engineering nonlinear optical materials, which is guided by principles of quantum physics, is the birth of a new and significant materials development paradigm in synthetic research," he declares.
    Click Here to View Full Article

  • "Is P2P Dying or Just Hiding?"
    CAIDA.org (10/04); Karagiannis, Thomas; Broido, Andre; Brownlee, Nevil

    UC Riverside's Thomas Karagiannis, the Cooperative Association for Internet Data Analysis' (CAIDA) Andrew Broido, et al. dispute popular media reports that peer-to-peer (P2P) file-sharing has declined precipitously in the last year, and contend that the reverse is actually the case. The authors attempted to measure P2P traffic at the link level more accurately by gauging traffic of all known popular P2P protocols, reverse engineering the protocols, and labeling distinctive payload strings. The results support the conclusion that 2004 P2P traffic is at least comparable to 2003 levels, while rigid adherence to conventional P2P traffic measurement techniques leads to miscalculations. The percentage of P2P traffic was found to have increased by about 5 percent relative to traffic volume. Furthermore, comparisons between older and current P2P clients revealed that the use of arbitrary port numbers was elective in older clients, while current clients randomize the port number upon installment without the need for user action. Meanwhile, P2P population studies found that the ranks of IPs grew by about 60,000 in the last year, and the number of ASes participating in P2P flows expanded by roughly 70 percent. These findings outline several trends, including evolving tension between P2P users and the entertainment sector; increasing demand for home broadband links; plans to directly induce P2P applications into profitable traffic configurations; and a significant transformation in supply and demand in edge and access networks, provided that P2P traffic maintains its growth and legal entanglements are eliminated.
    Click Here to View Full Article

  • "Getting Intelligent About the Brain"
    CNet (10/27/04); Shim, Richard; Fried, Ina

    In his book, "On Intelligence," Palm Pilot creator Jeff Hawkins theorizes that the brain does not generate an output for every input, as most scientists think, but instead stores experiences and sequences as memories, and taps them to make predictions. He further reasons that researchers can apply this theory to the creation of truly intelligent machines. Hawkins stresses that smart machines should be based on the cortex, the upper part of the mammalian brain where intelligence is thought to reside, rather than the "old brain" that comprises the middle region of the neural organ. He attributes the shortcomings of current computer technology and artificial intelligence to their failure to truly understand the job at hand, because computers merely match a pattern against a template. "Our brains work on a completely different principle than computers," Hawkins posits. "It doesn't mean you can't emulate a brain on a computer, but you have to understand what the brain is doing, first." The researcher believes he has established the fundamental principles needed to understand the common algorithm that enables the brain to process diverse inputs in a similar fashion, and predicts that commercial startups will be working on the issue within a year. Hawkins also thinks his brain research and his work at PalmOne complement one another, and collectively embody computing technology's future. He says, "In PalmOne, it's the future of personal computing where devices that fit in your pocket are going to have superfast wireless connections. The work here is like the future of computing in general."
    Click Here to View Full Article

  • "Japan Battles to Reclaim Supercomputer Crown"
    Nikkei Weekly (10/11/04) Vol. 42, No. 2, P. 16; Matsuda, Shogo

    Japan no longer has the fastest supercomputer in the world in its Earth Simulator, which has reached a top computing speed of 35.86 teraflops. IBM has a new supercomputer, Blue Gene/L, that has reached a mark of 36.01 teraflops, and Lawrence Livermore National Laboratory in Livermore, Calif., will receive a final version of the machine next year to use toward research in astrophysics, cosmology, and genetic science. IBM's new supercomputer is a sign that the race for the fastest supercomputer is heating up once again, and that China appears to be a serious challenger for the supercomputer crown. China, which now has the 11th fastest machine at its Shanghai Supercomputer Center, plans to develop a chip for the specific purpose of discovering substances that could be new drug ingredients, and incorporate 3,000 of the new chips into a specialized supercomputer by 2006. The specialized architecture of the government-affiliated Institute of Physical and Chemical Research (Riken) will allow the supercomputer, which will cost $9 million, to reach 1,000 teraflops for its specific task. Meanwhile, the Japanese government is teaming up with NEC, Hitachi, and Fujitsu to develop by about 2010 a supercomputer that would reach 1,000 teraflops.

  • "Think Like a Customer, Use Your Stopwatch"
    Software Development Times (10/15/04) No. 112, P. 26; Koch, Geoff

    Many veteran software developers rely on simple approaches and practicality when it comes to tuning and optimizing software performance. When carrying out performance testing, Adobe Systems software scientist Rich Gerber explains, "I just use the application the way a user would while using the simplest of all tools--a stopwatch." He notes that the simple action of activating compiler flags can facilitate tuning, while even more performance can be squeezed out by revising the algorithm or optimizing for some of the newer processor instructions, and adds that he and his development team eschew modifications that become too hard to maintain in the future. In his book, "Software Optimization Cookbook: High-Performance Recipes for the Intel Architecture," Gerber suggests that a developer determine the best time to cease tuning by figuring how close the code is to its theoretical maximum performance, as long as the compiler is behaving as expected; if the performance is too slow, Gerber recommends changing the algorithm. Randy Camp with Musicmatch says that his developers' approach to performance tuning and optimization is to deal with the most used components of their product and concentrate on the slowest, rather than the most easily optimizable, parts. Musicmatch's strategy when encountering a bottleneck is to localize it, evaluate the existing performance of the target subsystem to build a reference so developers can measure improvements as they make revisions, and employ a profiling tool to assess where the time is being spent in the subsystem. Info Touch Technologies CTO Roy Goncalves emphasizes algorithmic optimization, which focuses on completely new approaches to carrying out tasks rather than tweaking existing code in increments. Because algorithmic optimizations have a better chance of resulting in exponential performance upgrades, cost savings, or new products, Goncalves is more likely to let algorithmic optimization efforts continue longer than code-level tuning efforts.
    Click Here to View Full Article

  • "Carriers and Users Prepare to Midwife ENUM"
    Internet Computing (10/04) Vol. 8, No. 5, P. 7; Goth, Greg

    ENUM, the facilitation of IP-enabled applications such as end-to-end voice-over-IP (VoIP) via the mapping of globally recognized phone numbers to the Domain Name System, has long been touted as a scheme in which end users control the provisioning, attributes, and accessibility of their assigned numbers; but recent industry activity and brainstorming by standards bodies support a carrier/infrastructure ENUM framework whereby carriers control information for their own connectivity designs. This development has spurred the Internet Engineering Task Force's ENUM working group to further explore the technology in order to establish a firmer foundation for mainstream ENUM. VoIP connectivity cannot be guaranteed in the current environment because of too many incompatibilities between Session Initiation Protocol (SIP) and the International Telecommunications Union's H.323 protocol, and also too many incompatibilities within SIP implementations. Furthermore, ENUM boosters are wondering about the possibility of deploying carrier ENUM based on its assigned e.164.arpa domain in the face of end users and carriers' disparate requirements. Neustar's Tom McGarry thinks this debate might play a chief role in the postponement of the creation of a nationwide ENUM Tier-1 administrative body in the United States. The ENUM working group is at the center of a debate stemming from the particularities of implementing carrier ENUM in the original e.164.arpa framework: A pair of AT&T engineers contend that Tier 1 should incorporate two naming authority pointer records with distinct service guidelines for a given number--one would point to the Tier 2 records of the end user, and the other to the Tier 2 records of the carrier. This proposal is facing opposition by group members who think that technical problems could arise from such a scheme, and note that government worries about containing carriers' records in the U.S.-controlled .arpa domain could mirror similar worries about public ENUM, hindering such deployment.
    Click Here to View Full Article

  • "Technology Review 100: Computing"
    Technology Review (10/04) Vol. 107, No. 8, P. 50; Savage, Neil

    Many of this year's TR100 honorees in the field of computing are recognized for their efforts to increase connectivity in some way, whether by effecting chip-to-chip or system-to-system communication, person-to-person interaction, or enhanced understanding of the world by computers. Notable recipients include Aref Chowdhury with Lucent Technologies' Bell Labs, whose optical phase conjugator can boost the amount of data that can be compressed into fiber by reversing the phase of an optical pulse; Millennial Net CTO Sokwoo Rhee, inventor of a technique for binding wireless sensors into a self-configuring network with potential applications that include environmental control in buildings and remote surveillance of people and objects; and Microsoft's Nuria Oliver, who is integrating microphones and cameras with statistically based machine learning so that computers can more intuitively deduce users' goals and emotions by reading physical cues such as facial expressions or tone of voice. TurnTide co-founder Dave Brussin has applied the rules governing the routing of email to an anti-spam solution in which a router analyzes the content and origin of messages to determine whether a computer is sending spam, and can limit the number of messages that computer can issue based on its assessment. Honorees whose work involves building communities and social networks include Scott Heiferman, whose Web-based Meetup.com tool has helped bring together like-minded people for the purposes of organizing political rallies, among other things. Meanwhile, DuPont's photonics R&D program leader Maria Petrucci-Samija is cited for her commitment to combining more fiber-optic network elements on a single chip using molecularly-tailored polymers. And IBM researcher Daniel Gruhl earns his place on the list with WebFountain, his supercomputer-based system for analyzing millions of Web pages and extracting their context through natural-language processing, pattern recognition, and statistics.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM