HomeFeedbackJoinShopSearch
Home

Be a TechNews Sponsor

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM. To send comments, please write to [email protected].
Volume 7, Issue 801:  Wednesday, June 8, 2005

  • "With Intel Inside Apple, Macs May be Faster, Smaller"
    Wall Street Journal (06/07/05) P. B1; Wingfield, Nick; Clark, Don; Forelle, Charles

    Apple plans to introduce Macintosh computers running on Intel chips by next June, creating an opportunity for the machines to become more powerful, faster, and smaller, since Intel chips generate less heat. The success of this venture hinges on Apple convincing its thousands of software developers to adapt their application programs to the Intel-based systems. Apple CEO Steve Jobs told attendees at the company's annual software developer conference yesterday that the future Macs will be shipped with the Rosetta utility program, which automates this adaptation. He cited Apple's disappointment with IBM's G5 chips, which the Intel chips are supposed to replace: Jobs said the G5 could not deliver the 3GHz clock speed that he had promised Macs would be capable of, nor has IBM produced a G5 that operates in notebook computers. Without explaining how, the Apple CEO said the operating system for Intel-based Macs will not run on other Intel-based computers, and he promised that Apple will not offer technical support to any user who wants to run Windows on future Macs as efficiently as other PCs do. Even without this advantage, some analysts believe the capability upgrades could be enough to convince companies to adopt the new machines. Jobs talked about several initiatives to aid the conversion of software developers' products to operate on either Intel or PowerPC microprocessors. He said developers of less than 20% of the programs would be required to start using Xcode before the conversion, while the transition period for many developers would only take a few weeks.

  • "Technology, Modernity May Change Future Elections"
    Associated Press (06/07/05); Tanner, Robert

    A June 7 report from a task force comprised of officials and former officials from 15 U.S. states organized by the Election Center in Houston, TX, calls for a restructuring of elections to support the wide deployment of "universal vote centers." Such facilities, which were set up in Colorado two years ago on a trial basis, would be accessible to every resident and thus eliminate confusion over where to vote. There would be less waiting on line because these centers would accommodate more equipment and personnel, while their concentration of well-trained election officials and day workers would help reduce mistakes; Larimer County, CO, clerk Scott Doyle adds that vote centers would also save counties a lot of money by lowering the number of handicapped-accessible voting machines that must be purchased. The task force study supports a movement to allow voters to cast their ballots over a period of days or weeks, instead of over the course of a single day. The group also thinks states should prevent duplication and fraud by sharing voter registration data between states. The report calls for state lawmakers to consider an "independently verifiable" record of each ballot from touchscreen voting machines, thus underplaying growing concerns that such machines should provide paper trails. Johns Hopkins University computer science professor Avi Rubin warns that election officials would be making a serious mistake by ignoring the machines' security vulnerabilities, which critics claim could be used to perpetrate election fraud.
    Click Here to View Full Article

    For information on ACM's stand on e-voting machines and technology, visit http://www.acm.org/usacm.

  • "Computer Science Majors Dwindling at U.S. Colleges"
    Iowa City Press-Citizen (06/06/05); Schorsch, Kristen

    Interest in computer science among college students is flagging, despite indications that well-paying job opportunities for computer-science majors abound. The Computing Research Association estimates the number of newly declared U.S. computer science majors experienced a 39% decline from 16,000 in fall 2000 to approximately 10,000 in fall 2004. Collegerecruiter.com President Steve Rothberg admits that significantly less IT jobs are available compared to five years ago, but there are signs of a recovery. James Cremer, chairman of the University of Iowa's computer science department, says globalization and offshore outsourcing of tech jobs have contributed to the decrease of computer science majors, but adds that competition permeates every field, so better skills are a must for students in any discipline. He has high hopes that computer science will continue to be a strong major throughout the 21st century, and says UI is overhauling its introductory computer science courses so they do not concentrate exclusively on programming. Cremer believes students taking such courses will become fluent in more wide-ranging subjects such as security, cryptography, and genomic research. UI senior Keith Turner says many students take computer science as part of a double major, using it as a foundation for the other subject they are majoring in. The National Association of Colleges and Employers reports that software design and development is ranked very highly in the list of leading jobs among 2004-2005 college graduates with an average salary offer of $53,729; collegerecruiters.com lists information technology jobs as the top category for employers and job seekers.
    Click Here to View Full Article

  • "Redefining the Power of the Gamer"
    New York Times (06/07/05) P. B1; Schiesel, Seth

    More than 100 game makers and academic computer experts attended the first Artificial Intelligence and Interactive Digital Entertainment conference last week to check out new games, many of which are character-driven, free-form, and controlled by AI methods. One example was "Facade," a domestic drama in which the player interacts with virtual characters who exhibit a wide range of emotions in response to the conversational English the player types in. "As we try to create more immersive experiences, these artificial intelligence techniques are helping drive games forward and this is one of the areas that could really explode," said Electronic Arts chief creative officer Bing Gordon. Mad Doc Software CEO Ian Lane Davis said graphics are losing their status as a game differentiator as graphics technology advances, prompting developers to find other ways to distinguish games, such as through the realism of artificial worlds and virtual characters' behavior. Also encouraging developers' new focus on AI are players' demand for more free-form games with broader game-play options and a more variable gaming experience. Monolith Productions AI programmer Jeff Orkin noted at the conference that traditional scripting is becoming a less and less workable method for directing the behaviors of virtual characters and environments, given the complexity of the game worlds and the diverse potential courses of action that can be taken. Monolith developers are now creating games in which the artificial agents are given goals, but must figure out how to reach them by absorbing available data and extrapolating their own strategies. Another game, "Spore," is unique in that the game uses AI to adapt the gaming experience to the player, rather than the other way around.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Pacman Comes to Life Virtually"
    BBC News (06/06/05); Sandhana, Lakshmi

    Researchers at the National University of Singapore's Mixed Reality Lab have developed an augmented reality (AR) version of the Pacman arcade game in which human beings stand in for Pacman and his Ghost opponents using a combination of virtual reality, global positioning system (GPS) technology, Bluetooth, infrared, sensors, and Wi-Fi. Players wear headsets that impose virtual cookies and other objects on their real-world point of view, and can power up to neutralize Ghosts by acquiring Bluetooth-embedded sugar jars; online gamers can also keep tabs on players' progress through a virtual 3D Pac-world, providing tips via text messaging. Director of Georgia Tech's Augmented Environments Lab Blair MacIntyre says most AR games rely on precise models of the physical world, which is impractical because of the world's level of complexity and the fact that it is constantly changing. AR Pacman developer Adrian David Cheok says this problem is circumvented by using a "rough gauge" of the environment. AR games that use GPS to keep track of players are complicated by the GPS receivers' 10-meter to 30-meter range of accuracy, when the desirable margin of error should be within the millimeter range; in addition, high-rise buildings can disrupt GPS signals. The Singapore team addresses these issues through the use of advanced Long Range Kinematic GPS technology and the decision to use a wide open space for the game area. AR technology is expected to dramatically transform the gaming experience by adding mobility and facilitating live interaction and socialization. Challenges in realizing this vision include bringing the cost of AR systems down to the point where consumers can afford such technology.
    Click Here to View Full Article

  • "Big Tech Outfits Unite to Try to Hook Phish"
    Investor's Business Daily (06/06/05) P. A4; Howell, Donna

    A consortium of major tech corporations has organized to create an effective email authentication standard by combining Cisco's Internet Identified Mail technique with Yahoo!'s DomainKeys strategy; the hybrid method, DomainKeys Identified Mail (DKIM), promises to deliver backward compatibility and simple ungradeability from DomainKeys. Sent email messages would be coded with DKIM, which establishes a message's legitimacy by encryption and other measures, and ISPs would then employ special procedures to see if incoming email messages bear the appropriate DKIM signature. In the long run, the signatures could help make ISPs more effective at filtering out authentic email from spam and phishing email. Email authentication schemes currently in use besides DomainKeys include Microsoft's Sender ID and SPF, both of which rely on an email message's Internet Protocol (IP) to distinguish between genuine and bogus messages. Sender ID is faster than the DomainKeys method, but it lacks the latter's encryption; conversely, encrypted approaches can strain systems. The expectation is that various authentication schemes will eventually coexist and complement each other. "IP-based solutions I think will stay around for a long, long time and offer us a really good bridge to encrypted solutions," says Email Service Provider Coalition executive director Trevor Hughes. The Internet Engineering Task Force has yet to ratify an official email authentication standard, but Sendmail CEO Dave Anderson says the chances of such a standard being passed have improved, since industry players are coming into agreement even before they submit a proposal.

  • "Two Indian Americans Bag Microsoft Awards"
    rediff.com (05/26/05); Mozumder, Suman Guha

    Microsoft Research recently announced Harvard University computer science professor Radhika Nagpal and Georgia Institute of Technology College of Computing professor Subhash Khot--both Indian Americans--to be among the first group of recipients of this year's New Faculty Fellowship Awards. The fellowship is a new program that recognizes early-year university professors who exhibit outstanding talent for unique research and thought leadership in their field. Each winner will get $200,000 in cash to carry out cutting-edge computer science research in collaboration with leading Microsoft Research scientists in their discipline. "This is extremely important for the field of computing because computing has been trying to cope with major reductions in the funding of fundamental research by government and industry," noted Princeton University dean of engineering and applied science Maria Klawe, a past president of ACM. Nagpal, who works in Harvard's division of engineering and applied sciences, focuses primarily on engineering biologically inspired systems capable of self-configuration and self-repair. Khot's chief area of research is answering basic questions about which problems a computer can and cannot rapidly solve. "The intellectual curiosity, creative drive, and thought leadership the [recipients] demonstrate is exactly the sort of initiative we seek to encourage in developing programs like the New Faculty Fellowship Awards," said Microsoft Research's Rick Rashid.
    Click Here to View Full Article

  • "Security Breaches Challenge Academia's 'Open Society'"
    Computerworld (06/07/05); Cline, Jay

    Carlson Companies data privacy manager Jay Cline reports that U.S. universities are advising students, alumni, and employees to keep an eye on their personal accounts because of security breaches, which are only now coming to light because of the California Security Breach Notification Act, which has been in effect since January. He does not foresee a rapid decrease in the number of university intrusion advisories, given the fact that the free and open exchange of knowledge is a long-cherished cornerstone of the university system. Another factor that has raised the risk of academic network intrusions is the broad migration of university business processes (new-student application and selection, financial-aid application, tuition billing and payment, grade distribution) to the digital arena, making information that was formerly offline more susceptible to remote exploitation by hackers. Cline also points to less strident enforcement of IT security standards by universities than industry because of student demand for an IT environment that can support the latest gadgets. He notes that recent academic victims of intrusions are taking several measures to fortify security, such as phasing out Social Security numbers as the chief authenticators for campus transactions in favor of university ID numbers and personal identification numbers; educating the university community on good security practices for information and personal devices; and promoting a central email address and phone number the community can use to report suspicions of intrusions. Cline also says universities should consider making a bigger effort to fix known security flaws, and assess the possibility of deploying stored-data encryption, at the very least.
    Click Here to View Full Article

  • "UCSD Computer Scientists Develop Ubiquitous Video Application for 3D Environments"
    UCSD News (06/07/05)

    Researchers from the University of California at San Diego have developed a new form of video monitoring that recreates 3D environments using still images and live-feed video streams. The technology allows remote observers to explore the virtual environment instead of watching separate feeds on a bank of monitors, for instance. UCSD computer science and engineering professor Bill Griswold and Ph.D. candidate Neil McCurdy presented the technology at the MobiSys 2005 conference in Seattle. The technology is part of the Wireless Internet Information System for Medical Response in Disasters (WIISARD) project, and has been tested by a hazmat team during emergency response simulations; field participants are equipped with helmet-mounted video cameras, tilt sensors with magnetic compasses, and GPS devices to track their location; data gathered from those devices is melded together to create a virtual environment that enables observers to do fly-through inspections of the space. Apart from emergency response and homeland security applications, the researchers say the technology could be used for virtual tourism, where people could see what the streets of a foreign city are like, for example. The most difficult technical challenge was making up for incomplete image coverage, which the researchers accomplished by using dated still images when live feed was not available; that dated material could be sepia colored to indicate its age, which could be important information in security applications. The researchers also accounted for natural brain processes that correct for blind spots in vision. Humans in the field also play important roles in calibrating location-tracking systems, which are used inside buildings or other places where GPS is unavailable.
    Click Here to View Full Article

  • "Blue Brain: Illuminating the Mind"
    Business Week (06/06/05); Port, Otis

    The latest installation of IBM's Blue Gene/L supercomputer at the Ecole Polytechnique Federale de Lausanne's (EPFL) Brain Mind Institute will be used to model the neocortex of the human brain, an effort that institute founder Henry Markram expects to take several years. The EPFL machine, known as the Blue Brain computer, boasts a peak speed of about 22.8 teraflops; its successor, the Blue Gene/P, could ultimately be capable of petaflops speeds. The approximately 8,000-processor supercomputer will be structured to mimic the function of 10,000 interconnected neurons as the basis for studying neocortical columns, which Markram calls the seat of adaptability and intelligence. The Blue Brain Project will focus on unique research into human cognition and memory, and Markram says the precise modeling of brain processes facilitated by the supercomputer will help scientists explore the origins of psychiatric disorders such as schizophrenia, autism, and depression. The neuroscientist says Blue Brain will be able to simulate the cellular degeneration that causes Parkinson's, making the machine an important tool for pharmaceutical researchers looking for cures or treatments. Markram thinks Blue Brain might corroborate the "liquid-computing" theory of memory preservation, and this could lead to silicon circuits capable of new and more sophisticated functions that IBM could use to build a super-smart computer. Blue Brain's stunning processing speed should help provide "a tremendous opportunity to do some science that up to this point just hasn't been possible," according to IBM researcher Charles Peck. The machine is scheduled to come online on July 1.
    Click Here to View Full Article

  • "DVD Standard Battle Rages On"
    TechNewsWorld (06/07/05); Korzeniowski, Paul

    NEC and Sony continue to wage war over which of their next-generation DVD technologies--NEC's high-definition DVD (HD-DVD) or Sony's Blu-ray--will become the industry standard, although both products have advantages and shortcomings. HD-DVD and Blu-ray both rely on blue-laser technology, which increases their data storage capacity significantly, but that capacity is not the same for both formats. Single-layer ROM discs in HD-DVD format can hold 15GB, dual-layer discs can store 30GB, and single-layer rewritable discs can manage 20GB; meanwhile, single-layer and dual-layer Blu-ray discs can hold 25GB and 50GB, respectively. NEC's format uses Advanced Optical Disc technology to sustain backward compatibility with current DVD discs, allowing Hollywood studios and replicators to upgrade from the older format without substantial production line overhauls. Blu-ray boosts disc capacity via a newly designed optical-disc format and optical pickup facility, and employs an anti-copying safeguard in which content providers physically place a ROM mark onto a prerecorded disc during the mastering process. Although the Sony format may have the advantage in terms of industry support--100 firms have pledged to adopt Blu-ray, compared to 60 for HD-DVD--In-Stat analyst Gerry Kaufold says HD-DVD's backward compatibility may allow HD-DVD products to come to market sooner. Whether either format will be broadly adopted by consumers is a difficult question to answer, since first-generation HD recorders and players will be expensive. Analyst Jon Peddie notes that the MPEG-4 format permits the storage of HD content on DVDs without the cost or disruption headaches of either the NEC or Sony formats.
    Click Here to View Full Article

  • "Q&A: Ex-eBay Security Chief Sees a Safer Internet in the Future"
    InformationWeek (06/03/05); Claburn, Thomas

    Former eBay security chief and onetime chair of the President's Critical Infrastructure Protection Board Howard Schmidt is generally optimistic that the Internet's security will improve, and attributes this to a shrinking gap between the identification of security issues and industry's response to them, as well as increased recognition of security measures that must be taken by the private sector and the user community. He thinks end users' awareness of the need to practice safe computing and fortify themselves against cyberattack is on the rise, partly because of industry's increasingly aggressive promotion of security, and partly because of town-hall meetings the White House held throughout the U.S. while putting together the National Strategy to Secure Cyberspace. Schmidt says vendors are sufficiently improving the simplicity of secure PC use for end users, while the federal government has done a good job of spreading awareness among the private sector of the issue's ramifications for public safety, the economy, and national security through bodies such as the National Infrastructure Assurance Council, the National Security Telecommunications Advisory Council, the Information Security and Privacy Board for the Commerce Department, and the Office of Management and Budget. He says the government must continue its efforts to untangle and smooth out its own internal processes so that it can fulfill its promise to be a model for cybersecurity. Schmidt also believes encryption software is a valid and necessary tool for computer security, although he acknowledges that the use of encryption can be construed as a criminal act if it is used by wrongdoers to conceal evidence of their guilt. Looking ahead the next few years, Schmidt expects online identity management to improve, resulting in better privacy protection.
    Click Here to View Full Article

  • "Play It Again, Vladimir"
    New York Times (06/05/05) P. AR1; Midgette, Anne

    Zenph Studios President Dr. John Q. Walker is developing technology in which a computer deconstructs and digitizes a musical recording and then plays it back on a Yamaha Disklavier Pro. Walker says the technique can be used to produce crisp renditions of scores by long-deceased artists that accurately recreate the musicians' unique performance. He says the first challenge is translating an audio recording into a model the computer can understand, while further ahead lies the challenge of determining whether the player piano can precisely replicate human performance. The problem with the mechanical reproduction of music is that the assessment of such music is highly subjective. The Austrian Research Institute for Artificial Intelligence's Machine Learning, Data Mining, and Intelligent Music Processing Group led by Gerhard Widmer is attempting to ascertain how the human ear can be fooled so that a mechanical performance is mistaken for a live performance. The group authored a paper in 2003 detailing how a computer was fed 13 recordings of Mozart piano sonatas played by Roland Batik into a Bosendorfer Disklavier to see if rules describing the musician's interpretive choices could be extracted. The machine applied its knowledge of Batik's style to its own performance of a sonata it had not heard Batik play, and this performance earned second prize in the International Computer Piano Performance Rendering Contest in 2002. Widmer team member Dr. Werner Goebl is supportive of Walker's work, but is skeptical that such computerized replication constitutes a "real" performance. Walker hopes his technology will help make recordings more interactive.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "A Battle for the Soul of the Internet"
    ZDNet (06/06/05); Noss, Elliot

    The United Nations and the International Communications Union (ITU) are both seeking to take control of the Internet from ICANN, a battle that's taking place with little fanfare but affects the very foundation of the World Wide Web, writes Tucows CEO Elliot Noss. Noss says both the UN through its World Summit on Information Society (WSIS) organization and the ITU through its Working Group on Internet Governance (WGIG) organization are seeking to gain control over the management of the DNS, domain names, and IP addresses. Critics of ICANN complain that it places too much control over the Internet in the hands of the U.S. However, ICANN is becoming an increasingly global organization with two characteristics that set it apart from the United Nations and the ITU as the best candidate for Internet governance. First, ICANN's role encompasses policy, technical, business, and user interests, with formal roles in policy and governance allotted to each group of stakeholders. In addition, the global nature of ICANN means it does not divide stakeholders according to national governments and instead works by "rough consensus." This structure avoids the kinds of political problems, such as censorship and taxation, that could occur under a government body like the United Nations. In fact, Noss argues that ICANN's limited political acumen makes it vulnerable to power plays on the part of ITU and the UN, which are both experienced political organizations. Noss calls on all businesses that benefit from a free and open Internet to become more active in the ICANN process, and for individuals to participate as well.
    Click Here to View Full Article

    For information regarding ACM's Internet governance work related to ICANN, visit http://www.acm.org/serving/IG.html.

  • "The Biotech Men"
    Sydney Morning Herald (Australia) (06/07/05); Cauchi, Stephen; O'Neill, Rob

    Smarter and more distributed computing is playing an increasingly vital role in Australia's biotechnology industry. The Bionic Ear Institute in East Melbourne uses distributed computing and other state-of-the-art measures to aid research for developing more effective technologies for the hearing impaired. The institute is connected to the Eye and Ear Hospital and Melbourne University's Department of Otolaryngology via a dedicated optical-fiber network, and IT manager Martin Wojak says data analysis and computation is carried out with downloadable software. The network supports 350 PCs undergirded by a Linux server and a half-dozen Windows 2003 servers. CSIRO software engineer Gavin Kennedy says the output of information yielded from genomic research is outpacing Moore's Law, and closing the gap requires not only more processing power, but less computationally intensive problem-solving strategies. CSIRO uses the Internet to connect its 12 biotech divisions to its bioinformatics center through a Canberra cluster consisting of 66 blade servers that communicate with each other at 1Gbps, while the data is stored in a 2TB file server. Of all the software packages CSIRO uses, the most important is BLAST, which facilitates the interpretation of DNA and protein sequence data by breaking down searches and distributing them over several computers; the genes are compared against the U.S.-hosted Genbank Database. University of Melbourne lecturer Dr. Rajkumar Buyya has devised a virtual lab that permits researchers around the world to share computing resources for large-scale molecular simulation, using the SETI@Home grid as a template. This dramatically accelerates the time it takes to screen a single molecule from a chemical database against a protein target, which is critical for drug companies.
    Click Here to View Full Article

  • "NIST Focuses on Measurements for New IT Technologies"
    Federal Computer Week (05/30/05) Vol. 19, No. 17, P. 54; Olsen, Florence

    The National Institute of Standards and Technology (NIST) plans to identify and define information science measurements that are critical to U.S. competitiveness, according to officials. Equipped with accurate measurements, engineers will be able to predict the behavior of systems before they are built, says NIST manufacturing systems integration chief Steven Ray. The NIST effort has important implications for software development, where developers currently have few ways to determine behavior before testing. Systems and Software Productivity Consortium's Gregory Friedmann said the NIST effort would contribute to stronger software standards; the new NIST program is dubbed "Roadmapping America's Measurement Needs for a Strong Innovation Infrastructure," and is seen as a necessary countermeasure to successful national measurement efforts in countries such as Japan. Innovations are not caused by measurements, but measurements do provide a underlying framework for faster innovation, says NIST standards services chief Mary Saunders. As part of its effort, NIST will conduct a number of workshops this year and host a summit next January that will gather input from standards developers, accrediting organizations, national laboratories, science agencies, and trade organizations. The workshops and conference will help identify what measurements are needed, how existing measurements can be improved, and international incompatibilities among measurements. Topics for the workshops include measurements for magnetic data storage, broadband telecommunications, homeland security technologies, and health care technology.
    Click Here to View Full Article

  • "Low Cost Wireless Technologies Enable New Sensor Networks"
    High Frequency (05/05) Vol. 4, No. 5, P. 32

    Wireless technologies allow new sensor network deployments that will dramatically increase knowledge and allow meteorologists to more accurately predict the weather or doctors to more effectively treat physical ailments. Sensors are used to gather a wide variety of information, such as temperature, humidity, motion, chemical vapors, biometrics, and mechanical stress; low-cost and low-power wireless technologies promise to increase the utility and efficiency of sensor networks by enabling more sensors to be deployed and managed, and by increasing their life expectancy. Wireless technology research for sensors focuses on event-triggered communications, where sensor devices operate on a limited basis to conserve resources, and ad-hoc networking, which allows sensor nodes to relay data to base stations. A number of innovative wireless sensor programs are under way that could have significant scientific and practical impact: The Global Environmental Micro Sensors (GEMS) project sponsored by ENSCO aims to gather atmospheric data from lightweight sensors that float in different levels of the atmosphere; data gathered this way would be more accurate than that gathered at the ground, through balloons, or by satellite. The researchers are investigating an ad hoc networking system to link the sensors. The CORIE project on the Columbia River is currently being used for a number of environmental science projects, such as the characterization of water circulation in the river, estuary, and near-ocean areas. Researchers at the 2005 IEEE Wireless and Microwave Conference also put forward an embedded wireless sensor scheme for monitoring body functions. Special care has to be made to not only prevent RF radiation and comply with FCC rules for medical implants, but also to compensate for the sensitive sensor environment.
    Click Here to View Full Article

  • "Down to the Wire"
    Foreign Affairs (06/05); Bleha, Thomas

    Japan and other Asian countries have overtaken the U.S. in terms of high-speed broadband and mobile-phone technology deployment, dislodging U.S. leadership in Internet innovation and positioning Asia as the primary beneficiary of higher productivity, economic growth, quality-of-life improvements, and other advantages. Broadband implementation and proliferation is a low priority in the Bush administration, which has chiefly emphasized tax cuts, defense, and the war on terror; the governments of Japan and its Asian neighbors, on the other hand, have prioritized high-speed broadband penetration. The Japanese government established a competitive broadband infrastructure by lowering many regulatory barriers and convincing regional telephone exchange firms to make their residential phone lines available to outside rivals in return for a moderate toll. The U.S. government's effort to facilitate wireless, mobile-phone-based Internet access is paltry in comparison to Japan's, with America failing to transition to a services-based rather than price/coverage-based competitive landscape. If the U.S. is to catch up to Japan in these areas, the Bush administration must clearly identify the benefits of broadband to generate interest from providers and potential users, as well as pressure the President's Information Technology Advisory Committee (PITAC) to become a leading proponent of broadband deployment. Strategies PITAC should follow include establishing long-term U.S. broadband implementation goals; devising a way to fulfill Bush's pledge to make basic broadband affordable and available to all Americans by 2007; making affordable high-speed broadband access available to two-thirds of U.S. households by 2010; promoting affordable ultra-high-speed fiber access for one-third of U.S. households by 2010; and offering ways to set up a comprehensive, national, third-generation cellular infrastructure.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM