HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to [email protected].

Volume 5, Issue 500: Wednesday, May 28, 2003

  • "American Spam Is Flooding Europe"
    Los Angeles Times (05/27/03) P. C3; Moore, Matt

    Experts claim that most of the unsolicited commercial email swamping European ISPs originates in the United States, and partially blame disparate enforcement policies and lenient penalties for spam's proliferation. Anti-spam enforcement is the responsibility of individual nations, and the European Union does not threaten spammers with jail time, only fines. Furthermore, many EU member countries cannot afford to take legal action against overseas spammers, while countries such as Italy have justice systems that are notoriously slow to convict mass emailers. Other countries, the Netherlands being one example, have not devised anti-spam strategies because of a lack of complaints, although experts argue that the amount of spam is increasing. Over 95 percent of spam hails from the United States, according to Jens Storm Monroe of Microsoft's Denmark branch, who observes that the United States buys and sells email addresses much more extensively than Europe. Spammers may also be more inclined to base their operations in America because of the country's size, compared to European countries whose relative smallness makes anonymity difficult. Another reason the United States is so attractive to spammers is that Americans appear to place less value on their personal privacy than Europeans. Harald Summa of Germany's Electronic Commerce Forum says that "aggressive" American marketing is chiefly responsible for the flood of junk email into Europe.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "College Plans Virus-Writing Course"
    CNet (05/27/03); Fried, Ian

    Canada's University of Calgary is courting controversy by offering fourth-year students a course in which they will write and test computer viruses, starting in Fall 2003. Calgary's Ken Barker says such a class is a valuable tool for gaining insight about what drives creators of malware, but David Perry of antivirus software provider Trend Micro doubts that such programs could improve the line of defense against viruses. Perry adds that there is little point in examining virus writing, because most malware relies on very simple programming. "If you are a very good programmer, somebody hires you to write programs," he says. Barker counters that this financial incentive will help class participants concentrate on developing virus safeguards, rather than simply launching the viruses they create. Fred Cohen of the University of New Haven also sees value in allowing students to interact and write malware, as long as there are fail-safes to prevent viruses from being released into the wild. In fact, he has made it his mission to ensure that the University of Calgary and other schools have adequate anti-proliferation measures in place. However, Cohen does not agree that such courses will help students better understand virus writers' motivations, as Barker believes. Calgary officials claim safeguards have already been set up: Students will not be allowed to take out disks from bug-infected labs, which will have round-the-clock security, and the school will utilize a closed network.

  • "Legal Threat Rocks Linux"
    EE Times (05/27/03); Murray, Charles J.

    SCO Group sent a letter to Fortune 1,000 companies and 500 other global enterprises on May 12, warning them that they could be subject to legal action for using the Linux open-source operating system, which allegedly incorporates copyrighted software developed for Unix. An SCO spokesman said that three independent teams of programmers and mathematicians were commissioned to scan Linux for signs of patent infringement, and each group apparently found evidence. Linux advocates such as Linux International's Jon Hall argued that the real point of SCO's maneuver is "to create fear, uncertainty and doubt about whether or not Linux, or any free software, should be used at all," and expressed suspicion over the organization's refusal to disclose exactly what intellectual property is being infringed. SCO declared it would reveal the infringed content to industry analysts over the next several weeks, on the condition that they sign nondisclosure agreements. SCO recommended in the letter that enterprises should seek the advice of legal counsel before using Linux. The threat of legal action against Linux users is causing ripples in the industry, and if SCO's copyright infringement claims were to be upheld in court, it could destroy the open software business model; executives indicated that some critics have been so incensed by the announcement that they are advocating violence. Nine days before the letter was sent, SCO computers were hit by a denial-of-service attack allegedly launched by Linux proponents.

  • "From PlayStation to Supercomputer for $50,000"
    New York Times (05/26/03) P. C3; Markoff, John

    A supercomputer that may be able to perform a half-trillion operations per second has been built out of 70 Sony PlayStation 2 consoles for around $50,000 by researchers at the University of Illinois at Urbana-Champaign's National Center for Supercomputing Applications. The consoles were arranged in a rack and interconnected with a high-speed Hewlett-Packard network switch; the supercomputer runs the open-source Linux operating system. Another 30 PlayStations are being held in reserve, and may be used to facilitate high-resolution display. The supercomputer leverages the Emotion Engine, a silicon co-processor that comes standard on the game console, and can perform as many as 6.5 billion mathematical operations every second in order to generate realistic graphics. The machine's creators are building the supercomputer to test the viability of inexpensive consumer hardware products as the basis for high-performance computing. The console's 32 MB memory threshold and bandwidth restrictions could limit the usefulness of the system, say several supercomputer experts. However, the University of Illinois researchers point out that the PlayStation supercomputer is already performing quantum chromodynamics (QCD) calculations, which could help reduce QCD simulation costs. Microsoft's Gordon Bell thinks the machine may be especially useful as a computer for the Department of Defense's large digital display walls.
    (Access to this site is free; however, first-time visitors must register.)

  • "U.S. Launches Drive to Regain Top Spot in Supercomputing"
    Financial Times (05/27/03) P. 14; London, Simon

    Spurred by Japan's Earth Simulator stealing the fastest supercomputer crown from the United States last year, President Bush has established the High-End Computing Revitalization Taskforce to put the country back on top. A lagging American supercomputing effort could seriously impair the functions of the country's defense and intelligence agencies, and cause the United States to lose its lead in the climate modeling and geophysics arenas, experts warn. The National Security Agency issued an April report telling Congress that "The mix of research, development and engineering programs lack balance and coordination and is far below the critical mass required to sustain a robust technology/industrial base in high-end supercomputing." Starting in June, the taskforce will host workshops where academic, federal, and industry experts will brainstorm a five-year supercomputing research initiative, to be presented to the Bush administration by the end of August. Taskforce chairman David Nelson said last month that supercomputers assembled from commoditized components take a long time to construct and are hard to program and operate, limiting their usability for operations such as cryptoanalysis. Meanwhile, Dan Reed of the University of Illinois' National Center for Supercomputing Applications contended that Japan's government/industry partnership to develop the Earth Simulator may be a model that the United States should adopt. IBM expects the upcoming ASCI Purple the company is building for the U.S. Department of Energy to outperform the Earth Simulator. Far more important than the United States regaining its supercomputing lead is "how we leverage our leadership to serve the needs of federal government and also deliver cost-effective supercomputing that can be applied to a broad commercial base," stated Tilak Agerwala of IBM.

  • "Big Changes for Search Engines"
    Wired News (05/27/03); Delio, Michelle

    Near-future search engines will be intelligent, quick, and tailored to the requirements of individual users, if papers presented at the 12th International World Wide Web Conference in Budapest are any indication. This transformation will be spurred by new methods and interfaces that computer scientists are developing. Ben Shneiderman of the University of Maryland in College Park illustrated the value of graphical search result data displays such as TimeSearcher, which enables users to classify search terminology on a graph. A key step toward personalized online searching is unique search tools that operate on one computer or across a private network. Search engines will one day be capable of reviewing cookies and referring to past searches to conclude that a user is looking for specific types of data. Google must increase its search speed by a factor of 10 before search customization is to become possible, and Stanford researchers presented a paper at the conference indicating that Google could boost its speed through methods they have devised. Stanford scientist Sepandar Kamvar contended that "Google will not run five times faster if our research is implemented, but we do expect a 30 percent speed-up." NEC researchers delivered a paper detailing Review Seer, a tool that automatically ranks product reviews based on user feedback gleaned from postings on newsgroups and Web sites.

  • "Antispam Law Likely"
    IDG News Service (05/27/03); Gross, Grant; Roberts, Paul

    A slew of antispam bills are currently before Congress, and some form of antispam legislation is expected to pass this year, but eight antispam and consumer groups have sent a letter to four congressional committees arguing that the bills under consideration would lead to an increase in spam rather than a reduction. The letter, which was signed by the leaders of the Coalition Against Unsolicited Commercial Email (CAUCE), Junkbusters, the National Consumers League, and others, claims that the proposals "repeat many of the legislative mistakes that have exacerbated the unsolicited commercial email problem, permitting it to grow to the epidemic proportions it has reached today." A bill sponsored by Rep. Richard Burr (R-N.C.) would outlaw deceptive spam headers and the collection of email addresses from Web sites, and require that marketers offer an opt-out policy for recipients as well as include a legitimate street address. However, CAUCE's Ray Everett-Church says that Burr's bill--and any other antispam bill that only recommends opt-out policies--legalizes spamming to a certain degree, while Wellborn & Butler's Pete Wellborn adds that ISPs would be divested of their right to block spam. Furthermore, the majority of antispam proposals do not allow individuals to pursue legal action against spammers. The Network Advertising Initiative's J. Trevor Hughes asserts that private lawsuits, in addition to being beyond the resources of individuals, would be stymied by spammers' propensity to falsify their identities. He also says spammers' deceptive practices will nullify opt-in policies. Mailshell VP Eytan Urbas notes that the situation is complicated by the lack of a clear, universal definition of spam.

  • "Group Moves to Boost Women in IT"
    Globe and Mail (05/26/03); Kerr, Ann

    The Canadian Information Processing Society (CIPS) is attempting to interest high-school girls in IT and reform the field's stereotypical geeky image through a new "ambassador" program that strives to provide young women with role models they can look up to and draw inspiration from. Such role models can refute the view many girls subscribe to--that of IT being a solitary profession with hard working conditions, as portrayed in the media and in IT marketing--by demonstrating that interaction, flexible scheduling, and travel can also come with the job, according to CIPS director Karen Lopez. Many young women do not think they can qualify for a technology career without a heavy emphasis in math and science, but Jeanne Douglas of Vancouver-based Telus argues that other skills, such as communication, can be leveraged into technology-related professions, business analysis and quality assurance being two examples. Ryerson University's Denise Shortt argues that changing women's perception of IT extends beyond exposing them to positive role models, and calls for more proactive strategies to lure women into the technology sector. Ryerson's solution is to integrate technology with business skills into a hybrid IT management course in the hopes that it will reverse a fall-off in female enrollment, while a pilot program launched last summer does not feature a heavy math skills requirement. Lynda Leonard of the Information Technology Association of Canada believes that more primary and secondary schools should implement separate, female-oriented computer classes with the goal of making students feel more comfortable with the technology. The nonprofit ACTUA offers community programs for young people, including an all-girls program that aims to boost female participation in science and technology clubs.
    Click Here to View Full Article

    For information about ACM's Committee on Women in Computing visit http://www.acm.org/women.

  • "Fretting Over U.S. Data Collection"
    National Law Journal (05/26/03); Coyle, Marcia

    The Pentagon issued a report last week detailing its Terrorist Information Awareness (TIA) program, in which numerous databases about citizens' personal transactions would be mined to find indications of terrorist activity. The point of the report was to establish whether the initiative strikes a balance between national security and privacy laws, but both critics and supporters of TIA agree that the report should spur debate among legislators and the public over how current privacy laws should be revamped. Center for Democracy and Technology attorney Lara Flint says the report is a "first step in what could be a long process of discussing the implications of a program like this, what laws apply--which we think are few--and what the new rules need to be." TIA and other programs highlight three fundamental flaws with privacy statutes, according to Peter Shire of Ohio State University. First of all, such programs authorize searches through private databases that are not protected by the Privacy Act of 1974, which covers "systems of records;" second, the Fourth Amendment, which requires federal authorities to secure a warrant before conducting searches, does not apply to outside organizations with databases of individuals' personal information, which can be accessed under Supreme Court jurisdiction; and third, the government can leverage exceptions to existing privacy laws to get hold of data. "The whole point of TIA is to send searches through the largest possible array of private-sector databases, where...there is no Privacy Act protection and no Fourth Amendment protection," Shire declares. Paul Rosenzweig of the Heritage Foundation thinks Congress and the Pentagon can limit the application of TIA, while Flint notes that the Pentagon's TIA report does not settle a basic issue--whether the program will actually work.

    To read more about TIA and ACM's response to this program, visit http://www.acm.org/usacm/Issues/TIA.htm.

  • "Testing With the Mess of Reality"
    Raleigh News & Observer (05/28/03); Dyrness, Christina

    Duke University assistant professor of computer science Amin M. Vahdat is nearly ready to release the source code for an Internet software testing framework called ModelNet. The system allows users to simulate realistic Internet conditions and see if their software holds up. Vahdat will gain tenure this July at only 31, and says the system has been in development for about two years with the help of fellow faculty member Jeff Chase and several graduate students. He says, "What we wanted to do is develop an infrastructure where people can take...programs that they've done some local testing on...and subject it to Internet-like conditions." ModelNet currently can simulate a network of up to 30,000 routers and 10,000 to 20,000 end hosts. ModelNet allows users to set the parameters of the test so that they can simulate 10 percent of Internet links taken offline or a sudden traffic spike. CNN.com, for instance, saw traffic double every seven minutes following the Sept. 11 attacks. Vahdat says ModelNet could help firms prepare for events like that. Other specific applications are testing peer-to-peer software and analyzing the spread of malware--a general term for viruses, worms, and other malicious software. In the end, Vahdat's goal is to see ModelNet widely relied on as an Internet software testing tool so that developers do not need to repeatedly figure out how to test their ideas.
    Click Here to View Full Article

  • "Companies Pare Down UWB Proposals"
    ISP-Planet (05/20/03); Lipset, Vikki

    Twenty-three proposals for a new ultrawideband (UWB) standard were submitted at a March meeting of the IEEE, the majority of which favored a multiband-based model in which UWB's 7.5 GHz spectrum apportionment would be split into smaller bands of 500 MHz to 700 MHz. The UWB Multi-Band Coalition is now attempting to combine the proposals into one, and recently convened in Dallas to debate the assimilation and boost the group's membership. Thus far, the coalition has whittled down the number of proposals to 13. The final merged proposal will be submitted by UWB startup Time Domain in July, when the 802.15.3a Task Group will meet to vote on it and any other remaining proposals. Texas Instruments has devised a multiband proposal that would use orthogonal frequency division multiplexing (OFDM) to split the UWB spectrum into 14 528-MHz bands that can each be segmented into 4-MHz "tones," which TI's Anuj Batra claims will facilitate global scalability. Motorola and Xtreme Spectrum offered a joint proposal for a UWB standard based on dual-band, direct sequence code division multiple access (CDMA), because the collaborating companies doubt that a multiband approach is sufficient to handle "killer app" consumer electronics functions such as high-definition video transfer. "There is a fundamental physics benefit that comes from wideband radio transmission vs. narrowband," argued Xtreme's Chris Fisher, who added that the standard must enable a home network to support numerous users while simultaneously offering them high data rates. Jason Ellis of General Atomics does not think the standard will be complete by year's end, but was confident that the path it is heading down will be clearer.
    Click Here to View Full Article

  • "Guess Who's Smarter."
    Boston Globe (05/26/03) P. D1; Denison, D.C.

    Artificial intelligence is producing tangible benefits in specific applications, but the technology is still far away from computers that understand the world as well as a human three-year-old, according to MIT graduate student Push Singh. He is working on the Open Mind database and has collected over half a million common-sense postulates that people take for granted, such as "the sky is blue during the day;" Singh estimates 100 million common-sense positions may be needed to endow computers with the general knowledge shared by society. Singh is excited that the Open Mind database is already being integrated with a number of other MIT projects, such as an email program that automatically selects digital photos complementary to a particular email. Henry Lieberman, who is developing the email program, says his application of AI is "fail soft," meaning that inaccuracy has little impact. Singh says the use of the Open Mind database in other projects signifies that the program is really becoming useful. MIT Artificial Intelligence Laboratory director Rodney Brooks says researchers are attacking the AI problem from many different directions, and that "none of us are going to get there any time soon, and none of us are going to get there alone." AI guru Ray Kurzweil says making computer intelligence on par with human intelligence requires a better understanding of the human brain, while the task of decoding human intelligence is now at the point where the human genome project was 10 years ago. He expects that by 2029 AI will have advanced enough that a computer will be able to pass the "Turing Test."
    Click Here to View Full Article

  • "Q&A With Dan Reed"
    InformationWeek (05/13/03); Ricadela, Aaron

    Dan Reed of the University of Illinois at Champaign's National Center for Supercomputing Applications (NCSA) says the center's mission to devise enabling technologies for the science and engineering sectors is being extended to the arts arena. For instance, NCSA is working on data storage and query interfaces for the National Archives and Records Administration, while methods of mining for data taken from sensors could be used to search out pertinent newspaper articles. Reed anticipates that grid computing will be more applicable to back-end rather than front-end applications, and predicts that high-end computing's value will become evident as the amount of data collected for biological, astronomical, and physics research exceeds the terabyte, and later the petabyte, threshold. Reed calls the development of Japan's Earth Simulator "an impressive technical feat," and sees the value of supercomputers for such projects as the development of artificial life, nanotechnology, and smart materials, as well as modeling the "universe in a box," and biological studies. Reed observes that the movement of data within the central processing unit is becoming an important consideration for the business and well as the scientific aspects of the computer industry. "One reason vendors are interested in high-performance computing is it's an early test bed for technology that will transfer to the commercial side," he says.
    Click Here to View Full Article

  • "MIT Gives Peek at Future Tech"
    IDG News Service (05/23/03); Krazit, Tom

    MIT alumni, faculty, and students celebrated the centennial of the institute's Electrical Engineering and Computer Science department on May 16, where speakers addressed the need to keep pace with technological changes and cited the academy's progress in the fields of robotics, computer learning, miniaturization, and human-computer communication. MIT computer science lab director Victor Zue noted that the integration of speech recognition and face identification technologies is a key ingredient of successful human-computer communications and computer-generated speech and facial movements. Zue acknowledged that such a breakthrough could impact privacy and security--for instance, the technology could conceivably be used to create a realistic depiction of a major figure who could disseminate misinformation to people through television transmissions. Another session covered MIT's robotics efforts, such as a robotic leg prosthesis that adjusts the knee joint according to the texture of the ground and the wearer's speed. MIT professor Paul Penfield Jr. declared that faculty and graduates must always keep in mind that the chief goal of scientific progress is the betterment of society. May 15 saw the unveiling of MIT's federally funded Institute of Soldier Nanotechnologies, whose mission will be to boost the protection and simplify the operations of U.S. troops.

  • "How to Unclog the Information Artery"
    New York Times (05/25/03) P. 3-1; Hansell, Saul

    The growing headache of spam has prompted proposals to rigorously control it, some of which aim to be effective without repressing the libertarian spirit of the Internet. Hans Peter Brondmo of Digital Impact advocates Project Lumos, which requires mass emailers to include an encrypted digital certificate in the header of each message, and submit to performance scoring based on recipient complaints and how much email is bounced back. The nonprofit Spamhaus Project directed by Steve Linford has organized the Spamhaus Block List, which blocks addresses confirmed as spammers through spam traps and the Registry of Known Spam Organizations; Linford thinks the Can Spam Act currently before Congress will only cause spam to surge, since it would forbid ISPs from blocking spammers as long as those spammers send junk email with legitimate addresses. Orson Swindle of the FTC has urged ISPs to provide users with a simple system to block spam, but thinks the providers are reluctant to do so for fear of alienating customers who advertise online. He adds that harsher penalties are useless because the majority of spam cannot be traced, and also doubts the viability of a do-not-spam list. Release 1.0 editor Esther Dyson favors industry self-regulation, and maintains that federal legislation is more likely to spur spammers to set up shop overseas; she also believes TrustE's proposal for certifying good emailers is "a marketing scheme," and calls spam blacklists censorious, indiscriminate, and too broad. Crosstown Traders' Michael P. Sherman says legitimate email is endangered by a "huge overreaction" to spam, and suggests that an effective deterrent could be developed via industry/government collaboration to enforce existing statutes, and legislation that authorizes honest emailing practices. EarthLink's Spam Blocker, which CEO Garry Betty says will debut next week, only allows users to receive email from people in their address book, and requires new senders to ask permission to send messages to recipients; Betty claims this system is 100 percent effective, and his company is also pursuing litigation against spammers.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Predictions for Software Development and Web Services"
    Computerworld (05/19/03); Betts, Mitch

    Experts are making diverse forecasts about what trends in Web services and software development will emerge over the next five to 10 years. John Radko of Global Exchange Services predicts that corporate budgets for internal and external IT will equalize within five years, while half of the current crop of enterprise application integrators will fold as Web services become the prevalent technology in the same period. Today's application servers will transition to legacy transaction platforms due to the emergence of a grid computing-based application development and runtime platform in seven years, according to Cyclone Commerce CTO Dave Bennett; Fiorano Software's Atul Saini believes that by 2010, "the process of building complex, networked and distributed enterprise software applications will finally become as easy as building towers with Lego blocks." Bonny K. Eappen of Tata Consultancy Services expects business application development via programming languages to be phased out by 2010 in favor of natural-language-like applications and components that can be reused and configured. SPI Dynamics CEO Brian Cohen anticipates the meshing of software development and security into a single process, and M7's Monsour Safai believes it could take about five years for organizations to complete the switch to a service-oriented architecture. Integic's Robert E. LaRose predicts private-sector IT advances will be driven by the government as more and more federal agencies take advantage of Web services, while IBM's Grady Booch envisions genuine on-demand adaptive computing thanks to applications that self-regenerate through user interaction. Meanwhile, American Management Systems' Paul Turner believes Web services will transform how citizens interface with all types of government agencies within five to 10 years, spurring IT vendors to offer products with built-in Web services.
    Click Here to View Full Article

  • "Going For the Wireless Gold"
    Electronic Business (05/01/03) Vol. 29, No. 6, P. 44; Poe, Robert

    Chip manufacturers plan to capture a piece of the projected market for cellular handsets that communicate over high-speed wireless networks through the development of powerful yet energy efficient processors. Such chips are expected to support devices that can run multimedia programs with graphics, music, and video, and transmit and receive that data over the wireless networks. One way to fulfill these requirements involves running the applications on an applications processor while a separate communications processor manages coding, decoding, compression, and decompression, a task that requires the latter chip to be significantly enhanced. The cellular handset market may be less affected by operating systems (OSes), Bluetooth, and Wi-Fi than previously believed: OSes share a fundamental ARM design, allowing them to run on multiple chips rather than just one or two brands, while Texas Instruments' Richard Kerslake claims that Bluetooth's inclusion as a standard component in new cell phones is reducing its competitive advantage. Meanwhile, International Data's Alex Slawsby contends that cellular batteries cannot accommodate the power requirements of Wi-Fi. Advancements in hardware, software, services, and network infrastructure must proceed rapidly and concurrently if wireless communications is to evolve, while spurring users to purchase handsets and services will rely on the development of killer apps. Indeed, the biggest hurdle chip suppliers face in conquering the wireless communications market is a scarcity of interested users. Business services, gaming, and location-aware services could open up some huge markets, provided the right products are developed.
    Click Here to View Full Article

  • "The Wi-Fi Revolution"
    Wired (05/03) Vol. 11, No. 5, P. 6; Anderson, Chris; Graves, Lucas; Frauenfelder, Mark

    At the vanguard of the open spectrum movement is wireless fidelity (Wi-Fi), a cheap, powerful, and viable way to access broadband Internet that has emerged as one of the most rapidly expanding electronics technologies of all time. Wi-Fi, which is not subject to telecom regulation, is being driven by a grassroots movement of home-based users; meanwhile, public networks of Wi-Fi access points are sprouting throughout the country. Getting Wi-Fi access currently involves seeking it out, but companies are working to ease this process via wearable sensors keyed to Wi-Fi signals, at least until commercial hotspots have sufficiently proliferated. Wi-Fi could also become the standard technology to enable a central entertainment server within households, and to bridge the "last mile" between telecom networks and homes relatively cheaply. After the technology becomes ubiquitous, a standard device for accessing any Wi-Fi network anywhere will likely be created. Important developments in the open spectrum movement include neighborhood wireless accessibility being inexpensively deployed by Chicago real estate developer Don Samuelson and NYCwireless founders Terry Schmidt and Anthony Townsend; and the phasing out of government airwave regulation through ultrawideband, cognitive radio, and other technologies supported by Dewayne Hendricks of the FCC's Technological Advisory Council. Sens. George Allen (R-Va.) and Barbara Boxer (D-Calif.) favor the penetration of broadband into homes and enterprises through legislation such as the Jumpstart Broadband Act of 2003, while Vivato CEO Ken Biba has extended Wi-Fi's range with new switching technology. Another key figure is Sky Dayton, whose Boingo Wireless reportedly connects roughly 1,200 Wi-Fi hot spots in 46 states, making his company, in his words, "the first...to make it easy for anyone to find and connect to a huge network of hot spots."

  • "Building a Standard"
    Government Technology (05/03) Vol. 16, No. 6, P. 34; Patterson, Darby

    Section 508 mandates that federal agencies purchase enablement technologies for the disabled, and since its passage almost two years ago over a dozen U.S. states have adapted Section 508 to their policies. Industry leaders and accessibility proponents such as Hewlett-Packard's Michael Takamura favor compatibility standards based on Section 508; the alternative is to contend with inconsistent standards for each state that impede the adoption of and adherence to accessibility guidelines. Another major goal is the development and adoption of international accessibility standards, which IBM's Shon Saliga says is "beneficial for us and the states and the global market." W. Edward Price at the Georgia Institute of Technology's Information Technology Technical Assistance and Training Center (ITTATC) notes that the market for wireless devices is burgeoning, yet many of these products favor "style over function" and are inaccessible to disabled users. ITTATC builds accessibility resources and makes them available to state and local governments, and also offers training programs, design resources, deployment aid, and support for government procurement officials. Meanwhile, Microsoft argues the business case for accessibility and lists training and information resources in the book "Accessible Technology in Today's Business: Case Studies for Success." Microsoft and Freedom Scientific jointly introduced a consumer version of PAC Mate, a PDA that serves visually impaired users, in late 2002, and announced that such enablement technologies will open up job opportunities to the disabled, a critical consideration as a massive number of government employees near retirement. At IBM, where accessibility is a top priority, Jim Sinocchi has set himself the goal of making accessibility a design component.

[ Archives ] [ Home ]