ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 641: Friday, May 7, 2004

  • "E-Voting Commission Gets Earful"
    Wired News (05/06/04); Grebb, Michael

    The Election Assistance Commission's (EAC) first public hearing on the state of elections and voting systems on May 6 was characterized by opposing testimony from supporters and critics of electronic voting machines. Johns Hopkins University computer scientist Avi Rubin warned that paperless e-voting systems raise the risk of election fraud, explaining that "very well-funded and bad-intentioned adversaries" would have a lot to gain by rigging elections. Also testifying was California Secretary of State Kevin Shelley, who raised concerns of hackers penetrating the systems and altering votes, and recommended that poll workers need better training in order to deal with technical difficulties on Election Day. Other panelists included Rep. Rush Holt (D-N.J.), who was stumping for his Voter Confidence and Increased Accessibility Act, a proposal to make paper records a required component of all voting machines. Jim Dickson of the American Association of People With Disabilities complained that such a measure would severely limit disabled voters and overturn protections set up by the Help America Vote Act of 2002. Representatives of voting machine manufacturers claimed that electronic systems actually increase election security because far fewer people know how to hack into e-voting systems than those who can rig and compromise elections by ballot theft, stuffing ballot boxes, or punching holes in voting cards. Kennesaw State University computer science professor Brit Williams said that to ensure security and ease training burdens, a single national software standard for e-voting systems was needed. EAC Chairman DeForest B. Soaries Jr. doubted that the commission's forthcoming preliminary recommendations will include national standards requiring paper ballots, insisting that "We will not decide on what machines people will buy."
    Click Here to View Full Article

    For more on e-voting, visit http://www.acm.org/usacm.

  • "Faces of Globalization: Jobs for Tech Grads"
    United Press International (05/06/04); Magers, Phil

    Technology job opportunities for graduates--at least those who fail to specialize in creative areas--will grow scarcer because of layoffs and offshore outsourcing, attest people such as University of Texas at Arlington senior Brad Pitman, who has interviewed with defense department contractors because of his interest in autonomous robot vehicles. A survey conducted by the Computing Research Association records a 19 percent decline in enrollment in computer science bachelor degree programs in 2003, and a 23 percent dip in the number of new undergraduates majoring in computer science. Both the high-tech industry slowdown and offshoring were cited as major reasons for the slackening interest. "This was not lost to prospective students and their parents who were previously looking into the IT sector as an easy entry into the job market," comments Ohio State University Computer and Information Science Chairman Stu Zweben, who coordinated the survey. However, he notes that findings from the Bureau of Labor Statistics indicate that that demand for computer science graduates may once again overtake supply in the approaching years. The bureau projects that software engineering jobs will be among the 10 fastest growing jobs through 2012, while nine out of 10 will be computer or health related. "The more creative things, to solve problems for new customers, problems where you really have to interface with the customer to understand their needs, these are the kinds of things our graduates are also being trained to do and these are the kinds of jobs that are not going to be off-shored," observes Zweben.
    Click Here to View Full Article

  • "Plan for Spectrum Is Making Waves"
    Los Angeles Times (05/07/04) P. C1; Shiver Jr., Jube

    Lobbyists from the technology industry are making their presence known in Washington as the FCC and the White House put pressure on television broadcasters to make better use of airwaves: Technology firms want greater use of spectrum to expand their wireless Internet offerings while more and more end users are going wireless. Television broadcasters were granted $70 billion worth of spectrum space in 1997 in order to prepare for the switch to digital television in 2006, but have made slow progress toward that goal. The airwaves they vacate would be given to technology firms to use; television frequencies are highly prized because they can easily pass through walls and other obstacles. The FCC recently said it would include cable and satellite television subscribers in the quota needed to force the switch, putting pressure on broadcasters to end their analog services. Once 85 percent of television viewers can receive digital transmission in a local market, broadcasters will have to relinquish their spectrum frequencies. The Bush administration and the FCC are eager to roll out more wireless services, which technology firms say provide cheaper and easier-to-manage infrastructure than wired lines. Tropos Networks in San Mateo, Calif., can serve customers with wireless Internet connections for just $3 compared to five to 10 times that cost required for wired cable and DSL connections, according to Tropos CEO Ron Sege. Intel lobbyist Peter K. Pitsch thinks all the enthusiasm in Washington is not just due to technology, but also to the technology industry's increased political influence: He says, "If you look at the information technology industry, two or three years ago we were almost nowhere when it came to spectrum debates. Now, we are right in the middle of things in a big way."
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "How Much Does Information Technology Matter?"
    New York Times (05/06/04) P. C2; Varian, Hal R.

    Nicholas G. Carr has written a follow-up book to his much debated Harvard Business Review article published in May 2003, "IT Doesn't Matter;" the new book, titled "Does IT Matter?," expands on his original hypothesis, giving detailed examples and analysis. The basic idea is that companies no longer derive significant competitive advantage from being able to deploy and manage IT because skills and management tools are so abundant--businesses do not think anything of setting up a Web server or inventory management system, for example. Henry Ford was so successful in the 1920s because his company knew how to set up and operate an assembly line better than rivals, but that competitive advantage disappeared once companies such as General Motors adopted similar techniques and even added their own process refinements. Carr argues that IT today is a utility like electricity or water, which is vital to business but not a measure of competitive differentiation in itself. Only in some industries are the business operations so complex that execution of those operations offers competitive advantage: Chip design and manufacture is one such industry, and Intel is able to keep its lead because of its manufacturing excellence, which was the basis for the company's "Copy EXACTLY" program in the late 1990s. However, IT is not a stagnant sector, and each wave of standardization and commoditization allows for more innovation and competitive advantage, even if only temporarily. In the 19th century, the standardization of wheels, gears, screws, and other basic parts allowed for tremendous innovation in farm equipment, locomotives, and so on; in the same way, today's PC, database, and scripting language technologies lay the groundwork for ongoing IT innovation. Companies need to stay on top of IT in order to keep their operations more efficient and effective than the competition.
    Click Here to View Full Article
    (Articles on this site can be accessed free within 7 days of the published date. Registration is required.)

  • "Helping Exterminate Bugs in Spreadsheets, Web Applications"
    Newswise (05/05/04)

    The National Science Foundation has awarded a five-year, $2.6 million Information Technology Research grant to the End Users Shaping Effective Software (EUSES) project, a six-campus initiative to help eliminate glitches in spreadsheets and Web applications developed by "end-user programmers." Experts reckon that there will be 55 million such programmers by next year, and believe that almost 50 percent of the programs they create will be infested by bugs. Oregon State University computer science professor and EUSES director Margaret Burnett says the project lives by the philosophy of helping end users improve their programming habits as unobtrusively as possible. Burnett and other Oregon State colleagues presented a paper at ACM's recent CHI 2004 conference in which they compared different techniques of notifying spreadsheet programmers that they may have created buggy code; their conclusion was that "negotiated" interruptions (similar to the automatic underlining of misspellings by a word processor) were better than immediate interruptions (such as pop-up error windows). "We learned: Stay out of [programmers'] way, give them hints to explore and they'll get more done," Burnett notes. Carnegie Mellon EUSES researchers Andrew Ko and Brad Myers presented a separate report describing a unique debugging interface that asks programmers questions about "why did" or "why didn't" something happen; users were able to find errors eight times faster and make 40 percent more programming progress. Myers comments that current debugging tools, which date back to the 1940s, are overdue for a serious upgrade. Meanwhile, EUSES researchers at Drexel University, Penn State, Oregon State, and Cambridge University are trying to gain insight into end-user programmers' mind set through observation, while another Oregon State-based EUSES effort is focusing on the development of summer science and technology workshops for middle- and high-school teachers and students.
    Click Here to View Full Article

  • "NASA Reassessing a Role for Robots"
    TechNewsWorld (05/06/04); Dunn, Marcia

    Robots could be employed to extend the life of the Hubble Space Telescope, assist in the maintenance of the International Space Station, set up lunar habitats, and perform other missions deemed too dangerous or costly for humans. Saving the Hubble after the scrapping of a shuttle mission in the wake of the Columbia disaster became a rallying cry that NASA responded to with a call for robot concepts. Among the machines under development that could be used to maintain the Hubble is the University of Maryland's Ranger robot; the humanoid Robonaut from NASA; and the two-armed Dextre from the Canadian Space Agency. David Akin, who led the team that devised Ranger, says the technology already exists to perform 90 percent to 95 percent of the Hubble, space station, or lunar operations NASA wants to do. He warns that "If NASA waits until robots become servants in your house, they're way far behind the power curve." The Columbia tragedy and the Hubble's shrinking lifespan may be the jolt NASA needs to start funding the realization of robotic space applications, Akin notes. Ed Weiler, NASA's associate administrator for space science, was impressed by the success of the recent Mars mission in which two robots landed on the red planet and one of them had to be repaired by Earth-based technicians. Both he and Akin believe robots could complement astronauts on spacewalks and boost overall efficiency. Looking further ahead, Akin posits that extensive stays on the moon and Mars will require dexterous robots that can construct bases and transports.
    Click Here to View Full Article

  • "Nanotech: Beyond the Hype--and Fear"
    Business Week (05/06/04); Aston, Adam

    The potential benefits of nanotechnology could be undercut or limited by unrealistic public expectations fostered by hype, or because of panic spread by alarmist musings on the technology's darker aspects. Kristen Kulinowski of Rice University's Center for Biological and Environmental Nanotechnology explains that one of her jobs is to make sure that science policymakers in Washington are well-informed as to where nanotech development and studies into its associated risks currently stand. She explains that nanotech risk assessment is concentrating on cellular and environmental effects of nanoparticles and nanomaterials: The biological study is focusing on how the toxicity of nanoparticles in cells can be controlled, while the goal of the environmental study is to find a way to refine the manufacture of nanomaterials as well as the materials themselves in order to avoid environmental contamination. Kulinowski believes significant insights on both these issues will be reached within three to five years. She notes that the human genome project--in which ethical, cultural, and legal concerns were anticipated early on and federal funding diverted to study the issues, keep the public informed, and provide transparency--serves as a template for nanotech development. Kulinowski believes the earliest nanotech applications will probably be for the field of biomedicine, examples of which include nanoparticles that attack cancer cells; she also points to Rice scientist Richard Smalley's work with single-walled carbon nanotubes, which could be employed as semiconductors. Two other Rice nanotech researchers, Mark Wiesner and Michael Wong, are respectively focusing on water-filtration membranes and nanomaterials that tap into solar energy to help disintegrate pollutants. Kulinowski thinks that up to 15 years may pass before any "paradigm-shifting" nanotechnologies emerge.
    Click Here to View Full Article

  • "How the Word Gets Around"
    Wired News (05/07/04); Terdiman, Daniel

    The purpose of Brandeis University senior Sam Arbesman's Memespread Project was to track the route of a meme throughout the blogosphere in real time in an attempt to study the mechanisms behind the spread of ideas on the Web. Arbesman set up a Web site and submitted the meme to the popular Boing Boing, Slashdot, and kottke.org blogs, of which only kottke.org was willing to post a link to it. The site consisted of a single page that logged the date and time that it was viewed, the viewer's IP address, and the referring URL whenever possible; it also requested the viewer to help spread the meme. The largest spike took place 10 hours after the experiment began, after the meme spread to the MetaFilter blog. Arbesman's initial analysis reports that at the height of its proliferation, the Memespread Project was the third most contagious piece of information listed on Blogdex and the second most contagious word burst registered on Daypop. "It was neat to see how impressionable people were...and how helpful people were in helping me spread this," Arbesman notes. Hewlett-Packard Labs researcher Eytan Adar does not completely agree with Arbesman's theory that the spread of memes is directly related to the popularity of certain blogs, and Arbesman's own data may support this argument: Indications show that the proliferation of the meme experienced a gradual decay once the MetaFilter-related spike ended. Arbesman acknowledges that the spread of the meme may have been affected by people's awareness that they were being observed. Jason Kottke of kottke.org thinks that the project was a partial success, in that "it did demonstrate the idea that memes are a lot like viruses...things that are contagious."
    Click Here to View Full Article

  • "Smalltalk With Object-Oriented Programming Pioneer Kay"
    SearchWebServices.com (05/05/04); Brunelli, Mark

    Dr. Alan Kay recently won the Association of Computing Machinery's (ACM) Turing Award for helping develop the first fully-fledged object-oriented programming language, Smalltalk. Now a Hewlett-Packard senior fellow and president of Viewpoints Research Institute, Kay says enterprise developers need to look at the early thinking surrounding computer development. He says ideas such as those espoused by Douglas Engelbart, who invented the mouse, have been ignored as business developers are in too much of a hurry to implement technology, not think about more sophisticated design. Kay says object-oriented programming was first conceived as a way to make computer programming scale with complexity: As he worked on Advanced Research Projects Agency (ARPA) research as a graduate student, Kay came upon the idea of modeling hardware and software so that data and procedures were irrelevant. At that time, ARPA researchers were working on how to turn ARPAnet into what was called an intergalactic network that would be as pervasive as today's electrical network. Kay investigated mathematical properties needed for the recursive biological cells that make up people's bodies. Noting there are vastly more cells in the human body than nodes on the Internet, he says the biological ability to self-repair is applicable to computer programming as it applies to even large applications such as the Internet. After some years of work, Smalltalk was made a practical product at the Xerox PARC laboratories and helped spawn the overlapping windows interface, which in turn led to other developments such as desktop publishing. Kay says many of those early computer innovations were just steps toward researchers' larger aims, including changing the way humans think and exchange ideas in the same way the printing press did in the 17th century.
    Click Here to View Full Article

    For more information on the ACM A.M. Turing Award, visit http://www.acm.org/awards/taward.html

  • "ACM Announces Award Winners and Fellows"
    ACM (5/7/04)

    As part of ACM's mission to bring broad recognition to outstanding technical and professional achievements within the computing and information technology community, ACM has announced the winners for 12 awards that span a variety of professional and technological expertise. Included are the Turing Award, known as the Nobel Prize of Computing, and a new class of ACM Fellows, our distinguished colleagues to whom ACM and its members look for guidance and leadership as the world of information technology evolves.

  • "Computer 'Mobile Agents' and Robot Tested by NASA"
    Spaceflight Now (05/03/04)

    NASA researchers are putting "mobile agent" software through its paces in Utah's Southeast Desert in a test involving exploratory research conducted by human scientists and a prototype robotic assistant dubbed Boudreaux. The test is designed to simulate communications between planetary explorers, robots, and Earth-based mission support. The researchers command Boudreaux using laptops equipped with voice-responsive mobile agent software; the user talks through a microphone to his personal agent software, which in turn relays the instructions to the robot's personal agent software. Bill Clancey at NASA Ames Research Center explains that future planetary exploration will involve the transmission of data among science team members--both on the planet being explored and on Earth--via personal agent software. Information collected in the field will be stored in a database in the explorers' planetary habitat, and then emailed to Earth by the personal agent software. Clancey adds that the researchers playing the part of astronauts in the Utah experiment are also communicating with science teams at universities to test planning and communications software as well as procedures. "By sharing data as soon as possible, and sending a video of the crew's planning session for the next day's work, we hope to learn how the Mars crew and scientists on Earth can best work together," he notes. The astronauts will carry a computer with global positioning system technology, and the collected data will be time- and location-stamped by the agents.
    Click Here to View Full Article

  • "Presenting the Case for Industrial Qualitative Modelling"
    IST Results (05/04/04)

    The goal of the Information Society Technologies-funded MONET2 project is to attain new insights into and industrial applications for Model-based Systems and Qualitative Reasoning (MBS&QR) technologies. The project focused on the automotive, education and training, medical, and applied diagnostics sectors in order to obtain the most optimal results; MONET2 has determined the most probable industrial development of MBS&QR technology, outlined the technological requirements necessary to bring the development about, and is now applying its research to industry. MONET2 project coordinator Iain Russell at the University of Wales, Aberystwyth's Computer Science department, says that in order for MBS&QR technologies to reach their full potential, "we are currently building an online 'help' portal for educational QR Modeling and working towards introducing model-based QR technologies in modeling/eco-modeling into the curriculum of Welsh secondary schools." MONET also hosted a summer school in Crete for industrial entrepreneurs and doctoral students seeking to become more knowledgeable about MBS&QR technologies. Among the European projects MONET2 has spawned is REDIME, an investigation into the development of QR models for stream ecologists; and Protocure II, an IST software engineering initiative researching the improvement of medical protocols and guidelines via formal methods. MONET2's precursor, MONET1, was carried out at a time when an MBS&QR community was virtually nonexistent. Since then, the community has expanded to include specialists from numerous fields and leading companies, with 88 member institutions established globally.
    Click Here to View Full Article

  • "The Internet's Wilder Side"
    New York Times (05/06/04) P. E1; Schiesel, Seth

    Internet Relay Chat (IRC) is older than the World Wide Web, but still untamed: Virus writers research and launch code, hackers direct denial-of-service attacks, and copyright pirates offer music, video, and software for free on the IRC. Experts estimate no more than 500,000 IRC users are online at any one time, yet the network has a disproportionate effect on the wider Internet and even home computer use. Many bootleg copies of Windows software or antivirus utilities often originated on the IRC, and popular file-trading networks often get their material first from the IRC. What began as an efficient way for serious computer enthusiasts to communicate and trade code in the 1980s has evolved, or perhaps devolved, into a hotbed of illegal computer activity: "Elite" pirates use the IRC, feeding content into the wider Internet, says Business Software Alliance enforcement director John R. Wolfe. The FBI and 10 foreign law enforcement agencies launched an anti-piracy operation called Fastlink aimed at shutting down caches of illegal software online that can distribute hundreds or even thousands of copies each week. IRC infrastructure is easy to operate because it is text-based and servers do not serve as intermediaries for large file-transfers; instead, servers are only used for communication while transfer links are established between the end users themselves. Client programs needed to connect to the IRC are readily available, and hackers can install small, custom-built IRC clients in unprotected machines, enlisting them in a drone army for launching denial-of-service attacks. IRC server software developer William A. Bierman, whose online handle is billy-jon, says he sometimes cooperates with the FBI in tracking down culprits of major attacks, but has refused government entreaties that he install backdoors that let FBI agents monitor IRC traffic more easily. Another IRC administrator Rob Mosher, a.k.a. nyt, says it is useless to try and regulate illegal activity on the IRC because of its open nature.
    Click Here to View Full Article
    (Articles on this site can be accessed free within 7 days of the published date. Registration is required.)

  • "How to Build a Better Hand"
    Toronto Star (05/03/04); Carey, Elaine

    Tom Chau of the Bloorview MacMillan Children's Center has developed an artificial hand that responds to sounds produced by muscles in the arm, and employs a sensor to filter out background noises. The prosthesis is outfitted with a small computer chip that is trained to interpret the user's muscle sounds and move in the desired way. "The training is mostly in the machine," explains Chau, who recently won a five-year, $500,000 Canada Research Chair in pediatric rehabilitation engineering at the University of Toronto. Conventional hand prostheses are driven by electrical signals mounted in a hard plastic socket that must be firmly positioned over the muscle, but these devices are of limited use to children such as five-year-old Megan Strysio, who was born without a hand and lower forearm; consequently, the socket must be mounted over her elbow, which hinders forearm rotation and only allows her to control the movement of the hand's thumb and first two fingers. Chau's device is equipped with a soft socket that rolls on, and it enables elbow rotation and full movement of all five digits because the sensors can be positioned away from the muscle. Jorge Silva, a colleague of Chau's, programs the computer chip to correctly interpret the signals, and he notes, "I'm not training Megan how to use it; I'm teaching the hand how to interpret Megan." The program will receive updates so that the signals will not change as Megan gets older and the hand is replaced. Chau thinks the technology can be applied to the monitoring of vital signs such as heartbeat and respiration, and he is developing a communication system for severely ill, speech-impaired children that can decode head turns, eye blinks, and other understated movements.
    Click Here to View Full Article

  • "Internet Addresses, Phone Numbers Could Soon Be Interchangeable"
    Ottawa Citizen (05/05/04) P. D2; Wilson, Peter

    Widespread adoption of the ENUM domain naming system could vastly change telecommunication by making Internet addresses and telephone numbers interchangeable. ENUM has already been endorsed in the United States by President Bush's administration, but Canada has not yet taken an official position on the matter. Ottawa technology consultant and board member of the Canadian Internet Registration Authority Timothy Denton predicts that the "domain name system is going to blob over and take over the phone addressing system." He suggests that because of its widespread application across different lines of communication, ENUM will have to be proven secure before it is accepted by the public. The benefits of the system, according to Denton, will be the improvement of voice-over Internet protocol and the elimination of geographically specific phone numbers. ICANN CEO Paul Twomey says that as these changes in numbering are taking place, the domain name structure of the Internet is also likely to undergo a major shift from the Euro-centric names that dominate now, such as .com, .edu. and .gov, to names with Chinese, Korean, Japanese, and even Arabic and Cyrillic characters. While the English-dominated top-level domains are likely to remain in use internationally, Twomey says the impending changes could result in the splintering of the Internet into smaller, localized domains based on language.
    Click Here to View Full Article

  • "Making Up for Lost Time on IPv6"
    America's Network (05/01/04) Vol. 108, No. 7, P. 14; Poe, Robert

    U.S. companies may not be as interested in Internet Protocol version 6 (IPv6) as their foreign counterparts, but the recent involvement of the U.S. government has given at least some boost to national IPv6 development. The Moonv6 project started last June, at the same time the Department of Defense announced that all of its new global information grid assets would be IPv6-capable beginning Oct. 1, 2003. The goal is to get all Defense networks to IPv6 capability by 2008 while retaining backwards IPv4 compatibility. IPv6 has never been much of a concern in the U.S. because the early involvement of many U.S. companies ensured an ample supply of IPv4 Internet Protocol addresses, which were awarded on a first-come, first-serve basis; other countries contributed to IPv6 development more earnestly beginning in the 1990s, though the urgency has lessened somewhat with the use of DHCP (dynamic host configuration protocol) and NAT (network address translation), which make IPv4 more flexible. IPv6 still does the same tasks more simply and easily, and offers added security and configurability benefits as well. Global Crossing's Anthony Christie says IPv6 will allow event-specific networking where networks can be easily set up and dismantled for temporary events, such as concerts, conventions, or disaster recovery sites. The recent Moonv6 project involved government and industry in some of the first major IPv6 collaboration in the U.S., and established a permanent IPv6 backbone for testing and peering purposes. AT&T's Rose Klimovich says the Defense Department's involvement has spurred increased interest from vendors, along with the desire to be first in a new market.
    Click Here to View Full Article

  • "Linux Weighs In"
    Federal Computer Week (05/03/04) Vol. 18, No. 13, P. 16; Hardy, Michael

    Linux is taking over the high-end computing sector because of its flexibility, increasing maturity, and cost-effectiveness, according to scientists at government laboratories: New clustered supercomputers are almost all built to run Linux, which is much cheaper to implement across hundreds of processors than proprietary software such as that from Microsoft, says analyst Jonathan Eunice. A new Linux kernel is also setting the operating system up for even greater supercomputer adoption, as that update is expected to add many critical system and application management functions, allowing researchers to cordon certain jobs and manage how memory is used. Linux is widely known for its horizontal scalability across many nodes, but the operating system is also increasingly scalable in the vertical dimension, or the number of processors able to work together in one node, says Silicon Graphics' (SGI) Andy Fenselau, whose company adopted Linux in 1999. The Pacific Northwest National Laboratory now houses about 2,000 processors, of which less than 100 are not running Linux, estimates advanced computing associate director Scott Studham; he adds that many government laboratories need the flexibility of Linux to develop cutting-edge supercomputing systems, but that improved open-source software tools such as the Lustre file system have also contributed to greater adoption. NASA Ames Research Center is currently testing the Linux 2.6 kernel, which terascale systems leader Bob Ciotti says will add much-needed capabilities. The Linux kernel 2.4 is not as mature as SGI's Irix, a proprietary version of Unix that the center used previously. Still, Studham says supercomputers by the end of the decade will harness hundreds of thousands of processors--a challenge that proprietary Unix versions are probably more prepared for than Linux. "If I were a bank, I'd probably be using Unix," Studham says.
    Click Here to View Full Article

  • "The Interactive Nightmare"
    CSO Magazine (04/04); Datz, Todd

    A group of concerned scientists warned President Bush in 2002 that the United States' critical infrastructure is very vulnerable to cyberattack, and that massive disruptions of electrical power, finance, transportation, and other systems could take place by accident or through deliberate action. Cybersecurity experts believe such an attack could damage both the U.S. economy and citizens' confidence in government. The Homeland Security Department has the responsibility to protect critical infrastructure, led by the National Cyber Security Division, but doing so is not easy--up to 90 percent of critical infrastructure is privately controlled, and the agency has no regulatory authority, so it must convince instead. Other agencies have divisions dedicated to protecting critical infrastructure as well, but the newer Homeland Security Department is where cybersecurity efforts are coordinated, and it is still pulling itself together. The agency is to implement the National Strategy to Secure Cyberspace, and priorities include developing a national cybersecurity response system, a threat and vulnerability reduction program, and a security awareness and training program, as well as securing government cyberspace at all levels and assuring national and international cybersecurity cooperation. CERT manager Tom Longstaff says more cyberattacks are aimed at hijacking Internet control structures, such as domain names, and not necessarily outright damage to Internet infrastructure. Other attacks are aimed at the interfaces between the physical and cyber worlds, such as control systems for power grids, manufacturing, and gas lines. Part of the problem is trying to understand the interdependencies among networks. The stakes are high: although he says the changes are small, Alan Paller, director of research at the SANS Institute, says, "It's absolutely feasible for a massive attack to take out huge segments of the Internet."
    Click Here to View Full Article

  • "Spinning a Smarter Web"
    Information Highways (04/04) Vol. 11, No. 3, P. 16; Bowness, Sue

    The Semantic Web is a project under investigation by the World Wide Web Consortium, veteran research labs such as Stanford University's Knowledge Systems Lab, and other facilities such as Ireland's Digital Enterprise Research Institute to develop a shared system that will help computers comprehend how databases and documents relate to one another, as well as enable them to reason and communicate. Whereas the majority of the present Web's content is designed for human consumption, the Semantic Web would employ "knowledge representation" or "description logic" languages to make that content more understandable to computers. Extensible Markup Language is an early manifestation of such languages, but Resource Description Framework, DARPA Agent Markup Language, and Ontology Inference Layer--which were merged into the Ontology Web Language--are more geared toward the Semantic Web, and share the goal of generating a universal framework that illustrates the relationship between a system's various entities. Artificial intelligence researchers such as Sheila McIlraith of the University of Toronto are working on "agent" applications designed to interact with each other and follow users' research instructions. McIlraith notes that the Semantic Web currently lacks a business case. Even if one were to emerge, there is no guarantee that the Semantic Web will become universal. Other challenges facing the Semantic Web include issues over privacy, and how agents can determine the trustworthiness of information when confronted by conflicting data. Academic and government research labs are driving Semantic Web advances, including Canada's National Research and faculty at the University of New Brunswick and the University of Alberta. McIlraith says, "Europe has a huge funded initiative in Semantic Web" and "has taken the Semantic Web very seriously."
    Click Here to View Full Article

    [ Archives ]  [ Home ]