Volume 4, Issue 426: Friday, November 22, 2002
- "Sept. 11 Showed Work Needed on Internet"
United Press International (11/20/02); Burnell, Scott R.
Although the Sept. 11 terrorist attacks physically impacted the Internet only slightly, they revealed the need for operators to review Internet redundancy plans, says a new study by the National Research Council. The study, requested by the Association for Computing Machinery's Special Interest Group on Data Communication (SIGCOMM), aimed to get data about how the Internet coped with losing major communication nodes in Manhattan, says Craig Partridge, BBN Technologies chief scientist, ACM SIGCOMM chair, and chair of the report committee. Although the report found that overall the Internet is resilient enough to withstand attacks at specific sites, the report said that repeated attacks at multiple sites would be problematic. Partridge also notes that when the terrorist attacks damaged some Internet links and services, the effect was often surprising. For example, service providers in Europe lost their connectivity because of being connected to the Internet via a New York City link, and South Africa lost connectivity when it could not link to the Internet's address database. Meanwhile, the study found that certain data links thought to be separate were actually routed over the same fiber-optic cable, leaving them both vulnerable. Users of the Internet also showed a lack of contingency plans, the report said. For example, a New York City hospital briefly was unable to provide medical data to its physicians over PDAs because the attack interrupted its link to an external service provider.
- "Russia Joins Tech-Worker Game"
SiliconValley.com (11/21/02); Ackerman, Elise
Information technology currently accounts for roughly 1 percent of Russia's gross domestic product, and although Russian IT workers currently number 70,000, about 1.3 million Russians have a computer science or engineering degree. In order to find employment for these people, Russian outsourcing companies are hoping to net lucrative overseas contracts at the second U.S.-Russia Technology Roundtable. Intel Russia President Steve Chase notes that the country is bustling with creative and highly disciplined tech talent that is underused. Furthermore, such people are willing to work cheap. "Russia is where India was 10 years ago," Chase comments. Aberdeen Group research director Stephen Lane singles out Russian developers for their ability to tackle core engineering projects. "When it comes to solving mathematical algorithms, they are basically unbeatable," says Chase. But with well-entrenched firms securing larger contracts, Russian IT professionals need to build skills in other areas in order to compete, Lane explains.
- "Protecting U.S. Could Boost Tech Sector"
Los Angeles Times (11/21/02) P. C1; Shiver Jr., Jube
The technology sector could receive a much-needed shot in the arm with the Senate's passage of the Homeland Security bill, which calls for the consolidation of 22 federal agencies into a Department of Homeland Security. The effort will require the installation of computers and other electronic equipment, which could add up to hundreds of millions of dollars in sales. Furthermore, the government could spend even more money to develop and deploy technology to aid its anti-terrorism initiatives. Still, Wall Street analysts do not foresee industry gains, at least in the short term. "Investors are not in a 'show me the money' mode," says Stephens analyst Timothy Quillin. Nevertheless, companies are landing lucrative federal security contracts to implement new systems and services: Time-Domain, for example, is developing ultrawideband, which promises to spawn wireless services that could facilitate and improve advanced radar surveillance, collision avoidance systems, and the location of people buried in earthquakes. Time-Domain's Jeff Ross says the government is focusing heavily on commercially available solutions. However, Heritage Foundation research fellow James Gattuso cautions, "When the government starts putting its fingers in commercial technologies, you are going to end up wasting a whole lot of taxpayer money and end up with things that only bureaucrats love." VeriSign's Barry Leffew says the homeland security bill "could generate a very substantial" amount of business; experts say the bill could help overall security spending to jump from $16 billion to $40 billion over the next five to seven years.
Click Here to View Full Article
(Access to this site is free; however, first-time visitors must register.)
- "Homeland Security's Tech Effects"
CNet (11/20/02); McCullagh, Declan
The legislation that passed Congress to create the Homeland Security Department also includes a number of details concerning law enforcement's ability to secure the cyber-front. A number of new offices will be established to consolidate information on threats to critical infrastructure, including the Internet and communications, and research on Homeland Security-related technologies. The new Homeland Security Advanced Research Projects Agency, for example, will operate similarly to the Defense Advanced Research Projects Agency (DARPA) and will receive $500 million each year to fund innovative security research. Bundled with the legislation was the Cyber Security Enhancement Act, which provides stiffer penalties for computer crime, allows ISPs to reveal more to law enforcement, and loosens restrictions on Internet eavesdropping by the government. The Center for Democracy and Technology warns that the powers given the new department constitute grave threats to the privacy and civil liberties of Americans because it aggregates intelligence-gathering and analysis. Still, the Homeland Security Department will appoint a civil liberties officer and privacy advocate to mitigate any invasive practices. The legislation creating the agency also included a provision that barred "the development of a national identification system or card." The bill consolidates five agencies now responsible for critical infrastructure protection: the Defense Department's National Communications System, the Commerce Department's Critical Infrastructure Assurance Board, the Federal Computer Incident Response Center, and Energy Department analysis center, and the National Infrastructure Protection Center at the FBI. The bill also permits the department to enlist volunteers to help local communities recover from attacks on communications networks and information systems, and orders the development of a comprehensive national plan for securing the critical infrastructure of the United States.
- "Dyson Seeks to Amplify the Public's Voice in Internet Policy"
TechNews.com (11/20/02); McGuire, David
Former ICANN Chairwoman Esther Dyson says that Internet users around the world need to be more educated about Internet issues in order to make ICANN global democracy work, which is one reason Dyson is not unhappy about the end of elected ICANN board members. Dyson does support having elected ICANN board members, but she also says that Internet users now need to work effectively within ICANN "in the bowels rather than on the board." ICANN's reform plan includes an Internet-user supporting group called the At-Large Advisory Council (ALAC) that will provide ICANN with advice while also being responsible for choosing some members of the ICANN nominating committee that does choose ICANN board members. Internet users must help ALAC by creating regional sub-groups as a forum for Internet users in five distinct geographical areas: North America, South America, Europe, Asia/Pacific, and Africa. Dyson, who is working to coordinate the groups, believes that if enough users participate in ALAC that ALAC will possess clout, an opinion shared by Center for Democracy and Technology analyst Rob Courtney. Courtney says that an effective ALAC is a long-shot considering the many obstacles, including a lack of funding. Common Cause President Don Simon calls ALAC a "pale shadow" of ICANN's original mandate for incorporating public-participation within ICANN.
To read more about ICANN, visit http://www.acm.org/usacm.
- "Tele-Immersion System Is First 'Network Computer,' With Input, Processing, and Output in Different Locations"
Scientists from the University of Pennsylvania and the University of North Carolina at Chapel Hill collaborated to create the first tele-immersion system to rely on computing power over a network. The system displayed at the Super Computing 2002 meeting in Baltimore involves two adjacent rooms with banks of digital cameras set up in a half-circle. Based on visual data captured, participants in each room experience something like a 3D videoconferencing session. The users also wear special headgear that the cameras can track, as well as polarized glasses to relay images in each eye with subtle differentiations. The innovative aspect of the project comes from the fact that the data processing is not supplied on site, but piped over the Internet2 backbone to the Pittsburgh Supercomputing Center, 250 miles away, and back. University of Pennsylvania assistant professor of computer and information science, Kostas Daniilidis, says the tele-immersion project is made more realistic with supercomputing power and the high-speed network because it cuts latency to just 200 milliseconds. He says that such powerful infrastructure would allow even more potent capabilities than if computing processes were completed at the site itself. Previous tele-immersion experiments had reaction times of 15 seconds and scanned only limited portions of a room.
- "Energy Needs May Limit Size, Ability of Quantum Computers"
The more accurate a quantum computer is, the greater its energy needs, according to a report by University of Arkansas physics professor Julio Gea-Banacloche. Precise quantum calculations require pulsed electromagnetic fields, and a way to generate the energy necessary has yet to be found, the researcher explains. "Also, with solid-state controllers, such as capacitors, there is a minimum size because they have to be able to hold a certain amount of energy," Gea-Banacloche notes. He observes that the same amount of energy is needed by different proposed quantum computing control systems, as determined by Heisenberg's Uncertainty principle. Quantum computers with shorter decoherence times--how long the system can operate before it falls apart--require more energy. Decoherence requires quantum computers to scan and correct every bit for errors constantly. Gea-Banacloche estimates that a system with a 10-microsecond decoherence time needs at least 10 megaWatts. The majority of solid-state quantum systems have shorter decoherence times, while atomic trap quantum systems' exhibit longer decoherence times. Gea-Banacloche's findings were published in a recent issue of Physical Review Letters.
- "InfiniBand Reborn for Supercomputing"
CNet (11/21/02); Shankland, Stephen
The InfiniBand high-speed networking standard, which never fulfilled its promise to replace PCI technology and revolutionize business computing, may get a second chance from the supercomputing sector, which has pledged support to build supercomputers and related products using the technology. Many InfiniBand companies are using this week's SC2002 conference to unveil products: Paceline Systems, which counts Sandia National Laboratories and the University of Washington among its clients, is collaborating with MPI Software Technology to build Message Passing Interface (MPI), an open-source software application that controls data sharing between disparate systems. Meanwhile, Dell Computers says InfiniBand clusters are currently undergoing laboratory testing to see if they could be a viable choice for high-performance computing. In addition, InfiniBand was used to interconnect 128 units into a supercomputer at Los Alamos National Laboratory. "The idea of having a high-performance, low-overhead interconnect that everyone can agree on is pretty appealing," notes Illuminata analyst Gordon Haff. InfiniBand's advantages include higher data transfer speeds than Ethernet or Fibre Channel; InfiniBand 4x offers data transfer speeds of 10 gigabytes per second (Gbps), while Fibre Channel is standardized at 2 Gbps and mainstream Ethernet at 1 Gbps. Although InfiniBand is not cheap, it could be used in Beowulf clusters, clusters of interconnected Linux machines that can scale to provide high-performance computing. Such clusters are harder to program than traditional supercomputers but use inexpensive software such as Linux and GNU software.
- "Pushing Patents"
San Francisco Chronicle (11/18/02) P. E1; Pimentel, Benjamin
Silicon Valley giants such as Hewlett-Packard and Cadence Design Systems may be losing money, but their research and development budgets and patent filings have continued to climb, thus stressing the importance of having a portfolio of intellectual property, especially in a downturn. "We cannot sustain our innovation without investing in R&D and protecting that R&D with patents," declares Silicon Graphics CEO Bob Bishop. Even when companies let staff go in order to reduce costs, they make an effort to at least maintain their R&D initiative. This is balanced by an even more aggressive attitude when it comes to defending their portfolios, according to Fernandez & Associates managing partner Dennis Fernandez: Nowhere is this more evident than in the numerous patent infringement lawsuits that have sprung up in recent months. However, patent quantity does not guarantee competitive superiority; no Silicon Valley firm has as yet been vaulted to the U.S. Patent Office's list of top 10 American companies since it began collecting data by organization nine years ago. HP hopes to make the list through its recent merger with Compaq, and has accelerated its patent filings over the last several years. IBM is currently the industry leader in terms of patents awarded and revenue generated from royalties. The company has received more patents than any other U.S. company annually for the past nine years, while its intellectual property portfolio swelled its coffers with over $1.5 billion in licensing royalties last year.
Click Here to View Full Article
- "A Long, Hard Look at the Hackers"
Financial Times (11/22/02) P. 10; Hunt, Ben
Governments are taking the perceived threat of cyberterrorism very seriously, especially after the Sept. 11 attacks. The U.S.'s national strategy to secure cyberspace postulates a scenario in which terrorists use the Internet to remotely cripple electrical grids, disrupt telecommunications networks and air traffic control systems, and hold national e-commerce and credit card services hostage through the theft of hundreds of thousands of electronic IDs. However, critics such as Gartner's Richard Hunter say that the growing awareness of the "ubiquitousness of technology" has caused the cyberterrorism threat to be overexaggerated. He argues that cyberattacks, although "worrisome," would not have a major economic or social impact. "There is some potential there but most people for the foreseeable future have more to fear from a harsh winter storm than a cyber-terrorism attack," Hunter declares. Still, he and digital risk specialist mi2g acknowledge that cyberattacks, when used in conjunction with physical attacks, could amplify damage. Further fueling worries of cyberterrorism is the presence of hackers, who number in the thousands: Mi2g calculates that there are 6,000 hacker groups around the world, and while most hack purely for recreation, there are growing numbers of individuals motivated by politics or employed by criminal syndicates. At the U.S. Naval War College in July, analysts and industry representatives participated in a Digital Pearl Harbor exercise, and 79 percent said they expected a cyberattack to occur within the next two years.
- "Agencies Fail Cyber Test"
Washington Post (11/20/02) P. A23; Lee, Christopher
The House Government Reform subcommittee gave the federal government an overall failing grade for its computer security efforts, which were assessed in a study by the General Accounting Office (GAO). The GAO's study flunked 14 out of the 24 largest departments and agencies it studied, while seven agencies earned a "D" grade and two earned a "C". The highest grade, a "B-", was given to the Social Security Administration. "The overall government grade is an F, the same as last year," said Rep. Stephen Horn (R-Calif.), who chairs the panel. The GAO found that all 24 agencies had significant shortcomings; some included access control and vulnerability of sensitive information to former employees, thieves, or terrorists. As a result of the deficiencies, federal payments could be stolen or lost, Social Security and other records could be released illegally, and thieves could access personal information from tax records to set up false credit histories. Kenneth Mead, the inspector general at the Department of Transportation (DOT), said the DOT needs better control over its systems since hundreds of thousands of people are authorized to use its computer networks. At the Social Security Administration, employees must notify officials of suspected computer viruses and routinely discuss security. The GAO found that overall, "poor information security is a widespread federal problem with potentially devastating consequences." Still, the GAO's Robert F. Dacey says that as vulnerabilities are better understood, the government moves closer to addressing the problems.
- "New Dogs or New Tricks Await Semiconductor Industry"
Nanotech Planet (11/18/02); Pastore, Michael
Experts at the Nanoelectronics Planet Conference & Expo this week discussed the future of silicon as a platform for computing. Many attendees said that nanoelectronics would only be able to make halting progress into the semiconductor realm because of the huge investment the industry has already made in silicon-based technology. However, Micro and Nanotechnology Commercialization Educational Foundation President Dr. Steve Walsh said nanotechnology would eventually revolutionize the tools used to inspect semiconductors, the materials they are based on, semiconductor components, and entire semiconductor systems. The time-frame for this change, Walsh said, was anywhere from 10 years to 100 years, depending on who you ask. Because of this uncertainty over the commercialization of nanotechnology in semiconductors, Walsh said that venture capitalists were wary of funding nanotechnology startups, leaving most of the work to be done by large firms. Startup companies, however, would provide important niche applications for semiconductors using nanotechnology. Silicon is running up against serious problems, such as overheating and signal interference, but University of Rochester electrical and computer engineering chair Dr. Philippe Fauchet expects more innovation to extend the life of silicon as a semiconductor platform. He said porous silicon and optical interconnects would help mitigate problems and make silicon "the dominant material in microelectronics" in "our lifetime."
Click Here to View Full Article
- "Cyber Dudes to the Rescue"
Los Angeles Times Magazine (11/17/02) P. 20; Reitman, Janet
The nation's computer infrastructure could be protected by the Cyber Corps, a federally funded task force of young computer experts who will be offered cybersecurity degrees in return for two years of government service. Some 150 students are enrolled in 11 Cyber Corps "scholarship for service" programs throughout the United States; leading the campuses participating in the initiative is the University of Tulsa.
White House Cyber Czar Richard Clarke notes that cybersecurity plays a key role in homeland security, but there is a profound lack of IT talent with the necessary training. Students participating in the Cyber Corps program are required to steer clear of behavior and situations that could endanger their chances of being granted a security clearance, and will be subject to polygraph tests and background checks if so desired by the agencies that hire them. Some applicants are attracted to the program out of a sense of patriotism in the wake of Sept. 11, while others are motivated by the sense of power and the "coolness" factor. Students study viruses and other computer assault technologies, channeling their hacker skills into projects that benefit the nation's infrastructure and forestall cyberterrorist attacks. This course of study is balanced with repeated warnings that much of what students are being taught should not be applied outside their government jobs. Former National Security Agency information assurance director Michael Jacobs insists that the screening process for potential Cyber Corps members is "very robust."
Click Here to View Full Article
(Access to this site is free; however, first-time visitors must register.)
- "Learning From the 'Thumb Tribes'"
Business Week Online (11/20/02); Hof, Robert D.
The mobile Internet stemming from the convergence of the Web and cell phones will give rise to what "The Virtual Community" author Howard Rheingold calls "smart mobs"--social groups that are able to carry out collective activities with astonishing speed. He says that he has observed emergent smart mobs in Tokyo and Helsinki, where people thumb messages on their cell phones; he also notes that cell phones were also used to coordinate coups in Venezuela and the Philippines, while al Qaeda's Sept. 11 attacks also owed a good deal of their planning and execution to the mobile Internet. Rheingold argues that American industry is a laggard in terms of its adoption of the mobile Internet. He says the growing sophistication of technology will make it easier for groups of people to mobilize their actions by lowering transaction costs. In fact, the next technological breakthrough may not be new hardware, but the way in which people make use of existing technology. Rheingold notes that with Napster, 70 million people shared the use of their hard disk; He says, "What happens when those are mobile devices [and vast amounts of computing power]?" However, Rheingold cautions that telecommunications companies, manufacturers, or cable companies could limit mobile Internet applications by clamping down on innovation and research. "The lockdown of broadband providers and the move to build digital-rights-management chips that limit what changes you can make on the edges of the network may mean that the Gateses or the Wozniaks or the Berners-Lees of the future won't be in the game," he says. "And that means the game slows down."
Click Here to View Full Article
- "Cyberinfrastructure Will Fuel Scientific Discovery, NSF Chief Says"
InternetNews.com (11/19/02); Shread, Paul
Grid computing will play a key role in future scientific discoveries, said National Science Foundation Director Rita Colwell this month at the ACM's Supercomputing 2002 conference in Baltimore. She said in her keynote address that a number of current projects could form the basis for a larger future grid. The TeraGrid is, in Colwell's estimation, the most advanced computing platform available to U.S. scientists today. Researchers not only have access to vast computing power, but also remote storage facilities and research tools through the TeraGrid. The Grid Physics Network, also known as GriPhyN, and the Network for Earthquake Engineering Simulation (NEESGrid) are other examples of the emerging cyberinfrastructure. Colwell added that supercomputing grids would help the life sciences field understand life on this planet. The Human Genome project, she noted, would not have been accomplished were it not for advances in computing. The next step beyond cataloging the genome is protein folding, which Colwell said used to take up to 20 months to model, but now takes just a day. As the nation's cyberinfrastructure continues to develop, Colwell foresees it being used to solve problems such as cataloging all living species, managing global fresh water resources, tracking climate change, and preventing the spread of infectious disease.
Click Here to View Full Article
- "Future of the Notebook"
Computerworld (11/18/02) Vol. 36, No. 47, P. 34; Anthes, Gary H.; Brewin, Bob
In addition to speed upgrades and size and weight reductions, notebook PCs will go through other, more fundamental changes over the next five years. Processors such as Intel's Banias chip, which features improved heat and power management, will be incorporated into most mobile devices. Meanwhile, the advent of the organic light-emitting diode (OLED) will allow notebook screens to be thinner, more power-efficient, and flexible. Lithium ion batteries are likely to remain the power source of choice for manufacturers, although lithium polymer batteries have made significant progress. It is estimated that as much as 90 percent of all notebooks will be equipped to support at least one version of the Wi-Fi standard within two to three years, while security operations will be strengthened by their migration from software to hardware. Dell Computer's Michael Zucker says that even more security will be added with the emergence of smart cards used in conjunction with personal identification numbers, and future portables will also feature biometric devices. Portable computers are likely to diverge into two general product categories: Mostly office-centric notebooks with big screens, powerful CPUs, docking stations, and limited battery life; and smaller "road warrior" models for travelers that are outfitted with batteries that last all day. Tablet PCs, which are used with a pen-like stylus for writing and navigation, are expected to serve a niche market. And despite recent storage technology advances, Hewlett-Packard's Matthew Wagner says it will likely be 2005 before computers become equipped with new memory technology such as always-on nonvolatile RAM (NV-RAM). Still, he says, "The disk drive trend is one you can always count on--areal density doubles every year."
Click Here to View Full Article
- "Dead Air"
Forbes (11/25/02) Vol. 170, No. 11, P. 138; Woolley, Scott
Despite the open nature of the nation's airwaves, most of the radio spectrum has been apportioned to private industry and special interests, severely restricting opportunities for cellular carriers, wireless providers, and others to give quality services to mainstream consumers. As a result, 20 million Wi-Fi users are forced to share a narrow band of spectrum while a wider band set aside to distribute videos to schools remains mostly unused, and most TV airwaves sit idle while cellular congestion gives rise to poor call service and sound quality--and such problems are only the tip of the iceberg. Local governments currently own 72 MHz of prime spectrum for the purpose of dispatching police and firefighters, but they rely on outdated analog radios that usually consume much more spectrum than cell phones, and more over-the-air TV stations are clogging the airwaves thanks to archaic federal laws, even as Congress and the FCC allocate more free spectrum for digital TV. Meanwhile, the U.S. military owns roughly 500 MHz, thus controlling the largest slice of prime spectrum, but critics argue that its hoarding of this band in the interests of security is ridiculous, especially since most overseas communications will be beyond its control. The situation is particularly hobbling for suppliers of wireless technologies such as ultrawideband and cognitive radio, which can only fulfill their enormous potential if the airwaves are widely accessible. Wireless proponents, with the support of important figures in Silicon Valley and the Bush administration, are calling for changes designed to open up the radio spectrum: FCC Chairman Michael Powell believes that the airwaves should be regulated by market forces rather than the government, and an advisory panel will soon release a recommendation to revamp the regulatory process. Proposed solutions to the spectrum shortage include piggybacking on idle spectrum, and a similar FCC recommendation to allow largely unused airwaves to be traded to other users. Still another solution is to make idle spectrum publicly available.
- "Innovation Interrupted"
Optimize (11/02) P. 81; Phillips, Charles
The rapid proliferation of breakthrough technologies appears to be giving way to more modest enhancements to existing systems as companies struggle to save money, become more efficient, and bolster their bottom lines amidst the economic downturn. In addition, a depressed economy can be especially rough for startups and small companies, where a significant portion of innovation takes place--falling prices mean reduced revenues for large suppliers as well as smaller enterprises. Furthermore, the need to optimize budgets while at the same time keeping abreast of technological advancements to maintain competitive edge has put IT executives in a bind. Although major hardware platform shifts significantly contributed to the growth of tech companies' market capitalization, they also amplified integration difficulties and caused tech spending to grow as a percentage of total capital spending; subsequent changes to software architecture complicated the situation even further. However, promising innovations are still taking place: One emerging technology integrates broadband and wireless, a development that could help reengineer supply-chain architectures. The inexpensive Linux operating system could reduce costs and facilitate interoperability between information systems, while portals are likely to spread. Business-performance management (BPM) could help companies align business strategy and execution more closely, and Web services could be particularly useful for setting up applications with standard interfaces that ease communications inside or outside the firewall. Molecular computing has almost limitless potential, ranging from smart clothing and self-configuring materials to smaller computers.