Volume 4, Issue 396: Monday, September 9, 2002
- "Year After 9/11, Cyberspace Door Is Still Ajar"
New York Times (09/09/02) P. C1; Schwartz, John
Despite hopes from cybersecurity specialists that American companies and governments would implement better network protection in response to Sept. 11, there is little indication that progress has been made. Giga Information Group VP Steve Hunt reports that security spending has remained flat for the last year, despite numerous surveys concluding that businesses and government are inadequately protected against cyberattacks: For example, a Business Software Alliance poll of corporate network security overseers finds that 60 percent believe U.S. industries are prey to a major online terrorist assault in the next year. Spending for IT system disaster planning has also been low--just one in 10 respondents to an AT&T survey says that such strategies got high priority following the attacks. Trusecure CTO Peter S. Tippett attributes the slowness of cybersecurity deployment to businesses' frustration of boosting security, given the vast number of software patches and the enormity and complexity of the systems they must be applied to. Another possible factor he cites is industry experts' argument that the cyberterrorism threat has been exaggerated. White House cyberspace security adviser Richard S. Clarke says the real impetus for increased cybersecurity emphasis among industries and government was Sept. 18, when the Nimda virus wreaked billions of dollars in damages; one result of the attack was software companies taking security and the patching of vulnerabilities much more seriously. The Bush administration, for instance, has proposed 56 percent more spending on federal network security next fiscal year, compared to fiscal year 2002. Meanwhile, various groups are working to ensure that federal cybersecurity legislation serves everyone's interests, not just the government's.
(Access to this site is free; however, first-time visitors must register.)
- "Bush Mulls Internet Security Fund"
Associated Press (09/06/02); Bridis, Ted
Internal documents from the National Strategy to Secure Cyberspace imply, among other things, the creation of a technology fund "to address those discreet technology areas that fall outside the purview of both industry and government and yet are critical to the future secure functioning of the Internet;" such a fund could be provided through a joint federal-industrial venture, with the money funneled into projects such as super-secure versions of computer operating system software. The White House is making home users, corporations, and universities responsible for ensuring Internet security, and the documents indicate that the Bush administration is debating an advertising campaign to promote secure computing among schools and other users. Other recommendations from the working papers include getting Internet providers to practice a code of good conduct concerning cooperation; increased spending to beef up the security of computer systems for major utilities; better wireless security and restrictions on its use by federal workers; the creation of an industry testing facility designed to avoid software problems caused by software updates; analyzing the construction of a federal network that would provide communications and computing services in the event of Internet blackouts; and studying cyberattack response strategies when the origin of intrusions is difficult to ascertain. One document identifies the purpose of the national cyberspace protection plan as being to "empower all Americans to secure their portions of cyberspace." The plan will be released on Sept. 18, but not all the recommendations listed in the internal documents will necessarily be included in the final draft.
- "Archaic Computer Systems Hamper War on Terror"
SiliconValley.com (09/08/02); Puzzanghera, Jim
The U.S. government is unable to capitalize on its IT budget, largely because of the lack of coordination, complex purchasing requirements, and standalone technology. Analysts say the pace of change in the federal government is glacially slow--but that change is coming. The Transportation Department, for example, has an "Activation Information Management" system that utilizes the Web to link the 12 department agencies. The Office of Management and Budget is also keen to avoid investing in costly systems that do not work together, especially in regards to key homeland security functions. Because of this, computer projects at the Coast Guard, Immigration and Naturalization Service, and Customs Service costing over $500 million have been frozen until the Homeland Security Department is launched, probably at the beginning of next year. Newly appointed FBI Director Robert Mueller, who heads one of the most technologically backwards agencies, says that each FBI division had installed their own disparate computer systems up until just a couple years ago. Gartner analyst Rishi Sood says the federal government faces several difficulties not encountered in the private sector, such as its sheer size, bureaucratic procurement policies, and lengthy budget approval processes. Attempts are being made to remedy some of the problems, such as the Homeland Security Office's review of proposals from smaller firms and commitment to use off-the-shelf solutions when possible--both safeguards ensuring the most effective and interoperable technology.
- "HP to Unveil Nanotech Breakthrough"
CNet (09/06/02); Kanellos, Michael
On Monday in Europe, Hewlett-Packard scientists will announce a breakthrough that brings HP one step closer to its goal of making smaller, speedier, and less expensive chips using "molecular grids," in which crisscrossing molecular strands are arrayed in layers to form intelligence circuits. HP and UCLA have paved much of the way for molecular grid technology: The partners developed molecules that could be switched between on and off states in 1999 and 2000, and HP manufactured such molecules in strand configurations. Last year, HP and UCLA received a patent for connecting molecular wires into grids, and this year they were awarded a patent for managing molecular grid traffic. HP's breakthrough is only a small part of the nanotechnology drive--another promising technology is self-assembling carbon nanotubes, which could appear in microprocessors as a replacement for copper wires in about 10 years. But computer chips that could be technically classified as nanotech will probably emerge next year; their claim on the nanotech classification will be the size of their internal circuits, which will measure an average of 90 nm. Meanwhile, IBM has devised a memory device the size of a postage stamp that has enough capacity to contain about 25 million textbook pages, and such technology could be commercialized in three years. Last year, the United States government earmarked $422 million for nanotech research, while Japan invested $410 million. The rest of the world has committed $425 million in research funding.
- "Some Environmentalists Worry About Nanotechnology Risk"
Associated Press (09/08/02); Krane, Jim
Environmental organizations such as the ETC Group want governments to declare a moratorium on nanotechnology development until its health and environmental risks are more thoroughly assessed. Despite experts' assurances that nanotech's benefits greatly outweigh the risks, ETC's Kathy Jo Wetter contends that such risks have not been adequately studied, and adds that such evaluation is critical with the impending commercialization of nanotech. She is concerned that, among other things, nanoparticles could build up in organs, causing damage; such particles could supposedly be ingested in nanotech-enabled foods, while ETC argues that particles being developed as bloodstream-borne drug delivery systems could also be used to carry deadly toxins. Adding to the danger is the fact that nanoparticles are invisible to the naked eye and are small enough to pass undetected through most filters, while no technology exists to detect them outside of the laboratory. The implications for nanotech will be the focus of several studies that the EPA expects to launch this year: A November workshop coordinated by the FDA will look at how nanotech could affect food and agriculture.
Click Here to View Full Article
- "Almost Organic"
ABCNews.com (09/06/02); Onion, Amanda
Drawing insights on robot evolution and human-robot interaction is the purpose of the Public Anemone, a robot that resembles a sea anemone and exhibits unusual abilities. Such interactive robots could lead to the incorporation of robotics into people's daily lives, says Cynthia Breazeal of MIT's Media Lab, which developed the Public Anemone. The robot reacts instinctively to the presence of people as well as environmental factors: It "sleeps" during its terrarium's six-minute night cycle, while by day it chooses to behave in a variety of ways, hissing or recoiling when humans approach, seeking out people's faces and hands, or ignoring them completely and bathing itself in a waterfall. The Public Anemone's vision system consists of a pair of stereo digital cameras located in the terrarium; a program interprets the video and adds that data to an intricate behavior model, while the robot is programmed to react to the chroma of human skin. The robot's spine and tentacles move according to data fed by human animators, while Media Lab researchers such as undergraduate Dan Stiehl are trying to embed a sense of touch into robotic skin. A number of robots that could use such an advanced sensory system are under development at the lab. Although Breazeal and her team design robots that take certain cues from the natural world, they are not programmed to fully mimic known life forms, but rather display their own unique behaviors. MIT's work in such projects as Public Anemone could pave the way for highly interactive machines that "are aware of human presence and of social conventions...exhibit initiative and show personality and emotions," says Terry Fong of Carnegie Mellon's Robotics Institute.
Click Here to View Full Article
- "10 Choices That Were Critical to the Net's Success"
SiliconValley.com (09/08/02); Gillmor, Dan
Harvard University senior technical consultant and Internet standards development guru Scott Bradner listed 10 major decisions that led to the Internet's rise in prominence at a Massachusetts telecom conference last week. Multiple existing networks were supported by Internet protocols placed on top of them, while splitting messages into packets and deploying a routing function were partly responsible for the Net's resiliency and reliability. Another key decision was the division between the Transmission Control Protocol (TCP) and the Internet Protocol (IP), which increased the flexibility of network services. Bradner also cited the National Science Foundation's (NSF) decision to fund the University of California-Berkeley's effort to incorporate TCP/IP into the Unix operating system, whose low cost prompted many startups to adopt it. TCP/IP's availability increased because the NSF made it a requirement for NSFNET users to use TCP/IP exclusively, while the organization's prohibition of commercial NSFNET use fueled the creation of commercial network providers. The rejection of the TCP/IP protocol by international telecoms and their subsequent development of the OSI standard helped pave the way for many of the freedoms we take for granted, according to Bradner; one such freedom is the ability to place new protocols atop of IP without requiring permission from carriers. A connection between the CSNET academic network and the government's Internet predecessor, ARPANET, fostered better understanding of online networking among students, faculty, and staff while encouraging more network-centered research at the same time. A lack of government regulation in the Internet's development has also had a positive effect, Bradner noted.
Click Here to View Full Article
- "The High-Tech Rebels"
Financial Times (09/06/02) P. 7; London, Simon
Sun Microsystems co-founder and chief scientist Bill Joy says IT is too server-centric and that more robust software will help set it free. Meanwhile, Xerox chief scientist John Seely Brown adds that Web services promise to set entire markets free, while changing ideologies of how businesses function internally will also help boost productivity. Joy, who oversaw the creation of the Java programming language, is working on Sun's Jxta language, which will allow for a deluge of networked devices, communicating with one another peer-to-peer. Jxta utilizes XML for portability, and is structured far better than other languages, according to Joy. Coupled with inexpensive hardware technologies, he says there are little physical constraints to what he has in mind. For example, radio-frequency-enabled chips could be stamped out of plastic instead of more costly silicon, allowing every single piece of inventory to be tagged with information specific to that item. Joy theorizes that pervasive peer-to-peer computing will bring new dynamism to markets, and will act as a catalyst allowing buyers and sellers to connect more easily, similar in effect to what Brown envisions will be the result of Web services, which enable computer systems to communicate via modular software code. Brown says CEOs should focus on how things actually get done in a business and promote specialized communities of practice, assets to become more valuable as Web services allow companies to market and trade services more freely.
Click Here to View Full Article
- "Lack of Cybersecurity Specialists Sparks Concern"
National Journal's Technology Daily (09/04/02); Peterson, Molly M.
The United States faces a disturbing lack of skilled workers to protect critical infrastructures from electronic attack, said experts at a recent cybersecurity conference in Washington, D.C. As a result, the demand for people with IT skills will rise, predicted Harris Miller, president of the Information Technology Association of America. Mark Holman of the Office of Homeland Security said the president's new strategy for national cybersecurity due out Sept. 18 will stress the need for more workers to protect computer networks. The strategy will evolve as technology changes, he said. It separates critical infrastructure issues by industry, such as electricity, telecoms, or water filtration, he said. Ronald Dick of the FBI's National Information Protection Center urged the formation of information-sharing analysis centers (ISACs) to help the government and industry share knowledge on vulnerabilities. He noted that 90 percent of the nation's infrastructures is owned and run by private companies. Rep. Kurt Weldon (R-Penn) said he will propose a college scholarship program requiring participants to work in the military for several years as "cyber warriors."
- "Breakthrough Gives Diamond Electronics Sparkle"
New Scientist Online (09/02/02); Kleiner, Kurt
An international team has synthesized a thin film of diamond better suited for high-performance electronics than natural diamond and other artificial forms of diamond, because it is composed of a single crystal and has few impurities. The material could be ideal for electronic elements in radars, satellite communications, spacecraft, and cathodes for flat-panel displays. Diamond is also highly robust and more heat-resistant than silicon, which would make it very useful in the development of smaller, faster microprocessors. However, for now such components will probably be too costly as a substitute for conventional silicon semiconductors. The researchers first created a diamond substrate by exposing graphite to high temperature and pressure, then placed carbon atoms from methane gas onto the wafer via microwave plasma chemical vapor deposition, which resulted in a diamond crystal. Gaps to accommodate the flow of electrons and induce superconductivity were incorporated into the crystal through the introduction of a small quantity of boron. Testing demonstrated that the substance's carrier mobility was superior to that of gallium nitride and silicon carbide. The researchers who made the breakthrough are from Britain's DeBeers Industrial Diamonds and Sweden's Innovative Materials Group.
- "New York State Wins Top Semiconductor R&D Lab"
IEEE Spectrum Online (09/01/02); Savage, Neil
The New York branch of International Sematech will reside in a $403 million research center located at the State University of New York (SUNY)--Albany. Sematech was drawn to the area by SUNY's plans to construct several research facilities dedicated to the manufacture of 300-mm wafers, as well as a state commitment to pay $4 for every $1 from the consortium, according to a spokesman. Sematech officials and New York State Gov. George Pataki announced a five-year initiative to build the center on July 18; the center's chief focus will be the development of defect-free mask blanks for extreme ultraviolet (EUV) lithography. Some scientists are skeptical that such blanks can be fabricated; in the meantime, the Extreme Ultraviolet consortium is collaborating with several national laboratories to find more powerful sources of EUV that can be used to etch chip features. Other collaborative ventures between academia and private industry are also taking place in the Hudson River area: IBM has committed $33 million to Rensselaer Polytechnic Institute (RPI) in Troy, N.Y., for the construction of a broadband research center. SUNY Albany is an RPI partner in the Interconnect Focus Center, and will become involved in the broadband facility as well. Such initiatives are "a coup for New York," according to VLSI Research CEO G. Dan Hutcheson, who says, "the thing that brings people to New York is all the universities."
- "Businesses Gird for Grid Computing Breakthroughs"
Globe and Mail Online (09/05/02); Medicoff, Zack
Industry experts say that widespread commercial grid computing could be made available in about five years, allowing manufacturers to design products, drug companies to develop new medicines, and businesses to share complex data sets and software faster. Grid computing links together computing resources over the Internet, allowing participants to tap supercomputer-type power from a PC, for example. Companies linked to computing grids make better use of their IT resources because they do not have to maintain vast infrastructures that sit idle most of the time. Although standards and protocols are not adopted widely enough yet to support universal grid computing systems, corporate partners and organizational affiliates have begun sharing their computing power and storage. IBM, together with the University of Pennsylvania and the Sunnybrook and Women's College in Toronto, is building a grid computing network meant to help doctors identify and treat breast cancer more effectively. Using the massive computing power available via the grid, doctors can pull up a patient's digital files and quickly compare hundreds of X-ray images stored on the network, for example. Pharmaceutical companies looking to study DNA for developing drugs, oil companies studying geological data for drilling sites, and brokerages creating "what if" scenarios for client portfolios are some of the potential uses for grid computing. Platform Computing CEO Robert Gordon says grid computing could also help the chip industry design new chips faster. Still, for grid computing to achieve its promise, industry standards and protocols must be developed; one such effort is the Globus Toolkit, a joint effort of the Argonne National Laboratory in Chicago and the University of South California. The free toolkit features software libraries and security protocols for grid computing applications.
Click Here to View Full Article
- "Lining Up for Jobs"
eWeek (09/02/02) Vol. 19, No. 35, P. 39; Vaas, Lisa
Short-term IT job prospects for the latest crop of computer and engineering graduates are slim, as many companies are cutting entry-level hirings and scaling back their internships. An April survey of employers conducted by the National Association of Colleges and Employers finds that hirings by computer and business equipment makers will be down 58.7 percent this year, while hiring at consultancies has fallen 89.7 percent. Experts say that IT professionals in training should prepare by adopting strategies that aim to maximize their value and uniqueness to employers. Such strategies include knowing which industries are most likely to be seeking manpower (government, defense contractors, etc.), and what skills they are after; focusing on regions heavily populated by large employers; attaining non-technical abilities such as teamwork skills; and aggressively pursuing real-world experience such as internships. A few companies, such as Microsoft, are hiring more entry-level applicants, while BAE Systems is one employer that is widening the scope of its internship program. Hiring managers are more likely to consider entry-level IT candidates who exhibit product-specific skills, basic technical knowledge, and an overwhelming passion for technology. However, universities are notoriously slow to align their curriculums to shifting market demands so that graduates have the necessary talents, although some are making an effort to do so. Web-driven application development is the most highly sought-after skill.
- "Who Should Own What?"
Darwin (08/02); Datz, Todd
In an interview with Todd Datz of Darwin magazine, Stanford Law School professor and author Lawrence Lessig explains that he understands the impulse to "patent everything under the sun" so that one can remain competitive against both legitimate and illegitimate rivals. However, he advocates that the patent process should be amended so that people can patent what they like while being prohibited from using their patents to stifle productivity. Lessig agrees with Amazon.com CEO Jeff Bezos' argument that the terms of specific patents--software and business methods, for instance--should be reduced, and believes that injunctions should be removed as a patent enforcement tool. He claims that the patent office receives inadequate support for its operations, and the lack of a legal requirement for inventors to come forward with prior art only adds to its burden. Lessig laments that "The reality now is that every new innovation has got to not only fund a development cycle and fund a marketing cycle, it's got to fund a legal cycle during which you go into court and demonstrate that your new technology should be allowed in the innovative system." Issues that he says need to be resolved urgently include the clampdown on broadband deployment, which is likely to get worse as the telecom infrastructure is deregulated; and unreasonable patents that are concealed until someone tries to adopt them. Lessig thinks that the freedom of the "Internet commons" can be better protected if the government is more proactive in spurring broadband deployment and making the Internet neutral ground, as well as setting a maximum IP term of 75 years, with copyright renewal required every five years. He also thinks Congress should try to encourage the creation of IP conservancies such as the Creative Commons.
- "Tech Frontiers"
PC Magazine (09/03/02) Vol. 21, No. 15, P. 131; Rhey, Erik; Turley, Jim; Lohr, Steve
Four sectors are poised to drive future technological advancements in the next five to 10 years: Chip fabrication, software programming, security, and entertainment. Chip production is, by its nature, paradoxical--materials costs are virtually nil, but tooling costs are tremendous. As chips shrink and probability and quantum mechanics assert themselves, predicting chips' behavior becomes more and more difficult; chipmaking will therefore change--future chips will be mostly composed of air and ultra-thin copper and silicon wires, while fabrication will feature automated knowledge systems to ensure that the mask template is flawless. Future security solutions will employ both heuristics analysis (the detection of abnormal behavior) and signature detection (the identification of preexisting virus signatures); they will react to threats quicker; and they will use self-healing autonomous agents to handle blended threats. Entercept, SecurityFocus, and Internet Security Systems are all working on such solutions. Advancements in aspect-oriented programming and intentional programming could lead to genetic programming, in which a programmer describes a goal and writes a score of programs that move toward a solution in an evolutionary pattern; the approach could be especially beneficial in the area of design. Entertainment could be radically transformed with the introduction of MPEG-4 technology, which features profiles that will allow it to service a wide spectrum of devices, including PDAs, HDTV, digital camcorders, and mobile phones. Applications will include computer games and TV programs that can be viewed on multiple devices and enhanced with 2D and VRML-based 3D graphics, Java support, and other features.
- "In Pursuit of the 'Everywhere' Computer"
HP World (08/02) Vol. 5, No. 8, P. 16; Shor, Susan B.
Former Hewlett-Packard Labs director Joel Birnbaum is a staunch advocate of pervasive computing, the establishment of an invisible, all-encompassing information system that can be harnessed for virtually any function by ubiquitous sensors and actuators. The applications for such a network are practically limitless, and run the gamut from spacing vehicles on a highway to monitoring elderly people's vital signs to more accurately predicting weather forecasts. At the Usability Professionals Association (UPA) conference in July, he explained that the Web has spurred the development of standards that promote system interoperability, but building the infrastructure is the primary challenge, one that he thinks can be accomplished in the next few decades. At a Washington, D.C., workshop conducted in late June, Birnbaum coordinated a team effort to conceive a disaster response plan that relies on pervasive computing. He says the concept involves "the combination of hundreds of thousands or millions of sensors and actuators, some embedded in the environment, some deployed at the time of the disaster, but all of them connected in an astonishingly complex network that has to be able to work under the most adverse conditions." Birnbaum stresses that the devices must be intuitive, self-adaptive, and self-configuring. He says his group will propose its outline to Congress, the National Science Foundation, and other agencies in the hope that it will spur further research into pervasive computing deployment. Birnbaum acknowledges that resolving privacy concerns will be a formidable challenge.
- "Joining the Third Generation"
R&D (08/02) Vol. 44, No. 8, P. 28; Poliski, Iris
Cellular networks could be significantly enhanced with third-generation wireless technology, but differing levels of acceptance around the world and limited spectrum availability remain formidable obstacles. Its potential benefits include always-on services and boosted voice and data capacity, while services that stand to gain from its deployment include online gaming, messaging and email, and Internet access. Europe and Asia offer considerably more wireless spectrum than the United States, which is still waiting for the National Telecommunications and Information Administration (NTIA) to complete its evaluation of available spectrum, according to Ericsson's Tom Lindstrom. Once the evaluation is released, "It will at least add a degree of certainty to the industry," proclaims Ericsson's Barbara Boffer. "It will permit economics of scale on a global basis." Motorola's John Touvannas notes that ALLTEL, Sprint, and Verizon have devised a strategy to circumvent the wireless bandwidth shortage via Code Division Multiple Access (CDMA). Using CDMA IX technology, telecoms can boost current network data transmission speed to as much as 153.6 kb/sec, thus facilitating Net and portable device downloads, increasing voice service capacity almost 100 percent, and improving spectral efficiency for voice service. Furthermore, Touvannas says that the IX products his company will offer can be upgraded.
- "Research That Reinvents the Corporation"
Harvard Business Review (08/02) Vol. 80, No. 8, P. 105; Brown, John Seely
In the August issue of Harvard Business Review, the journal revisits John Seely Brown's 1991 article "Research That Reinvents the Corporation." In the paper, Seely argues that to stay competitive, corporations must do more than just create new products; organizations' research and development departments also must have the proper mechanisms in place to ensure that the company stays innovative on an ongoing basis. Brown offers four suggestions he gleaned from his time as director of Xerox's Palo Alto Research Center. The first principle places as much value on research on new work practices as research on new products. Brown says most organizations view corporate research as the driver behind new technologies and products, but he argues that research should also help shape new organizational practices. The second principle addresses innovation and the proper way companies should apply it. Many times, organizations do not know exactly what to take from new technologies--let alone how to leverage them to make the company more efficient. Brown's third principle challenges the corporate community's idea of the role research plays in driving innovations. Organizations err when they assume that research produces innovation; research must first co-produce new technologies and work practices, and that is done by developing a shared understanding of why the innovations are valuable. The fourth principle examines the role customers play in R&D. Brown says customers should be encouraged to discover their underlying and dormant needs.