Volume 5, Issue 460: Friday, February 21, 2003
- "FCC Delivers Mixed Vote on Competition"
Washington Post (02/21/03) P. A1; Krim, Jonathan
Former Bell telephone companies scored a victory yesterday when the FCC voted 3 to 2 against requiring them to lease new or upgraded broadband networks to rivals at regulated rates. Therefore, consumers who wish to get Internet service delivered over their phone lines would likely be stuck with the carriers that own the wires. "Today we may be choking off competition...Consumers and the Internet itself may well suffer," warned Democratic commissioner Michael J. Copps, who criticized the deregulation decision as a move that will allow the phone companies to fortify their monopolistic hold on broadband access. However, a coalition of technology companies supported the ruling, since they believe it provides the Bells with the needed incentive to upgrade their fiber-optic lines, increase technology spending, and foster competition with cable companies. Cable's 10 million Internet broadband customers greatly outnumber the Bells' 5.5 million DSL users, and Bell companies argued that there was no incentive to buildout broadband, particularly into rural areas, if they would be forced to lease their new lines to rivals. It was not a total victory for the Bells, however. The FCC voted to sustain leasing requirements for local telephone service, meaning that state regulators can still dictate licensing rates for competitors. James C. Smith of SBC Communications said the benefits of deregulated broadband service will be canceled out by this decision, and added that the phone companies will continue to lobby against such regulation. FCC Chairman Michael K. Powell derided the decision as "chaotic," and declared that "the impulse to leave much more telecom policy to state commissions may run against the winds of technological change."
- "Summit: DMCA Blocks Tech Progress"
Wired News (02/20/03); Dean, Katie
Speakers at the Digital Rights Summit on Wednesday warned that the Digital Millennium Copyright Act (DMCA) is hanging over Silicon Valley like a cloud, endangering tech innovation and, by extension, business. "The DMCA's blatant restriction on circumvention threatens a few of the core foundations of Silicon Valley: interoperability, innovation without prior permission, and Silicon Valley's [belief in] empowering the consumer," declared Digitalconsumer.org co-founder Joe Kraus. This is especially detrimental to technology startups, which are hard-pressed to find funding because investors do not want to risk DMCA lawsuits, according to venture capitalist Hank Barry. Among those who spoke at the event was Sen. Ron Wyden (D-Ore.), who wants vendors to notify consumers about the technology restrictions that may apply to their products. Another speaker, Rep. Howard Berman (D-Calif.), supports a bill that would allow copyright owners to obstruct the unauthorized use of digital content on peer-to-peer networks. Gartner/G2 VP Van Baker commented that companies must give greater priority to the opportunities technology represents, rather than the threats, and added that business models will shift regardless of media companies' reluctance to adapt. "We really have a chance to move things forward, but that's only if Silicon Valley engages in the process," noted Michael Petricone of the Consumer Electronics Association. Digitalconsumer.org and Intel co-hosted the Digital Rights Summit.
- "Study Lauds Open-Source Code Quality"
CNet (02/19/03); Shankland, Stephen
The networking component in Linux is written with better code than in five other proprietary operating systems, according to a study by Reasoning, a consulting group selling automated software inspection services. The TCP/IP technology in Linux version 2.4.19 contained just 0.1 defects for every 1,000 lines of code, compared to between 0.6 and 0.7 defects for every 1,000 lines of code in other systems, including two versions of Unix. Reasoning examined the source code of general-purpose systems and two special-purpose networking operating systems, but declined to name exactly which ones it used. Linux even fared slightly better than one of the embedded networking systems. Reasoning CEO Scott Trappe said the conventional wisdom goes that more participants in open-source development meant that flaws were more likely to be discovered, and that time constraints hampered the development of commercial products. Recently, Microsoft has stepped away from its main argument against the General Purpose License attached to Linux and other open-source software. The company contended that such "viral" licenses would spread to more and more software as pieces were packaged together. Instead, Microsoft has begun limited sharing of its source code with countries. One problem Reasoning specifically looked for was memory allocation flaws that could lead to buffer overflows, a oft-used method of attack for hackers.
- "Trial Near in Patent Case on Key Internet Technology"
New York Times (02/20/03) P. C5; Markoff, John
The case of a retired electronics engineer suing VeriSign, RSA Security, and four other companies for allegedly infringing on his patented techniques for online customer authentication and secure communications will have its day in court starting next week. Leon Stambler, 74, filed suit against the companies in 2001, claiming that seven patents he owns cover the Secure Sockets Layer (SSL) Internet Web security standard, which is the cornerstone of practically all e-commerce. Internet security experts counter that Stambler's patents copy cryptographic research originally carried out in the 1970s and 1980s at MIT and Stanford University. Russo & Hale intellectual property lawyer Jack Russo admits that prior inventions that may render new patent claims unviable sometimes fall through the cracks because of the overwhelming workload the U.S. Patent Office has to deal with. Counterpane Internet Security CTO Bruce Schneier calls Stambler's claim a "submarine patent," in which an inventor continuously broadens the sphere of initial patent claims to include technologies that others have developed and marketed. Among the companies cited in Stambler's 2001 lawsuit, three of them--First Data, Openwave Systems, and Certicom--have settled and forged licensing deals, according to SEC records and someone familiar with the case. A fourth, OmniSky, has filed for bankruptcy protection and was recently bought by EarthLink.
(Access to this site is free; however, first-time visitors must register.)
- "Russia Is Ripe for a Tech Boom, Officials Say"
SiliconValley.com (02/21/03); Davis, Aaron
Speakers at the second U.S.-Russia Information and Communication Technology Roundtable declared yesterday that the former Soviet Union is poised to become the next major technology hot spot, thanks to several years of economic growth, elevated consumer spending, and a more than 20 percent average annual growth rate in its technology market. Alex Freedland of Mirantis said that Russia's tech growth could be attributed to the country's longtime reliance on self-sufficiency. Cisco Systems CEO John Chambers said that his company plans to extend its Russian operations were due to efforts by the Russian government to nurture a friendly business climate. Leonid Reiman, the Russian Federation's minister of communications and informatization, noted that such efforts include increasing Internet access to the public, reorganizing the nation's regulatory architecture into a business-friendly, anti-piracy model, switching government operations to the Web, boosting high-tech training and education, and establishing Internet connectivity for rural areas outside of Moscow. He noted that Russia's high-tech growth has bested that of Europe by a ratio of 2 to 1 for the last three years. Ernst & Young estimates that 14 percent of Russia's populace, which numbers approximately 150 million, own personal digital assistants and cell phones, over 50 percent of which were bought by consumers in 2002; 8 percent of Russians own PCs. One of the key differentiators of Russian IT workers is their unique ability to innovatively solve problems.
- "Lawyers: Hackers Sentenced Too Harshly"
CNet (02/20/03); Lemos, Robert
The National Association of Criminal Defense Lawyers (NACDL), along with the Electronic Frontier Foundation and the nonprofit Sentencing Project, has signed a paper that finds fault with current sentencing guidelines, claiming that felons convicted of computer crimes usually receive harsher penalties than those who commit comparable non-computer-related crimes. The paper's author, Jennifer Granick of Stanford University's Center for Internet and Technology, alleges that most computer-related offenses are committed by angry employees trying to hurt their employer or gain money, but are being treated as if they were life-threatening. The paper cites a Department of Justice review of 55 cybercrime cases that found that only 15 involved threats to the public, while just one constituted a threat to safety. "The [current sentencing] guidelines punish people more for using a skill that members of the general public don't have," says Granick. "If we can't do your crime, then we punish you more." The report also observes gradual increases in cybercrime prosecutions and convictions: The Justice Department prosecuted 135 cybercrime cases in 2001, up from 57 in 1997. Such a rise could stifle security research and hinder innovation, the paper contends. Granick's paper was published in response to a request for public comments by the United States Sentencing Commission, a necessary component for the 2002 Homeland Security Act's ratification.
- "Innovations That Reinvent The Wheel"
Washington Post (02/20/03) P. E1; Walker, Leslie
This week's Demo high-tech forum revolved around new ways of making old technologies work, according to speakers such as Intel Capital President Leslie Vadasz, whose argument was echoed by many of the prototype products on display. Heavy concentrations of interest at the conference were oriented around techniques to help prevent systems from being inundated by email, and facilitate Web-based wireless communication between portable devices and electronic sensors. In the former category, anti-spam software from MailFrontier successfully blocked 85 percent of junk email, according to a Network World consultant; in contrast, a similar product from Cloudmark blocked just 43 percent of spam, but erroneously blocked less legitimate email than MailFrontier's software. Meanwhile, Open Field Software exhibited Ella, a tool that offers suggestions on how users can more efficiently manage email by analyzing what they read, delete, and transfer into folders. Other notable products on hand at Demo included a software program from Delean Vision that can identify people from an image of their skin, a wireless Internet music player from TerraDigital Systems equipped with a touch-screen menu, and a wireless meshed network from Digital Sun that enables sensors distributed throughout a lawn to communicate and coordinate watering. ActiveWord Systems showcased a method to accelerate computing and process automation by allowing users to carry out operations using text shortcuts, while Liquid Machines unveiled software developed by Harvard researcher Michael Smith that enables corporations to dynamically control digital content accessed by their employees.
- "Consortium Releases Common Embedded Linux Spec"
EE Times (02/20/03); Murray, Charles J.
The Embedded Linux Consortium (ELC) has released its long-awaited common software platform, which will provide a consistent standard for embedded application developers. Large device and component makers such as IBM, Intel, Motorola, Panasonic, Samsung, Sharp, Sony, and Siemens are part of the ELC, along with over 100 smaller firms. Their plan is to create a stronger case for Linux as an embedded operating system in the face of a proprietary market dominated by Wind River Systems and Microsoft. A common platform would enable software developers to write applications for an embedded Linux operating system knowing that it would be compatible with a number of devices, which in the future would be marked with an ELC label similar to Intel's "Intel Inside." Mark Brown of IBM's Linux Technology Center says development and testing would likely be shorter as well, and a common standard prevents fragmentation of embedded Linux systems. Linux companies still face trouble finding a sustainable business model, and some observers doubt that making Linux more marketable is a viable strategy, since the main appeal of Linux is its royalty-free character. Venture Development's Steve Balacco of predicts that embedded Linux will nevertheless be included in products worth a total $346 million by 2006, up from $59 million in 2001. He notes particularly how embedded Linux fared relatively well in the past year, given the difficult economic environment. Moreover, he says the market for embedded Linux is large, as more than half of embedded application developers are uncommitted to either the proprietary or open-source camp, but will have to choose as technical complexity requires off-the-shelf solutions.
- "Congress Preps Tech Agenda"
Fox News Network (02/17/03); Porteus, Liza
The technology agenda of the 108th Congress includes deploying cybersecurity, limiting spam, expanding broadband, and shielding citizens from identity theft, while industry groups are lobbying against many mandates Congress is currently debating, including modifications to the Digital Millennium Copyright Act (DMCA). Robert Cresanti of the Business Software Alliance says that Congress should concentrate more on enforcing the existing version of the DMCA rather than altering it. Meanwhile, the U.S. entertainment sector is pushing for legislation that would broaden the scope of anti-copying technologies on CDs and movies, a move that the tech industry opposes because of the enormous cost involved and the potential impact on consumers. Working its way through Congress is Rep. Rick Boucher's (D-Va.) Digital Media Consumers Rights Act, which would require copy-protected CDs to be clearly labeled as such and extend consumers' fair-use rights for digital products; meanwhile, entertainment industry proponent Sen. Ernest Hollings (D-S.C.) may revive a proposal that would ban the manufacture and distribution of "digital media devices" unless they comply with federal anti-copying tech regulations. There is also a good chance that Hollings, Sen. Conrad Burns (R-Mont.), and Rep. Cliff Stearns (R-Fla.) will introduce legislation to protect Web surfers' privacy. The Electronics Industries Alliance (EIA) supports a broadband tax incentive, which EIA President Dave McCurdy says is essential to the widespread deployment of broadband across the nation. Sen. Ron Wyden (D-Ore.) and Rep. Christopher Cox (R-Calif.) have proposed legislation calling for a permanent moratorium on Internet access taxes that distinguish between online sales and store sales, as well as a prohibition on multiple state taxes.
- "Has Your Computer Talked Back to You Lately?"
Israeli professor Dov Dori at the Technion-Israel Institute of Technology has created a software translation program that allows users to make programming changes through speech and graphic diagrams. Though Dori, who is also a research affiliate at MIT, wants to configure his OPCAT program for all types of computer interfaces, he says the current focus is on its industrial application. He likens OPCAT to CAD applications that eliminated the need for draftsmen, or word processors that made typists obsolete. By speaking to the computer, users can pull up a graphical representation of programming options, which they can then implement without having to know the back-end code. Conversely, users can manipulate graphic diagrams and listen to the computer's audio response. Dori says this versatility allows people to interact with their programs comfortably, no matter what their learning style. Dori pioneered a concept called Object Process Methodology, in which he says everything is either an object or a process that changes an object. Based on this premise, OPCAT works from a model to generate computer code automatically, based on users' instructions. Pratt & Whitney Canada principal engineering and applications architect, Mark Richer, says he used a beta OPCAT version for analyzing aerospace concepts. He says the software automatically generated thousands of diagrams and statements that would have been impossible to derive using traditional methods.
- "Butterflies Offer Lessons for Robots"
Technology Research News (02/19/03); Patch, Kimberly
Although the fluttering of butterflies may appear to be random, researchers at the University of Oxford have determined through experimentation that the insects actually utilize a dizzying array of aerodynamic mechanisms, and such analysis could prove very useful in the development of robots designed to mimic insect flight. Free-flying butterflies "use all of the known mechanisms to enhance lift--wake capture, leading-edge vortex, clap and fling, and active and inactive upstrokes--as well as two mechanisms that had not been postulated, the leading-edge vortex during the upstrokes and the double leading-edge vortex," explains University of Oxford research associate Robert Srygley. The scientists reached this conclusion by studying butterflies trained to fly between synthetic flowers in a wind tunnel, using smoke and cameras to capture the flow of air around their wings. Ron Fearing of the University of California at Berkeley says the research adds considerable knowledge to the field of small-scale aerodynamics, and could be applied to the development of robotic fliers that weigh between 100 milligrams and 10 grams. However, Georgia Tech Research Institute's Robert Michelson reports that controlling insect wings not only consumes a lot of power and is physically complicated, but is hard to replicate in miniature. Srygley forecasts that flapping robots the size of hawk moths or butterflies could appear within five to 10 years. He adds that such machines could have a variety of functions, such as exploring volcanic vents, evaluating stresses on architectural structures, or probing the surfaces of other worlds.
Click Here to View Full Article
- "Old School"
CNet (02/18/03); Kanellos, Michael
A diversity of computer applications and technologies, including artificial intelligence, robotics, and data searching, are using probability theory outlined by 18th-century clergyman Thomas Bayes. Bayesian theory dictates that the probability of future events can be determined by calculating their frequency in the past, and its credibility has been given a major boost in the last 10 years thanks to advances in mathematical models and computer speed, as well as laboratory experiments. The predictions are reinforced using real-world data, and the results are altered accordingly when the data changes. As computers become more powerful, the number of calculations needed to predict phenomena has been reduced thanks to the introduction of improved Bayesian models. Microsoft's upcoming Notification Platform will enable computer and cell phones to automatically filter messages, schedule meetings free of human interaction, and organize approaches for contacting other people using probability modeling. For instance, the platform's Coordinate application collates data from personal calendars, cameras, and other sources to build a profile of a person's lifestyle, which can be applied to the delivery of information to application users. Meanwhile, researchers at the University of Rochester employ Bayesian models to detect anomalies in a person's walk using data from cameras fed into a PC. Eric Horvitz of Microsoft Research's Adaptive Systems and Interaction Group explains that Bayesian theory was given little credence in the computing world until it became clear that logical systems could not predict all unforeseen variables.
- "Digital Demise"
Red Herring (02/12/03); Pfeiffer, Eric W.
The high-tech industry has been devoting a great deal of time and effort to the advancement of digital chips, but analog chips are expected to move to the forefront over the next decade. Computers' interaction with the digital world--the coding of sensory data--cannot be accomplished without analog chips, and experts such as Carnegie Mellon University's Sebastian Thrun say that artificial intelligence will never be realized without analog advancements. "Robots are really good at finding out small things with accuracy and repeatability, but they are ages away from understanding the physical analog information that humans can," he explains. Meanwhile, Matrix Semiconductor co-founder and Stanford University professor Thomas Lee notes that the operation of digital chips with 90-nm features can be disrupted by the tiniest external elements, such as random cosmic rays. He suggests that researchers should develop 3D structures that can gather and process data in ways similar to systems found in nature. Lee notes that biological systems would be a very effective model, and uses the ear as an example; the organ is capable of registering sounds at just a few decibels as well as withstanding incredibly loud noises. "We can't build a widget that has that kind of range," he observes.
- "Radio Tags, Nanotubes and a Video Cyclops"
Financial Times (02/20/03) P. 8; Harvey, Fiona
The high-tech sector does not want for innovative ideas, despite the financial turbulence it has weathered recently; nor has it stifled tech visionaries' search for the "next big thing." One developing technology of interest is radio frequency identification (RFID) tags that can transmit their location as well as relevant data. They could be embedded into everyday goods thanks to their cheap production cost (approximately 5 cents per tag), and help usher in automated grocery shopping in which consumers do not need to interact with staff to process a purchase. Former ICANN President Esther Dyson lauds RFID tags as "probably the technology of the most immediate sizzle," and expects it to become an essential component of keeping track of people, luggage, and cargo. Research is also underway to make videoconferencing more immersive, and eliminate such limiting factors as a lack of eye contact: "We're using artificial intelligence to recreate what is in effect a virtual eye situated in the middle of the user's forehead, like a Cyclops--and that will ensure eye contact can be maintained," explains Microsoft Research's Andrew Blake. Meanwhile, Teleportec has devised a technique that combines cameras and projectors to deliver 3D holographic videos of people in real time. Nanotechnology is touted as a revolutionary field thanks to breakthroughs such as carbon nanotubes, although it is more likely that its applications will include improved manufacturing and more highly powered processors, rather than the minuscule robots that have been popularized in the media. It will be decades, however, before computers can be controlled by thought, although research toward this goal involving animals has been recently documented.
- "Key Technology Predictions: 2003 to 2012"
Gartner believes that IT will be transformed over the next decade thanks to five disruptive technologies, including wireless networks, which will cause platforms to evolve; the incorporation of networked chips into everyday objects due to falling costs for tagging technologies; the commercialization of alternate power sources for portables; microelectromechanical systems (MEMS) that will enhance chips with perception and control capabilities; and the digitalization of pen and paper through the advent of digital ink, embedded chips, and cheap wireless communications. Human-computer interaction will be affected by several trends outlined by Gartner, among them: The growing ubiquity of computer screens, environment-sensitive input interfaces that can also ID users and personalize interactions, and improved information flow thanks to advanced interface metaphors. Gartner also notes several factors that will be key to the evolution of application integration, including architectural agility being spurred by business activity monitoring and enabled by rules; a predominance of service-oriented architecture and Web services by 2007; the development of personalized enterprise applications stemming from architectural agility; and a gap between semantic and physical integration. Gartner expects the market for electronic workplace solutions to consolidate, thus winnowing out the less effective products, while organizational structures, business processes, and IT infrastructures must be in a continuous state of flux once the office moves out of a fixed location. The emergence of more automated customer service technologies must lead to a growth in the skill sets of human personnel, Gartner reports. Over the next 10 years, in the area of supply chain management, Gartner forecasts that information "float" will be reduced via business activity monitoring, more data will be captured thanks to better visibility and tracking, "cradle to grave" applications will be supported with tagging and tracking technologies, and supply chain operations will undergo a major shift as a result of the availability of cheap RFID technology.
Click Here to View Full Article
- "Tech Tools Alter Battlefield Tactics"
SiliconValley.com (02/19/03); Puzzanghera, Jim
Since the Persian Gulf War ended, the U.S. military has adopted information technology to improve its targeting efficiency and reduce the risk of "friendly fire" incidents. The new battlefield will be largely digital, thanks to advances in computing, software, and wireless communications. Such innovations--many of which came from Silicon Valley--have significantly boosted the effectiveness of weapons such as the B-2 stealth bombers, "smart" bombs, and remote-controlled planes, according to analysts and military officials. Telling friendly and enemy forces apart will be easier thanks to computers that combine data relayed from ground forces, surveillance planes, and satellites. At the vanguard of any expected military incursion into Iraq will be the Army's first two "digitized" Army divisions--the 4th Infantry and the 1st Cavalry--using the Force XXI Battle Command Brigade and Below (FBCB2) system, while a pair of Navy ships stationed in the Gulf are equipped with the Area Air Defense Commander Capability System, which features SGI workstations that can map out the battle theater in three dimensions. Meanwhile, military commanders will use a new common operating environment on their computers designed to blend data collated by the four branches of the military. The use of off-the-shelf technology in military IT systems was a key recommendation of the "information superiority" agenda the Joint Chiefs of Staff outlined in its "Joint Vision 2010" report in 1996. However, IT military systems have their share of drawbacks: Their computers are not immune to crashing; their effectiveness could be reduced in an urban environment, because they are designed for more open warfare; and the Global Positioning System (GPS) transmissions used to relay troop and armament positions can be jammed.
- "Net Talk Gets Real"
CIO (02/01/03) Vol. 16, No. 8, P. 86; Hapgood, Fred
Analysts as well as hardware and software providers are banking on the breakout of voice-over-IP (VoIP), although conventional telephony still boasts high call quality and reliability standards, while VoIP can strain the underlying network and entail considerable expense and effort in its deployment, management, and performance maintenance. Most companies are shifting to VoIP in the wake of key corporate events, such as the appropriation or expropriation of assets, or relocation. A few businesses are employing VoIP transition strategies in which the enterprise adopts the technology incrementally, thus reducing up-front costs and risks while making the most of user training. Jeff Amerine of Communications Network Services followed such a plan when he started converting FedEx Freight's phone calls to VoIP, and learned that sufficient network quality of service was mainly achieved through many hardware and software upgrades. He recommends that, given the sensitivity of these upgrades to local features, IT departments should develop the needed VoIP networking talent within the company. "If you don't find a way to get comfortable with the technology first, you're going to end up with a tin cup," warns Overlake Hospital CIO Bruce Elkington, who also chose an incremental strategy to install wireless VoIP, and gained familiarity in an initial deployment along an existing wireless data network. His experience proved that installation partners and contractors often have an inflated opinion of their knowledge, but the reliability of such a system, once operational, is very high.
- "Evolving Inventions"
Scientific American (02/03) Vol. 288, No. 2, P. 52; Koza, John R.; Keane, Martin A.; Streeter, Matthew J.
Computer programs that use Darwinian evolution through a method known as genetic programming are designing new inventions that can be patented. Genetic programming starts with thousands of randomly generated test "organisms" and a high-level description of their function; successive generations are produced by the selection and cross-breeding of the best individual organisms, while mutations are introduced into a small percentage of the descendants. Stronger, more able individuals capable of fulfilling the targeted function emerge over the course of dozens of generations, and the best one is harvested at the end of the genetic programming run. Genetic programming has been used to recreate patented electronics, some of which are cutting-edge; other inventions that have evolved via genetic programming--some of which should be patentable--include antennas, general-purpose controllers, and computer algorithms that recognize proteins. The development of genetic programming is a great leap forward in the field of artificial intelligence, because it delivers human-competitive machine intelligence while keeping human involvement for each new problem minimal, and does not rely on logical deduction or a human-knowledge databank. Although the technique does not fulfill the criteria of mathematician Alan Turing's famous test for machine intelligence, it does pass the U.S. patent office's intelligence test for demonstrating creativity and ingenuity. Furthermore, it achieves many of the goals of another Turing theory of machine intelligence, using "the genetical or evolutionary search by which a combination of genes is looked for, the criterion being the survival value." Genetic programming is expected to be applied practically to the design sector first, while increased computing power should bring genetic programming applications to the desktop workstation by 2010.