Volume 5, Issue 534: Monday, August 18, 2003
- "The Bits Are Willing, But the Batteries Are Weak"
New York Times (08/18/03) P. C1; Harmon, Amy
The recent blackout proved that the sea of digital data created with cell phones, digital cameras, email, the Web, and connected computers is not entirely independent of real-world constraints. Despite the robust performance of the Internet itself, e-commerce sites such as Amazon and eBay showed significant fallout from the disruption. Only two days earlier, the Blaster worm had affected millions of computer users, emphasizing the vulnerability of a system many take for granted. For many Internet-addicted people, a day's respite from email and other digital communications meant having to rediscover old-fashioned methods, such as meeting neighbors on the apartment stoop. The power outage also showed the extent to which some of the digitally dependent will go in order to keep their tenuous grip on the electronic world. New York City blogger Grant Barrett uploaded digital pictures and his blog via dial-up using his laptop computer and a flashlight. After hypothesizing grandiosely about his motivations, Barrett wrote about his urge to post, "It's just some moron typing in the dark." Michigan-based law professor Jessica Litman realized the importance of an always-on Internet connection for finding information when she was forced to tune in her car radio for news; instead of searching out the information she wanted online, Litman said, "A car radio tells me what it wants to tell me."
(Access to this site is free; however, first-time visitors must register.)
- "Cybersecurity Chairman: Infosec Mandates May Be Needed"
IDG News Service (08/15/03); Gross, Grant
House Subcommittee on Cybersecurity, Science, Research and Development Chairman Rep. William Thornberry (R-Texas) says he is considering legislation that would require private industry to improve cybersecurity. His influence over federal cybersecurity policy will depend on his clout with several other groups, such as the House Subcommittee on Technology, Information Policy, Intergovernmental Relations, and the Census. Thornberry's preferred approach is aligned with the White House's desire to spur cybersecurity best practices free of regulation and to create private-sector incentives for deploying Internet safeguards, such as tax breaks for companies with a heavy cybersecurity emphasis. Ari Schwartz of the Center for Democracy and Technology doubts that Thornberry's subcommittee will break any new ground, while AIG eBusiness Risk Solutions COO Ty R. Sagalow thinks the congressman could encourage private industry to boost information security from his committee's "bully pulpit." Thornberry says one of his overarching missions is to closely monitor the Homeland Security Department's cybersecurity readiness, since other agencies look to the department as a template for best practices. "You don't want to be too quick on the draw with new mandates," the congressman acknowledges. "But you can't be too hesitant to pull the trigger when there are concerns."
- "IT Leads Recovery After Regional Power Failure"
Computerworld Online (08/15/03); Mearian, Lucas; Brewin, Bob; Rosencrance, Linda
The massive power outage that blacked out Manhattan and other northeast regions on Aug. 14 was not a crippling blow against Wall Street and other area businesses thanks to backups and data-recovery systems, many of which had been installed as a result of the terrorist incidents on Sept. 11, 2001. The loss of power was immediately followed by the activation of diesel generators at brokerage, bank, and clearinghouse data facilities around New York City and New Jersey, and the New York Stock Exchange reported that it retained all Aug. 14 trading data despite the blackout. Harris Beach technology attorney Alan Winchester noted that his law firm's financial records were preserved thanks to their real-time replication to an office in Rochester, N.Y., which has its own generator. In Cleveland, Case Western Reserve University scrambled to recover core systems (email, enterprise systems, course management systems, etc.). Of major concern was the potential loss of data on returning students' tuition payments and course information, but Case Western CIO Lev Gonick said that a post-Sept. 11 storage-area network automatically kicked in, taking snapshots of data sets. He added that as a result of this recovery system, only a "fraction of a second" worth of data was lost. Not all regional businesses were able to continue operations with minimal delay: FedEx noted that package information processing was interrupted because drivers were unable to download data from bar-code scanners into the corporate network at hubs and stations affected by the blackout.
Click Here to View Full Article
- "Linux Hits Landmarks in Los Alamos Supercomputer Deals"
TechNewsWorld (08/15/03); Lyman, Jay
Linux Networx has forged a $10 million deal with Los Alamos National Laboratory to build a Linux cluster supercomputer known as Lightning, which will encompass nearly 3,000 AMD Opteron processors and theoretically boast a peak performance of 11.26 teraflops per second. Lightning will be one of the 10 fastest computers in the world, and the first 64-bit supercomputer to be used for nuclear weapons research as part of Los Alamos' Advanced Simulation and Computing program. Lab Computing Communications and Networking leader John Morrison stated that the Lightning contract will enable the Los Alamos facility to process intricate calculations and arrive at results within a few days, adding that the effort could lay the foundation for more open-source computing in high-end milieus. Bard Rutledge of Linux Networx reported that his company will set a development record by delivering Lightning in September. Linux Networx has another deal with Los Alamos to build a 256-node cluster system called Orange, which will be used for unclassified biological, chemical, and engineering experiments. The Orange system will employ 512 Opteron processors, open-source InfiniBand interconnects by Mellanox Technologies, and Linux Networx's cluster-management software; the supercomputer will be the biggest system to use InfiniBand, and will allow users to rapidly flip between 32- and 64-bit modes while expediting operating system customization. "The change in philosophy in supercomputing has been away from highly complex, powerful, hardware-oriented systems to taking advantage of cheap components to achieve the same performance," observes Yankee Group analyst Dana Gardner. "Linux aligns well with this new general philosophy that has taken shape over the last 10 years."
- "A Spray-on Computer Is Way to Do IT"
Edinburgh Evening News (08/14/03); MacGregor, Fiona
Edinburgh University researchers have received a grant of 1.3 million pounds from the Scottish Higher Education Funding Council to develop ubiquitous computing technology in which mote-sized computers can be sprayed on objects and communicate their readings to each other wirelessly. "At the moment if you want to interface you have to use a keyboard or a mouse, which is very unwieldy," notes project leader Professor DK Arvid. "With this you could take a pen and spray it and it becomes an interface in its own right." A working version of the nano-computer technology is expected to be ready by 2007, while Arvid believes hospitals, schools, and shops will be availing themselves of the technology within a decade. Edinburgh scientists envision a ubiquitous computing project in which heart patients can be monitored unobtrusively at home by spraying the tiny computers onto their chests, where they can take heart readings and relay the data to a hospital computer. The success of the project will give Scotland and Edinburgh a prominent role in the next IT wave. Bill Furness of the Edinburgh Chamber of Commerce adds that the program could help the city become a world-class research and innovation hub. Other Scottish institutions involved in the ubiquitous computing program include the universities of Napier, Glasgow, St. Andrews, and Strathclyde.
- "Indy Know-How to Be Feature of CMU Entry in Robot Racing"
Pittsburgh Post-Gazette Online (08/14/03); Spice, Byron
The Defense Advanced Research Projects Agency (DARPA) has organized the Grand Challenge, in which robot vehicles will race from Barstow, Calif., to Las Vegas for a $1 million prize on March 13, 2004. The chief purpose of the Grand Challenge, from DARPA's point of view, is to fire up innovation that could be added to new military fleets. One of the entrants is William Whittaker of Carnegie Mellon University's Robotics Institute, whose Red Team will include Indy/CART/NASCAR team owner Chip Ganassi and Rod Millen, a champion rally and off-road racer who will serve as the team's race director. Competing robot vehicles will have to traverse 250 miles of rough desert terrain without human assistance, with the winner being the first vehicle to cross the finish line within 10 hours. Whittaker says likely candidates for Red Team's vehicle include a Humvee H1 from General Motors and a vehicle developed by Millen's specialty vehicle business in Huntington Beach, Calif.; stereo video cameras and laser rangefinders will be employed to scan the area ahead of the vehicle, while radar will be used for peripheral navigation and to overcome visibility difficulties posed by rain and dust. The Red Team vehicle will also need navigational software that keeps tabs on its position as well as the positions of competing vehicles, and that figures out the best response when confronted with dust clouds. Ganassi has hooked the Red Team up to racers such as Millen to help answer such challenges. Red Team corporate sponsors include Intel, Boeing, Seagate Technologies, Caterpillar, and Science Applications International.
- "DNS Inventor Says Cure to Net Identity Problems Is Right Under Our Nose"
Business Standard (India) (08/13/03); Berlind, David
Paul Mockapetris, chairman of Nominum and the writer of the DNS protocol, is focused on addressing the problem of identity theft on the Internet. Considering possible solutions to identity issues on the Internet, Mockapetris declares, "We can use something that's already in place, that's lightweight, and that every computer on the Internet already knows how to use--the DNS." Mockapetris says an extension to DNS, DNSSEC, could resolve the majority of these concerns, though it does have some flaws and has been under construction for a decade. DNSSEC is comparable to a seal on Web, email, and other Internet applications, as it introduces data origin authentication to the information retrieval procedure, making it far more difficult to impersonate a source of information on the Internet. DNSSEC could improve the odds that a given email would originate from the domain it claims it is from and that information viewed online would be from the Web site it claims to be. "After getting nowhere in 10 years, it's time to give it a try," says Mockapetris of DNSSEC. Still, others says that DNSSEC, in requiring more data to go between DNS servers and client systems, could intensify traffic online, and point out that implementation would be complex. VeriSign principal scientist Phillip Hallam-Baker says deployment would be complicated and would hinge upon an industry-wide agreement to make the transition. Another question that would arise from the implementation of DNSSEC is what entity would have the most control.
Click Here to View Full Article
- "Smart Chips Making Daily Life Easier"
BBC News (08/13/03)
European researchers with the Smart-Its Project continue to make progress on "ubiquitous computing." During the recent computer graphics Siggraph exhibition in the United States, Smart-Its Project researcher Martin Strohbach explained that his colleagues at Lancaster University and other institutions in Zurich, Germany, Sweden, and Finland envision embedding all kinds of everyday household items with programmable microchip sensors, which would give them smarts. "For example, we have used a table as a mouse pointing interface so you can control the TV or computer," says Strohbach. Bookshelves that warn people when they are overloaded and water bottles that tell users when their contents need to be cooled are additional fun ideas for such technology, but ubiquitous computing could have more serious applications, and may even help save lives. Sensors placed in floors would be able to determine that an elderly person has fallen and is unable to stand up. And a medicine cabinet could be transformed into a unit that tracks its contents and guides people through taking medicine. DIY flatpack chips have been developed that sense movement and use a voice to warn people when they are making a mistake in assembling products.
- "Keeping the Net Neutral"
Salon.com (08/12/03); Manjoo, Farhad
The so-called "Net neutrality" proposal the Coalition of Broadband Users and Innovators submitted to the FCC in July wants the federal government to regulate the broadband Internet in order to ensure that cable companies do not discriminate between the various content that customers receive. The measure's backers note that cable companies have the technical capability to control the sites and services users can access, and argue that they also have a financial motivation for doing so: They stand to make a lot of money by charging for premium content, and such a maneuver would impede broadband investment and deployment. Cable firms counter that these speculations have little merit, because such a move could potentially alienate so many customers. They also allege that coalition members stand to reap enormous rewards if the Net neutrality rule is enacted, not the least of which is a blunting of their competitors' power. National Cable & Telecommunications Association attorney Howard Symons contends that the coalition's submitted proposal is deceptively simple, and uses key terms like "discriminatory" and "unfettered" that are open to broad interpretation. He further notes that any regulation that restricts cable companies' operations is objectionable, given that there is no indication that such companies are harmful. This attitude is typical of many cable company defenders, who subscribe to the libertarian view that all regulatory measures are inherently bad. Still, the technology exists for cable firms to block specific services and sites, and the current lack of competitive broadband services in many areas gives cable firms much leverage. However, the most troubling aspect of this issue is the difficulty in determining which side has consumers' best interests in mind, writes Farhad Manjoo.
Click Here to View Full Article
- "Researcher Invents New Graphing Method"
Johns Hopkins School of Public Health (08/12/03)
Dr. Alvaro Munoz of the Johns Hopkins Bloomberg School of Public Health has determined three major flaws with the 3D bar graph method used to present financial, medical, and other information in many computer programs, newspapers, and scientific journals. First of all, the bar graph cannot equally represent the two variables that equally contribute to the outcome; second, the difficulties of representing a 3D image on a 2D page makes determining the bars' true value problematic, perhaps even impossible; and third, overlapping data cannot be demonstrated on a 3D graph. "The inaccuracies of the traditional 3D bar graph may seem trivial, but they can be significant when youre dealing with important information like predicting your risk for a heart attack or plotting the performance of your company investments," remarks Dr. Munoz. To overcome these problems, Dr. Munoz has conceived of a new Diamond Graph approach that seeks to equally represent all variables two-dimensionally. The Diamond Graph basically offers an overhead rather than side view of a bar graph, using expanding polygons arrayed in a diamond-shaped grid instead of parallel bars. Dr. Munoz learned through experimentation that a six-sided polygon is the only configuration that can represent all outcomes proportionately as the grid expands. The Diamond Graph is the first technique that can evenly embody the relationships between an intermittent outcome and each of the two variables that the outcome relies on.
Click Here to View Full Article
- "CMU Professor Wins Award for Program That Aids Decision-Making Process"
Pittsburgh Post-Gazette (08/11/03); Spice, Byron
Artificial intelligence can be used to find the best overall solution in a competitive decision-making environment, according to work done by Carnegie Mellon University computer scientist Tuomas Sandholm; those decisions include real-life political ones, such as where to build a bridge or whether or not to award a gambling license. But Sandholm's startup company, CombineNet, has so far focused only on business solutions, saving companies such as Bayer, H.J. Heinz, and Procter & Gamble a total of $300 million through electronic auctions worth between $3 billion and $4 billion. For his so-called "combinatorial optimization technology," Sandholm will be given the Computers and Thought Award at the International Joint Conference on Artificial Intelligence in Mexico this month. Carnegie Mellon Center for Automated Learning and Discovery director Tom Mitchell says mechanisms such as combinatorial optimization technology are important for artificial intelligence because they harness group dynamics, similar to how the brain's nerve cells work together to make decisions. "Even though you feel like you're one person, you're actually 10 billion neurons," Mitchell says. Sandholm's work can be applied to any problem where a number of competing interests demand representation in a decision. By calculating each interest's need with others, the decision-making rules find the best solution for all parties involved. Sandholm says, "Clearing the market--deciding who should win--is a very hard optimization problem." Sandholm has won a patent for his method, which he says is the world's fastest for these NP-complete problems--a class of problem that is best illustrated by the Traveling Salesman question, where the fastest route through a large set of cities is needed.
- "Data Search Stirs Concern"
eWeek (08/11/03) Vol. 20, No. 32, P. 22; Carlson, Caron
The war on terrorism and the homeland security push have made citizen cross-checking and profiling a government priority, which is in turn fostering the creation of companies that wish to sell private and public data to federal agencies. There is, however, no system currently in place to ensure the accuracy and quality of the data being collected, or the incorruptibility of the information merchants; nor does the government appear concerned about this issue. Chris Hoofnagle of the Electronic Privacy Information Center characterizes these info traders as "opportunists" who feel entitled to use the information they collect in any way they see fit. "They're taking advantage of a lapse in public policy to enrich themselves to the detriment of society," he declares. Critics have proven that the government is taking a greater interest in the citizens it is investigating rather than in the backgrounds of the brokers it does business with; furthermore, data merchants do not have to comply with quality standards or be certified. It is also a historical fact that info dealers have avoided selling data to federal law enforcement agencies rather than satisfy privacy regulations designed to protect public-sector databanks. Erroneous data in the public sector alone can inconvenience, embarrass, or possibly endanger people, and these threats are amplified if such information is given to the government. Bates Information Services principal Mary Ellen Bates says most public records were never designed with cross-indexing in mind, while Washington privacy consultant Robert Gellman remarks that unregulated records are becoming more and more prevalent in the government.
- "Postal Service Pursues 'Intelligent Mail' Despite Privacy Concerns"
Computerworld (08/11/03) Vol. 31, No. 38, P. 12; Verton, Dan
The President's Commission on the U.S. Postal Service has thrown its support behind the idea of developing sender identification technology for U.S. mail. The presidential commission has recommended that the U.S. Postal Service (USPS), which has crafted a corporate plan for implementing intelligent mail, collaborate with the Department of Homeland Security to develop "personalized stamps" and other sender identification technologies. The commission focuses on making the agency more efficient, and believes having digital identification information embedded in mail offers advantages in tracking and delivering mail. "Intelligent Mail could allow the Postal Service to permit mail-tracking and other in-demand services via a robust Web site that ultimately becomes the equivalent of an always open, full service post office," states the commission's report. Sender identification technology would also provide greater security for the mail system, the commission acknowledges, but civil liberties groups and technology groups such as the Electronic Frontier Foundation have expressed concerns about the technology's potential impact on the privacy of ordinary citizens. Congress most likely would have to approve the use of intelligent mail.
Click Here to View Full Article
- "A Glimpse of IBM's Future"
PC Magazine Online (08/11/03); Metz, Cade
New computing applications designed and developed by interns at IBM's Industry Solutions Lab were recently spotlighted at the Extreme Blue Technology Showcase. One application, Blue Fusion, is a supply-chain simplification tool that utilizes Web services technology; business intern Jordan Friedman explains that Blue Fusion can smoothly meld collaboration tools, accumulate data across the entire value chain, and automate business transactions. The Hippocratic Database Project, designed by interns from UCLA and the University of Wisconsin, Madison, is a privacy management system connected to existing healthcare applications that allows users to keep their personal information private while authorizing that only the appropriate parties can view relevant data. A third application, developed by an Extreme Blue team led by Duke University's William Streit, is eLumination, a tool that enables businesses to track organizational health. Streit describes eLumination as "a framework for capturing real-time transaction data and transforming it into visual business information." The tool ties into IBM's WebSphere Business Integration product suite and currently keeps tabs on energy and utility usage. The Extreme Blue application with perhaps the widest allure is Total Recall, a desktop application developed by interns from MIT, the University of Waterloo, and the University of North Carolina. With Total Recall, users can retrieve documents they have viewed before by entering a word or phrase. Extreme Blue projects manager Paul T. Baffes says that the internship program is important to sustaining IBM's innovation, because it brings in "fresh talent."
- "Quantum Computer Keeps It Simple"
Technology Research News (08/20/03); Smalley, Eric
Quantum computer researchers at the University of Oxford and University College London have proposed a radically different quantum computer design that broadens the horizon of possible implementations. Instead of relying on metal electrodes to control the interaction between two qubits sharing superposition, the new design uses a third qubit that is attuned to a different energy frequency. The middle electron responds differently to energy frequencies than does its neighbors on both sides, and the device would be able to complete all necessary logic functions necessary for quantum computing. Quantum programs would make use of six specific energy frequency changes at timed sequences. University of Calgary quantum information science professor Barry Sanders says the most important aspect of the new work is that it expands the range of possible quantum computer designs and allows for much easier control. With other designs, paired qubits must have on and off interactions, but with the new design the qubits are in an always-on state. Getting qubits to start and stop interactions is difficult to control because of the precision necessary. University of Oxford senior research fellow Simon Benjamin guesses that the first quantum computers will be used in scientific settings just 10 years from now, helping biological researchers examine natural quantum occurrences such as those in fundamental biological processes; still, Benjamin does admit that there is the real possibility that quantum computing, while valid in theory, will be too complex to practically apply in everyday computing tasks, given the available technology.
Click Here to View Full Article
- "Saturday Light Fever"
New Scientist (08/09/03) Vol. 179, No. 2407, P. 34; Schechter, Bruce
Scientific research of microscale objects could advance significantly thanks to a combination of optical tweezers and holograms developed by University of Chicago physicist David Grier. Optical tweezers invented by Bell Laboratories researchers 17 years ago can trap and move tiny particles in defiance of gravity through the manipulation of a laser beam's optical intensity, but only one or two tweezers can be created at a time within a given microenvironment. Grier sought a solution to this dilemma, which was holding up his research into colloidal behavior; then in 1997 his student Eric Dufresne discovered a commercially available hologram that refracted a single laser into16 smaller beams, which could theoretically snare 16 particles at once by focusing them through the lens of a microscope. Grier and Dufresne employed such a hologram, even though it was designed to work with red light, whereas the optical tweezers were generated with green light to prevent heat damage to the cell samples. A key element of the methodology the researchers arrived at is a spatial light modulator (SLM), which responds to an electric field by thickening its pixels rather than changing their color; laser light undergoes a subtle phase shift as it passes through the beefed-up pixel, and this effect enabled Grier and Dufresne to create the required interference to produce multiple tweezers, which are easy to manipulate because the holograms are computer-generated. The computer algorithms used to create the holograms were the product of several years' work, and the ultimate technique, which combines a computer, an SLM, and some optics, can yield as many as 200 tweezers at once. A microscope-compatible device containing all the optics is made by Arryx, a company Grier started. Holographic optical tweezers are being looked into as a control mechanism for micromachines, and Grier believes his method will help expedite the sculpting of micromachines as well.
- "The Myth of Generation N"
Technology Review (08/08/03); Garfinkel, Simson
"Database Nation" author Simson Garfinkel challenges the popular assumption that all young people possess a natural aptitude for technology, which puts a crimp in social scientists and technologists' projected emergence of a Net generation, despite statistical reports of wide Internet usage among American youth. Garfinkel warns that the myth of Generation N, as he terms it, could be a sociological and professional barrier to people who are slower to gain technological competence. The author notes that just as he has seen high school and college students with little inclination toward or experience with information technology, he has also seen people between the ages of 40 and 50 who are computer-savvy. The major distinction Garfinkel sees between these two generations is that computer usage is now a requirement among the younger set. Human-computer interaction experts say teenagers are more willing to experiment with computers and have more tolerance for defective programs because the technology is such a basic component of their lives. But Garfinkel points out that experience, not age, is the defining element of computer literacy, and adds that increased education and federal grants alone will not bridge the competence gap he foresees. "As a society, we need to come to terms with the fact that a substantial number of people, young and old alike, will never go online," Garfinkel writes. "We need to figure out how we will avoid making life unbearable for them."
Click Here to View Full Article
(Access to this site is free; however, first-time visitors must register.)
- "The Great Debate Over Software Patents"
CIO (08/01/03) Vol. 16, No. 20, P. 66; Rosen, Lawrence; Radcliffe, Mark
Gray, Cary, Ware & Freidenrich partner Mark Radcliffe and Open Source Initiative general counsel Lawrence Rosen offer differing views of how the U.S. patent system is being affected by the growing viability of software patents. Radcliffe posits that the software patent furor is typical of early technological development, and insists that such problems are eventually worked out, with inventors receiving appropriate remuneration and protection for their work. He reasons that the current difficulties have less to do with the aptness of software patents than with the U.S. Patent and Trade Office's patent application review process. Patents and licensing fees are a valid form of compensation for vendors who spend huge sums on product development, and a defensive tool to resolve conflicts with another company's patents and avoid expensive court actions. Radcliffe says it is false to believe that patents chiefly benefit large corporations and handicap smaller firms, because startups cannot establish a market presence and discourage bigger rivals from offering similar products and services without patents; he also argues that, contrary to popular assumptions, patents do not cover whole software programs, while software is not intangible and therefore not distinct from other technologies. Rosen contends that "Big software companies create portfolios of thousands of patents designed to legally monopolize innovation," and it is the customers who ultimately pay. Worse, people are being sued for using technology featuring patented software that they were previously unaware of. Rosen claims that big companies are using software patents and the threat of litigation to blunt the ambitions of smaller competitors. Anyone who wants to protect themselves from patent indemnification faces an impossible task--assessing the flood of patents that are issued each year; Rosen sees wisdom in instituting limits or bans on patents for essential technologies, congressional legislation that forces patent holders to license such technologies to any user, and a reduction of the current 20-year patent term.