Volume 5, Issue 585: Friday, December 19, 2003
- "Sun Researchers: Computers Do Bad Math"
IDG News Service (12/18/03); McMillan, Robert
Unexplained computer crashes--which have at times made the difference between life and death--are sometimes attributable to bad math rather than poor operating system design, according to Sun Microsystems CTO Greg Papadopoulos. Sun has been granted $50 million from the Defense Advanced Research Projects Agency (DARPA) to prevent such errors from occurring, among other things. "Floating-point arithmetic is wrong," declares Sun researcher John Gustafson; the binary mathematical scheme computers use has difficulty representing certain numbers--fractions, for instance--accurately. Fractions are non-terminating numbers that defy attempts to be accurately expressed in binary format. Insight64 analyst Nathan Brookwood notes that software programmers can employ an array of methods to circumvent this error, but such workarounds can lead to performance slowdowns, especially in supercomputers that calculate billions of sums each second. Sun researchers are investigating interval arithmetic as a solution: The technique basically cages a mathematically incorrect number between two correct numbers. "If you can prove mathematically that the right answer is between this answer and that answer, you can restore mathematical rigor to computing," explains Gustafson. The interval arithmetic research Sun is conducting will be incorporated into a prototype supercomputer to be constructed with the DARPA grant; Gustafson estimates that the machine would be about 50 times faster than the fastest existing supercomputer. Whether DARPA will decide to take the supercomputer beyond the prototype stage will depend on how successful Sun is with "proximity interconnect," a new chip-to-chip communication methodology in which data is transferred wirelessly between chips.
Click Here to View Full Article
- "Web Tools Don't Always Mesh With How People Work"
Newswise (12/17/03)
There are numerous techniques Web users employ to recall the Web pages they visit (sending emails to themselves or writing sticky notes, for example), but most people do not avail themselves of such methods when they decide to revisit pages, say University of Washington's William Jones and Harry Bruce and Microsoft Research's Susan Dumais. Bruce says, "People should have fast, easy access to the right information, at the right time, in the right place, in the right quantity to complete the task at hand." The researchers have studied this phenomenon, with funding from the National Science Foundation, in the hopes that more useful tools for keeping track of information can be developed. Their work implies that "keeping" methods stem from the different ways people plan to use the data, but bookmarks, which are the chief "keeping" instrument of most Web browsers, lack many of the advantages users desire. Furthermore, Dumais, Jones, and Bruce have learned that no matter what "keeping" technique a user prefers, most users often attempt to return to a Web site using three other methods: Directly entering a URL in the Web browser, conducting search engine queries, or using another Web site or portal to access the page. Jones and Bruce, along with their students, enhanced a Web browser with an "Add to Favorites" dialog, which allowed people to add comments about a link, send it through email, or save the page to their hard drive from a single dialog; however, most testers did not adopt this option, since they had fallen out of the habit of using bookmarks. Now the researchers are devising a conceptual architecture for how people decide to retain information to use later, which Bruce terms PAIN (personal anticipation of information need). The team is also seeking a patent for tools and techniques to address the problem of "information fragmentation" by meshing all the data scattered across different documents and media into a single "My Life" taxonomy.
Click Here to View Full Article
- "Linux Gets Heart Transplant With 2.6.0"
CNet (12/17/03); Shankland, Stephen
A market dominated by Unix-enabled servers could be disrupted with the introduction of Linux version 2.6.0, an updated kernel of the open-source operating system that was released Dec. 17. In a note to the kernel mailing list officially announcing the new kernel's release, Linux founder Linus Torvalds declared that most of the problems were fixed prior to the delivery of the final update, while the remaining bugs were esoteric and difficult to locate. Whereas the previous Linux kernel, 2.4.0, only runs on servers with four to eight processors, version 2.6.0 can run on 32-processor systems, according to Andrew Morton, the programmer in charge of the kernel. International Data analyst Jean Bozman says the growth rate of Linux servers has accelerated significantly over the last three quarters, with third-quarter sales totaling $743 million. Brian Stevens of Red Hat and SuSE Linux's Kurt Garloff expressed enthusiasm for 2.6.0's purported storage improvements, among them: Better handling of input-output requests and less vulnerability to "thrashing," in which heavy hard-disk traffic loads slows down the computer and sometimes leads to crashes; an improved volume manager; the ability to use file systems with more than 2 TB of storage space; and the removal of "device space" restraints. Garloff said 2.6.0 will be incorporated into SuSE Linux Enterprise Server 6 slated for release next summer. Red Hat plans to release Fedora Core 2, a hobbyist version of the new kernel, in April, to be followed by a 2.6.0-enabled version of Red Hat Enterprise Linux 4 in 2005.
Click Here to View Full Article
- "With an Urban Scooter, a Humanoid Robot Hits Its Stride"
New York Times (12/18/03) P. E7; Flaherty, Julie
Segway, the company that develops the Segway Human Transporter, modified the two-wheeled, gyroscopically balanced vehicle for university projects supported by the Defense Advanced Research Projects Agency (DARPA) under its Mobile Autonomous Robot Software initiative, whose goal is to create robots that can aid military operations without a heavy reliance on human supervisors. An MIT project led by Dr. Una-May O'Reilly incorporated the Segway Robotic Mobility Platform into Cardea, a humanoid robot that can explore on its own, pass through doorways, and traverse hills and bumps. The Segway's gyroscopic balance system is Cardea's most important component, because it enables the machine to stand with perfect balance without requiring a cumbersome wide base, a capability that could allow the robot to interact with humans at eye level. Researchers plan to upgrade Cardea to follow commands from people it encounters, as well as extrapolate information about its environment through the manipulation of objects. Meanwhile, Carnegie Mellon University engineers have incorporated the Segway into smaller robots trained to play soccer, and intend to have these machines square off against humans riding Segways in a demo soccer match this summer. Sebastian Thrum of Stanford University, another participant in DARPA's Mobile Autonomous Robot Software project, says the Segway, in addition to being fairly inexpensive, "allows us to build relatively complex robots with relatively small footprints that can still see very nicely." The modified Segway platform moves according to directions from a computer program, and comes with a shelf where any equipment the robot needs--sensors, cameras, robotic limbs, etc.--can be positioned. Research teams have designed bumpers, kick stands, and even airbags so that there is less chance of damaging the valuable equipment the robots carry if the Segway platform loses balance.
Click Here to View Full Article
(Access to this site is free; however, first-time visitors must register.)
- "IBM Research Working Toward a Better In-Box"
Network World (12/17/03); Fontana, John
IBM Research's Collaborative User Experience Team is trying to improve email systems through its ReMail (Reinventing Email) project, which recently revealed a prototype client boasting features that promise to offer better email organization and management. "Part of the overload that people have is not just that everything comes in one stream but that it is scattered," notes ReMail team leader Dan Gruen, who adds that the goal of ReMail is to consolidate this scattered information into a manageable environment that aids collaboration. The three-pane ReMail interface can be adapted to display a calendar, a buddy list, and message threads, as well as a standard email message list. The list of messages in the in-box is indexed by collapsible/expandable "separators" that group the messages under headings for each day, while messages can be annotated through color-coding. Messages can also be organized creatively in ReMail through highlighting, while connections between messages can be visualized via the Thread Map, which displays a blue circle to symbolize a selected message, a hollow circle to represent sent messages, and a black circle signifying unread messages; arcs are used to represent links between messages. Messages can be indexed by sender through the Correspondent Map feature, while senders can be organized according to domain, and multiple senders within a domain can be grouped according to the number of messages they send. Icons representing messages in the in-box are provided through ReMail's Message Map, and users can search on message authors, status, or attachments. The Collections function enables in-box folders to be stored in-sight or out-of-sight and organized in any arrangement, including alphabetical. The prototype ReMail client can also store communication from different sources such as blogs and discussion lists.
Click Here to View Full Article
- "From Browser to Platform: Mozilla Rises"
Linux Insider (12/16/03); McFarlane, Nigel
Besides providing a steadily improving Web browser, the Mozilla technology is quietly becoming an OS-independent applications platform, writes Nigel McFarlane, who says Mozilla products continue to improve incrementally and have won several awards and outstanding reviews this year. Mozilla is already a platform, and the popular browser component is encapsulated in a single zip file sitting on the "chrome" directory. Mozilla adheres to basic Web development technologies as well, including XML, CSS, and JavaScript, so that Web developers have an easy way to try their hand at applications development. Another benefit of using the Web standard XML is the way it can power GUIs: More powerful hardware enables interpreted code to be run fast enough for GUI operation. By creating a standard application GUI window, Mozilla does away with the need for a document to sit inside an HTML page inside a browser window; this straightforward programming method is a breakthrough and has already influenced other GUI development vendors to develop their own GUI-specific XML variant. Because they utilize standard Web technologies, Mozilla applications easily port from Windows to Linux to almost any other operating system, including the less popular OpenVMS, BeOS, and OS/2 systems. Mozilla goes to great lengths to ensure portability and undermines the efforts of proprietary vendors to maintain control. Instead of using Microsoft's ActiveX, previously called COM, Mozilla re-implements a completely independent COM so the two run side-by-side; this allows Visual Basic programmers to try out other operating systems, since the Mozilla COM extends to those systems as well.
Click Here to View Full Article
- "Cyber Threats Risk Net's Future"
BBC News (12/16/03); Boyd, Clark
A major issue stressed at the recent U.N. summit in Geneva was how to leverage information technology so that developing countries can benefit--and key to this is embedding trustful cybersecurity within these nations. But this goal is complicated by the fact that such countries are frequent targets of cyberattacks, and many governments lack the resources--knowledge especially--to set up effective countermeasures. Jet Infosystems director Eugeny Shabligin commented that technical know-how is only part of the solution: Trust between government and the people cannot be ensured without proper cybersecurity legislation and enforcement. That trust, however, is in danger as cybercrime rises, and security experts warn that trend is a serious threat to the future of the Internet. "If current trends persist, the Internet will inevitably evolve into a chaotic formation, swamped with tons of different spam messages, myriads of viruses," declared Garry Kondakov of Kaspersky Labs. "The home users as well as the various business organizations will refuse to use this global network, as it will become the main source of infections." Rheinhold Scholl of the International Telecommunication Union pointed out that the utilitization of networks and computers requires security as a prerequisite. As the leading source and target of cyberattacks, the United States has the most to lose if cybertrust cannot be established globally. The U.N. summit's action plan recommends inter-governmental and private-sector cooperation to manage worldwide Internet security.
Click Here to View Full Article
- "Profound Debate Roils Nanotechnology Field"
Investor's Business Daily (12/19/03) P. A4; Bonasia, J.
President Bush's approval of the 21st Century Nanotechnology Research and Development Act on Dec. 3--which allocates $3.7 billion for National Nanotechnology Initiative projects--has split the nanotech community into two camps: Those supporting Foresight Institute founder Eric Drexler, who believes there should be more funds devoted to the investigation of molecular manufacturing, and researchers such as Rice University scientist Richard Smalley, who dismiss molecular manufacturing as patently impossible. Drexler believes the creation of "molecular assemblers" that could be used, for instance, to cure cancer or deliver clean energy is an attainable goal, although much research remains to be carried out. He also warns that such nanoscale machines could be used as terrorist weapons. Smalley and his compatriots counter that molecular assemblers are a dead end because the mechanical systems they need to function cannot work at nanoscale proportions. Drexler alleges that U.S. nanotech policy, which excludes molecular assembly on the grounds of infeasibility, is based on political rather than scientific reasoning. Stuck between both sides of this debate are venture capitalists such as Josh Wolfe, whose firm, Lux Capital, refuses to invest in molecular assembly research whose practical benefits are a long way off--and a lack of government support only makes such investments an even dicier prospect. Wolfe believes the government must be responsible for laying a solid foundation for molecular manufacturing, and criticizes Drexler's camp for attracting supporters who subscribe to some of nanotech's more outlandish promises, such as immortality.
- "The Data Center of the Future"
EarthWeb (12/16/03); Robb, Drew
Companies are looking to several emerging technologies to help them cope with rising data center costs and low utilization rates: META Group predicts data center budgets will grow by 70 percent over the next decade, with software spending expected to more than double and storage and server costs falling. Linux will figure prominently in the future data center and will experience a tremendous boost in 2004, according to Forrester analyst Ted Schadler, while Gartner says the Linux market will surpass $9 billion in 2007. Although lower cost plays some role in growing Linux adoption, companies are actually willing to pay for supported enterprise Linux versions, especially since operating system prices are usually a small percentage of overall software costs. A major benefit of Linux is the flexibility and scalability it offers, as it can be deployed in devices as small as cell phones and as large as IBM's z-Series mainframes; this range of applications allows companies to slim down their required IT skill set. And though mainframes have experienced a renaissance lately, smaller blade servers are becoming popular as well because of their ease of deployment and flexible configuration. Linux adoption dovetails with that of blade servers since the lightweight operating system is ideal for the smaller hardware. Virtualization is one of the most important technical data center advances, and will help boost utilization by pooling resources and avoiding over-provisioning. New data center standards are emerging as well to make data center administration more manageable; standards such as the XML-based data center markup language (DCML), supported by major management software vendors, and Microsoft's system definition model (SDM) will allow managers to operate data centers in a plug-and-play style.
Click Here to View Full Article
- "Users Worry About 'Zero-Day' Attacks, Try to Secure Systems"
Computerworld (12/15/03); Vijayan, Jaikumar
Companies are becoming increasingly concerned about the possibility that their systems will be attacked before software patches are made available. At the InfoSec 2003 Conference in New York last week, information technology managers indicated that so-called zero-day attacks have the potential to cause enormous damage to data security. Joseph Inhoff, LAN administrator for lighting equipment maker Lutron Electronics, said company management is concerned about a zero-day attack, adding that in attending the conference he wanted to learn whether automated patching software would be an effective way to secure systems. Industry experts say the amount of time it takes a hacker to exploit a software flaw has been reduced substantially. For example, the SQL Slammer worm appeared last January, eight months after the vulnerability in the SQL Server database was discovered; but last summer's Blaster worm came nearly one month after Microsoft issued a Windows patch. Industry observers believe hackers will eventually attack systems before flaws are disclosed and vendors release patches. A zero-day attack might force an unprepared company to shut down their systems and restart them.
Click Here to View Full Article
- "Taming the Supercomputer"
CNet (12/17/03); Kanellos, Michael
Supercomputing is the central driver behind scientific advances, plays an important role in the economy, and is proving useful in national security issues, says IBM systems vice president at the T.J. Watson Research Center, Tilak Agerwala. IBM collaborates with a number of national and academic laboratories on a number of supercomputing projects: The PERCS (Productive, Easy-to-use, Reliable Computing System) program, TRIPS (Tera-op Reliable Intelligently Adaptive Processing System), and the BlueGene L system are part of IBM's push toward novel leaps in supercomputing design. At the same time, Agerwala says IBM is pursuing more incremental advances in mainframe computers, supported by continuing improvements in chip speed and software standards. The ASCI Purple is IBM's current big-iron system, and the company is also working on low-cost, high-performance clusters that use standard processors and interconnect technologies. PERCS is a long-term blueprint for where mainstream systems will be in about 2010, and is focused on productivity in terms of easily developing new applications. PERCS systems will have adaptable components including processors, cache structure, and a set of complex compilers and middleware that will automate application development tasks; the result will be a system that is much cheaper and easier to use, and can be applied to a broad range of problems. TRIPS is focused more on the chip-level and is working toward a "supercomputer on a chip" that can perform 1 trillion calculations per second. Agerwala says much of IBM's system work is currently in the investigation phase and still grappling with significant issues; supercomputing has important implications, he says, for life sciences, weather forecasting, and simulated testing of nuclear stockpiles to avoid physical testing.
Click Here to View Full Article
- "Making Something Out of Nothing"
New York Times (12/18/03) P. E1; Bernstein, David
A startup firm founded by a 29-year-old MIT graduate student has developed a virtual interactive display that combines converted air particles and a laser-tracking system in an interactive touch screen; the two-dimensional computer images are shown floating in free space and can be manipulated with a finger or similar physical object. Venezuelan-born Chad Dyner first built a five-inch display prototype and founded IO2 Technology in July 2002. He says potential applications for the Heliodisplay product include museum or similar display settings, or for collaborative meetings. Display technologies consultant Chuck McLaughlin says new display technologies such as the Heliodisplay are seldom adopted in the mainstream for lack of practical applications; Dyner, however, says the technology behind the Heliodisplay could lead to dramatic new applications in the future and compares his invention to the Wright brothers' first flight. The Heliodisplay is a 15-inch flat surface with a diametrical slit. Dyner explains that ambient air is drawn into the device and subjected to an unspecified thermodynamic and electric process, then jetted out through the slit where it creates the computer images. A digital projector is located at the rear of the device and illuminates the streaming particle cloud, while a laser-tracking system in the rear registers physical user interaction. Norwegian company FogScreen markets a similar type of device that projects images on a cloud of water vapor induced by ultrasonic waves, but the device lacks the interactive component the Heliodisplay boasts.
Click Here to View Full Article
(Access to this site is free; however, first-time visitors must register.)
- "AI Think, Therefore I Am"
APC (12/03); Braue, David
Virtual agents--autonomous, self-directing computer programs that are social and reactive--are being developed for numerous tasks ranging from the simple to the highly sophisticated, but making them effective requires a delicate balance between psychology and technology. A virtual agent is ineffective if it is incapable of building trust within the user with whom it is interacting, and one of the biggest hindrances to establishing trust is an unrealistic appearance, such as jerky motions, unnatural expressions, and poor speech-to-lip synchronization. A U.S. study found that smiling, blinking, and other facial cues make a dramatic difference in users' perceptions of agents; in addition, even visually appealing agents can be less effective if they are too attentive or inaccurate. The results of the study indicate that the best virtual agents are bodiless, and such agents are being employed to collect new data rather than retrieve old data. The behavior of disembodied agents is not directed by their personalities, but by set parameters such as how far and how deep their search for data should extend, and many researchers believe these programs will be well-suited as personal assistants tasked with categorizing, indexing, and presenting information meaningfully. Cutting-edge virtual agents can be found in a joint venture between the U.S. Army and the University of Southern California's Information Sciences Institute, in which peacekeepers are trained to deal with angry or hostile people in war-torn regions by interacting with simulated characters--each imbued with its own personality and emotional expressions--in a virtual setting. Robotic vehicles driven by tireless agents that use cooperation and negotiation tactics to interact with one another are being used at Australian mining sites, notes Hugh Durrant-Whyte of Sydney University's ARC Center of Excellence in Autonomous Systems.
Click Here to View Full Article
- "The End of the Experiment"
CircleID (12/16/03); Palfrey, John
A recent paper by John Palfrey, executive director of the Berkman Center at Harvard, critiques ICANN's effort in cultivating worldwide Internet democracy. Palfrey explains how ICANN has experimented with different types of governance of the domain name system involving recommendations from the Internet community since its establishment five years ago. "ICANN's experimentation in running a representative and open corporate decision-making process has largely failed," Palfrey states. He notes that this failure has most clearly shown itself via ICANN's retreat from its attempt to allow the direct election of a subset of its members of the board, and less clearly, by the length to which other attempts to involve the Internet user community in the decision-making procedure have proven ineffective. Palfrey states that although ICANN does not point the path toward a viable new model for governance of the technical structure of the Internet, the failure of its representation experiment does offer numerous insights into the worldwide challenge of governance of the Internet's technical structure. First, Palfrey writes, ICANN's governance structure must be revamped in a manner that moves ICANN away from its partially democratic past and toward a model better suited to the body's limited technical management objective, with a promise to keep from adding to that requirement. Second, Palfrey says, ICANN should clarify the manner in which users can involve themselves in the decision-making procedure for overseeing the domain name system, possibly through the Supporting Organization process. Lastly, Palfrey suggests, new ways to govern the Internet's technical architecture need to be discovered.
Click Here to View Full Article
- "Feds Look at the Big Computer Picture"
Government Computer News (12/15/03) Vol. 22, No. 34, P. 33; Daukantas, Patricia
Federal policy makers are grappling with future supercomputer development after receiving a wake-up call in the form of Japan's massive Earth Simulator built by NEC. That system can process complex calculations nearly three times faster than the second-fastest supercomputer, the ASCI Q machine at the Los Alamos National Laboratory. The U.S. federal government still lays claim to either funding or owning eight of the top 10 supercomputers on the semi-annual Top 500 list. Energy Secretary Spencer Abraham says his agency's 20-year research and development blueprint puts a strategic emphasis on supercomputing, which is critical to modern science and technology. The Energy Department has the UltraScale Scientific Computing Capability program at No. 2 on its list of 28 long-term strategic programs and plans to partner with U.S. IT vendors to create new supercomputers that are focused on complex scientific work. Many of the current machines on the Top 500 list are reconfigured transaction-processing computers and not designed specifically for scientific calculation. Developing supercomputing research programs requires careful consideration because of the diverse demands on policy makers, says Georgia Institute of Technology associate professor of public policy Juan D. Rogers; he says efforts should be focused on real-world scientific applications and not on substitute performance benchmarks. Abraham says the Energy Department's renewed supercomputing push will also open up its computing capability to unclassified research by a factor of 100, and other IT projects the department is pursuing include the high-speed Energy Sciences Network and the National Energy Research Scientific Computing Center in Berkeley, Calif.
Click Here to View Full Article
- "Instant Manufacturing"
Technology Review (11/03) Vol. 106, No. 9, P. 56; Amato, Ivan
Direct manufacturing, in which products are custom-made from digital files, can accelerate production schedules, reduce or do away with excess inventory, and save hours of human labor while delivering products of superior quality and precision. Such systems essentially print out digital product designs in three dimensions, simultaneously fabricating materials from raw ingredients and configuring them into the desired shape. The groundwork for direct manufacturing was laid out by rapid prototyping technology created to eliminate the labor involved in manually sculpting prototypes, and to reduce the time spent in polishing up designs and correcting problems; the earliest rapid prototyping systems employed stereolithography, whereby liquid polymer was deposited in successive layers by laser, while later advances included laser-powered fusion of powders and printheads that doused powders with binding liquids. Products made through direct manufacturing are typically customized and high-cost: Examples include hearing aids, jet parts, medical implants and prosthetics, and small gears. The advantages of the technique are illustrated in the fabrication of bone substitute. In addition to having the porosity to support real bone-producing cells, artificial bone produced by direct manufacturing can use images of other bones in the patient's body as templates; for example, fragments needed to reconstruct a damaged arm bone can be extracted from an image of the person's other arm bone. Meanwhile, the U.S. Army is adding mobility to direct manufacturing by devising moveable units that can build replacement parts for weapons and vehicles deployed on the battlefield. In addition, researchers at the University of California, Berkeley, are looking into the direct manufacturing of robotics and electronics that feature moving parts. David A. Bourne of Carnegie Mellon University's Robotics Institute believes the advantages of direct manufacturing will eventually make the method omnipresent.
Click Here to View Full Article
(Access to full article available to paid subscribers only.)
- "Programming Matter: A Possible Future"
Futurist (12/03) Vol. 37, No. 6, P. 17; Cristol, Hope
Science writer Wil McCarthy describes the early stages of the development of programmable matter in the book, "Hacking Matter." Solid-state physicists at IBM, Sun, MIT, and the Defense Department are exploring the technology, which involves manipulating the electrons of an object to completely transform it into something else within seconds. Applying an electrical signal to microscopic devices called quantum dots theoretically allows researchers to change the properties of a different atom, which would transform the entire object. McCarthy says wellstone is a form of programmable matter composed of thin silicon crisscrossing a translucent lattice comprised of quantum dots. "It's feather-light, wholly rust-proof, and changeable at the flick of a bit into zinc, rubidium, or even otherwise-impossible substances like impervium, the toughest super-reflector known," McCarthy says of wellstone iron. The technology would do wonders for the rock, paper, scissors game, or could be used to change the exterior color of a home according to the temperature, which would help conserve energy. McCarthy envisions programmable matter becoming a common technology by the twenty-second century.
|