Volume 5, Issue 530: Friday, August 8, 2003
- "Mapping Technology Speeds Help to Fire-Scarred Land"
New York Times (08/07/03) P. E8; Eisenberg, Anne
Wildfires not only threaten forests, but can lead to soil erosion that results in flooding and water contamination. Evaluating the risk of erosion in order to protect property and water supplies is an arduous procedure that assessment teams only have a limited amount of time to accomplish; however, software written by Chris S. Renschler of the University at Buffalo can expedite the process. The GeoWepp software interface connects people to a massive database of free geographic information collated by the government: Users can access data relevant to their specific region and plug the data into a program that anticipates watershed damage. Models are generated quickly because GeoWepp can calculate slopes and other factors automatically. GeoWepp can extract and map out networks of erosion channels that could result from burning, a feature that makes the interface rapid and attractive to users. Renschler says a major challenge was connecting the geographic database to the watershed model, because the model was incompatible with geographic information systems (GIS) technology. His solution was to retrofit the model to handle GIS, a national effort that involved the participation of about 30 researchers. Renschler boasts that, thanks to GeoWepp, the only fieldwork assessors need to do now is to study the vegetation cover to determine burn severity.
(Access to this site is free; however, first-time visitors must register.)
- "Warning Lights Flash for Transport Planners"
Financial Times-IT Review (08/06/03) P. 1; Griffiths, John
The technology to build fully automated cars exists, but lawmakers have yet to set the safety standards and legislative foundation that will make such a vision a reality, while transport planners are wrestling with the impact self-driving cars will have on public transportation--especially when telematics systems allow passengers to travel in greater comfort and safety than on a bus or train. Bernd Bohr at German systems supplier Robert Bosch does not foresee fully automated driving within the next two decades, but he is confident that bureaucracy will not stand in the way of individual technologies. The only possible legislative difficulty he sees at this point is establishing who is responsible for accidents involving self-driving cars. The development of intelligent cars represents a tremendous opportunity for vehicle manufacturers--UBS Warburg estimates that the value of IT-related in-vehicle systems will top $30 billion by 2008. However, the job of developing hardware for telematics systems is outsourced to companies like Robert Bosch. Car makers cannot afford to let telematics suppliers reap the bulk of value added, which places stringent pricing pressure on these suppliers. A key point of frustration for both car manufacturers and their IT collaborators is determining what systems and services consumers desire the most. An Accenture study calculates that 71 percent of consumers would be very interested in owning a car outfitted with telematics, yet consumers have not exactly been scrambling to take advantage of offerings such as General Motors' OnStar service.
(Access for paid subscribers only.)
- "Advocates Form Open-Source Trade Group"
CNet (08/06/03); Bowman, Lisa M.
Although the Open Source And Industry Alliance (OSAIA) has not yet been officially inaugurated, its members are drumming up support for its mission, which is to promote the advantages of nonproprietary software and battle proprietary software makers' attempts to suppress Linux and other open-source products. The OSAIA has already got open-source guru Bruce Perens, former Slashdot editor Chris DiBona, and Center for Open Source and Government director Tony Stanco in its corner, and the group recently invited LinuxWorld attendees to participate. Other OSAIA supporters include Sleepycat Software, MySQL, and Damage Studios. The alliance is being launched with the help of the Computer and Communications Industry Association (CCIA); CCIA CEO Ed Black insists that the new group "is not designed to be anti-Microsoft," although his association has sharply criticized Microsoft in the past. Key to the OSAIA's success, which places it ahead of past attempts to start an open-source lobbying group, will be soliciting the advice of experts familiar with Washington politics. Black says the OSAIA will track the actions of SCO Group, which filed a lawsuit against IBM for allegedly including copyrighted Unix code in Linux. He adds that the alliance will keep a close watch on intellectual property laws and international treaties, and study procurement codes of various organizations and governments to ensure that they do not lock out open-source software. The OSAIA is hoping to draw its members from four segments--the Linux operating system's original developers, pure Linux companies like Red Hat or SuSE, open-source software buyers, and large companies that value open source as a strategic advantage.
- "Object Technology Conference Takes on Open Source Software"
AScribe Newswire (08/06/03)
The ACM Special Interest Group on Programming Languages (ACM SIGPLAN) is addressing open-source software at its annual object technology conference on October 28 and 29 in Anaheim, CA. Stanford University law professor Lawrence Lessig will define intellectual property law boundaries in regards to open-source software in a keynote address. Open-source activist Tim O'Reilly is the other keynote speaker and will talk about open-source software in the context of network computing. ACM SIGPLAN hosts the Object-Oriented Programming, Languages, and Applications (OOPSLA) conference every year, providing a forum for researchers, field practitioners, and students to share insights on object technology. OOPSLA will again host the ACM SIGPLAN Student Research Competition where students have a chance to interact with experts. Besides a schedule of workshops, technical presentations, and tutorials, special talks will be given by Dr. Erich Gamma of IBM's Eclipse project, Motorola's Gerald Labedz on Augmented Reality, and Sun Microsystems' Dr. David Ungar on Programming Language Design.
Click Here to View Full Article
For more information, or to register for OOPSLA, visit http://oopsla.acm.org/oopsla2003.
- "XML: Extremely Critical or Exhaustingly Complex?"
ZDNet UK (08/05/03); Donoghue, Andrew
Despite its tremendous popularity, XML deployment has become even more controversial with its proliferation, since there is no central authority governing XML standards. Gartner research director Charles Abrams says the XML meta-language is as important as was the World Wide Web or client/server computing. However, the spread of XML standards--with no one keeping track of exactly how many, Abrams says there are hundreds--limits the extent to which companies should commit business-critical operations to the emergent software technology. A 2002 Gartner report warned against rapid, extensive XML deployment, stating that incongruous standards could lead to wasted effort or even compromised business-critical transactions. The World Wide Web Consortium (W3C) that vetted XML does not govern how companies use it, leaving even single companies to craft their own set of XML tags. Significantly, however, several large XML efforts have involved major standards bodies, including the ebXML standard developed by the Organization for the advancement of Structured Information standards (Oasis) and an U.N.-related agency. McKinsey & Company has been similarly circumspect in its support of XML, advising businesses to extend existing EDI infrastructures with XML rather than replace them wholesale. By wrapping messages in the Simple Object Access Protocol (SOAP) data exchange standard, Abrams says companies can turn hybrid XML and EDI exchanges into basic Web services components. Abrams notes that XML development will mirror previous important technology rollouts, where firms initially overestimate its usefulness before underestimating the technology later.
Click Here to View Full Article
- "Reasonable Computers"
ABCNews.com (08/05/03); Eng, Paul
The Defense Advanced Research Projects Agency's (DARPA) Perceptive Assistant that Learns (PAL) program is an initiative to develop cognitive computer systems that can automatically perform many of the routine tasks that decision-makers are currently burdened with--chores such as answering email, scheduling meetings, and furnishing reports. Decision-makers' efficiency would be enhanced through the deployment of such digital assistants, which would be programmed to adapt to their users' needs via software that mimics the way people think and learn. DARPA has invested $22 million in SRI International's Cognitive Agent that Learns and Observes (CALO) program, a PAL-related project that seeks to mesh various expert software and technology developed by the military into a cognitive system that manages the many tasks and data typical of military decision-making. DARPA has thus far earmarked $7 million for Carnegie Mellon University's Reflective Agents with Distributed Adaptive Reasoning (RADAR) project. The idea behind RADAR, like CALO, is the consolidation of expert systems into a whole whose components can interact with each other. CMU researcher Scott Fuhlman illustrates this concept by projecting that RADAR's email element would be trained to notify the scheduling component when it recognizes key phrases and associated times; the scheduler may then communicate with the email sender's agent to set up the most convenient meeting time. Both SRI and CMU researchers believe that smarter, more adaptive systems will be the inevitable result of increasing computing power.
Click Here to View Full Article
- "Combing Through the Tech-Job Haystack"
NewsFactor Network (08/05/03); Hill, Kimberly
Challenger, Gray & Christmas lists technology as one of the few job sectors starting to show signs of recovery, but a surfeit of job candidates and a shift in employer attitudes requires job-seekers to revise their strategy. Herb Rozoff of Challenger says that successful candidates must emphasize their work for past employers as a demonstration of their value, and his firm advises that an employee's personal objectives should no longer be included in resumes. Deloitte & Touche's Maria Grant recommends that candidates clearly communicate their skills and past accomplishments in both resumes and interviews, while job-hopping is no longer a valid tactic for updating one's skills or certifications. Grant adds that employers place greater value on competency rather than industry experience, which makes a highly detailed resume even more advantageous. Challenger reports a decline in job search times and a drop in industry-switching by job-seekers, while tech industries are eliminating fewer positions in the wake of the damage caused by mass layoffs; the overall result of these trends is less difficulty for technical employees to secure a job in the industry of their choice, according to Rozoff. Grant explains that her clients depend less on newspapers and more on word-of-mouth to attract prospective employees, and use an array of Web sites to compile resumes. She also advises technical job-seekers to participate more in professional organizations, and notes that temporary employment gives candidates the opportunity to network as well as test the waters with prospective employers. IBM's Ray Schreyer thinks job-seekers would do well to visit corporate Web sites, which often list up-to-date, highly comprehensive vacancy announcements.
- "How Robots Will Steal Your Job"
Wired News (08/05/03); Glasner, Joanna
Over 50 percent of Americans could be replaced by robot labor by the middle of the 21st century, according to futurist Marshall Brain in his essay, "Robotic Nation." Brain projects that machines will be handling about 5 million retail jobs by 2015, while humanoid robots will be widely available and employed as housecleaners, fast-food servers, and sales staff by 2030. In another essay, Brain focuses on how electronic food-ordering kiosks at fast-food restaurants could pave the way for automated cooks. The essay is one in a series designed to complement a novel Marshall will publish online, detailing a future society where robots comprise the bulk of the workforce. Many technical experts responded to Brain's predictions with skepticism, as demonstrated by postings on Slashdot. One poster argued that the wide deployment of humanoid robots is predicated on people buying them, which would be impossible if unemployment is widespread. Another poster noted that a certain degree of worry is often associated with the introduction of new technologies, but added that people are able to find new forms of employment to replace those lost to automation. Brain disputes this assertion, using robot repair as an example: He foresees the fixing of broken robots--which some expect to be a human task--being completely managed by machine labor as well.
- "Smart Rooms"
Computerworld (08/04/03) Vol. 31, No. 37, P. 29; Anthes, Gary H.
Carnegie Mellon University's "Barn" is a prototype conference room capable of recording everything that happens during a meeting through an array of microphones, cameras, projectors, and other equipment. Faculty advisor Asim Smailagic says the Barn was designed for meetings that aim to flesh out designs. "It's for brainstorming, idea generation, knowledge generation and knowledge transfer," he notes. Conference participants register their presence by donning radio-frequency identification tags, while wearable sensors allow the Barn to confirm their identity and constantly track their location; "social geometry" is used to adjust lighting and microphones according to attendees' physical position. A key component of the meeting area is a digital whiteboard outfitted with an intelligent interactive display, or "Thinking Surface," where concepts can be projected and updated via PC connections. Major decisions or brainstorms are flagged in meeting logs when someone pushes a "that was important" (TWI) button on his computer. TWI markers are useful for people who miss meetings and need to be brought up to speed quickly. Director of CMU's Human-Computer Interaction Institute Dan Siewiorek says future Barn research will focus on avoiding contradictory decisions among semi-independent subgroups within large project teams--and the headache of resolving those problems later on--by allowing liaisons in each subgroup to remotely audit the other groups' meetings quickly through the deployment of keyword recognition systems throughout the conference room. Siewiorek boasts that one of the standout characteristics of CMU researchers is their dedication to building technology around human issues, rather than vice-versa.
Click Here to View Full Article
- "CC Product Evaluation Picks Up Steam"
Network World (08/04/03) Vol. 20, No. 31, P. 8; Messmer, Ellen
The Common Criteria accreditation for software and hardware is gaining support in the U.S. government and from vendors, who are beginning to place stress on the system by submitting more products for evaluation. The United States, along with 14 other nations, backs Common Criteria, which was first introduced as a National Security Agency (NSA) mandate for classified government systems. The Defense Department has since applied Common Criteria to all its systems while lowering the bar--vendors now have only to commit to putting their products through evaluation, since the process can take up to a year and usually costs hundreds of thousands of dollars. This year, Oracle and Red Hat are aiming to get Linux an Evaluation Assurance Level 4 (EAL4) rating, which is considered sufficient for sensitive systems. Security is the main purpose of Common Criteria, according to IBM technical strategy director Ken King, whose company is working to get SuSe Linux through EAL2 testing this year. Common Criteria in the United States is run by the National Information Assurance Partnership (NIAP), a joint effort of the NSA and the National Institute of Standards and Technology; the NIAP has worked to accredit about half of the 93 vetted products at seven U.S. labs. There are more than two dozen other labs operating globally, including private laboratories contracted to do Common Criteria testing. All participant countries agree to accept Common Criteria evaluations done under each others' jurisdictions. The banking industry is also interested in the Common Criteria process, according to BITS senior director Laura Lundin, who works under the aegis of the Financial Services Roundtable.
Click Here to View Full Article
- "Israeli High Tech Targets U.S. Security Market"
Reuters (08/04/03); Ackerman, Gwen
Israel expects to parlay its defense expertise into more business opportunities in the United States: With the United States starting to focus more on guarding against terrorism, Israel's technology sector views the market as a potential source of economic growth after the collapse of the global technology market in 2001. Israel has its own "Silicon Valley" of startup software and telecommunications companies that conducts business worldwide, and officials see the security market as a chance to return its technology industry to prominence. One startup near Tel Aviv, DBS Watchdog Alarm Systems, has produced software that relies on biosensors and digital signal processing analysis to "translate" the bark of a dog, and the Nahal Sorek nuclear research center south of Tel Aviv has developed a laser that can pinpoint explosives; another startup, Actimize, monitors the clustering of cell phones in various cities around the world to determine suspicious activity. Homeland Security Research in California projects that at least $130 billion will be spent by the United States on homeland security by 2010, but Israeli companies will have to compete with U.S. companies for government contracts. The Israeli government has encouraged companies to focus more on security by launching a grant program.
Click Here to View Full Article
- "Educators Turn to Games for Help"
Wired News (08/02/03); King, Brad
Academics are hoping to employ the software that powers popular video games to enhance the learning experience and allow students to apply classroom lessons to simulated environments. The Digital Media Collaboratory at the University of Texas at Austin has teamed up with the public and private sectors to bring educational video games to schools, governments, and businesses. "We have seen the power of using the right kind of tools--in this case, games--in the right types of situations, to create real change," notes Alex Cavalli of the IC2 Institute, where the Digital Media Collaboratory resides. Using a $1 million grant from Microsoft, MIT has launched the Games-to-Teach Project, an initiative that has yielded playable prototypes for four simulation programs; one game, Revolution, was built using design tools that the makers of the popular commercial game Neverwinter Nights freely distributed to the public. Revolution participants can opt to fight on different sides of the Revolutionary War and then contend with the results. MIT and Carnegie Mellon University's Entertainment Technology Center co-developed Biohazard, a training simulation for emergency workers in which teams of players must set up new lines of communication in the midst of a toxic disaster. Leaving the development of educational games to the games industry has not panned out--the Entertainment Software Association reports that educational products only account for 7 percent of the software market for console games. Furthermore, MIT professor Henry Jenkins, who helped kick off the Games-to-Teach Project, notes that "Games teach systematic things much better than they teach facts." University of Wisconsin at Madison professor James Paul Gee believes that educational video games will be most useful for the post-high school environment.
- "Light on the Horizon"
Economist (08/02/03) Vol. 368, No. 8335, P. 66
Commercial, three-dimensional holographic data storage has long been an elusive dream of academic and industrial researchers, but a holographic memory is expected to hit the market in 2004. In theory, holographic methods could significantly expand data storage capacity and accelerate data retrieval speeds far beyond that of current storage technologies, and dramatically boost the accuracy of data-mining and other search functions. The development of holographic memory has been hampered by difficulties in finding the best recording medium, which must be a cheap, stable photosensitive material. Dr. Hans Coufal of IBM says his company has made strides with the development of several lithium niobate-based holographic "test platforms" capable of storing as much as 390 bits per square micron, while encoding and signal-processing algorithms can eliminate the problem of cross-talk noise. Photosensitive polymers are another material under development, although they are susceptible to volume changes when chemical shifts that accompany data recording occur within and between the molecules, causing holographic distortion. Aprilis has reportedly circumvented this problem with an epoxy-modified silicone, and the company plans to roll out a 200 GB drive with a 75 Mbps transfer rate within two years. Meanwhile, next year may see the debut of holonide-coated 500 GB disks from Polight Technologies, and In-Phase Technologies is working on a polymer-based 100 GB Tapestry disk that avoids distortion by using a substance whose photosensitivity and cohesion depend on separate chemical processes. Further hurdles to holographic storage include precisely integrating lasers, detectors, and spatial light modulators into an inexpensive system, but mass production of holographic memories is within reach thanks to the availability of cheap opto-electronic components.
- "Sharing the Code"
Chronicle of Higher Education (08/01/03) Vol. 49, No. 47, P. 47; Olsen, Florence
The success of the established open-source projects Linux and Apache and their frequent use in higher education has given rise to other open-source projects intended for colleges and universities' use. One example is uPortal, which can be used to build campus-wide Web portals, and another is Shibboleth, which enables the colleges to control outside access to Web-posted information. Administrators still tend to be wary of adopting such open-source projects, preferring to deal with companies that they can hold responsible for poor performance, but a down-side to that is the fact that commercial software companies can raise prices without warning or stop supporting products completely. On the other hand, open-source projects appeal both because of their low cost and the fact that the source code is available for programmers to study and adapt. Goshen College director of information technology Michael Sherer is one example of a college administrator who has embraced open-source projects ranging from Linux to the network-directory application OpenLDAP and the file and print sharing software Samba. Many university officials are finding that the collaboration between colleges and the free technical help that other open-source uPortal users provide to one another is surprisingly plentiful. However, Mark Olson of the National Association of College and University Business Officers argues that open-source projects oriented toward higher education should get software companies involved in the project, an approach that has succeeded for uPortal, which is available in commercial versions via Unicon Inc. and SCT. Companies such as IBM have come to see the value of the open-source collaborative model in developing and licensing software. "Our judgment is that many of the open-source technologies are extremely good, and they're cost-effective," says Alfred Spector, IBM's vice president of services and software research.
- "Future Results Not Guaranteed"
CIO (07/15/03) Vol. 16, No. 19, P. 46; Worthen, Ben
To rely exclusively on computerized demand forecasting is a serious error, despite the continued endorsement of demand forecasting software by vendors and academics: The best strategy is to couple demand forecasts with human intelligence and judgment. Forecasting software that uses historical data cannot predict major changes in the marketplace, and the intricacy of today's supply chains translates into less accurate information. Companies can get more accurate forecasts by leveraging up-to-date sales data and point-of-sale (POS) information that comes directly from retailers, while proficient staff and procedures should be included to explain anomalies and correlate projections with actual market observations. Ninety percent of existing demand planning software conducts classic regression analysis, but as such systems' computing power grows, so do the number of variables and corresponding data points they can consider--and most of these data points are inaccurate. Most companies do not have the option of getting POS data directly from customers because few firms collect such data, and many consider such data to be critical to their competitive advantage and therefore secret. Power converter manufacturer Vicor has successfully married computerized demand forecasting and the human element to its benefit: Separate demand forecasts are made by the sales department and the computer system, which check and balance each other. Another mistake companies make is not involving the executive level in the forecasting process. Executives must be there to see how the computer-generated forecast lines up with the human-generated forecast in order to make a mutually agreeable final decision.
Government Technology (07/03) Vol. 16, No. 9, P. 50; Brown, Justine
Recycling lead-bearing cathode ray tubes (CRTs) is a difficult process whose most common solution causes air contamination, but researchers at the New Jersey Institute of Technology (NJIT) have developed a fast, easy CRT disassembly method that eliminates additional pollution. The method employs high-powered, dual-nozzle water jets that cleanly separate the faceplates from the funnels as the CRTs move down a conveyor belt. The process can be completed in about 30 seconds, compared to the average time of five minutes typical of traditional water saw or hot wire techniques. The New Jersey Commission on Science and Technology was the primary backer of the NJIT project, and associate director of Science Development Jennifer Posda explains that the initiative "gives us a chance to help develop technology that can be used immediately and put money back into the economy." She adds that the technology is being set up in China so that the country's stockpile of obsolete computers can be more safely disposed of. Reggie Caudill of NJIT's Multi-Lifecycle Engineering Research Center Program notes that wide-scale deployment of the institute's solution could mitigate the fight between state governments and technology companies over who must pick up the tab for electronics recycling. "If we can provide a better method of cheap, mass recycling, it could help solve all of those problems in a fairly simple way," he claims. A recent report from the Silicon Valley Toxics Coalition estimates that 500 million computers containing an aggregate 1.58 billion pounds of lead will become obsolete between 1997 and 2007. Many state and local governments have responded to this report by approving laws that limit the amount of old computers that can be dumped in landfills.
- "Is the Pen Mightier?"
CIO Insight (07/03) Vol. 1, No. 28, P. 67; Bolles, Gary A.
Tablet PCs promise to boost worker productivity and support more flexible collaboration by capturing data and graphic information more efficiently, but there have been few major white-collar tablet PC implementations thus far, and many companies lack an efficient strategy for managing data once it is captured and stored. Tablet PCs are optimal for enterprises that already employ mobile computers that follow a similar design paradigm, such as pharmaceutical companies, retail stock management services, and warehouse inventory management; tablets may also be favored by companies that find personal digital assistants inadequate in terms of screen size or data storage capacity. In the end, however, "it really comes down to what the workflow is," observes John Keane of ArcStream Solutions. Experts such as Gartner VP Ken Dulaney consider the current crop of tablet PC products to be first-generation, which leaves plenty of room for improvement. The latest models owe a lot to progress in flat-panel display, microprocessor, disk storage, and wireless networking technologies. There are no solid forecasts as to where further tablet PC growth will occur: Acer America's Sumit Agnihotry says his company expects tablet PCs to further penetrate the mainstream in the last six months of 2003, but he acknowledges that most current tablet buyers are existing tablet customers. Additional barriers to adoption include the high cost of tablet PCs and their potential support costs, if off-the-shelf products do not fulfill corporate needs. The best approach is to test commercial tablet PC hardware and software by selecting the IT staff most likely to benefit from the technology as a test group; the incorporation of Wi-Fi into the networking infrastructure is also a plus.
- "Flash of Brilliance or Flash In the Pan?"
IEEE Spectrum (07/03) Vol. 40, No. 7, P. 18; Cass, Stephen
Researchers at the California Institute of Technology (Pasadena) and the University of California at Santa Barbara have a made a discovery that could potentially increase hard disk capacity by more than a thousandfold. According to the scientists, thin films of gallium manganese arsenide exhibit what they term the giant planar Hall effect: When a magnetic field is applied in the plane of a current passing through such a highly sensitive ferromagnetic semiconductor, a voltage is generated transverse to the current. The classic Hall effect, in contrast, involves a transverse voltage generated when the field is applied perpendicular to the current. Although the cause of the effect has not been nailed down, UC Santa Barbara researcher David Awschalom thinks it is related to the fundamental magnetic characteristics of the manganese atoms in the film. Michael Roukes of Caltech maintains that this breakthrough could support the development of a new category of powerful sensors, but he notes that "we don't yet have good semiconductor ferromagnets working at room temperature." The effect only appears in gallium manganese arsenide films when the temperature drops below 40 K, though Roukes adds that all semiconductor ferromagnets could theoretically exhibit the effect. Even if the giant planar Hall effect does not dramatically advance hard disk technology, the researchers are confident that the discovery is a significant step in the development of spintronic devices.