HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to [email protected].

Volume 5, Issue 546: Wednesday, September 17, 2003

  • "Software Quality Is Still a Work in Progress, Offshore and in the U.S."
    Computerworld (09/15/03); Willoughby, Mark

    Overseas software developers--India especially--are adopting software quality development certification standards such as the Software Engineering Institute's Capability Maturity Model (CMM), which SEI fellow Watts Humphrey says can reduce faults and boost security. He observes that "To do good security, engineers need to be using good design methods, and engineers historically have not done a good job of design." Oracle chief security officer Mary Ann Davidson reports that global development processes must be implemented in order to accommodate the globalization of software development. "No matter where you build the software, you must have a culture of security with good internal processes," she declares. Humphrey notes that the topmost CMM certification level, CMM 5, does not enable code modules to be traced to individual programmers, but the SEI plans to add that traceability with the emergence of Team Software Process (TSA), which is being intensely deployed by Indian software firms and a few American software developers. Humphrey says that CMM 1-rated organizations typically boast 7.5 software defects per 1,000 lines of code; CMM 5 organizations usually have only one defect per 1,000 lines; and TSP-rated organizations usually detect only 0.06 defects for every 1,000 lines. Steve Lipner, Microsoft's security engineering strategy director, says that buffer overruns are a persistent security problem, because new overruns are always being discovered even though Microsoft has built automated buffer overrun detection and blockage tools. Microsoft is also applying Common Criteria testing to its Windows operating system, though Shawn Hernan of the CERT Coordination Center cautions that such evaluation does not solve software quality and security problems.
    Click Here to View Full Article

  • "Microsoft's Setback in Patent Case Ripples Through World Wide Web"
    New York Times (09/17/03) P. C1; Lohr, Steve

    A federal court's August decision that Microsoft should pay $521 million in damages to former University of California researcher Michael Doyle for supposedly infringing on patented technology he developed is resonating throughout the software industry. The focus of the suit is the Active X technology that allows a browser to automatically call up programs over the Internet, and which is a key ingredient in Microsoft's Internet Explorer browsing software as well as hypertext markup language approved by the World Wide Web Consortium. Microsoft has notified the consortium and a small group of software companies that it will likely alter its Internet browser, which 90 percent of computers users employ to access the Web, in order to comply with the court's ruling--which could be an expensive and unsettling development for other Internet software firms and major commercial Web sites. Microsoft suggested several ways to change the browsing software, such as having PC users check a "click to proceed" box to run multimedia programs on the browser. Executives say companies would need to acclimate themselves to such changes, but the effects on average PC users should be minimal. Daniel Weitzner of the World Wide Web Consortium reports that the ruling and its potential fallout illustrates the need to keep fundamental Web software royalty-free. Indeed, both Microsoft and IBM will demonstrate their commitment to such a policy at a Sept. 17 event co-hosted by Microsoft CEO Bill Gates and IBM senior VP Steven Mills. Microsoft announced intentions to appeal the ruling, but it may be a year and a half before the final decision is reached.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "The Risks and Threats Start to Stack Up Fast"
    Financial Times-IT Review (09/17/03) P. 7; McKenna, Brian

    Some security experts are concerned that enterprise applications will be the focus of new cyberattacks, especially as corporations implement Web services. "A lot of the new vulnerabilities will open up at the business process level, and most of the Internet community is not even aware of what is going on in that space," warns Hewlett-Packard security research director Martin Sadler, who attributes this trend to network security specialists' lack of focus on the top or application layer of computer networks. He adds, "When new vulnerabilities appear at the top, there isn't an Internet community that can get those things fixed." A security director at a City of London institution also sees combined attacks as a source of growing concern, and cautions that a lack of patching could make corporate networks highly vulnerable to more advanced worms with increasingly malevolent payloads. Mark Stevens of the WatchGuard network security firm believes security vulnerabilities stemming from Web services will hit faster than people expect. He notes that vendors are doing little in terms of trying to detect inconsistencies in the Simple Object Access Protocol used to transfer Web services. Besides Web services, new threats are likely to emerge as network interdependency increases and the time between the disclosure of software flaws and their exploitation by hackers continues to shrink.

  • "In PC Design, Harbingers of Shrink"
    CNet (09/16/03); Frauenheim, Ed; Kanellos, Michael

    Industry insiders predict a 2004 market debut for products built from several technologies designed to reduce the size of notebooks and desktops, and enable more malleable configurations of standard PCs. Such technologies include PCI Express, a new technique for linking together PCs and peripherals that could lead to shrinkage of machines' inner mechanics; ExpressCard, a PC expansion card standard that supporters claim will supplant PCMCIA cards used to boost memory or establish network connections; and the high-speed Serial ATA disk drive interface standard that can reduce the cabling inside PC boxes. The technologies all share a serial nature rather than depending on parallel data exchange typical of most computers. PCMCIA Chairman Brad Saunders reports that the PC expansion card's overhaul is being driven by the emergence of PCI Express. Saunders' organization says that it is simpler for PC manufacturers to adopt the ExpressCard than the current CardBus card, because the former requires just 26 pins to connect to a PC while the latter needs 68 pins. The ExpressCard standard, which was developed by Hewlett-Packard, IBM, Dell, Microsoft, Lexar Media, and others, comes in two sizes: The larger accommodates applications such as compact flash memory and 1.8-inch hard drives, while the smaller is supposed to have a long-term value as a tool for smaller PCs. Saunders thinks the ExpressCard standard will allow computer makers to try out new designs, while the PCMCIA indicates that desktops could be equipped with the new cards as well; this would enable users to expand their machines without dissecting them. PCI-SIG Chairman Tony Pierce says the PCI Express method could spawn new desktop architectures, one example being separate CPUs and graphics chips so heat can be more easily dissipated around the components.
    Click Here to View Full Article

  • "Semantic Web: Out of the Theory Realm"
    Internetnews.com (09/12/03); Singer, Michael

    Experts working on the Semantic Web say the pieces of the puzzle are coming together with standards to guide them. World Wide Web Consortium (W3C) Semantic Web activity lead Eric Miller is heading the project for that organization and says some Semantic Web applications are already in use among bloggers, even if the standards needed for ubiquitous adoption are not due until at least next year. Miller cites blogging tools such as TrackBack, syndication, and author metadata, which he says create computer-generated links between like concepts and people. "It works very much like six degrees of separation," he says. The Semantic Web is about giving Web operations more intelligence, equipping content and applications with metadata so that computers can automatically create recommendations or reuse data. About 20 more standards are needed to tie together the Semantic Web, which is fundamentally based on the Resource Description Framework. Recently, the W3C recommended the Web Ontology Language for the Semantic Web; Web Ontology working group co-chair Jim Hendler says the Semantic Web will run in the background like HTTP, but that the effects of self-networking will be tremendous. He says the Semantic Web will allow easier application of copyrights in the digital realm since creators can put that information in the metatag. Several major IT vendors have already created Semantic Web tools, such as HP Haystack from Hewlett-Packard, Global Knowledge Engineering framework from Sun Microsystems, and Semio Tagger from IBM. Miller notes that early adopters of the Semantic Web include the life sciences and bioinformatics communities, as well as companies using automated phone systems.
    Click Here to View Full Article

  • "Big Plans for Smart Tags Also Bring Concerns"
    Associated Press (09/15/03); Pope, Justin

    This week's Electronic Product Code symposium is supposed to mark the official commencement of the wide commercial adoption of radio frequency identification (RFID) tags, which would be affixed to crates and used to track products along their supply chain in order to better manage inventory and cut costs. A later development may extend RFID tracking as far as consumer sales, but a new AMR report indicates that some in the industry think the technology is being rushed. Also of concern is a dearth of unified RFID standards, which means that a new system will have to be built for each customer; the emergence of standards itself is generating worry that retailers will have no choice but to share information with rivals so they can tap into the systems of common customers. Privacy proponents fear the technology could be used to track people and their personal data, and Katherine Albrecht says the Caspian privacy organization--of which she is a member--plans to protest the lack of citizen and consumer advocates' inclusion in the symposium. Kevin Ashton of MIT's AutoID Center, where the RFID technology was developed, says privacy backers should be included in the debate, and insists that customers would be given the option of deactivating the tags and allowed to control the usage of the information embedded within them. The rush to adopt RFID was instigated this summer by Wal-Mart, which gave its 100 leading suppliers a Jan. 1, 2005 deadline to tag their products.
    Click Here to View Full Article

  • "Australia's Ecommerce Patent Solution Nears"
    ZDNet Australia (09/16/03); Colley, Andrew

    Australia's business and technology community is split over the proliferation of business method patents, especially software-implemented patents, after late-1990s U.S. court rulings percolated to that country. Australia's Advisory Council on Intellectual Property (ACIP) is expected to give its recommendations on the situation this October, on the heels of a controversial business method patent granted to Canadian firm D.E. Technologies which appears to patent electronic international commercial transactions. As in the United States and Europe, opponents to business method patent proliferation say the practice creates business uncertainty and limits innovation. The Australian Consumers Association and the Australian Computer Society (ACS) both oppose business method patent expansion, while proponents claim intellectual property protection is vital to the software industry and that business method patents should be treated no different than other patents. As in the United States, which saw a 30-fold increase in business method patents after a U.S. court loosened patent law in 1995, the ACIP reports that business method patent filings in Australia have grown by 1,000 percent. Questions prompting the ACIP review include the issue of patent office readiness in terms of both manpower and expertise, the impact on businesses, and the relaxation of validity requirements since new laws passed in April 2002. IT industry lawyer Brendon Scott says the patent office has tarnished its credibility by easing laws regarding business method patents and that examiners do not have the resources needed to adequately research patent claims. ACS legal committee chair and industry lawyer Philip Argy explains that e-commerce never progressed as orderly as offline business, and notes that patent examinations need public comment periods, since Australian and U.S. filing systems are not in sync and can lead to Australian companies unknowingly infringing on newly filed U.S. patents.
    Click Here to View Full Article

  • "Feds Set Up Cyberfighting Group"
    CNet (09/15/03); Borland, John

    An organization co-founded by the Department of Homeland Security (DHS) and Carnegie Mellon University's CERT Coordination Center aims to spur information sharing between federal agencies, companies, networks, security researchers, and other parties affected by digital security vulnerabilities, according to a Sept. 15 announcement from the DHC. "The recent cyberattacks--such as the Blaster worm and the Sobig virus--highlight the urgent need for an enhanced computer emergency response program that coordinates national efforts to cyberincidents and attacks," stated DHS Secretary Tom Ridge. The new US-CERT group would be designed to monitor, respond to, and forestall such cyberattacks by providing a forum where affected entities can directly discuss and exchange data about cybersecurity. CERT center manager Jeffrey Carpenter notes that most interchange between organizations is informal and ad-hoc, but he adds that groups are starting to appreciate the importance of collaboration. "We're much more powerful together than individually," he explains. The announcement of the partnership aims to notify the security community that the new organization will be approaching critical network monitors and vulnerability trackers--ISPs, government agencies, and the like--to participate. Carpenter says US-CERT will sign up those national and international collaborators that can offer the best understanding about the current state of network security over the next several months.
    Click Here to View Full Article

  • "Needed: A Security Blanket for the Net"
    Business Week (09/16/03); Salkever, Alex

    This summer's rash of vulnerabilities, viruses, spam, and worms has forced even the most fervent Internet boosters to acknowledge the need for security reforms, and consider dramatic solutions. Though network traffic has grown steadily and network performance remains strong, "things...have gotten somewhat flaky," reports Carnegie Mellon University's David Farber. There are a welter of proposed measures for identifying and authenticating Net users, including combining the IPv6 protocol's ability to tag data that crosses the Internet with DNSec, which employs digital certificates stored on Internet-linked machines; however, this solution requires broad adoption of new DNS and complex routing software, while software and training costs could escalate in order to handle the digital certificates or encryption keys. Other cybersecurity proponents want to instill more accountability among software providers by allowing companies and individuals to sue them for damages suffered as a result of faulty software, while certain security experts advocate the creation of stricter network-security rules backed by government muscle. There is plenty of blame for poor security to go around, and ISPs and network administrators are not exempt: Many have consistently failed to regularly deploy software patches, or follow fundamental cybersecurity procedures. There is little doubt that consumers as well as businesses will need to be involved in cybersecurity improvement efforts. Some security advocates maintain that broadband providers should be more responsible for consumer-related risks, and should even go so far as to refuse customers whose PCs lack security software. Meanwhile, RSA Security's Scott Schell is especially concerned about the security (or lack thereof) of huge databases containing personal data on millions of Americans.
    Click Here to View Full Article

  • "Darpa's Ditziness Dents Budget"
    Wired News (09/16/03); Shachtman, Noah

    A Senate bill cuts $103 million from the requested $169 million 2004 budget of the Information Awareness Office, part of the Defense Advanced Research Projects Agency (Darpa), in response to the questionable value of projects such as the LifeLog and Terrorism Information Awareness (TIA) database programs. But that is just one of the cuts to Darpa's often far-reaching research programs. "Darpa got too much of the wrong kind of publicity, the kind that invites mockery and ridicule, and now the agency is paying the price," notes Steven Aftergood of the Federation of American Scientists, who admits that the budget cuts will probably cause some valuable research to be postponed or abandoned. The reduced allocations would be most injurious for Darpa's Information Awareness Office, while TIA critic Sen. Ron Wyden (D-Ore.) declared in August that he wants to ax the program entirely. One Darpa-funded project in danger of being shelved by the Senate's decision is a program at Duke University's Center for Neuroengineering that focuses on the development of mind-controlled robotic limbs. Darpa granted the lab $5 million annually under its Brain Machine Interfaces initiative. Another project at Columbia University seeks to research the effects of sleep deprivation on people's thoughts and actions in the hopes of one day enabling soldiers and others to function with little or no sleep, but Tim Bouley of the Senate Appropriations Committee says his group cannot see the defense value of such efforts. Darpa's Jan Walker warned in an email, "The proposed congressional cuts to Darpa's biology programs will cripple Darpa's strategic thrust in the life sciences and seriously impede Darpa's mission to prevent technological surprise." She added that this will boost the likelihood that America's military will lag behind that of its enemies in terms of capability, strength, and safety.
    Click Here to View Full Article

  • "Internet Worms: Worst Is Yet to Come?"
    NewsFactor Network (09/16/03); Ryan, Vincent

    Computer security researchers say Internet worms are becoming more complex and could cause much more harm than previous versions: Internet Security Systems' X-Force research engineer Neel Mehta says the Blaster worm, for example, necessarily had to connect through a port usually blocked, thus hindering its ability to spread as fast as possible. Nimda, however, represents a more complex worm in that it targeted internal networks and pieces of local networks that were easier to break into and live in. Zone Labs' Fred Felman agrees that worms could do more to penetrate systems by exploiting multiple vulnerabilities or finding other ways to propagate once inside the system. Gartner's Richard Stiennon warns against "low and slow" worm attacks that go unnoticed by administrators and security systems until they unleash a catastrophic attack, while Mehta says encryption advances could allow worms to carry more potent executables without being identified as doing so. A lot of the current security problems are the shared responsibility of users and software vendors, who create connected products that are easy to use but also more vulnerable to Internet attack. Enterprises can do more to protect themselves by adopting advanced firewalls that inspect packets and intrusion-detection systems that break down data into protocols instead of just matching patterns, and companies should also assess their security risks and write rules for when and how applications can connect to the Internet. Stiennon adds that organizations should not rely on a monolithic IT architecture, but vary their components in order to better contain possible break-ins.
    Click Here to View Full Article

  • "Inside the Gadget Labs"
    Herald Sun (AU) (09/17/03); Thom, Greg

    Understanding how consumers use technology is becoming more important as competition tightens and new products can mean either marketplace victory or disaster. Philips Electronics' Home Lab in the Netherlands differs from other consumer electronics research efforts because of the amount of psychological analysis involved. Volunteers are invited to make themselves at home and try to ignore being spied upon by hidden cameras and microphones. Researchers try to find out how people will use cutting-edge applications such as a children's toothbrush that triggers animated cartoon displays on the bathroom mirror, an interactive TV that lets people see distant friends who are watching the same show, or a music management system that finds songs when people hum just part of the tune. Researchers from the Australian Graduate School of Management and the University of Technology, Sydney, are pioneering a new methodology and software that will help predict user behavior of new products. Called Information Acceleration, the system can predict not only how people will use the new technology or product, but also what price they will pay. Meanwhile, Philips Research scientific program director Emile Arts says the continued miniaturization of computer chips will lead to a world of "Ambient Intelligence," in which technology will be embedded in things all around us--the walls, clothing, and furniture--and automatically respond to a person's visual or verbal cues. For example, a tired look on your face when walking through the front door could be detected and prompt the bath tub to fill with water. Anticipatory Behavior systems would automatically tape favorite TV shows while electronic wallpaper could add surround sound, vibrations, and special lighting to movie viewing. Arts says, "It's not about the technology itself. It's about the way people handle the technology to be of benefit in the end."
    Click Here to View Full Article

  • "Net Struggles With Data Overload"
    BBC News (09/16/03); Ward, Mark

    The amount of data generated by certain scientific experiments is so vast that gigabit-per-second data transfer rates are not enough, which is prompting researchers to look for speedier transfer methods and technologies. Europe's Very Long Baseline Interferometry (VLBI) project connects a distributed network of 16 telescopes, each of which produces 1 Gbps of astronomical data over a 25-day observation session; Dai Davies of high-speed network provider Dante says the Internet's Transmission Control Protocol--the same standard used to make sure that data goes where it is supposed to--can hold up data transfer when it checks to see if packets have arrived at their specified destination. He notes that until now the VLBI's only option was to record the data on magnetic tapes and drive them to an analysis center. Davies says his company is considering amending its basic data transfer protocol as well as investigating network reorganization to supply capacity on an as-needed basis. Meanwhile, a Welsh e-science center being developed at Cardiff University involves the deployment of a gigabit network that center manager Tom Wiersma claims will enable researchers to more easily transfer data. "Scientists need to be able to connect to the Grid, take their data and find the computing resource they need to crunch it without having to worry about preparing the data, splitting it up and so on," he explains. One experiment that will utilize the network is a geological activity monitoring project involving data collected by many sensors positioned on the sea floor.
    Click Here to View Full Article

  • "New Search Algorithm Hears 'People's Voice'"
    NewsFactor Network (09/16/03); Martin, Mike

    The Vox Populi Internet search algorithm developed by German researchers functions by assigning relative weights to search terms. For instance, Google sends users who type in "free MP3 downloads" to all MP3 download Web sites, while search engines that employ Vox Populi could link users who type in the same phrase to free download sites first, if the word "free" has a larger relative weight than "downloads." "The main idea of this algorithm is to extend the existing algorithms by a component which reflects the interests of the users more than existing methods," write Andreas Schaale, Sonke Lieberam-Schmidt, and Carsten Wulf-Mathies in a paper detailing Vox Populi. Assigning relative weights to search words cannot be achieved without a massive statistical database that tracks the frequency of words that appear in queries, and the researchers used the publicly available keyword-datenbank.de site as their source. Though an algorithm such as Vox Populi may seem simple in concept, execution is a more complicated proposition, and there is currently little data available to demonstrate the method's value. However, such a technique could be a valuable tool to winnow out spam from search results, which Schaale says should be the objective of Internet queries. Vox Populi can perform this service by scanning for "natural clusters" that contain links from a variegated series of sites.
    Click Here to View Full Article

  • "Paving the Way for 'Systers'"
    San Francisco Chronicle (09/15/03) P. E1; Jolitz, Lynne Greer

    ExecProducer CTO Lynne Greer Jolitz notes a profound lack of high-tech female professionals of her 1980s generation, and she attributes this deficit to the dot-com downturn. She observes that many of these women have moved away from technology, been laid off, or are refocusing on technology marketing and sales, which leaves a critical absence of mentors for the younger generation of female tech workers. "Unlike the aspirations of the older systers for academic recognition, my generation's challenge was to build products and companies from...new technologies," she writes. "Perhaps these more applied pursuits left women more vulnerable when the crash hit than if they'd stayed in academia writing papers." Jolitz recounts a discussion she had with CrossWorlds Software founder Katrina Garnett at a Springboard Enterprises conference last December, which served to remind her that Garnett's success was the result of her keeping a team of engineers with her to make her vision a reality and getting investment and advice through her venture capitalist husband. The author writes that Garnett fulfilled her function as tech visionary without abandoning her technical heritage or her relationships--in fact, she harnessed them to develop her firm. Jolitz also attended a eulogy for Anita Borg, a staunch advocate of women in technology and founder of Stanford University's Institute for Women and Technology, where the majority of the audience was younger women. The author felt a strong need for these women to have access to the capital and status that would allow them to flesh out their high-tech visions.
    Click Here to View Full Article

    For more information on the Institute for Women and Technology, visit http://www.iwt.org/home.html.

  • "Thinking Outside the Box"
    Economist (09/11/03) Vol. 368, No. 8341, P. 74

    The construction of data-storage centers and perhaps even supercomputers could be greatly simplified if the Collective Intelligent Bricks project of IBM's Almaden Research Center proves successful. The building block of such machines is a cube that contains 12 disk drives with 80 GB storage capacity, a processor to run the disks, and a chip that transmits signals to the connectors that link the cube to its neighbors; the physical connections are metallic pads that form capacitors when linked, and enable cableless brick-to-brick communications. This architecture allows easy rerouting in the event one or more bricks fail, while data can be preserved because multiple bricks can store back-up copies. However, the system is difficult to cool because the bricks in the middle of the array are not exposed to air. The current solution is to dissipate heat with pipes that carry water, but the failure of these pumps could lead to an irreparable system-wide failure. Project leader Dr. Jai Menon is confident that the design will not only support data storage, but general-purpose computing once processor banks as well as hard drives are incorporated into the bricks. The addition of more bricks adds scalability, while compact stacking lowers signal latency and boosts performance. Menon expects to have a working 3x3x3 arrangement of bricks with 25 TB capacity before 2004.
    Click Here to View Full Article

  • "IT's Global Itinerary: Offshore Outsourcing Is Inevitable"
    Computerworld (09/15/03) Vol. 31, No. 43, P. 26; King, Julia

    The tremendous cost savings posed by outsourcing in today's economy will prompt 40 percent of companies to shift at least part of their IT operations overseas by 2004, according to Gartner; offshoring also enables companies to boost the flexibility of their IT staff as well as tap into a growing pool of world-class IT talent. The overall outsourcing trend is characterized by a number of sub-trends, such as the use of multiple outsourcers in different countries, a practice known as multisourcing. Accenture CEO Joe Forehand predicts that multisourcing will ramp up because "There's no one provider who is best-of-class across all services." With the overseas migration of application development and maintenance, domestic IT staffs are applying their skills to more managerial and business process-related duties, such as business analysis, accounting, negotiation, and compliance monitoring. New outsourcing services are emerging, and outsourcers are broadening their scope to include human resources, call center, finance, and accounting operations; infrastructure services such as desktop support, help desk, and network monitoring functions are also experiencing dramatic growth. Though most companies want to outsource technical work to save money, labor costs in mature offshore centers such as India will inevitably rise as demand increases, and experts caution users to watch out for lowest-cost deals. In an effort to lower their own fixed IT costs, outsourcers are "daisy-chaining." One daisy-chaining strategy is for service providers to sell a piece of their fixed IT assets back to hardware and software providers, while another is for mature outsourcers to subcontract IT work to younger and cheaper outsourcers.
    Click Here to View Full Article

  • "In-House Innovation"
    InformationWeek (09/15/03) No. 955, P. 34; Foley, John

    Despite technology commoditization and IT budget cuts, proprietary software has not been driven into obsolescence, and is key to many companies' competitive advantage. Borland Software's Ted Shelton expects custom software-development projects will increasingly hybridize the primary J2EE and .Net programming environments, and he notes that corporate development teams are using tools such as Borland's JBuilder and StarTeam to concentrate less on craft and more on "practice;" this is accomplished by paying more attention to how software-development groups shape application functionality by working closely with business managers and users, and designing, developing, and testing the applications more methodically. Experts note that business units that use the new applications must have a hand in determining usability, project timing, and business-process harmonization. Though technological innovations cannot replace brainpower, they do offer advantages: London Stock Exchange technology director Ian Homan reports that C#, Java, and Visual Basic .Net offer more safety, ease of use, and speed than earlier languages, while firms can choose application-development tools independently of operating system and hardware platform, with the resulting cost savings factored into total cost of ownership. In-house programmers not only build custom applications, but refine their companies' commercial enterprise resource planning applications, integrate them, and add Web services to software environments. As a result, development productivity at many American companies is at an all-time high. As companies increasingly turn to offshoring software development to save money, U.S. software engineers will have to focus more on management than development, according to Shelton. Sun Microsystems VP and Java creator James Gosling adds that developers will have to make a greater effort to understand user needs, convert business needs into software solutions, and acclimate themselves to corporate culture.
    Click Here to View Full Article

  • "Next-Generation GPS"
    Scientific American (09/03) Vol. 289, No. 3, P. 34; Ashley, Steven

    Third-generation Global Positioning System (GPS III) technology currently in the planning stage seeks to boost accuracy and reliability while addressing issues of increasingly sophisticated applications, alternative geolocation services, and more effective signal disruption methods. Stanford GPS Laboratory director Per Enge observes three "megatrends" driving the short-term advancement of GPS technology: The first, frequency diversity, involves the periodic replacement of old GPS satellites. Once the replacement process is complete, the new satellite network will provide civilian users with a trio of new positioning signals, while the U.S. military will have two more signals that can block jamming. The second megatrend aims to tackle radio-frequency interference (RFI) through RFI hardening, which enables GPS receivers to monitor terrestrial broadcast transmissions so they can double-check the calculations they make to locate themselves accurately. The third megatrend stems from the deployment of "integrity machines" to ensure that positioning errors are not as big as stated. One such system is the FAA's Wide Area Augmentation System, which Enge's lab co-developed. The system uses differential GPS methods to collect up-to-date error-correction data from communications satellites orbiting geosynchronously. GPS technology, currently in its second generation, is used by some 20 million people on a regular basis.
    Click Here to View Full Article

[ Archives ] [ Home ]