HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 611: Friday, February 27, 2004

  • "Intel Backs a More-Efficient PC Power Supply"
    Wall Street Journal (02/26/04) P. B4; Carlton, Jim

    Intel is promoting a new power supply specification that promises to cut power plant emissions, reduce U.S. energy bills by $1 billion, and trim the annual electricity bill for an average business PC by $17. PC power consumption could be lowered by a third with the device, which could also allow developers to focus on more innovative PC designs, according to Intel officials. Of the electricity that flows into the PC from the wall socket, less than half is typically used by current PC power supplies, while the remainder is given off as heat. Intel's new spec was developed in conjunction with the Natural Resources Defense Council (NRDC), and the chip company says the new design will be used by four power-supply manufacturers. "This is definitely something we would be interested in, because we know we could pass the cost benefits on to customers, as well as the environmental benefits," notes Dell's Bryant Hilton. Intel's Dave Perchlik reports that his company began examining more efficient power supplies as a way to make smaller computers. Intel and the NRDC are collaborating with the EPA to embed the new design into the EPA's Energy Star label requirements, which Intel officials think could be a significant driver for adoption, because the federal government cannot purchase PCs without Energy Star certification. The new power supply is expected to add $10 to computer costs, but its backers say this extra expense will be offset by the electricity savings.

  • "Patents Raise Stakes in Search Wars"
    CNet (02/25/04); Olsen, Stefanie

    IT vendors and Internet firms are arming themselves with Web search patents in preparation for future battles over how computer users find information. Last week, Yahoo! fired opening salvos by dropping Google as its Web search provider, though attorney David Jacobs says that one overt action belies the intense legal wrangling going on behind the scenes, noting that legal issues are what drives the Web search industry forward--or hinders it, depending on one's point of view. With the realization that Web search can critically influence e-commerce--Yahoo! grew earnings 84 percent last year, helped significantly by search advertising--Internet players have begun scrambling to develop and patent fundamental search technologies; IBM and Microsoft are also active in Web search patents. Yahoo!'s acquisitions of Overture and Inktomi have boosted that firm's position. Amazon.com is another major player, and last year filed a patent for auctions conducted on embedded Web ads; Amazon.com subsidiary Alexa Internet also holds a recently awarded patent for tracing Web surfing history. Microsoft and Google have teams of academics writing papers that act as evidence of prior art and strengthen their patent claims. Overture has been most active in the legal arena so far, filing suits against FindWhat.com and Google to protect its method of charging advertising customers based on search-engine results and the number of hits coming from searches; this so-called pay-for-performance model allows advertisers to bid for result positioning while paying only when a user clicks on the link. These suits are still in the courts, but some experts say patents are not so useful as an offensive tool as they are for defense: For instance, Lycos and CMGI previously claimed "spidering" technology patents fundamental to Web search, but were not rewarded much.
    Click Here to View Full Article

  • "Can Lessons From the Common Cold Help Us Defeat Computer Viruses?"
    TechNewsWorld (02/26/04); Korzeniowski, Paul

    Researchers are observing the similarities between computer viruses and biological viruses in the hopes that computer systems could be fortified against malware in much the same way the immune system guards the human body against disease invaders. Like their real-world counterparts, computer viruses are designed to self-replicate and flourish in environments whose functions rely on the interactions of large and sophisticated collections of systems. Some computer viruses can also destroy a host system's data in much the same way disease can overwhelm and kill an organism. In addition, the 20-year arms race between virus authors and antivirus vendors reflects biological viruses' ability to mutate into stronger strains to combat the defenses of the invaded host's immune system. Mike Reiter of Carnegie Mellon University and Stephanie Forrest of the University of New Mexico have received a hefty grant from the National Science Foundation to study living organisms to obtain insights that could be applied to computer security. The researchers have discovered that biological viruses have a much greater chance of damaging--or even killing off--an entire species when the species exists as a monoculture where all members share the same characteristics; in an effort to mimic species diversity that boosts resistance to diseases, some computer scientists think software developers should diversify programs, particularly those nonfunctional software segments that are often targeted by malware. Still, there are researchers who note significant differences between computer and biological viruses: Trend Micro's David Perry explains that unlike germs, malware cannot self-mutate--it must be changed by a hacker. Malware can also be removed from a computer system when that system is deactivated, taken off the network, and restarted with antivirus software, but biological viruses cannot be so easily eliminated from organisms.
    Click Here to View Full Article

  • "Survival of the Catchiest"
    Washington Post (02/26/04) P. E1; Musgrove, Mike

    Competition between antivirus firms to be the first to alert the public to new computer viruses or worms can lead to a bewildering profusion of names for the same bug, a situation that has started to become irritating. The general industry consensus is that the first person or company that finds and publishes information about a new piece of malware gets to name the bug, but this rule is not always applied. Nor is there a guarantee that the first title of a new worm out the gate will be the one that catches on. Antivirus companies must weigh a number of factors when it comes to naming viruses, not the least of which are the kinds of names to avoid, such as those connected to a company or a person, or even the virus' intended title. Additional confusion is sowed if different antivirus firms number subsequent permutations of the same virus differently, which is what happened in the case of the "klez" virus. "It can be bad news for the customers," comments MessageLabs antivirus researcher Alex Shipp. "They have no hope of sorting out that mess." MessageLabs assumed correctly that MyDoom would be the official name for the fastest recorded computer worm, but the U.S. Computer Emergency Response Readiness Team called the worm "Novarg" when it posted a public advisory. The prevalence of MyDoom prompted the group to use the more popular name in future postings.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "The Curse of the Biometric Future"
    Salon.com (02/26/04); Williams, Sam

    Biometric security technology was discovered 10 years ago, and has since promised security solutions but provided mainly controversy. Rockefeller University professor Joseph Atick, while studying human brain processes and sensory signals, found facial dimensions formed a unique template that could be matched to captured images of that person; he says he felt obligated to bring his discovery into the public sphere because he understood it had valuable societal uses. But despite the interest from government agencies and law enforcement, the field has not grown as quickly as other technology sectors, says International Biometric Group consultant Jeff Watkins. Atick says the 9/11 attacks brought tremendous attention to biometric security technologies, not all of it beneficial: Not only did it draw fire from civil liberties groups incensed by efforts such as the Patriot Act, but it also was seen as a panacea by some government officials. In reality, biometric security is decades away from the type of instant recognition portrayed in movies such as "Minority Report," says Imagis Technologies co-founder Andy Amanovich. Despite its ineffectiveness at catching al-Qaeda terrorists at airports, Amanovich says facial biometric scanning is useful for applications such as matching police suspects to mug-shot databases; his company sells an Internet-based facial recognition solution to police departments that allows officers to take a digital picture of arrested people then check it against their department's database. Professor Atick, who now runs biometric security firm Identix, says the technology is constrained both by what society is willing to accept and what the technology will practically allow. The Tampa Police Department's FaceIt public surveillance program, for instance, failed to positively identify any culprits in the two years before it was dismantled, although a more accurate version of the technology might sway some opponents, according to Atick's theory.
    Click Here to View Full Article
    (Access to full article available to paid subscribers only.)

  • "Inspiration From Nature's Grand Design"
    Financial Times (02/27/04) P. 9; Harvey, Fiona

    Biomimetics, the science of using nature as an inspiration for new technology, is valued not only as a way to find offbeat solutions to problems, but as a much less expensive approach than working from scratch. "Evolution is a massive trial and error experiment and anything you look at has moved through more iterations than you possibly can, and evolved solutions that may be the results of many unexpected combinations," explains Qinetiq researcher Chris Lawrence, whose company has devised a water-harvesting technique drawn from the water-condensing ability of the stenocara beetle. Some of the more dramatic examples of biomimetics include evolutionary computation, in which programs are generated by recombining fundamental code segments and are pitted against each other in a Darwinian competition to produce "genetic algorithms." Another notable effort is IBM's autonomic computing initiative, which aims to make computers more resilient against viruses and other dangers by modeling them after the human immune system. However, experts recommend that biomimetic researchers proceed with caution: SRI Consulting Business Intelligence consultant Carl Telford notes that nature does not always offer the best solutions to certain problems, while Garth Johnson of Newscastle University points out that the mimicry of biological systems may be constrained by the sophisticated infrastructures needed to maintain them. SRI consultant Kermit Patton adds that interfering with natural systems--inserting foreign species into ecosystems, for example--can have unpredictable, detrimental effects. Still, the worst to expect from most biomimetic applications would most likely be a failure to function properly, or a financial investment so high as to cancel out the applications' viability.
    Click Here to View Full Article

  • "New Spam Filters Cut the Noise"
    Wired News (02/26/04); Asaravala, Amit

    Several open-source spam filter developers recently announced filtering techniques that are close to 100 percent reliable in blocking incoming spam on a network, whereas many expensive commercial filters are only 99 percent reliable, at best. William Yerazunis claimed that his CRM114 spam filter generated only one error while filtering 6,000 emails in a recent experiment; his filter uses a method known as Markovian discrimination, which automatically attributes weights to combinations of words that are more likely to show up in spam. Another open-source developer, Jonathan Zdziarski, touted the advantages of his Dspam antispam software, which employs a technique called Dobly to divest incoming emails of the "noise" that spammers frequently include to thwart filters. He reported that the latest version of Dspam missed only one out of 3,000 spam messages and incorrectly identified just one out of 3,250 legitimate messages as spam during a recent test on "live" data. "Cost is measured in two ways when you're dealing with spam filtering--cost of implementation, and the cost of inaccuracy," Zdziarski explained. "Most commercial filters generally suffer in both areas--they're too expensive and not nearly as accurate." Still, many businesses continue to use commercial spam filters because of the technical support and training vendors provide.
    Click Here to View Full Article

  • "Conference Tutorials Provide Ways to Connect, Interact and Shape Future at CHI2004, 24-29 April, Vienna, Austria"
    Market Wire (02/23/04)

    Approximately 2,000 IT professionals from more than 35 countries are expected to attend the CHI2004 gathering April 24-29 in Vienna, Austria. Sponsored by the Association for Computing Machinery's Special Interest Group on Computer-Human Interaction, CHI2004 will give attendees an opportunity to find out what interacting with computers will be like in the next five to 10 years. The six-day event will include presentations, tutorials, vendor exhibits, and networking opportunities for IT professionals. Jun Rekimoto of Sony and Tim Brown of IDEO will give keynote presentations. Tutorials include Product Usability: Survival Techniques by Jared Spool, and Cognitive Factors in Design: Basic Phenomena in Human Memory and Problem Solving by Dr. Tom Hewett. "We selected 31 tutorials, ranging from Mobile User-Interface Design to Human-Robot Interaction, because they represent leading-edge issues to our community of researchers and designers," says Elizabeth Dykstra-Erickson (Kinoma) and Manfred Tscheligi, (CURE-AT), co-chairs of CHI2004. "Our conference history shows that one of the most important reasons people attend CHI is to learn something new, and CHI2004 will exceed attendees' expectations for the important perspectives to consider when designing future technologies."
    Click Here to View Full Article

    For more information visit http://sigchi.org/chi2004/.

  • "A New Step in Spintronics"
    Newswise (02/23/04)

    University of Utah physicists report in Nature that they have moved one step closer toward a new generation of miniaturized electronic devices with the development of electrical switches fashioned from organic semiconductors. These "organic spin valves" marry elements of organic semiconductor electronics and spintronics, which taps the spin of electrons rather than their electrical charge to store and transmit data. Associate professor Jing Shi calls the prototype device developed by the research team a valuable proof-of-concept demonstration. Computers equipped with spintronic memory should be capable of storing more information, processing data faster, and conserving more power than purely electronic devices; instant-on computers could also become a reality through spintronics. Shi says the next phase of spintronics' evolution is to augment spin-based devices with semiconductor properties, and his team elected to build a spin valve with an organic semiconductor because conventional semiconductors do not integrate well with spintronics since their manufacture requires high temperatures. Organic semiconductors, in contrast, are cheaper and can be fabricated at lower temperatures using simpler manufacturing methods, while their electronic properties and physical configurations are adjustable. The organic spin valve consisted of a layer of 8-hydroxyquinoline aluminum sandwiched between a layer of cobalt and a layer of lanthanum strontium manganese oxide, which functioned as electrodes; the physicists were able to induce a 40 percent change in the electrical current flowing through the valve via the application of a weak magnetic field. The breakthrough could help pave the way for new-generation "computer chips, light-emitting devices for displays, and sensors to detect radiation, air pollutants, light and magnetic fields," notes physicist Z. Valy Vardeny.
    Click Here to View Full Article

  • "Relieving Peer-to-Peer Pressure"
    Technology Review (02/25/04); Hadenius, Patric

    Kazaa co-founder Niklas Zennstrom is launching a new caching product that promises to relieve the onerous peer-to-peer traffic problem at ISPs. Because peer-to-peer protocols such as Kazaa's FastTrack and Morpheus' Gnutella are horribly inefficient at minimizing traffic, ISPs are seeing at least one-fifth of their traffic loads coming from these systems, according to University of California researcher Thomas Karagiannis, who measures peer-to-peer traffic at the Cooperative Association for Internet Data Analysis. Zennstrom's Joltid caching service reads Internet traffic and makes local copies of likely peer-to-peer files; that way, instead of two users simultaneously requesting and downloading the two copies of the same movie from a faraway node, the movie is sent only once and cached on the local ISP server for nearby distribution. The system works much like other caching programs, except that Zennstrom uses his team's peer-to-peer expertise to more accurately identify which Web packets are peer-to-peer traffic. Peer-to-peer systems often try to do their best to hide users' identity and network activity because of possible copyright violations and in order to get around firewalls. Joltid would be of particular use to local ISPs that operate the access links connecting users to big Internet backbones. Many of these cable and DSL companies provide less upload bandwidth to users than download bandwidth, which was not a problem for normal Web use but is a difficulty in more symmetrical peer-to-peer activity. Sprint researcher Supratik Bhattacharyya says that problem does not apply to international Internet backbones, where symmetrical traffic is actually preferred to lop-sided Web traffic coming out of large data centers; he says Sprint's backbone links are operating at about half capacity.
    Click Here to View Full Article

  • "RFID Blocker Tags May Soothe Privacy Fears"
    New Scientist (02/26/04); Biever, Celeste

    Anxiety that radio frequency identification (RFID) technology might infringe on consumers' personal privacy could be assuaged with a chip developed by RSA Laboratories that can block the tracking of RFID-tagged goods or people by scanners. Critics assert that ubiquitous RFID tagging would enable advertisers to profile consumers' shopping habits and use this data to target individuals, which constitutes privacy invasion. Once an item has been purchased, RSA's silicon "blocker tag" cancels the RFID reader's ability to obtain the unique electronic codes of RFID-tagged objects by flooding the reader with query responses, in effect overwhelming the device via a denial of service attack. "Most people think either you get privacy or convenience but you can't have both," explains RSA director Burt Kaliski. "We believe you can have both with a blocker tag." However, Cambridge University computer scientist Ross Anderson does not think the consumer should bear the cost of RFID-blocking technology. Machines that prevent chips from broadcasting radio signals already exist, but Kaliski notes that this would cancel RFID's consumer-oriented advantages. Proposed technologies include tags that transmit cooking directions to a microwave or washing directions to a washing machine, while medicine cabinets and refrigerators could be enhanced with RFID reader technology to remind users when items need to be restocked.
    Click Here to View Full Article

  • "Piercing the Fog With a Tiny Chip"
    New York Times (02/26/04) P. E5; Eisenberg, Anne

    Electrical engineers at the California Institute of Technology have integrated the basic elements of a radar system in a silicon chip smaller than a penny that can be mass-produced with cheap lithographic techniques, according to associate professor of electrical engineering Ali Hajimiri, who estimates that the device should only cost a few dollars per unit. He also says the chip, which features eight miniature phased array antennas designed to focus and direct a highly concentrated microwave beam, can reach bit rates as high as 1 Gbps. Hajimiri notes that the pulses generated by the radar chips are not as powerful as those emitted by chips currently used in aviation systems, but their power could be amplified if they are configured into arrays. The radar chip could initially be used in inexpensive automotive radar systems that map out the vehicle's surroundings to detect automobiles, pedestrians, and other nearby objects that may be hidden by fog, for example. The chip operates at 24 GHz, which is within the parameters prescribed by the FCC for vehicular radar systems. The chip's antennas, whose bearings are adjusted electrically, can be made unsusceptible to undesired signals, so interference is limited. The radar chip's usefulness may extend to wireless communications since the device boasts a broad bandwidth, while the rate of its bit stream is more than adequate to facilitate rapid downloads of movies and other kinds of digital content. The device could be employed so that a fleet of army tanks could communicate with each other in the field: "Using these extremely high frequencies, you can first capture location, sending out pulses and scanning the area like a bat," explains Irvine Sensors' chief scientist Volkan Ozguz. "Then, using the same chipset, you can start communicating at high frequency."
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Barrier Free Access to the Information Society"
    European Union (02/25/04); Liikanen, Erkki

    The European Commission (EC) is directing development and adoption of technology that is accessible to everyone, said EC member Erkki Liikanen in a speech to the Information Society for deaf, hard-of-hearing, and speech-impaired people. Last year saw a program specially devoted to improving technology access for disabled persons, and resulting initiatives are starting to take effect, especially in the area of digital television, fourth-generation cellular technology, Web site access, and general technology standardization. Citing examples of regulatory successes in the United States, Liikanen said it is important for government bodies to guide industry in formulating accessibility standards and rewarding compliant vendors through procurement rules. New technology promises to not only give disabled persons access to technology, but also greatly improve its usability; miniaturization improves device mobility, and ambient sensing technologies allow device personalization, for example. Liikanen said publicly available terminals should be able to acknowledge user needs and apply appropriate assistive technologies, such as enlarged fonts, speech-recognition, or a tactile screen. Many of these basic technologies and applications are under development in EU-funded research programs. Disabled persons themselves need to be involved in these research projects, Liikanen said, citing the slogan of last year's focus: "Nothing about people with disabilities without people with disabilities." Governments have the most responsibility in terms of setting technology standards, an area where the EU can contribute to global standardization efforts. This includes standardizing radio frequencies used in hearing aids, Web site accessibility requirements, and computer "safety" requirements where basic control functions are operated in a standard way.
    Click Here to View Full Article

  • "Xerox PARC Veterans Picked for Prestigious Draper Prize"
    TechNewsWorld (02/24/04); Mello Jr., John P.

    The National Academy of Engineering (NAE) has selected ACM fellow Robert W. Taylor, Alan C. Kay, Butler W. Lampson and Charles P. Thacker to receive the $500,000 Charles Stark Draper Prize for pioneering contributions to PC technology during their tenure at Xerox's Palo Alto Research Center (PARC). "These four prize recipients were the indispensable core of an amazing group of engineering minds that redefined the nature and purpose of computing," boasted NAE President William A. Wulf. Taylor, who is now retired, organized the others into a team that built the first practical networked PC at PARC 30 years ago, and many of the machine's technologies, such as the graphical user interface (GUI) and bitmapped displays, are still in use in modern PCs. Dynamic object-oriented programming and GUIs that use overlapping windows were invented by Kay, who is now a Hewlett-Packard senior fellow and a computer science professor at UCLA. Lampson, now a Microsoft engineer as well as an MIT adjunct professor, worked on numerous PARC projects, including the SDS 940 time-sharing system, the Xerox 9700 laser printer, the Alto distributed computer system, two-phase commit protocols, and the Autonet local area network. One of Lampson's collaborators at PARC, Roy Levin, recounts that the engineer's credo was to keep PCs as simple and usable as possible. Thacker led the Alto system project and co-invented Ethernet before moving on to become a Microsoft engineer, like Lampson.
    Click Here to View Full Article

  • "Congress to Review Tech Agenda"
    eWeek (02/23/04) Vol. 21, No. 8, P. 24; Carlson, Caron

    Vying for Congress' attention this year is a number of technology-related issues--cybersecurity, Internet taxation, and spyware foremost among them. Bob Dix, majority staff director for the House subcommittee on technology and information policy, reports that industry representatives are readying a recommendation on enhancing private network security for Rep. Adam Putnam (R-Fla.), who wishes to consider options for market and government enticements designed to encourage the adoption of voluntary best practices, examples of which include tax credits or liability limits for adopters. "We're trying to move the ball up the field with a set of action steps that can...address some of the vulnerabilities," Dix explains. Many in the IT industry are worried that Rep. Mary Bono's (R-Calif.) Safeguard Against Privacy Invasions Act, which was proposed to limit the use of spyware, could embroil automatic software updates. A hearing to discuss the issue is expected within the next month, while the bill undergoes amendment. Meanwhile, the Senate is expected to vote in March on whether the Internet access tax moratorium should be made permanent, a move that has been delayed for many months by state efforts to avoid such an outcome. Concerned that senior managers and corporate boards lack accountability for information security in corporate networks, Putnam suggested last year that publicly traded companies be required to submit security audit reports, but in November he gave industry representatives the opportunity to furnish alternate solutions. Association for Competitive Technology VP Steve DelBianco says his organization, along with VeriSign and other companies, will present a security stack scheme to Congress in March.
    Click Here to View Full Article

  • "Copper Tops 10 Gigabits"
    Computerworld (02/23/04) Vol. 32, No. 8, P. 26; Hamblen, Matt

    The cost of 10 Gigabit Ethernet technology, which currently only runs on fiber-optic cabling, could no longer be a barrier for entry with the emergence of two standards that promise to bring 10 Gigabit Ethernet speeds to copper cabling, although both standards come with caveats. The 10GBase-CX4 specification developed by the Institute for Electrical and Electronics Engineers' (IEEE) 802.3ak Task Force will bring 10 Gigabit Ethernet to CX4 cable, although the range will be limited to 15 meters; the 10GBase-T specification will enable twisted-pair cabling to support 10 Gigabit Ethernet speeds over a maximum range of 100 meters, but the standard's ratification is at least two years away, and the possibility that 10GBase-T will only run on Category 6e twisted-pair cable instead of the more common Category 5e cabling could also affect product rollouts. The range trade-off of 10GBase-CX4, which is expected to be finalized by the IEEE in February, should not be an obstacle for customers who plan to use the standard to link switches or servers within a data center. IEEE 802.3ak Task Force Chairman Dan Dove estimates that overall costs for copper cabling in the data center should range between 5 percent and 20 percent of the cost for fiber, and he says CX4 could provide 10 times the bandwidth of Gigabit Ethernet for two to three times the expense. Still, many organizations prefer using more expensive fiber as an Ethernet backbone even for data center connections. Some vendors are undecided as to whether they will supply both CX4- and 10GBase-T-enabled products out of uncertainty over customer demand. Gartner analyst Mark Fabbi cautions that "demand for either copper standard is relatively small."
    Click Here to View Full Article

  • "Lean, Mean Green Machines"
    Midrange Server (02/23/04) Vol. 13, No. 8; Morgan, Timothy Prickett

    Most computers in use today are characterized by low energy efficiency, high power consumption, and high heat output, to say nothing of the additional cost, noise, and consumption of cooling systems. According to rough estimates, PCs worldwide collectively guzzle 1.25 trillion kilowatt-hours of electricity each year, and servers consume 36 billion kilowatt-hours--80 percent to 90 percent of which is wasted; by that reckoning, PCs and servers together eat up $250 billion a year, with about $213 billion of that number wasted. The cost of computers' inefficiency extends beyond money: Most electricity computers use is generated by burning coal, which pollutes the environment, while the strain computers place on the grid leads to outages that disrupt people's lives and result in lost business and productivity. The solution is to build greener machines that only consume power on an as-needed basis while maintaining performance at sufficient levels; greener machines such as thin clients are optimized for a particular set of functions, which reduces energy consumption and waste. Lowering size and power consumption while retaining peak processing performance is only part of the solution--grid computing and virtualization are also needed to maximize PC and server efficiency. An open-source deployment may be the best option for setting up a standard virtualization environment for servers and possibly PCs. Reducing the energy profile and cost of PCs and servers will make computing resources and the Internet more accessible to poor, isolated people in developing countries, which is a significant factor in global economic growth and prosperity. The IT world has typically regarded the development of greener machines with indifference, but this attitude could change as the frailty of the electric grid grows and power outages increase in frequency.
    Click Here to View Full Article

  • "The Web Within the Web"
    IEEE Spectrum (02/04) Vol. 41, No. 2, P. 42; Castro-Leon, Enrique

    The success of second-generation e-commerce depends on making disparate databases--both old and new--accessible across the Web, and Web service technologies are helping make this vision a reality. So that Web services can be used to construct networks of collaborating databases and services, a number of standards have been developed: Extensible Markup Language (XML) is a universal data format designed to replace HTML, while the addition of metadata by emerging Web service innovations would allow different databases with kindred fields to be compared by a software program without human assistance. The Simple Object Access Protocol (SOAP) standard was created so that XML can be overlaid over HTML, thus providing an effective transport protocol for Web services; the combination of XML and SOAP gives unprecedented interoperability to Web service applications. The Universal Discovery, Description, and Integration (UDDI) standard was made to add a search-engine-like functionality to Web services--in essence, UDDI lets Web sites announce to services that they contain data. So that machines can determine by themselves what is on a site, the Web Services Description Language (WSDL) was devised. Web services boast loose coupling and delayed binding, which will enable enterprises to replace older software and interfaces gradually, and with minimal disruptions. Web services not only raise convenience levels for users, but can drive costs down for companies. ZapThink analyst Ron Schmelzer reports that Web services have shaved off 90 percent of software project costs, with the biggest savings derived from software reuse across projects.
    Click Here to View Full Article

  • "The New Face of the Silicon Age"
    Wired (02/04) Vol. 12, No. 2, P. 96; Pink, Daniel H.

    American programmers' frustration at having their jobs offshored to foreign workers willing to work for dramatically less money--and being forced to train their foreign-born replacements in some cases--has sparked a backlash against outsourcing and raised fears that America's economic leadership will be undermined by countries such as India. Fueling these anxieties are estimates from Gartner that 10 percent of U.S. technology jobs will be exported by year's end, and Forrester Research's forecast that over 3 million U.S. white-collar jobs--most of them in the IT industry--will migrate overseas in the next 15 years. Already, U.S. workers' complaints have moved lawmakers to action, an example being New Jersey Sen. Shirley Turner's (D) introduction of a ban on outsourcing in her state, a protectionist policy that supports her belief that U.S. companies should hire American labor first; her opinion is that outsourcing will take away income tax revenues from governments, hurting programs for displaced workers and eroding the middle class. However, Indian programmers employed by Hexaware Technologies scoff at such attitudes, arguing that outsourcing can actually benefit the United States by allowing Americans to concentrate more on other economy-boosting activities, such as invention and innovation. Recurring disruptions of America's industrial status quo are a historical fact, and the U.S. economy has always emerged from these cycles stronger than before. The IT upheaval, compared with previous disruptions, is proceeding at a faster pace and in many cases is derailing people in the middle of their careers. The only effective long-term solution is for professionals affected by this upheaval to readjust their career goals and acquire new skills. The brute-force IT jobs such as product fabrication, testing, maintenance, and upgrading is going overseas, but domestic talent can flourish in imagining, creating, and marketing new products.
    Click Here to View Full Article



    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM