HomeFeedbackJoinShopSearch
Home

       HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM. To send comments, please write to [email protected].
Volume 5, Issue 576:  Wednesday, November 26, 2003

  • "Blackouts Highlight Network Vulnerabilities"
    CNet (11/25/03); Lemos, Robert; Loney, Matt

    A report from the Renesys data analysis firm recently concluded that the summer-time power outages that plagued North America and Europe had a much greater impact on the Internet than previously thought, and cast infrastructure vulnerabilities into sharp relief. "While the very largest provider networks--the Internet backbones--were apparently unaffected by the blackout (in North America), many thousands of significant networks and millions of individual Internet users were offline for hours or days," including manufacturers, banks, hospitals, schools, ISPs, and federal and state government units, the study found. Renesys estimated that the Aug. 14 North American blackout struck over 9,700 customer networks of more than 3,500 organizations in the affected region, with one-third of these networks suffering from "abnormal connectivity outages." Of that portion, over 2,000 networks were severely disconnected for more than four hours, while over 1,400 networks were disconnected for more than 12 hours. Approximately 1,700 organizations owned networks plagued by abnormal connectivity outages, and almost 50 percent of those involved in global Internet routing suffered partial or total network disconnection in the blackout region. The Renesys report made no indication of cascading failures impacting on global Internet stability, and noted that specific effects of the blackout on Internet availability were well-localized geographically. A report from the U.S.-Canadian Security Working Group uncovered no evidence to suggest that the Aug. 14 blackout was the work of malevolent parties, nor was it attributed to widely distributed worms and viruses. However, the task force did not rule out the possibility of critical infrastructure being affected by a cyberattack.
    Click Here to View Full Article

  • "Novel Processor Stirs Petascale Controversy"
    EE Times (11/24/03); Merritt, Rick

    Stanford University computer science professor William Dally stirred controversy at the SC2003 conference for his idea of a custom supercomputing chip that would enable a 2 Pflops system for just $20 million: The Merrimac CPU would utilize rewritten applications optimized for parallelism, allowing large 64-bit chips to do more processing and less waiting for off-chip memory. Dally said bandwidth is the critical issue in supercomputing design, not operations per second; the Merrimac chip clusters 64 64-bit units orchestrated by separate on-chip controllers and fed from a register hierarchy. At the 90 nm process, each of these 10 mm by 11 mm chips would cost just $200 and produce about 31 watts of heat, but deliver an astounding 128 Gflops each. Dally's team has not found a major supercomputer vendor willing to take up the project, and is working on a compiler and register files, and has already hand-coded three streaming applications. IDC analyst Earl C. Joseph says supercomputer makers are constrained by the small size of the technical-computing market, which did not grow in 2001 or 2002 due to the downturn and the rise of clustered commodity systems. The government's High Productivity Computing Systems (HPCS) project promises to bolster the supercomputing field as Sun Microsystems, Cray, and IBM are all working on novel architectures. Cray, which has retained Dally as an advisor on its HPCS project, is pursuing specialty systems that will incorporate streaming, though not in the way Dally laid out in his presentation; IBM HPCS director Mootaz Elnozahy says his company is aiming for a system compatible with the PowerPC processor, but will do away with the concept of a conventional core; meanwhile, Intel is perceived as countering the specialized nature of the HPCS program with its own Advanced Computing Program meant to utilize off-the-shelf technologies for greater supercomputer performance. Elnozahy says he is worried that commodity technology may outshine his aggressive research: "What I am scared of is that someday we are going to be able to configure a supercomputer at the Dell Web site," he quipped.
    Click Here to View Full Article

  • "Taking Cues From Mother Nature to Foil Cyberattacks"
    Newswise (11/25/03)

    A National Science Foundation-supported cyberdefense project operates on the premise that many computer systems are vulnerable to viruses, worms, and other forms of malware because they use identical software that has the same vulnerabilities, in much the same way that genetically similar individuals are susceptible to the same diseases or disorders. The project, which enlists collaborators from Carnegie Mellon University and the University of New Mexico through a $750,000 NSF grant, is investigating how "cyber-diversity," like biodiversity, can bolster systems' resistance to dangerous agents. "Our project seeks to reduce computer vulnerability by automatically changing certain aspects of a computer's software," explains Carnegie Mellon researcher Dawn Song. "Adapting this idea in biology to computers may not make an individual computer more resilient to attack, but it aims to make the whole population of computers more resilient in aggregate." Earlier attempts to diversify software had independent teams develop different versions of the same software in the hopes that different sets of vulnerabilities would evolve from each version, but researchers call such an approach time-consuming and economically costly. University of New Mexico computer science professor Stephanie Forrest says they are exploring ways to automate the diversity process, which could be more effective and less economically taxing. NSF program director Carl Landwehr says the Carnegie Mellon-New Mexico collaboration represents the kind of innovative research his organization expects to encourage through its CyberTrust program.
    Click Here to View Full Article

  • "Will December Make or Break the Internet?"
    Register (UK) (11/24/03); McCarthy, Kieren

    The future of the Internet could depend on the outcome of the upcoming World Summit on the Information Society in December, where government leaders from over 60 nations are expected to convene. The chief topic of debate will likely be who exactly governs the Internet, which has spawned two dissenting viewpoints: The United States, Europe, and other English-speaking countries want private industry, in the form of U.S.-based ICANN, to continue its management of the Internet's infrastructure, while developing countries favor the International Telecommunications Union (ITU) as Internet governor. ICANN and ITU's conflict over Internet governance has been percolating since the Internet was created, and the summit will be where the issue finally comes to a head. ICANN has acquired a bad reputation for being out of touch with the general public, so the body was reorganized to give government a greater influence on policy--while the ITU, having rolled out practically all forms of modern communication, would seem the logical choice for Internet overseer. However, U.S. technology allowed networks to be set up by individuals on the proviso that people made their networks freely available to all, and the ITU's attempts to push its model of having incumbent telecom providers supply computer-to-computer communications aroused suspicion among the Internet community. Though ICANN is slated to become independent in several years, developing nations do not like the idea of an Internet disproportionately controlled by primarily Western and English-speaking nations, while ICANN supporters argue that placing the ITU in charge would threaten the Internet's democratic nature by allowing governments to censor citizens. It is likely that ICANN will prevail, given the fact that its advocates control all but one of the Internet's root servers and constitute the most Internet expertise; but to muffle the voices of influential people who want ITU to play a greater role in Internet governance would unbalance the Internet. However, neither side in this debate is likely to reach an agreement or compromise, which will only prolong the battle and ruin the chances of implementing a coherent international solution to the problems of spam and viruses.
    Click Here to View Full Article

  • "The Rise of the Machines"
    Japan Times (11/25/03); McNicol, Tony

    Japan's interest in robotics extends across many commercial products and research efforts, ranging from Sony's Aibo robot dog to humanoid machines such as Qrio and Asimo to Sakura Sanae, a wholly computer-generated character currently employed as a diplomatic envoy to ASEAN nations. Artificial intelligence and humanoid robotics research has been a heavy focus of government capital in recent years, and scientists are trying to develop machines that can take up the slack for a Japanese workforce that threatens to be significantly depleted by age and retirement in the coming decades. Professor Hirohisa Hirukawa's government-funded Humanoid Robotics Project, which concluded this year, set the goal of developing robots capable of driving vehicles, acting as security guards, and caring for the elderly. By being able to function in the same environment and operate the same equipment as humans, humanoid robots obviate the need to overhaul factories and other places where people work, Hirukawa argues. However, funding requirements and budgetary considerations are pressuring researchers to deliver a practical humanoid machine by 2010, even though conservative estimates doubt such a development will arrive for two decades. "We must find some real applications within five years, because we need big investment of the order of several million dollars a year to continue the development," notes Hirukawa. The Advanced Telecommunications Research Institute International's Atom Boy Project, which aims to create a humanoid robot that can move and think like a five-year-old, will require an annual budget of half a billion dollars for 30 years; project creator Mitsuo Kawato says the project is similar to the Apollo moon shot and its success would be an enormous boost to Japanese pride: "We need to have some dream from technology," he argues.
    Click Here to View Full Article

  • "Java Toolmakers Work for Peace"
    InternetNews.com (11/25/03); Singer, Michael

    A new group of software firms that use Java heavily are calling for a unified application framework that would allow Java tool extensions to be written once for all standards-based Java integrated development environments (IDEs). Third-party and independent software developers would be able to easily ensure interoperability with the major vendors' development tools, speeding innovation in the application development industry, says Gartner analyst Mark Driver. The effort, tentatively called Java Tools Community, is comprised of SAP, BEA Systems, Compuware, Sybase, and is headed by Sun and Oracle. The group is in strategic talks with IBM and Borland. Currently, open source and independent developers are forced to write tool extensions for each vendor IDE, such as Oracle's JDeveloper, the Eclipse/WebSphere Studio Application Developer, SunOne Studio/NetBeans, and Borland's JBuilder. The effort is an outgrowth of Java Service Request (JSR) 198, which established a common application interface for Java IDE extensions. The Java Tools Community needs to establish a scope of responsibility and work out a formal structure for collaboration, according to one Sun official. The program is similar to IBM's Eclipse project meant to create a general-purpose IDE. Eclipse recently won further support from Oracle, which joined the Eclipse board to ensure interoperability with its own development platform; in doing so, Oracle acknowledged the growing influence of Eclipse and the necessity of providing easy links to its own Oracle9i Application Server and Oracle9i database.
    Click Here to View Full Article

  • "Senate Approves Antispam Bill"
    CNet (11/25/03); McCullagh, Declan

    After more than six years of congressional sparring, a national antispam bill is poised to be approved by Capitol Hill and later the White House once the legislation is submitted to President Bush next month. The current version of the Controlling the Assault of Non-Solicited Pornography and Marketing Act (CAN-SPAM), which received Senate approval on Nov. 25, is a compromise bill that will make major spammers targets for criminal prosecution, FTC prosecution, and hefty lawsuits, according to Sen. Ron Wyden (D-Ore.). The Senate-approved CAN-SPAM bill features several revisions to an earlier version approved by the House. The new legislation requires email senders to follow specific regulations, such as offering a simple way for recipients to unsubscribe from future mailings, even after obtaining their "affirmative consent;" state attorneys general would have less difficulty in stopping spammers who forge headers or bounce mail through networks "accessed without authorization," while the FCC and FTC would be able to more easily acquire cease-and-desist orders to use against such spammers; federal judges would not be granted more discretion in giving "reasonable attorney fees" to state attorneys general; and spam sent to mobile devices would have a wider definition. Tech trade associations, the NetChoice coalition, and the Direct Marketing Association see CAN-SPAM as a promising development, but some antispam proponents argue that the bill is riddled with loopholes: It favors an opt-out policy that permits bulk, unsolicited email advertisements, and the FTC is allowed, but not required, to set up a no-spam registry. Additionally, the bill would invalidate tougher statutes in U.S. states, including one that requires spam to be clearly labeled as advertising. EmailLabs' Loren McDonald argues that CAN-SPAM essentially legalizes junk email, and warns that "it could increase the volume of what we consider to be [spam] emails."
    Click Here to View Full Article

  • "Q&A: Improved Security Requires IT Diversity"
    Computerworld (11/24/03); Vijayan, Jaikumar

    Security guru and author Bruce Schneier contends that physical security is not a function of technology but a function of people: Technology by itself cannot make people safer because that is not its purpose; safety comes from how people implement and use technology. Schneier argues that his report "CyberInsecurity: The Cost of Monopoly" is not a condemnation of Microsoft's operating system per se--it is not the operating system that lies at the core of IT security problems, but rather the prevailing monoculture, which carries greater risks than benefits. Schneier explains that bad patching and the lack of secure software is attributable to economic rather than technical problems, and claims that the solution is to essentially hack the business climate. He suggests that software manufacturers should be made liable for the damages users suffer as a result of insecure software, which will give them a direct economic incentive to fix those vulnerabilities. Full public disclosure of security holes is also forcing software companies to take security seriously, while virus and worm outbreaks--and the publicity they generate--is an additional source of pressure for CEOs. Schneier says that patching is a useless gesture, given that there is an overabundance of patches marked by generally poor performance--and what is more, companies cannot catch up with the rate of vulnerability disclosure. His argument is to shift focus from threat avoidance to risk management, and achieving this requires that the CFO be placed in charge of security, since security people have too narrow a view to make such decisions. Schneier admits that measuring effective security is difficult, because "there is no standard benchmark against which to measure your own security."
    Click Here to View Full Article

  • "ORNL to Design High-Speed Experimental Network Called UltraNet"
    EurekAlert (11/24/03)

    Oak Ridge National Laboratory (ORNL) has been awarded a $4.5 million grant from the Department of Energy's Office of Science to develop the Science UltraNet, a prototype dedicated high-speed network that will facilitate the design of high-performance computing networks for large-scale scientific research projects at universities and the DOE. ORNL's Nageswara Rao, who will manage the three-year UltraNet project with colleagues Bill Wing and Tom Dunigan, says the system could mark a revolutionary advancement in large-scale data transfer, while also providing remote computational steering, distributed collaborative visualization, and remote instrument control. ORNL will construct the prototype network infrastructure around existing optical networking technologies in order to enable development and testing of the scheduling and signaling equipment needed to process user requests and expedite system optimization. The UltraNet will run at 10 Gbps to 40 Gbps, which represents a 200,000- to 800,000-fold increase over the fastest dial-up connection. The project calls for the establishment of a testbed network that extends from ORNL to Atlanta, Chicago, and Sunnyville, Calif. Rao notes that "the data transmittal requirement plus the control requirements will demand quantum leaps in the functionality of current network infrastructure as well as networking technologies." Nanotechnology, climate modeling, genomics, and astrophysics are just a few disciplines that stand to benefit from the UltraNet. "We're developing a high-speed network that uses routers and switches somewhat akin to phone companies to provide dedicated connections to accelerate scientific discoveries," explains Rao.
    Click Here to View Full Article

  • "OSDL to Detail Development Process of Linux Kernel"
    eWeek (11/25/03); Galli, Peter

    The Open Source Development Lab (OSDL) plans to detail the development process of the Linux kernel to any Linux customer as well as anyone considering Linux through a new initiative to be announced on Nov. 26. The goal of the effort is to boost the confidence levels of Linux stakeholders, especially in light of the forthcoming Linux 2.6 kernel release in December. An OSDL representative declared Tuesday that Linux creator and OSDL Fellow Linus Torvalds will release a test11 version of the kernel to the open-source community, prior to the public debut of the 2.6 production version. He says the lab considers Torvalds' leadership of kernel development to be a critical ingredient to the group's effective creation of powerful software over more than a decade. OSDL CEO Stuart Cohen attempted to defuse the SCO Group's attacks on Linux and the open-source community by arguing that such criticism demonstrates "a lack of understanding as to the rigor imposed by...[Torvalds] himself and the development community at large." He says the development process involves thousands of developers self-organized into particular subsystems shaped by each developer's interest and technical skill; a subsystem maintainer is appointed to coordinate the work of other subsystem developers, and these maintainers assess the code sent to them and subject it to broader peer review. Once the code is accepted by a maintainer, it is submitted to Torvalds, who oversees the Linux development kernel, or Andrew Morton, who oversees the production kernel; final arbitration over what code is incorporated into the kernel is Torvalds' responsibility. Cohen further points out that all Linux code is published online so it may be examined by the public.
    Click Here to View Full Article

  • "802.16a Links the Last Mile"
    Wireless Newsfactor (11/24/03)

    The emergence of the IEEE 802.16a standard for wireless metro-area networks could give broadband adoption a much-needed boost, especially among the millions of users who lack access to cable- or DSL-based Internet. The standard, also known as WiMAX, furnishes wireless, last-mile broadband connectivity over frequency bands bellow 11 GHz; 802.16a offers marked improvements in non-line-of-sight performance, and is well suited for locations where trees, buildings, and other obstacles are present; stations do not have to be installed on high-altitude structures such as mountains or towers--they can instead be mounted on homes or buildings. Reflections from obstacles may distort the radio frequency signal, but 802.16a correctly interprets the transmission at the base station. Wireless 802.16a point-to-point connections, OCX, or DS3 can then supply backhaul to the Internet. Other drivers of WiMAX deployment can also come from the inexpensive manufacture of standards-based products and their compatibility. WiMAX can give companies business-quality broadband service with the technology's maximum throughput of 75 Mbps, while secure transmissions and authentication can be supported through 802.16a's privacy and Triple-DES encryption. Wireless 802.16a's delivery of robust performance is derived from support for licensed and license-exempt band operation below 11 GHz; forward error correction, which boosts transmission reliability; high spectral efficiency; space/time coding; support for sophisticated antenna methods; and adaptive modulation support. In addition, WiMAX offers low latency for voice over IP and other delay-sensitive services, which is critical for companies that desire voice as well as data services from their broadband service supplier.
    Click Here to View Full Article

  • "IETF Ponders Internationalized E-Mail"
    Network World Newsletter (11/24/03); Marsan, Carolyn Duffy

    Several proposals for establishing internationalized email standards have been submitted to the Internet Engineering Task Force (IETF), which met earlier this month to discuss the issue. "We need to decide if we want a multi-lingual Internet with English as one language or an English language Internet with some capability for handling other scripts," notes IETF leader John Klensin, who rejects proposals in which non-English speaking users are forced to communicate to one another in English. "My belief is that if we don't deal with the localization problem well, systems will be deployed that are localized and incompatible with each other and globally," he warns. The Internationalized Mail Addressed in Applications (IMAA) proposal is thought to be a "quick fix" in which encoding and translation methods to convert foreign language characters into ASCII are used, keeping revisions to email applications or other Internet applications that employ email addresses to a minimum. The trade-off means that nonsensical characters would show up in the "from" line if users fail to update their email clients to support the protocol. These characters might cause users to misinterpret the email as spam, notes IMAA draft document co-author Paul Hoffman. On the other hand, the IMAA proposal utilizes the same encoding and normalization protocols the IETF adopted in 2002 for internationalized domain names, although many IETF engineers regard the standard as "a hack." The other proposal offers a much cleaner solution that involves updating all email standards and applications to replace ASCII characters with UTF-8 characters; recipients that do not upgrade their systems accordingly will reject incoming email.

  • "When Free Isn't Really Free"
    New York Times (11/23/03); Schwartz, John

    The share-and-share-alike attitude that characterized the early days of the Internet and subsequent dot-com frenzy has been replaced with an environment where free movies, music, and software often come with many strings attached. The Center for Democracy and Technology and some lawmakers are pulling for legislation to combat a rising tide of adware and spyware that makes its way into people's computers by way of downloaded free content and programs; the file-sharing software Kazaa, for example, comes with a software bundle from Brilliant Digital Entertainment that, among other things, enlists the user's computer in a grid network used to power music downloads. In addition, TruSecure malicious-code research director Bruce Hughes warns that as much as 45 percent of software available on the Kazaa network is infected with viruses. InterMute CEO Ed English, whose company creates ad-blocking software, says free software and content online actually carries costs in terms of privacy and productivity, and surmises that many regular computer users simply toss out their PCs when they become overburdened with superfluous software. Privacy expert Richard M. Smith was first to uncover many of these rogue programs, and recently encountered new tactics when cleansing his daughter's computer: When googling for the Spybot anti-spyware program, Smith found one of the infectious programs generated a spoof Web page that masked the link to the download. New York University communication studies director Siva Vaidhyanathan says nothing online is ever free after all, since someone has to pay for the PC and Internet connection. "What's really at stake here is who's going to get the money and in what-sized pieces," he says. The way out of the morass, say experts, may be more legitimate and innovative efforts such as Apple's iTunes music store.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Why Tech Is Still the Future"
    Fortune (11/24/03) Vol. 148, No. 11, P. 119; Arthur, W. Brian

    Information technology and its transformative effects on the economy will spur the creation of new industries and productivity growth, writes Santa Fe University Citibank professor W. Brian Arthur. He argues that the digital revolution cycle is the same as previous technology cycles--there is a boom period followed by a crash, which is then followed by a vast buildout that facilitates major economic changes and leads to many prosperous years. The economic transformation Arthur cites are in three key areas: Increasing connectivity stemming from the advent of wired and wireless networks; the growth of smarter businesses as the result of digital technology revolutionizing supply-chain processes and management; and totally new subindustries being spawned from digitalization. The author writes that this last transformation comes out of the fact that industries encounter rather than adopt IT, with the subsequent changes to the industries giving birth to new industries as traditional industrial operations crossbreed with digitization operations. Arthur points to statistics showing a 2.5 percent yearly rise in output per hour since 1995, with computer-intensive industries recording the greatest productivity increases. He adds that such growth is likely to continue, since the digital revolution will stretch on for decades. Arthur reports that tech vendors will be obvious beneficiaries of the digital economic transformation, but the chief recipients will be "the Wal-Marts, Fords, and FedExes of business that use the technology most effectively and efficiently." Although the author notes that the steady offshore IT migration appears to threaten America's global competitive edge, he contends that this could actually be a positive development: For one thing, other prosperous nations can bolster trading and political stability, while U.S. leadership in scientific advances should allow America to maintain its prevailing momentum for many years.
    Click Here to View Full Article

  • "Baffling the Bots"
    Scientific American (11/03) Vol. 289, No. 5, P. 36; Bruno, Lee

    A great many spam emails are attributed to bots, rogue software programs that masquerade as people in order to set up email accounts and automatically launch junk messages. To thwart such programs, researchers such as Carnegie Mellon University's Manuel Blum and Henry Baird of Xerox's Palo Alto Research Center have designed CAPTCHAs (completely automated public Turing tests to tell computers and humans apart). A CAPTCHA works by degrading an image--a word, for instance--and requiring a user to identify it correctly in order to proceed with an account registration. In principle, CAPTCHAs can be easily solved by a person, yet are impossible for a bot to decipher. One CAPTCHA, EZ-Gimpy, chooses one out of 850 words and distorts the letters, and the puzzle is employed by Yahoo! and other Internet mail services. However, CAPTCHAs such as EZ-Gimpy may not prove as successful against later generations of bots, so researchers are building more sophisticated CAPTCHAs to combat them. Baird and Monica Chew of the University of California at Berkeley have devised BaffleText, a CAPTCHA that randomly creates several distorted words whenever someone logs onto a Web site; the person must type the words into a blank space in order to proceed. Auditory and visual CAPTCHAs are also being researched. Baird says, "This is our arms race. There's no question that bots are going to become more and more sophisticated."
    Click Here to View Full Article

  • "What's Wrong Here?"
    Computerworld (11/24/03) Vol. 31, No. 53, P. 35; Hoffman, Thomas

    Fifty-six percent of 936 professionals polled in Computerworld's 2003 Job Satisfaction Survey report being less satisfied with their companies this year than they were last year, while 55 percent say their advancement opportunities are unsatisfactory; 69 percent of respondents think their jobs do not allow them to reach their full potential, and 59 percent claim job stress levels have risen since last year. A recent survey of 5,000 American households conducted by The Conference Board estimates that less than 50 percent of Americans finds satisfaction with their jobs. CIOs and IT career specialists attribute some of the worker dissatisfaction to fallout from the dot-com boom, when bonuses and other benefits were frequent. "With the economy being tighter and IT [opportunities] not nearly as wide open as a few years ago, people in IT might feel more stressed and more trapped," notes Domino's Pizza CIO Tim Monteith. An economic recovery could spur dissatisfied IT workers to defect to other employers. Meanwhile, increased offshoring is sowing doubt among IT workers that their profession has any future. Cooper Tire & Rubber business analyst Aaron Carr adds that employee firings or voluntary exits are often followed by a redistribution of work among remaining personnel, which can leave employees overloaded; Carr's pursuit of an MBA springs from his desire to increase his involvement in the business side of his company's IT projects. Budgetary issues have led to training cutbacks at some companies, while in other cases IT workers say they lack the time to attend training sessions. Still, 77 percent of the Computerworld survey respondents are happy they followed an IT career path.
    Click Here to View Full Article

  • "The Technology of the Year: Social Network Applications"
    Business 2.0 (11/03) Vol. 4, No. 10, P. 11; Pescovitz, David

    It started with the Oracle of Bacon, a Web site set up by a group of university students showing how actor Kevin Bacon had links throughout the Hollywood universe. Since then, the concept of social networking and the Internet have joined to become a potentially powerful force in business, society, and even homeland security. So-called social network applications allow people to find the business contacts or knowledgeable experts they need quickly, unearthing social capital otherwise left undiscovered. Cap Gemini Ernst & Young vice president Michael Cleland wanted a contact inside a large semiconductor firm and used software from Spoke Software to discover a colleague was a good friend of a manager inside the target firm. The Spoke system builds a map of the company's human connections, indexing emails, calendars, address books, and buddy lists. Combined with Web searches, the software can compile profiles of potential customers. Other social networking applications have similar aims, such as Visible Path, which claims to increase the size of contract wins and shorten sales cycles by 27 percent. That software can estimate the strength of human relationships by analyzing email patterns, such as how often correspondence occurs and what average response times are. Besides business, social networking software is used to track viral infections and terrorist organizations. In-Q-Tel, the Central Intelligence Agency's venture capital arm, has invested in Systems Research & Development, which monitors customers and employees for links to known criminals and con artists. The technology also has more prosaic uses, as demonstrated by the rapidly growing Friendster network where people create personal profiles and can get to know friends of friends. However, with the proliferation of social networking technology, a privacy backlash can be expected.
    Click Here to View Full Article

  • "Salary Survey 2003: What Are U.S. Developers Worth?"
    Software Development (11/03) Vol. 11, No 11, P. 32; Morales, Alexandra Weber

    On the supply side, IT talent is healthy but is about to undergo significant changes, according to the 2003 Software Development salary survey, which estimates that 4 percent of this year's nearly 6,000 respondents lost their jobs in the last 12 months. Nine percent of respondents cited offshore outsourcing as the reason they are seeking employment, while 29 percent cited downsizing; 10.5 percent of staff and 65 percent of managers said they are affected by outsourcing. The average salary of professionals expecting to be outsourced was $87,000, while the average salary of those who were not was $83,000. The median salary for staff experienced a steady increase from $70,000 to $77,000 between 2000 and 2002, but did not experience any increases this year; meanwhile, the lowest-paid staff witnessed a salary decline from $65,000 to $64,000. Managers had a much better year, with median salary growth from $92,000 to $96,000 and median pay growth from $95,000 to $100,000. There was a steady fall-off in almost all benefits--stock options, certification reimbursement, education and training, etc.--over the past four years, while respondents indicated significant rises in median years of experience and average number of years they expect to work for their current companies; salary satisfaction levels were more or less the same from last year, with 3.5 percent reporting extreme dissatisfaction, 14 percent claiming dissatisfaction, 25 percent neutral, 42 percent reporting satisfaction, and 17 percent reporting a high degree of satisfaction. Rates of masters' and doctorate degrees in computer science held steady at 15 percent and 1 percent, respectively, while bachelor's degrees experienced a decline from 32 percent to 29 percent between 2000 and 2003. There was a dramatic drop in the percentage of respondents who were contacted by a recruiter, from 69 percent in 2000 to 36 percent in 2003; 13 percent of survey respondents were women, who earned median salaries of $81,000, compared to $88,000 for men.



    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM