Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to technews@hq.acm.org.

Volume 5, Issue 451: Wednesday, January 29, 2003

  • "Internet Attack Shows Vulnerability of System"
    SiliconValley.com (01/28/03); Heim, Kristi

    The Slammer worm, whose attack over the weekend represented the most serious online assault in 18 months, demonstrates the Internet is still highly vulnerable. The worm infected defenseless machines, reproduced itself, and sent out large volumes of data traffic that disrupted many systems, including Bank of America ATMs, high-tech manufacturing, mortgage and credit card companies' Web sites, and police dispatch operations. More than 200,000 North American computers and between 400,000 and 700,000 global computers were affected by the worm, according to the Information Technology Information Sharing and Analysis Center. Asian businesses received some of the worst disruption, and both American and South Korean authorities are seeking out the worm's inventor, a task complicated by the hacker's unknown location. Slammer exploited an established security hole in Microsoft's SQL Server database software, for which a patch had been issued several times last year; the havoc caused by the worm proved that the patch was not installed by many users. Ensuring Internet security and keeping abreast of security patches as well as vulnerabilities is an almost impossible challenge, says Computer Security Institute director Patrice Rapalus. "If individuals or organizations are determined to exploit whatever kinds of flaws there are in the millions of lines of code for different applications, you have no real defense," she explains. The SQL flaw was one of several recent security embarrassments for Microsoft, including other vulnerabilities that were exploited by the Code Red and Nimda worms in 2001.
    http://www.siliconvalley.com/mld/siliconvalley/5048021.htm

  • "Users Uneasy On SBC Claim To Patent On Web Tool"
    New York Times (01/28/03) P. C4; Harmon, Amy

    SBC Communications sent out letters last week claiming it holds the patent on a widely used Web navigation method; it asserts that any Web site that has a menu that stays on the screen while a user looks through other pages may have to pay royalties--a claim that "would affect hundreds of thousands of sites," according to WebCreators CEO David VanderVeer. SBC's Jason Hellery says his company has outlined licensing terms of $527 to $16.6 million annually, with the cost dictated by licensees' yearly revenue. In response, Web developers have provided examples indicating that the Web navigation technology may have been in use prior to SBC's 1996 patent. Some have said that the method was introduced with the premiere of the Netscape 2.0 browser one year earlier. In recent years, companies such as SBC have tried to enforce patents on highly popular software operations, to the point where computer industry veterans contend that the patents do not adhere to the Patent and Trademark Office's rules that only "nonobvious" inventions can receive patents. "The thing that's disturbing to me is that all of a sudden so many groups seem to be going out and doing this," notes Marilynne Eichinger, founder of the Museum Tours site, which received one of SBC's letters. Certain economists are concerned that innovation could be stifled because of legal disputes over licensing. Others, however, claim that inventors are encouraged to innovate precisely because of the enormous profit potential of inventions that are widely accepted.
    http://www.nytimes.com/2003/01/28/business/28LEFT.html
    (Access to this site is free; however, first-time visitors must register.)

  • "Internet Worm Unearths New Holes"
    Washington Post (01/29/03) P. A1; O'Harrow Jr., Robert; Cha, Ariana Eunjung

    This past weekend's outbreak of the Sapphire worm demonstrates that the increasing linkage of computer systems to the Internet is creating unexpected vulnerabilities. The worm, which proliferated with remarkable efficiency using a well-known flaw in Microsoft server software, inundated computers with data and clogged network connections, causing, among other things, the shutdown of approximately 13,000 Bank of America ATMs, the blackout of a Seattle-based emergency call center, and the severing of Internet access for most South Korean computer users. This was in spite of the fact that a patch for the Microsoft security hole was issued last year. Even Microsoft suffered slowdowns thanks to the worm, because it failed to patch some of its own computers. "[The attack is] showing us the cutting edge of where people or organizations are becoming prematurely reliant on the Internet," says SecurityFocus' Kevin Poulsen. "It's showing us interdependencies we didn't know existed." The upcoming National Strategy to Secure Cyberspace is supposed to cover the study of such interdependencies, according to a person who has read the report. The Sapphire incident also highlights companies' failure to deploy security patches, although security specialists such as Roman Danyliw of the Computer Emergency Response Team (CERT) note that keeping up to date with new security flaws and network connections, as well as maintaining those connections, can be expensive and time-consuming for many companies.
    http://www.washingtonpost.com/wp-dyn/articles/A57550-2003Jan28.html

  • "H-1B Visa Awards Drop in '02"
    Computerworld Online (01/28/03); Thibodeau, Patrick

    The total number of approved H-1B visas in 2002 was 79,100, compared to 163,000 visas the year before. Bob Cohen of the Information Technology Association of America says this drop-off clearly proves that the market "is self-regulating--that companies are using the program in a the way that it was intended to be used." There is also a dip in the number of extensions the Immigration and Naturalization Service (INS) granted to H-1B holders as well as people who work for organizations exempt from the 195,000 visa cap: Last year they numbered 215,000, while in 2001 they numbered 342,000. The current visa cap is expected to be lowered to 65,000 in the 2004 fiscal year, even though technology groups will likely lobby Congress for an increase. In the meantime, a grass-roots movement of unemployed IT workers opposed to a cap increase is growing. The U.S. Bureau of Labor Statistics estimates that 94,000 computer scientists are currently jobless, and George F. McClure of the IEEE's Career and Workforce Policy Committee pegs the unemployment rate in that field at 5.1 percent. Foreign workers with H-1B visas "are all competing for the same small pot of jobs, and we don't think that is a good thing," he explains.
    Click Here to View Full Article

  • "Companies Test Prototype Wireless-Sensor Nets"
    EE Times (01/28/03); Johnson, R. Colin

    Four years after it was proposed by the Defense Advanced Research Projects Agency (DARPA), the wireless-sensor network concept has reached the prototype phase and is being tested by over 100 groups worldwide, according to David Culler of the University of California at Berkeley. The university and Intel collaborated on the development of smart sensors, or "Motes," that can self-organize into networks via TinyOS (operating system) and TinyDB (database). The impetus behind the original DARPA proposal was to create a "smart-dust" network in which thousands of wireless sensors are distributed over a battlefield, where they could collate and filter raw data, then transmit the most relevant information to central command; such sensors would enable soldiers to "see around corners" and detect chemical and biological weaponry long before they have to encounter them. The decision to release the Mote hardware and software as open source allows researchers around the world to build their own smart-sensor networks for military and civilian uses. Examples of the latter include environmental monitors as well as keeping track of elderly people's health without infringing on their freedom. Culler explains that a reliance on local rules is key to forming self-organizing networks, and the Motes can run these rules simultaneously, creating a hierarchical network. The Motes feature a modular configuration, allowing the separation and specialization of the sensor, the energy source, and the physical packaging; TinyDB, which is modular as well, handles data aggregation.
    http://www.eetimes.com/at/news/OEG20030128S0028

  • "FBI's Computer Upgrade Develops Its Own Glitches"
    Los Angeles Times (01/28/03) P. A1; Schmitt, Richard B.

    The Trilogy project, an attempt to upgrade the FBI's antiquated computer systems, has run into trouble since it was launched with the blessing of Congress. A source close to the matter says the project's original projected budget of $458 million will increase by 30 percent, which bureau officials claim is necessary in order to improve records management and information sharing, as well as guarantee security. Critics such as Sen. Judd Gregg (R-N.H.) see Trilogy as "a large disaster" typical of FBI technology projects plagued by delays and cost overruns. Despite skepticism, many FBI offices have been equipped with desktop computers, while FBI CIO Darwin John recently sent a letter to the inspector general in which he asserted that the bureau is on track to fix many of the management problems cited in a recent report. The report indicates that the project has been hampered by poor planning and an underestimation of how deficient existing structures really are--for example, the delivery of many desktop computers was delayed because fiber-optic cable had yet to be installed in certain field offices. Additional points of criticism from the inspector general include the FBI's failure to complete a system-wide installation of hardware and other equipment by its promised deadline of July 2002; the new target date is March 31, while the deployment of user-application software has been pushed back to June 2004. The inspector general's report finds that "The Trilogy project provides an example of how the nonimplementation of fundamental [information technology] investment management practices can put a project at risk of not delivering what was promised, within cost and schedule requirements." A bill passed by the Senate last week to cut the FBI's funding for high-tech projects by about $100 million could be another blow to the beleaguered Trilogy project.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "More Trouble Ahead for Moore's Law?"
    NewsFactor Network (01/27/03); Martin, Mike

    Technical complications arising from smaller chip sizes may short-circuit Moore's Law, according to electrical engineering professor Laszlo Kish, who teaches at Texas A&M University. He says thermal noise and a lower noise-tolerance threshold work against Moore's Law, which was proposed by Intel founder Gordon Moore in 1966. Since that time, semiconductor sizes have increased in density while decreasing in size so that today they are regularly manufactured at the 100-nanometer scale. Kish explains that as chip sizes decrease, less capacitance, or the ability to concentrate an electrical charge, leads to increasing thermal noise. That in turn puts pressure on designers who need to keep bit-error rates at a minimum. They can do so by increasing power, but that creates dissipation. On top of all this, Nancy Hantman of the IEEE says smaller wire sizes do not carry signals as well as larger ones. Kish says the combination of these factors will limit chip components to the 40-nanometer scale, which Moore's Law predicts will be reached in six to 10 years. Semiconductor companies are not oblivious to these problems and are devising work-arounds. Hantman notes that Dow Chemical, for example, is developing a "spin-on polymer deposition technique" to help signal transmission on small chips. IEEE Spectrum associate editor Harry Goldstein also points out that engineers have found ways to continually create ever-smaller chip sizes, and that with similar ingenuity it is possible to deal with resulting complications.
    http://www.newsfactor.com/perl/story/20571.html

  • "PSINet Europe Study Reveals Massive Vulnerabilities"
    Web Host Industry Review (01/28/03); Eisner, Adam

    Company networks and servers run the risk of being attacked randomly by hackers from the very day their Internet connections are established, while companies have not deployed the appropriate safeguards to shield their IT assets, according to a recent study from PSINet Europe. The study's findings were based upon the attacks that an exposed, defenseless "dummy test" server set up by PSINet Europe suffered over the holiday season. In all, the server was assaulted 626 times over the three-week testing period, and 460 intrusions took place within 24 hours of its installation. The PSINet study lists Western Europe and the United States as the most common points of origin for the attacks, which goes against the traditional belief that Eastern Bloc countries usually serve as "hacker hotspots." This suggests that companies should redirect their security initiatives to those areas. The study also finds that broadband or cable ISPs were the source of a significant number of attacks, indicating that rising capacity is widening the scope of hacker targets, techniques, and tools. The most popular European locations for hacker originators were Germany, Italy, the Netherlands, and Britain, while Russia, Bulgaria, and Romania barely registered. "The issue here is not just that firms aren't doing enough to protect themselves, but also that they are not spending enough time analyzing exactly what the threat is to their online presence, and what security measures would best serve to protect them," notes PSINet's Stephen Scott.
    http://thewhir.com/features/hackers.cfm

  • "What Next for the Internet?"
    Australian Technology & Business (01/22/03); Kidman, Angus

    The evolution of the Internet will be marked by diversity, and by developments that will serve business users and consumers alike, or offer advantages to one sector while negatively impacting the online experience of the other. Some 200 U.S. universities and industries support Internet2, a collaborative environment chiefly concerned with promoting Internet technologies that further America's dominion over the communications sector; these groups are focusing on real-time network collaboration, better videoconferencing, more predictable networks, etc. Future developments will require programmers to be solidly grounded in XML, even though end users take it for granted, while the value of XML-based standards such as Web Services Description Language will increase. Although Microsoft's Internet Explorer currently dominates the browser market, its reign could end with the advent of interfaces that respond to more basic user cues, such as speech. Concern is growing over the issue of Internet access and content being affected by political factors, as demonstrated by online censorship in China; meanwhile, a debate is brewing in the United States over whether content providers such as the entertainment industry should have the final say on the disposition of their digital works in order to curb online piracy, or whether that decision should be left to consumers--an argument that is likely to continue as broadband services become more affordable. Haptics technology, which allows people to interact with virtual environments via touch, is already finding use in the medical sector and could be employed in many training applications. Wireless Internet access is emerging in two forms: Local wireless access to fixed line networks via Bluetooth and Wi-Fi standards, and broadband services through 3G mobile phone spectrum; the rollout of the latter lags behind that of the former because lucrative 3G corporate applications are scarce, while security issues may be hindering the deployment of Wi-Fi services. Finally, LG Technologies' prediction that an Internet-enabled refrigerator will be the locus of all digital household activities may come to pass, if the appropriate infrastructure is set up.
    Click Here to View Full Article

  • "Bell Labs to Collaborate on Flexible Displays"
    Photonics.com (01/23/03)

    Bell Labs will team up with DuPont and Sarnoff to develop thin and flexible displays that use organic light-emitting diodes, in a project funded through the National Institute of Standards and Technology's Advanced Technology Program. Bell Labs researchers will create organic compounds from which they will build thin-film transistors (TFTs) that can be printed out on plastic panels from DuPont using design printing processes also developed by Bell Labs. Meanwhile, Sarnoff will develop the technology to enable full-color moving images on the displays. The ultimate goal of the collaborative effort is the creation of bright high-contrast display panels that can be refreshed rapidly, can be viewed from multiple angles, and even rolled up; their production costs should be lower than those for displays made from silicon and glass--some researchers believe they could be manufactured through cheap methods such as ink-jet printing--and there will be no need for backlighting. The flexible displays could be used in a wide array of products, including cell phones, video games, personal digital assistants (PDAs), and large-scale screens. Bell Labs' director of Materials Research Elsa Reichmanis says the convergence of multiple disciplines is key to the project's success. Director of Nanotechnology Research at Bell Labs, John Rogers, says the center's leadership position in the display arena was cemented by a joint venture between Lucent and E Ink that led to the development of flexible displays dubbed "electronic paper." Lucent products could also feature technology yielded from the current flexible display project.
    http://www.photonics.com/readart.asp?url=readarticle&artid=153

  • "File-Sharing Service Says Studios, Labels Misuse Copyrights"
    Los Angeles Times (01/28/03) P. C1; Healey, Jon

    Less than two weeks after U.S. District Judge Stephen V. Wilson ruled that American music labels can sue Sharman Networks, distributor of the software used by the popular Kazaa file-sharing service, for copyright infringement, Sharman has alleged that the entertainment industry is misusing copyrights for the purpose of monopolizing the digital content market. The Kazaa owner has asked Wilson to grant an injunction to block copyright enforcement until such abuse has been halted. Spokespersons from the Recording Industry Association of America (RIAA) and the Motion Picture Association of America dismiss Sharman's claims as an excuse to move the spotlight away from Kazaa's piratical operations; one RIAA spokesman describes it as "akin to an arsonist burning down his home and then seeking sympathy for being homeless." Electronic Frontier Foundation attorney Fred von Lohmann is skeptical that allegations of copyright misuse will ever be thoroughly examined in court--for one thing, claims are difficult to prove, and there are doubts that companies such as Sharman can afford the expense of finding and gathering evidence. Central to Sharman's claim is the entertainment industry's dealings with Altnet, a company that established a secure distribution network within Kazaa that promotes the downloading of sanctioned, copy-protected content. Prior to Sharman's launch in 2002, company founder Nicola Hemming became privy to Altnet's intentions, the counterclaim asserts; her plan was to distribute authorized, copy-protected works on Kazaa via Altnet technology as an alternative to piracy. Sharman attorney Roderick G. Dorman alleges that the copyright owners jointly refused to license Altnet, which violates antitrust laws. He further insinuates that the entertainment companies misused their copyright by supplying content to their own online ventures while shutting out Sharman and Altnet.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Developers Turn to Linux, Stunt Microsoft Growth"
    Reuters (01/26/03); Kennedy, Siobhan

    Last year, a number of large IT vendors lined up behind Linux, which analysts say portends tough times for Microsoft in the enterprise-level of business. LinuxWorld 2002 featured large showings from the likes of IBM, Hewlett-Packard, Dell, Sun Microsystems, and Advanced Micro Devices, along with the usual crowd of small Linux developers and service companies. Mike Petitjean operates a small software firm that provides services to local governments, and says he switched to the Linux platform because of its cost, reliability, and open nature. A number of organizations in government, academia, and financial institutions have also switched from Unix to Linux; Linux International executive director Jon "Maddog" Hall says groups began to seriously evaluate the business value of Linux in 2002. Goldman Sachs analyst Tom Berquist says Linux is taking market share from Unix in large data centers and in process-intensive applications, but that Microsoft dominates at the day-to-day operational level. However, he notes that Microsoft will come into increasing conflict with Linux as it moves into the higher regions of corporate computing, and as Linux gains acceptance and comes out with better department-level usability. International Data (IDC) currently pegs Linux as having 26 percent of the new server market in 2002, compared to about 42 percent for Microsoft and roughly 12 percent for Unix.
    http://www.reuters.com/newsArticle.jhtml?type=topNews&storyID=2111821

  • "Out of This World: NASA Tests Mobile IP in Space"
    Washington Technology Online (01/24/03); Jackson, Joab

    NASA is using the space shuttle Columbia's orbital mission between Jan. 16 and Feb. 1 to test a mobile Internet protocol in space, notes Operating Missions as a Node on the Internet (OMNI) program leader Jim Rash. The shuttle's onboard Linux-based computer is maintaining a live connection to the Goddard facility in Greenbelt, Md., and will relay signals to NASA satellites or ground stations. Aboard the shuttle is an embedded PC module equipped with a 233 MHz processor that features 128 MB of RAM and a solid-state 144 MB hard drive disk. In the course of the mission, the Goddard crew will have the opportunity to contact the shuttle computer via the IP protocol approximately 140 times. Once the shuttle establishes contact with the satellites or ground stations, the crew will use the computer to upload or download files, and perform administrative tasks remotely. The success of the test may allow the IP protocol to extend its scope to more of NASA's core communications, while NASA could save money by maximizing the use of commercial gear and software. The IP test is a component of a larger initiative to test a global positioning system-based transceiver, which is being conducted under the aegis of the Fast Reaction Experiments Enabling Science, Technology, Applications, and Research program.
    http://www.washingtontechnology.com/news/1_1/daily_news/19926-1.html

  • "Software Innovator David Gelernter Says the Desktop Is Obsolete"
    Application Development Trends Online (01/28/03); Vaughan, Jack

    Yale University computer scientist and veteran developer David Gelernter says he is now focusing on creating tools that make it easier for users to find "stuff" on their computers and otherwise improve the end user's computer experience. Gelernter says the mouse, icon, and windows metaphors are no longer able to manage the flood of information on most people's PCs. He says, "As email and the Web became a big thing, it was clear that the hierarchical file systems and tools we've inherited from the 70s would not work." To solve this problem, his company, Mirror Worlds Technologies, has released a beta version of Scopeware, software that runs atop normal desktop operating systems. Scopeware, available free via download, allows users to search for standard documents on their PC by keyword, but presents the results as a visual, time-sequenced narrative. Gelernter says the user should determine the presentation of information, not the machine. "I want my information management software to have the same shape as my life, which is a series of events in time," he says. "I want the flow to determine the shape of the picture I see on the screen." Gelernter says that future iterations of Scopeware could allow a community of users to share documents pertinent to them through peer-to-peer systems. Gelernter was instrumental in devising the parallel programming techniques that allowed for the Linda language; his work also laid the foundation for Java and distributed memory architectures. He says it's now time to create software "for the user as an everyday tool," not to meet the needs of code developers.
    http://www.adtmag.com/article.asp?id=7187

  • "Viruses Get Smarter"
    Computerworld (01/27/03) Vol. 37, No. 4, P. 21; Verton, Dan

    Security experts warn that computer viruses are becoming more subtle and sophisticated, as well as more numerous. Polymorphic programs are one emerging threat, an example being megaworms, which target multiple vulnerabilities and can propagate using multiple techniques. "Today, the line between worms and viruses is blurred as successful designs take on characteristics of both and spread over the Internet," explains Dan Ingevaldson of Internet Security Systems' X-Force Group. "The most successful worms act like a Swiss Army Knife, because they can spread by using many different proven methods, such as mass email, Web server vulnerabilities or peer-to-peer technologies." Sophos technology consultant Chris Wraight notes that such "combined cocktail threats" will be much harder to fight than previous worms and viruses. Virus writers will focus on exploiting new weaknesses, such as adding viruses to Windows New Technology File System (NTFS) files as linked, Alternate Data Stream (ADS) files. There are also signs that the goal of malicious code writers is shifting from data destruction to data theft. Furthermore, researchers such as Symantec's Vincent Weafer caution that Linux and Unix systems are not invulnerable, as demonstrated by the Linux Slapper worm epidemic last September. In addition, there are indications that virus and worm writers are studying Microsoft's .Net Framework and developing malicious programs that take advantage of vulnerabilities in the framework and associated executable files. Weafer says the best defense against these looming threats is to keep security up-to-date and eliminate redundant services.
    Click Here to View Full Article

  • "Simply Secure Communications"
    CIO (01/15/03) Vol. 16, No. 7, P. 100; Patton, Susannah

    Virtual private networks (VPNs) can be clumsy and arduous to set up and use, especially for IT managers; in response, vendors are offering alternative access tools based on Secure Sockets Layer (SSL) technology such as "instant virtual networks." Deployment of SSL VPNs is cheaper than that of IPsec VPNs, notes Aberdeen Group analyst Eric Hemmendinger, while Meta Group's David Thompson says SSL offers greater stability. However, employees that have to link to non-Web-enabled applications will need a client/server VPN version, and SSL systems may lack security. Catholic Health System (CHS) is using an SSL VPN so that clinicians can access medical records through Web browsers, while a similar connection was developed for the lawyers of Sonnenschein, Nath, and Rosenthal; CIO Andrew Jurczyk says the SSL system costs approximately 50 percent of an earlier, traditional VPN deployment. Other companies are outsourcing their remote access systems through providers, while screen-sharing technology is another option. Meta Group expects 80 percent of companies to access the Web via SSL by 2006. It is also expected that many companies will opt to integrate VPNs and VPN alternative technologies in the coming years.
    http://www.cio.com/archive/011503/et_article.html

  • "Hidden Pitfalls"
    InfoWorld (01/20/03) Vol. 25, No. 3, P. 1; Angus, Jeff

    Companies that rushed to implement packaged enterprise applications in an effort to avoid expensive internal system development are feeling the pinch of hidden costs. There are five outlined layers of potential hidden costs for packaged apps and project planning, that extend from within the project to outside it. The most obvious, most highly predictable costs are budgeted costs such as those for staff, software, hardware and operating system upgrades, testing, and consultants; the next layer includes implementation costs that a knowledgeable project manager should consider in scheduling, such as customization and tech support. The third layer covers tangential costs, including integration, software/hardware upgrades, and post-deployment training. The fourth layer contains external costs that still impact departments, such as a decline in effectiveness early on in the learning curve, and required changes to other systems. The fifth and final layer of cost, which is especially difficult to account for, involves the organization structure and superstructure, and can include costs derived from internal and external politics, missed opportunities, and responses to unmet project goals. Atlantic Guild Systems principal Tom DeMarco also points to an unavoidable hidden cost of packaged apps that will hit companies hard: Latent turnover, in which many developers, disenchanted by doing mundane maintenance chores because of an immense backlog of packaged application development that dates back to the late 1990s, will defect to other employers once the economy bounces back. Meanwhile, American Management Systems principal Dave Perkins foresees a defection of strategists fed up with an "idea backlog" stemming from the emphasis on packaged applications.
    http://www.infoworld.com/article/03/01/17/030120fepkgapp_1.html

  • "Digital Dilemmas"
    Economist--Survey (01/23/03) Vol. 366, No. 8308, P. 3; Manasian, David

    In the same way previous technological breakthroughs such as the railroad and automobile produced economic bubbles and then rapid societal change, we can expect computer and Internet technologies to vastly change our world in the coming years. Economist legal affairs editor David Manasian says the rapid growth of the Internet resulted in many unrealistic expectations, but the technology that powered it continues to advance unabated. Molecular electronics developed by companies such as Hewlett-Packard and IBM promise to extend Moore's Law for another 50 years, while Intel predicts current silicon-based technology will improve processing power for another 15 years. MIT computer science laboratory director Victor Zue expects broadband Internet connections to be nearly free in developed countries within five years. He is also working on some applications of these advances, especially MIT's Project Oxygen, which aims to enable computing to seamlessly integrate with our natural home and office environments. Tim Berners-Lee, Zue's colleague at MIT and a founder of the World Wide Web, is also pushing for a next-generation, intelligent Web called the Semantic Web. Microsoft researchers are working on technology that would let people digitally archive their entire lives' phone conversations, email, and pictures. As the current economy eventually pulls out of its depression, we can expect another wave of rapid change as society latches onto new applications afforded by the Internet and related technology. At that time government and business will be even more urgently pressed to deal with the resulting political and legal complications.
    http://www.economist.com/displayStory.cfm?Story_id=1534303

  • "Information System's Roles and Responsibilities: Towards a Conceptual Model"
    Computers and Society (09/02); Sagheb-Thrani, Mehdi

    The information system development (ISD) process could benefit from a conceptual model that outlines the relationship between the concepts of information systems' (IS) roles and responsibilities. The definition of IS relevant to the model is a set of reciprocally connected elements that gather or retrieve, process, store, and disseminate information critical to decision-making, coordination, control, and analysis in an organization; how the user of such a system is categorized can be important to the IS' "technical" and "psychological" success, as well as the concept of IS roles and responsibilities. An IS is considered user friendly if the user--anyone directly or indirectly affected by the system--spends more time doing actual work than struggling to operate the system, with a minimum of error. There are different ISes for the three general management levels: Transaction Process Systems (TPS) and Office Automation (OA) systems support the lowest level, operational management; Management Information Systems (MIS) support the middle management level; Decision Support Systems (DSS) support the middle and senior management levels; Expert Systems (ES) serve all three levels; and Executive Support Systems (ESS) support senior management. Also essential is determining whether the IS is defined as a user's subordinate, controller, or colleague--this definition can help people better understand the impetus behind ISD and successful IS deployments. Organizational factors influencing--and influenced by--IS roles and responsibilities include size, culture, politics, environment, operating procedures, and management decisions. Introducing an IS can wreak profound changes in an organization, leading to either centralization or decentralization. From the conceptual model, it may be concluded that there is a reciprocity between the concepts of organization, the IS, and IS type; the organizational concept influences the concept of the role of the IS; the concept of IS role impacts IS responsibility; the IS responsibility concept stems from the concept of IS role; and the concept of IS responsibility can help successfully implement an IS project.
    Click Here to View Full Article