Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to technews@hq.acm.org.
Volume 5, Issue 586:  Monday, December 22, 2003

  • "Offshore Jobs in Technology: Opportunity or a Threat?"
    New York Times (12/22/03) P. C1; Lohr, Steve

    The offshore outsourcing of U.S. jobs is seen by many as an inevitable and unstoppable trend on the road to a globalized economy, but projections from Forrester and other research firms about the volume of domestic jobs destined to migrate overseas are less alarming than they may seem. American workers have been fearing for their job security ever since Forrester disclosed a report last year predicting that 3.3 million U.S. services jobs would be transferred to lower-wage countries by 2015, with the IT industry at the vanguard of this offshore movement. However, analysts contend that the amount of money a company saves on a development project that employs cheaper overseas IT labor can actually be less than the company would save by relying on domestic IT labor: The analysis, design, and deployment stages of such projects usually involve face-to-face meetings, where cultural and communications barriers can lead to higher costs and reduced effectiveness. Furthermore, the Forrester report estimates that only 14 percent of the 3.3 million jobs expected to be outsourced by 2015 are in computer services, and this number is less cause for worry when one considers that around 3.5 million new private sector jobs have been created every year for the past decade. In addition, cutting technology services costs boosts efficiency and productivity while controlling inflation. Analysts think the IT workers at the greatest risk of losing their jobs to software development offshoring are the pure programmers, while the ones with the best job security are those who can solve specific business problems with their talent, and the move from coder to designer will be beyond the capabilities of some IT workers. Academics and research organizations believe the transition period will be less calamitous to the U.S. IT workforce if certain wage insurance schemes are enacted.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Growth of the Internet May Take Nothing Short of a Revolution"
    Wall Street Journal (12/22/03) P. B1; Gomes, Lee

    Upgrading the Internet so that 100 million U.S. households can access it at 100 Mbps--over 100 times faster than most high-speed home connections today--is the goal of the "100 by 100" consortium organized by four major universities and other research centers with a $7.5 million National Science Foundation grant. But achieving such a scheme hinges on a contentious debate about whether Internet technology should follow a "revolutionary" or "evolutionary" path. Carnegie Mellon University Professor Hui Zhang argues that an evolutionary strategy cannot be supported by the Internet's current fundamental design set: At the root of this problem is the increasing complexity of the network, which is mostly invisible to average users. Zhang says the complexity of the Internet's routers has become so great that only a small number of companies can build them, while maintaining the operations of a large-scale network is becoming a costlier and more difficult proposition. The outcome of the evolution vs. revolution debate will determine how these problems are handled, and while the current Internet functions on a "connectionless" scheme, there is advocacy for a future Internet with a "connected" architecture similar to telephone networks. Though Zhang thinks there are plenty of fiber-optic lines in the U.S. to support the backbone of even the swiftest Net, bridging the "final mile" between homes and the Internet is pricey and difficult, though new methods of using wireless communications could be helpful. Those who espouse the revolutionary Internet developmental path will also have to face networking companies' resistance to making technical changes that may hurt their competitiveness.

  • "Anti-Spam Law Just a Start, Panel Says"
    Network World (12/18/03); Caruso, Jeff

    A panel of experts at Network World's "Spam Forum: The Future of Email" agreed on Dec. 17 that the recently signed federal anti-spam law is a solid first step, but legislation alone is unlikely to significantly curb the glut of junk email clogging in-boxes. Eileen Harrington of the FTC's Bureau of Consumer Protection says the commission has too much to contend with--tracking identity theft and administrating the National Do Not Call Registry, for instance--to effectively enforce the federal statute, and also suffers from a lack of funding into anti-spam efforts; she adds that tight wallets have also made state anti-spam enforcement even less effective. The new law defines spam as unsolicited bulk email and allows political spam, which can be a problem to people who consider spam to be any undesirable messages. Furthermore, the law permits "opt-out" marketing, in which a company can send users unwanted advertising until the users request the company to stop. The panel was in agreement that legislation needs to be complemented by the dissemination of technical tools and education for marketing firms and email users, while one topic of discussion revolved around how to make spam a disincentive for marketers: Mary Youngblood of ISP Earthlink's Abuse Team noted that the ISP could employ technology to recognize spamming behavior and block traffic from a bulk emailer after a certain point has been reached. The panelists saw little potential in a no-spam list--Harrington said that such a measure could take a very long time to move from concept to implementation; Postini President Shinya Akamine opined that the registry would be a major target for hackers; David Silver of Responsys declared that such a list would require specific definitions of acceptable and unacceptable email; and Merrill Lynch's Wilson D'Souza argued that the measure would only encourage spammers to move their operations offshore, beyond the reach of the law.
    Click Here to View Full Article

  • "Little-Known Leaders Make Their Mark"
    SiliconValley.com (12/21/03); Gillmor, Dan

    A number of influential technologists, creative thinkers, and deal-makers move the machine of Silicon Valley forward away from the limelight: Stanford University Cyberlaw Clinic executive director Jennifer Granick, for example, recently scored a huge legal coup in getting the government to admit wrongful conviction of Bret McDanel, who had warned customers of a software company's product flaws; McDanel was convicted of a computer crime even though he had not actually done anything, a point Granick made clear in an appeals court brief. Her Cyberlaw Clinic operates a "Chilling Effects Clearinghouse" in conjunction with other advocacy groups. Broadband pioneer Dewayne Hendricks sits on the FCC's Technological Advisory Council and is also participating in an effort to bring Gigabit broadband links to every California home via wireless connections. Hendricks is convinced that the only major obstacles to such connections, which he says would render cable and telephone companies obsolete, are entrenched corporate interests and the lack of political motivation. Wireless entrepreneur Dave Sifry gets credit for the Technorati Weblog search engine that allows people to gauge new developments in the Weblog revolution; Sifry sees Technorati as providing a window into how information is developed from the bottom up through Weblog networks. Don Knuth, author of "The Art of Computer Programming," continues to earn accolades for his definition of "elegant" computer code. He is currently working on Volume 4 of his seminal and definitive work. Well-known open-source pioneer Linus Torvalds points to Brian Behlendorf, one of the creative forces behind the Apache Web server, as another not-so-well-known open-source leader; Behlendorf created the first Web sites for many companies, including the Wired magazine site, and is pleased the Apache Software Foundation is driving continued development of the Apache system.
    Click Here to View Full Article

  • "Mother Nature Recruited for War on Cyber Terror"
    NewsFactor Network (12/19/03); Martin, Mike

    The National Science Foundation (NSF) is sponsoring research at Carnegie Mellon University and the University of New Mexico with the goal of bolstering network security by applying the principle of bio-diversity to computer systems. The most successful diseases are those that attack a group of genetically similar individuals with a common flaw for the pathogen to exploit; David Hart of the NSF notes that this concept is echoed by worms and viruses that take advantage of the same vulnerabilities by computers running the same software. Bio-diversity is the biological countermeasure for disease, and the University of New Mexico/Carnegie Mellon researchers are exploring its computerized equivalent, dubbed cyber-diversity. "We are looking at computers the way a physician would look at genetically related patients, each susceptible to the same disorder," notes Carnegie Mellon's Mike Reiter, while colleague Dawn Song reports that the project aims to make computers less exploitable by automatically modifying software. Cyber-diversity forces hackers to formulate and implement a different attack strategy for each computer and software version. "Adapting the idea of diversity in biology to computers may not make an individual computer more resilient to attack, but it aims to make the whole population of computers more resilient in aggregate," Song says. Earlier software diversity efforts were too costly and time-consuming, because they focused on developing different versions of the same software by independent teams; the University of New Mexico's Stephanie Forrest says the automated strategy is potentially more economical and could yield greater software diversity. NSF program director Carl Landwehr declares that the cyber-diversity project is representative of the type of creative thinking the NSF's Cyber Trust initiative wants to engender among researchers.

  • "Software Shares Out Spare Processing Power"
    New Scientist (12/21/03); Graham-Rowe, Duncan

    SETI@home author David Anderson has developed a new system that will allow several distributed projects to be run simultaneously on a single computer, and enable users to apportion the amount of resources to be devoted to each project. The Berkeley Open Infrastructure for Network Computing (BOINC), which will be launched in January, is capable of running a number of screen-saver-type applications atop the PC's operating system. The software is particularly attractive to project managers as a measure to deter people from claiming their computers have processed more data than they actually have, because BOINC farms out each problem twice and checks the results. "If the answers are different we have to assume that one of those parties may have cheated," notes Anderson. BOINC is designed to shield users from computer viruses and certain varieties of cyber-attack via strong encryption, but Colin Low of Hewlett-Packard cautions that deploying such software still carries a certain amount of risk. "In fraudulent hands, a system like this could be used as a splendid and general mechanism for installing Trojan horse programs," he warns.
    Click Here to View Full Article

  • "PDA Translates Speech"
    Technology Research News (12/24/03); Patch, Kimberly

    A prototype two-way speech-to-speech translator of Arabic to English and vice-versa that runs on a personal digital assistant has been developed by researchers at Carnegie Mellon University, Cepstral, Mobile Technologies, and Multimodal Technologies, demonstrating that automatic translation via portable device is possible. CMU computer science professor Alex Waibel says the project is part of a series whose goal is to arm the military with automatic translation for medical and force protection scenarios, as well as provide tourists at the 2008 Olympics in Beijing with similar products. The Speechalator, as the tool is called, features a built-in microphone and a language-selection button, while its software is comprised of a speech recognizer, a translator, and a speech synthesis engine, all of which come with certain modifications so that the device can handle spontaneous speech; Waibel says two-way translation can only be facilitated with two sets of each software component. In addition, the device can translate text such as street signs through a camera attachment. So far, the Speechalator must be switched between English-to-Arabic and Arabic-to-English translation modes, boasts an 80 percent accuracy rate under laboratory conditions, and can only handle people speaking about medical information; BBN Technologies researcher Bernard Suhm says the device could be applied to business and travel, but still needs to be subjected to field tests. Waibel says the next development phase is to boost the Speechalator's accuracy so ambient noise does not interfere, widen its coverage by gathering more data about how people communicate in different domains, and create learning algorithms that automatically catalogue different ways of stating the same things. A follow-up prototype that can be used for hotel reservations and medical situations is expected to be ready by next summer.
    Click Here to View Full Article

  • "CERT/CC: Racing to Secure the Internet"
    Web Host Industry Review (12/19/03); Epperson, Wayne

    The CERT Coordination Center at Carnegie Mellon University promotes effective Internet security by employing a cadre of security experts to consult, test product vulnerability, coordinate training courses, and take on other roles in order to win the race between those dedicated to maintaining the security and trustworthiness of networks and malicious parties attempting to undermine the networks. CERT/CC team leader for incident handling Martin Lindner says the exchange and dissemination of data is a major challenge, as issuing network vulnerability advisories notifies both the good guys and the bad guys of the flaw's existence. Every day CERT/CC faces "the challenge...to come up with better ways of sharing and disseminating the information so that we defend the best interests of the critical infrastructure," he explains. Lindner notes that a new virus launch is followed by the provision of malware by an infected user to an anti-virus company, which then has the job of developing and releasing a signature to shield against the new virus quickly. He points out that every major worm of the last couple of years spread despite the fact that CERT/CC or some other security agency issued advisories calling attention to the software bugs the viruses exploited. Lindner adds that CERT's job has become even harder with the emergence of Web developers, who have their own security issues; "When we talk about needing to educate the developers, it is not just the software developers, it's the Web developers, too," he says. Lindner says network intruders have little incentive to innovate their attack strategies or tools because there are so many systems that do not deploy patches for known software flaws. Almost 3,000 network vulnerabilities were reported to CERT/CC in the third quarter of 2003, while 114,855 security incidents were reported over the same period.
    Click Here to View Full Article

  • "Linux Revolution: Asian Countries Push Open Source"
    Linux Insider (12/17/03); Krikke, Jan

    China's promotion of Linux open-source projects is setting the pace for much of Asia, where announcements of collaborative, government-driven Linux efforts have become a daily routine. The goal of these initiatives, the latest being the Japan-China-Korea (JCK) partnership, is to build a regional software industry where manufacturers do not have to pay royalties on foreign, proprietary products. JCK, which aims to devise open-source business models, train software engineers, and standardize software, splits labor up between the three collaborators: Software development and security will be Japan's responsibility, China will handle the development of PC operating systems, and Korea will create software for personal digital assistants; the Japanese IT Services Industry Association, the Chinese Software Industry Association, and the Federation of Korean Information Industries will coordinate the JCK initiative. A Japanese JCK representative emphasized that the three partners do not like being reliant on software they cannot control, while the People's Daily of China proclaimed "The monopoly of foreign office software in the Chinese market will be broken." The Asian Linux server market will also get a shot in the arm from the JCK partnership. The Asian adoption of Linux is partly spurred by worries about spyware: The Chinese military, for instance, embraced Linux out of fear that sensitive information could be compromised by security leaks. In addition, almost all major Western IT giants want to capture a piece of the Linux market through Chinese partnerships. The Chinese software market is expanding at an annual rate of 20 percent to 25 percent, and is expected to be worth approximately $30 billion in 2005.
    Click Here to View Full Article

  • "Building a Blueprint for Network Security"
    EarthWeb (12/17/03); Rubens, Paul

    Paul Rubens outlines a roadmap for securing a corporate network, and recommends that the patch management policy be evaluated first of all, as Gartner Group estimates that about 30 percent of damage to networks is attributable to tardy patching. The same study finds that 65 percent of intrusions stem from system misconfigurations, but the author writes that IT managers who are pressed for time can hire penetration testers to look for major network vulnerabilities; he also suggests IT managers conduct internal network security testing. An effective security architecture is a wise move--and it is also required by regulatory and fiduciary responsibilities. In the end, the architecture is the responsibility of a company's head of information security, and it must clearly designate roles, duties, and a policy framework across the entire hierarchy. Dividing the architecture into domains with individual security needs maximizes risk management, according to Rubens. The security architecture's effectiveness comes out of the division of responsibility between internal staff and outside consultants: The safety of mission-critical elements requires the involvement of internal resources at all levels, while consultants can guarantee that the company has an up-to-date risk management strategy. CERT advises companies to use the Operationally Critical Threat, Asset, and Vulnerability Evaluation (OCTAVE) method, in which IT department staff and operational or business unit personnel formulate a security strategy based on employee knowledge. Rubens writes that effective security stems from the incorporation of a security culture in which all employees are aware of security measures and realize that they are valuable, not an inconvenience. He concludes that eternal vigilance is the price of good network security, so safety measures must be constantly re-assessed.
    Click Here to View Full Article

  • "Sun Invites IBM, Cray to Collaborate on High-End Computer Language"
    EE Times (12/16/03); Merritt, Rick

    Sun Microsystems wants to work with IBM and Cray on developing a low-level computing language that would be used for the petascale-class computer project, as well as for a broader group of scientific and technical computers. Sun says the Defense Advanced Research Projects Agency favors collaboration on the high-end computer language, rather than competition. The government is sponsoring a program, the High Productivity Computing Systems (HPCS) project, that has the companies competing to design petascale systems by 2010. Sun's Jim Mitchell says, "We think languages are one area where the three of us should cooperate, not compete." In using an architecture-independent, low-level software standard like Java bytecodes, the language would be able to work with the run-time environment of any computer. Sun views the Portable Intermediate Language and Run-Time Environment as a potential open industry standard. IBM and Cray could agree to collaborate with Sun on the new technical computing language, which offers performance benefits to scientific and technical computing. However, the companies' work on the project includes software plans that involve developing new languages and operating systems. Both IBM and Cray have yet to comment on Sun's offer.
    Click Here to View Full Article

  • "Smaller, Lighter Power Adapters Take the Weight Off Laptops"
    Penn State Live (12/16/03)

    Penn State researchers, in conjunction with Japan's Taiheiyo Cement and Face Electronics of Virginia, have developed a piezoelectric power PC adapter that is one-fourth the size of current adapters. Penn State electrical engineer Kenji Uchino notes that these smaller, lighter adapters are primarily geared toward the laptop or notebook computer market, but says the devices could be applicable to any appliance requiring an AC to DC converter and transformer. Printers, tape recorders, CD and DVD players, and other devices that run on both battery and wall plugs fall into this category. "Eventually we would like to make [the adapter] the size of a pen, but that is far away," explains Uchino. "When we can do that, the adapter will be a component of the laptop, not attached to the cord as a separate piece." Most laptops need approximately 15 volts of direct current with less than one amp of current and around 12 watts of power; a piezoelectric chip's length and width can be manipulated to convert 115 volts to 15 volts. Unlike conventional electromagnetic transformers, piezoelectric power adapters generate no heat or electromagnetic interference, and because the devices operate in the ultrasonic range, no noise is audible to humans. The piezoelectric transformer Uchino and his collaborators devised was detailed in The Proceedings of the 5th International Conference on Intelligent Materials.
    Click Here to View Full Article

  • "Embedded Linux: The New Home-Grown RTOS"
    Software Development Times (12/15/03) No. 92, P. 16; Correia, Edward J.

    Tight budgets and shorter project schedules are encouraging companies to adopt Linux as their embedded RTOS rather than build the operating system from the ground up. Wind River Systems' Dave Fraser notes that the challenges associated with developing more sophisticated software cancel the feasibility of organically-built systems software; in addition to easing RTOS development, Linux offers standardized interfaces, a free OS kernel, widely available source code to help facilitate customization and debugging, and a global development community that can be tapped for expertise, software, and drivers. A recent survey of embedded developers conducted by Venture Development concluded that almost 85 percent of current Linux users and nearly 70 percent of those planning to use Linux turn to multiple vendors for their OS and tool needs. Fraser reports that one thing developers like about Linux is the opportunity to become part of the Linux community, while engineering teams value what Linux can do for their resumes. But it pays to keep several factors in mind: For one thing, the security of Linux's memory complicates debugging, and developers must be careful to build competitive advantages under the GNU General Public License that do not have to be publicly disclosed. Another problem is the possibility that the use of tools from multiple vendors to solve problems can spawn additional difficulties. LynuxWorks CEO Inder Singh remarks that Linux is very appealing to former "roll-your-own" developers because they like to tinker. "Linux has traditionally been something you put together yourself, so people who want to do their own things [can use] Linux as a starting point," he explains; in addition, Linux not only gives do-it-yourselfers a leg up at project commencement, but can help them maintain and upgrade the RTOS later.
    Click Here to View Full Article

  • "The Next 25 Years of IT"
    InfoWorld (12/15/03) Vol. 25, No. 49, P. 54; Yager, Tom; Udell, Jon; Dickerson, Chad

    InfoWorld writers debate what IT advances may emerge in the next quarter-century: Tom Yager contends that only a small part of the path toward pervasive computing has been traversed thus far, but believes the model of tomorrow's pervasive systems lies in wireless phones with versatile, multimedia functionality, ease of use, and standardized technology; the next step includes reducing the cost of these devices, extending their communicability beyond carriers' coverage areas, and embedding cheap, minuscule, efficient, and adaptive cores wherever they might offer practical benefits. Chad Dickerson predicts that a transformation in the IT workforce will take place as outsourcing, increasing network speed, and seamless IT integration make location irrelevant, technology transparent, and information capable of being stored anywhere and transmitted everywhere within seconds. He forecasts that the home will become the new workplace, with IT professionals working for outsourcers and using collaboration tools that make commuting obsolete. Jon Udell has seen little progress toward AI researchers' goal of endowing computers with intelligence by making them capable of speech and comprehension, but points out that computers that mimic intelligence by using human behavior as a template is a much more realistic goal. He writes that adaptive Bayesian spam filters are a sign of innovations to come, starting with technology that can monitor and aid with personal information management tasks, collaborative efforts, and Web services network transactions; after this should come assistants that utilize voice and video, followed by the fulfillment of Apple's Knowledge Navigator vision. Steve Fox believes chipmakers will reach the physical limits of silicon transistors in 15 to 20 years, after which a biologically-inspired replacement for silicon will emerge. He notes that DNA computing, neural networking, and other biocomputing initiatives promise to tackle problems current computers are not equipped to handle, such as massively parallel operations and self-healing.
    Click Here to View Full Article

  • "A Net of Control"
    Newsweek (12/22/03) Vol. 142, No. 25, P. E26; Levy, Steven

    The Internet will eventually become a tool of government and corporate interests, used to enforce censorship, monitor the populace, and suppress creative freedoms. Though the Internet continues to foster free expression and allow unprecedented access to information now, updates to its technical infrastructure promise to clamp down on that free flow. Digital identifiers required for e-commerce and other activities, and "trusted computing" efforts from the likes of Microsoft and chip vendors could turn the Internet into the ideal tool of Big Brother. Autodesk founder John Walker, now retired in Switzerland, in September published a 28,000-word document titled "Digital Imprimatur" after the idea that future Web publishing will be subject to censors' approval. Walker warns that trends in law, technology, and commerce point toward an Internet tightly controlled by government and business; technologies and laws meant to prevent e-commerce fraud, unsavory Web activities such as child porn and spam, and even terrorism would also eliminate Web anonymity. The new "trusted" system would verify everyone's identity to allow the e-commerce dream of microtransactions and would prevent viruses from infecting computers. Microsoft is working on such a project to be incorporated into its next Windows operating system, code-named Longhorn. The so-called Next Generation Secure Computing Base scheme, formerly named Palladium, would work together with secure chips from Intel and AMD to vet programs before they cause trouble on the user's machine; while preventing hacker programs from executing, the system would also allow corporate interests to enforce inescapable digital rights management controls. It would be possible to disallow access to censored information and require permission before uploading anything to the Web. Ironically, some Palladium developers have already forecasted the emergence of "dark nets" of under-the-radar Web activity where enclaves of users trade information freely.
    Click Here to View Full Article

  • "Inventing a Better Patent Law"
    Business Week (12/22/03) No. 3863, P. IM5; Reinhardt, Andy

    A debate is raging in Europe about whether software patents should be allowable: Those in favor argue that Europe risks damaging its global competitiveness by not permitting such patents, while opponents counter that innovation would be suppressed and small- and medium-sized businesses would suffer. Andy Reinhardt writes that the issue is prompting a re-evaluation of software patents in the United States, where tech leaders are vociferously complaining that the system is out of control and threatening to strangle creativity. The European Parliament's recent passage of a draft law prohibiting all software patents incited a backlash by trade groups, who claimed that such a provision would leave inventors unprotected and deprive them of rightful remuneration for use of their work; this in turn spurred the European Council of Ministers to reject the legislation until certain inconsistencies were ironed out. Though Reinhardt notes that software-patent critics have a certain justification for their opposition, he does not think banning all such patents outright is an effective solution: Not only would it bring Europe into conflict with the United States and Japan, but it would constitute a breach of World Trade Organization regulations and annul thousands of software patents already approved by the European Patent Office. The author believes people on both sides of the software-patent issue should de-emphasize rhetoric and concentrate on reaching a compromise. Patent critics must realize that a total ban could have a detrimental effect on many European and global industries. There should also be an effort to retain components of existing European patent laws--such as the Paris Convention of 1883--that could prevent American-style problems from cropping up. And finally, Reinhardt recommends that Europeans develop a "smart and fast" patent process that allows rapid disclosure of applications, quick approvals, and plenty of opportunities for challenges after the initial grant.

  • "Intel's Tiny Hope for the Future"
    Wired (12/03) Vol. 11, No. 12, P. 236; Koerner, Brendan I.

    Intel expects that its investments in UC Berkeley's smart dust project will enable the chip giant to rule the high-volume sensor market, since the technology is designed to support ad hoc networks of tiny wireless sensors that, in their idealized form, can be deployed anywhere. The payoff of a sensor network explosion for Intel is twofold: Not only will it generate greater demand for silicon, but more high-end PCs will be needed to process the vast volumes of data produced by the networks. Projects employing sensor "motes" underway at Intel Research Berkeley include the planned deployment of about 200 motes on the Golden Gate Bridge to assess structural integrity; a distribution of 80 motes in Sonoma County's redwood forest to monitor temperature and humidity; and a mockup of a futuristic residence that utilizes wireless sensors to assist in the day-to-day activities of Alzheimer's patients. Intel does not hold exclusive rights to the mote technology under its collaborative research agreement with UC Berkeley, a policy that extends to all of Intel's academic research facilities, or "lablets." The TinyOS operating system that serves as the motes' core component is being used and tweaked by hundreds of people to which it is available for download on an open-source software development Web site. Technical obstacles that must be conquered if sensors are to flourish include devising a way to program sensor networks to eschew junk information and only relay vital data; creating a more sustainable battery for the motes; and enabling motes to operate consistently without frequent maintenance. The ultimate goal of Intel's smart dust investment is to shrink motes down to rice-grain proportions by 2011, by which time the average price per mote should be about $5. The mote initiative, along with research projects into computer languages, wireless personal area networks, optical switches, and other technologies, is part of Intel's push to diversify its customer base as demand for desktop processors slackens.
    Click Here to View Full Article

  • "IT Planning: Cultivating Innovation and Value"
    Syllabus (11/03) Vol. 17, No. 4, P. 10; Norris, Donald M.; Keehn, Anne K.

    When Nicolas Carr suggested in Harvard Business Review that IT has been commoditized to the point where its value as a strategic differentiator is practically nil, many opponents argued that IT can lead to competitive advantage if enterprises can leverage collaboration and innovation to fundamentally change organizational dynamics; the same argument can be applied to colleges and universities, which rarely integrate even successful innovations throughout the institutional culture. Higher education's potential for enterprise-level innovation will be lost if universities continue to follow a model in which innovative learning experiences, cost reductions, and other stakeholder value propositions are limited to individual faculty, courses, and labs. A better solution is the development through teamwork of scalable, enterprise-wide innovations such as the kind implemented by the British Open University, the University of Phoenix, and UMassOnline, among others. IT planning is primarily used by institutions to extrapolate more efficient versions of current practices for every five years, when it could be harnessed as a strategic tool for concentrating and mustering enterprise-level innovative potential; IT planning should be put into continuous use and act in a regenerative capacity, engaging all echelons of campus leadership. Its objective should be the development of stretched goals, an innovation-supportive culture, and the architecture for making sensible, expeditionary decisions about technology options and uses. Leveraging innovation and creativity in IT usage will allow higher-education institutions to effect a migration from data-centric to process-centric perspectives, and a shift from provider-centric to stakeholder-centric value propositions. Institutions must commit to using enterprise resource planning, learning management systems, next-generation content management, enterprise portals, and Web services applications to enable personalized value propositions, facilitate cost reductions, and implement process restructuring across the board. Tremendous potential exists in the concept of a learning grid, in which knowledge exchanges are set up online to provide learning materials to universities on an as-needed basis.
    Click Here to View Full Article