Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to technews@hq.acm.org.
Volume 6, Issue 593:  Monday, January 12, 2004

  • "Congressional Leaders Promise Action on Tech Issues"
    CNet (01/10/04); Becker, David

    A panel of eight U.S. senators and representatives promised attendees at the recent Consumer Electronics Show that lawmakers will take coordinated action on some of the major issues the technology industry is concerned about, but cautioned those attending not to have unrealistic expectations. Panelists noted that it is a formidable challenge to balance the encouragement of free international trade with the prevention of U.S. job migration to offshore regions, but Rep. Steve Buyer (R-Ind.) warned that free-trade Republicans should not ignore the very real threat offshore outsourcing represents. Legislators expressed their support against taxing the Internet, and Sen. George Allen (R-Va.) advised lawmakers to be on their guard for attempts by state and local authorities to collect broadband service fees, while Sen. John Sununu (R-N.H.) stated that those same authorities are ravening for a piece of the emerging market for Internet-based telephone service--in fact, he believed defining voice over IP and shielding it from taxation will be the chief policy issue on Capitol Hill over the next several months. There was general consensus among the panel that solving file-swapping and other problems should be the duty of the media industries, and Sununu agreed with others that the Recording Industry Association of America's legal assault on alleged music file traders goes against the spirit of the Digital Millennium Copyright Act and other U.S. copyright laws. Rep. Joe Barton (R-Texas) declared that it is the responsibility of content creators to devise business models that address contemporary technology and outlooks.
    Click Here to View Full Article

  • "Kazaa Delivers More Than Tunes"
    Wired News (01/09/04); Zetter, Kim

    Bruce Hughes of TruSecure discovered that 45 percent of almost 5,000 executable files downloaded via the Kazaa file-sharing program over the course of a month contained malware: The malicious code manifested itself as viruses designed to infect all files in a user's Kazaa download directory, or code that steals the user's AOL Instant Messenger password or embeds Trojan horse programs enabling hackers to hijack machines to use as spam launching platforms or to raid their files and personal information. Hughes says the malware ended up in shared files because the individual hosting the shared file purposefully inserted the malicious code, or the code was a peer-to-peer worm programmed to search the network for exploitable download directories, or the code was designed to automatically contaminate other files in the user's file-share directory upon downloading. Hughes selected the files he downloaded on Kazaa based on popular keyword searches using such terms as "Britney Spears," "nude," "porn," and "Microsoft XP." He reports that a great deal of the malware he found turned up in program files designed to circumvent or crack copyright safeguards placed on software files such as Microsoft Office that permit users to share pirated copies of the software. The Wild List estimates that Kazaa experienced a 133 percent in the number of types of viruses circulating throughout it in 2003. Hughes notes that music, pictures, and movie files' lack of executability has kept them immune from malicious code so far, but hackers could eventually devise a method to infect such files. He predicts that the amount of malicious code that is posted on purpose and unwittingly shared on peer-to-peer file sharing networks will rise this year, but notes that anti-virus software can detect 80 percent to 95 percent of the malware on Kazaa; unfortunately, people are lax when it comes to updating such software's current virus definitions.
    Click Here to View Full Article

  • "It Takes a Thief"
    Wall Street Journal (01/12/04) P. R5; Fong, Mei

    There has been an increase in the number of schools offering courses in hacking and network penetration techniques, so that companies can bolster their defenses against such intrusions. "If we want to improve computer security, we have to teach how attacks work, how viruses work," contends Counterpane Internet Security founder Bruce Schneier. Some courses cover hacking fundamentals, such as cracking passwords and spying on data as it passes through the Internet; others emphasize the exploitation of human weakness, such as sifting through an organization's garbage to find nuggets of useful information such as passwords and where employees live; still others focus on ethical hacking. Foundstone, which claims to have 3,500 students annually, offers courses oriented around structured query language injection attacks, whereby intruders manipulate data on an e-commerce Web site. Meanwhile, the University of Calgary has attracted controversy for offering a course that teaches virus- and worm-authoring methods, the purpose of which is to help students design more effective anti-virus and anti-hacking strategies, insists computer science department head Ken Barker. Critics condemn such courses as opening up the potential for more harm than good: "We've got enough idiots out there writing viruses without training any more," argues David Banes of the MessageLabs security-software firm. Also fueling opponents are the sometimes questionable credentials of the instructors--while some are well-regarded security professionals, others include notorious hackers such as Kevin Mitnick, who did time for his activities. E2 Labs Information Security in India offers a course in hacking co-authored by a teenager.

  • "Privacy Progress at Homeland Security"
    Business Week (01/08/04); Black, Jane

    Terrorism experts complain that the Homeland Security Department's U.S. VISIT program, which requires foreign nationals entering and leaving the United States at airports and seaports to be digitally fingerprinted and photographed, will do little to prevent terrorists from penetrating the country and committing terrorist acts--yet little criticism has been raised about the program's privacy implications. U.S. VISIT's launch coincided with the debut of the Homeland Security Privacy Office's Privacy Impact Assessment (PIA), an outline of the initiative's privacy policy that also charts the movement of data from department to department. The PIA covers five basic issues of data security--who can access the data, security protocols, how and with whom the data can be shared, how long the information will be retained, and how to remedy errors. Ari Schwartz of the Center for Democracy and Technology calls it "a really positive sign" that Homeland Security is willing to disclose so much about the program to civil libertarians; the department's chief privacy officer, Nuala Kelly, says a civil-liberty-based evaluation program is critical: "We try to encourage all programs to assess privacy for any new substantial change that affect citizens and noncitizens." Though U.S. VISIT's privacy policy currently states that the data in the Arrival & Departure Information System would be retained for 100 years, Kelly believes the term of retention will significantly fall. She is also aware of the threat of mission creep, in which data collected originally for one reason is exploited for another, more deceptive, purpose. "We're currently working on documents that will be handed to people at the time they provide their data to inform them of the full range of ways their data may be used," she explains. Despite the unusual level of transparency the PIA offers, privacy proponents still feel the redress protocols need work, and want more timely responses from Homeland Security officials to their concerns.
    Click Here to View Full Article

  • "Top Networking Technologies for 2004"
    NewsFactor Network (01/09/04); Long, Mark

    Positioning technologies behind corporate firewalls, switching to more scalable servers, and carrying out global services assessments are just some of the technological trends expected to characterize the buildout of next-generation IT networks this year. Intel Communications Group CTO Tim Dunn is convinced that wireless will eventually become embedded in all aspects of corporate operations, foster a shift in corporate users' work styles, and bolster network security. Placing wireless behind the corporate firewall was facilitated by Intel's widescale adoption of its Centrino mobile-chip technology in company laptops and personal digital assistants. Establishing an online link for wireless users through a formal IT program was also an important development, and Intel's Daniel Francisco says that an internal study estimated that program participants experienced a daily increase of 23 minutes. Meanwhile, Doug Oathout of IBM reports that IT managers implementing "pay-as-you-grow" servers "need only buy the right scalable technology upfront and then add more capacity as they actually need it." Avaya's Lawrence Byrd suggests IT managers use a global assessment service before they implement new technologies in order to ensure that their networks will work properly, and adds that 2004 will likely witness a significant increase in demand for speech-based servers boasting Voice XML. Voice XML will allow customers to contact enterprises directly over the Internet via speech commands, as well as enable staffers to employ the tools internally to retrieve corporate email over the phone. Dense "server blades" are also growing in popularity because they offer such a high level of integration; Jeff Benck of IBM's eServer Blade Center notes that blade servers are space-efficient, simple to implement, and permit a flexible infrastructure to be established without a lot of cables.
    Click Here to View Full Article

  • "New Software Has 'No Boundaries or Rules'"
    Associated Press (01/08/04); Konrad, Rachel

    "No Boundaries Or Rules" (NBOR), the product of 10 years' work by composer and musician Denny Jaeger, is software that boasts an intuitive user interface for numerous PC tasks, including writing, drawing, and compiling multimedia presentations. NBOR transmits large files over the Internet very rapidly and facilitates real-time collaboration; its core component is Blackspace software, which handles word processing, graphics, desktop publishing, slideshow presentation, audio, photo cropping, drawing, animations, instant messaging, and real-time conferencing. Once Blackspace is opened, users can access a blank canvas where text can be arranged into complex visual displays via the mouse, without any need for pull-down menus, tabs, fonts, icons, and margins of other conventional word processing systems. Canvases can be preserved as garden-variety document files as well as symbols; creators merely have to send the symbol, not all the data, over the Internet. A recipient outfitted with NBOR can rebuild the entire file in his own Blackspace by just clicking on the symbol. This ability is provided by 500,000 lines of computer code that Jaeger and eight overseas developers labored over for two years. The distributor of the NBOR software, also called NBOR, will initially market the product to education and small businesses, according to NBOR Chairman John Doyle. NBOR runs on Windows PCs, with Linux- and Macintosh-enabled versions expected later this year. A free demo of NBOR will be available for downloading when the software officially debuts on Jan. 15.
    Click Here to View Full Article

  • "MIT Media Lab Launching Consumer Electronics Push"
    EE Times (01/09/04); Wallace, Richard

    MIT's Media Lab on Jan. 9 announced the launch of a consumer electronics research program as part of a collaborative effort between the facility and the Consumer Electronics Association. MIT Consumer Electronics Laboratory director V. Michael Bove pointed out that accessing technology from the center has long been a difficult proposition for small companies, and claimed that the new initiative "will make it easier for these companies to develop applications from the lab's technology." The program will reportedly concentrate on setting up close hands-on collaboration with industry to develop new kinds of devices and explore new ways to think about consumer services and products. A promotional brochure at MIT Media Lab's booth at the Consumer Electronics Show states, "The overarching theme is one of co-evolution and design principles and technological discoveries, resulting in simple, easy- and delightful-to-use devices that know a great deal about one another, and the people in their proximity...all conducted by a Lab full of people who simply don't know the meaning of the phrase, 'It can't be done.'" The research will emphasize five areas of technology at first: New power technologies, such as wireless, parasitic, and self-generated power sources; innovative sensors, actuators, and displays; smart devices and ecosystems capable of self-management; new material and design/fabrication techniques; and cooperative wireless communications.
    Click Here to View Full Article

  • "Internet 6.0"
    Technology Review (01/04); Garfinkel, Simson

    The new Internet Protocol Version 6 (IPv6) is quickly taking hold in Asia but faces a convoluted adoption path in the United States: For most U.S. individuals, businesses, and even government offices, IPv4 continues to be more beneficial, especially since they have far more IP addresses available than the more numerous Asian users. Moving to IPv6 entails rewriting applications from the enterprise level to the desktop and expensive upgrades to network hardware; backbone network routers use expensive chips specialized to work with IPv4 data packets, and would likely run so slowly with IPv6 that they would require upgrades. Because of the massive conversion, some users and components with only IPv4 capability would be left behind by any wholesale switch to IPv6. A new Internet protocol is also sure to yield serious opportunities for hackers, even if IPv6 mandates IP security cryptography. Many U.S. homes and businesses now employ network address translation (NAT) technology that hides the private addresses of machines behind the firewall and translates data coming in and out of the perimeter; although this technology seemingly obviates the need for more addresses because entire offices can share a single IP address, it causes more insidious problems that Internet pioneers warned of years ago. Hackers and virulent software can penetrate the perimeter NAT defense more easily with the proliferation of Internet-interfacing devices, applications, and wireless networks. Once in, those intruders can launch attacks on the outside Internet under the cover of the NAT firewall. NAT also causes problems for peer-to-peer networks and multimedia groupware applications. In the future, IPv6 will become a standard around the world sans an IPv4-using United States, which will employ ISP-level gateways to adjust for the difference in protocols; the divide will be similar to the English/metric one, and efforts to change U.S. protocol will be as fruitless as plans in the 1970s to convert highway speed limit signs to kilometers per hour.
    Click Here to View Full Article

  • "Tapping the Grid"
    Astrobiology (01/12/04)

    The SETI@home distributed computing project is the largest grid network in the world with 4.7 million volunteers offering spare computing cycles on their PCs; but perhaps the most important achievement of the project is not the 1.6 million years of computer processing time already logged, but the ability of just six staff to manage about 0.1 percent of the world's entire computing capacity. Dr. David Anderson of the University of California, Berkeley, Space Science Laboratory is project leader for SETI@home and says the distributed computing idea faced a lot of resistance when conceived in 1994 by then-graduate student David Gedye. The advantages of distributed computing were clear as PC processing power was increasing at breakneck speed: Whereas in the past, computers struggled to keep up with processing tasks, an abundance of desktop computing power meant PCs remained idle most of the time. Anderson notes that typing 90 words per minute in Microsoft Word requires just a small fraction of what a slow Pentium chip is capable of. He also says the SETI@home project is successful because of social reasons as well, such as the software's ability to track individual contributions. The current worldwide rankings of contributors is dominated by teams whose members actively recruit other PC users. Besides ramping up the second version of SETI@home, Anderson also continues to develop the Berkeley Open Infrastructure for Network Computing (BOINC), which is a basic distributed computing framework other scientific programs can use. Other projects looking to use the BOINC framework include new gravity-wave detection projects; Anderson says for-profit distributed computing groups such as United Devices have turned to helping corporations set up private grids, and that scientific projects requiring high data-to-computing ratios and low latencies will always perform best on traditional supercomputers.
    Click Here to View Full Article

  • "Security Threats Won't Let Up"
    InformationWeek (01/05/04) No. 970, P. 59; Hulme, George V.

    The state of information security, which took a hammering last year, is expected to worsen this year as security vulnerabilities increase in severity, the use of spyware grows, and spammers adopt hacking tools and techniques to distribute junk email. To bolster themselves against these threats, businesses may have to add commercially available intrusion-prevention applications to an arsenal that includes fast patching, firewalls, regularly updated antivirus software, and strict remote-user security regulations. A Yankee Group poll of 404 security decision-makers finds that over 50 percent of respondents expect their security budgets to grow significantly over the next three years. Gartner VP John Pescatore notes that virus writers are getting craftier and launching spyware attacks, many of which are designed to fool users into thinking they are dealing with trustworthy parties so that they will give out confidential information that can be exploited. The good news is that more and more effective anti-spyware tools are available from software vendors, while antivirus vendors are enhancing their offerings with spyware-detection and -removal software. In addition, anti-spyware legislation such as an overhauled Safeguard Against Privacy Invasions Act is slated to be introduced in 2004. Meanwhile, Vincent Weafer of Symantec anticipates that spammers will continue to employ Trojan horses and viruses to hijack computers and use them as spam launching platforms; experts also believe hackers will start taking advantage of popular peer-to-peer networks and instant-messaging services, and target cell phones, handhelds, and emerging operating systems as well. Though well-publicized "zero-day" worms are of less concern to security analysts, Pescatore points out that more worms are appearing within one to two weeks after a software flaw is discovered.
    Click Here to View Full Article

  • "The LWAPP Flap"
    Network World (01/05/04) Vol. 21, No. 1, P. 35; Vance, Jeff

    A new Internet Engineering Task Force (IETF) standard for centralized WLAN control is facing some opposition from Aruba Wireless Networks and a reticent Cisco. Proponents say that Lightweight Access Point Protocol (LWAPP), a standard for interoperable WLAN access points and switches, would drive further adoption in the enterprise, where centralized WLAN control is an important issue. Enterprises want to move to "thin" access point technology where control functions are centralized on switches, instead of at each access point node. Without a standard, WLAN administrators have to make trade-offs in terms of supporting more users and cost-effectiveness; LWAPP would allow enterprises to add WLAN access points and hardware from any supporting vendor knowing that the equipment will plug into their existing network control system. Without centralized control, it is difficult to set up seamless roaming, single sign-on, and load balancing, for example. However, standardizing a protocol before defining access point and switch functions is putting the cart before the horse, says Aruba's Keerti Melkote; he advocates tunneling protocols General Routing Encapsulated tunnel and IP Security, solutions that LWAPP proponents say would do nothing to control the radio technology aspect of WLANs. Cisco's Ron Seide says multi-vendor interoperability is a good thing, but that it is premature to support a specific standard at this point. He also notes Cisco has its own WLAN management technology called Structured Wireless-Aware Network solution. The protocol is expected to get its own working group within the IETF within six months and be finalized within 18 to 24 months.
    Click Here to View Full Article

  • "Rise of the Machines"
    CIO (01/01/2004) Vol. 17, No. 6, P. 82; Edwards, John

    Advancements in autonomic computing, artificial intelligence, and other technologies are making IT workers fear for the security of their jobs, as is hardware commoditization and the proliferation of the Internet and Web-based software. Experts foresee more growth in offshore outsourcing to lower-wage countries, a trend bolstered by technology enablement. Web services software, which is based on widely accepted industry standards, could constitute another threat to IT workers by simplifying the creation, operation, and maintenance of system-to-system links; utility computing, meanwhile, could also erode the ranks of IT professionals in small and mid-size companies, as well as IT managers in larger enterprises. Self-managing, self-healing systems via autonomic or organic computing promise to make many IT specialists superfluous, though Ovum's Alan Pelz-Sharpe doubts that the technology will start rippling through the tech workforce anytime soon, given the enormous technical challenges involved in creating such systems. Highly intelligent, proactive AI-powered computers that can think independently are also on the horizon, according to BT Exact futurologist Ian Pearson; such systems have the potential to supplant low- and mid-level IT workers and/or give companies incentive to shed expensive tech specialists and replace them with cheaper entry-level professionals who can instantly access the collective wisdom of their veteran peers. Although the projected extinction of many IT jobs through the advent of such technologies is frustrating, employees can take heart that the next several years will witness technological breakthroughs that will open up new opportunities for IT skills, such as virtual reality. "There's no reason at all to believe that we're going to find lots of jobless people in the IT sector," insists Pearson. "If anything, there's going to be a shortage because the new opportunities that are presenting themselves in IT are far greater in volume than the things which are being eradicated." The U.S. Labor Department's Bureau of Labor Statistics estimates that 1.5 million new IT jobs for database administrators, network administrators, and systems analysts will be created by 2010.
    Click Here to View Full Article

  • "Nature's Tiny Helping Hands"
    U.S. News & World Report (01/12/04) Vol. 136, No. 1, P. 46; Schmidt, Karen F.

    MIT materials scientist Angela Belcher is selectively breeding and genetically altering viruses that can help build super-small transistors. Her tailored viruses are little more than DNA in a protein shell, but they have special ends that attach themselves to gold source and drain electrodes. The shaft of the virus body is engineered to attract semiconductor material, which is mixed in a solution poured over the virus. Heat burns the virus away and leaves the semiconductor material in place between the source and drain to create the transistor. Other research is similarly looking to harness biology for semiconductor manufacturing. NASA Ames Research Center scientist Andrew McMillan is creating super-dense memory devices built atop grid arrays of a special microbe protein. The microbe lives in hot springs and its protein creates cage-like structures to protect it from the heat. These protein structures organize magnetic particles into a potential memory array. McMillan says, "If you could make each 10-nanometer particle represent a bit of data, you'd have the highest memory storage [currently] possible." Technion-Israel Institute of Technology researchers have created nanotransistors in a test tube using DNA and proteins as assemblers. University of California, Santa Barbara, materials scientist Evelyn Hu says the current top-down manufacture of semiconductors is vastly inefficient compared to the bottom-up construction enabled by biological assemblers. MIT's Belcher also notes that her new transistors can be created at lower temperatures and with non-toxic solvents, making them much more environmentally-friendly than existing technology.
    Click Here to View Full Article

  • "Not If, But How"
    CIO Insight (12/03) No. 34, P. 69; Perkowski, Mike

    Increasing numbers of CIOs are seeing the value of open-source software--Linux in particular--thanks to a growing consensus that it is cost-efficient, reliable, and is supported by more and more vendors; the focus for CIOs now is finding Linux's proper place in their IT programs. CIOs gauge a technology's advantages according to how secure and scalable it is, and though Linux thus far offers plenty of security, it has suffered from scalability deficiencies that make the software more suitable for tactical situations than mission-critical data-center applications. However, scalability concerns are dissipating with the forthcoming Linux 2.6 kernel, which promises support for 16-way servers, and the introduction of 64-way Linux support in new Silicon Graphics servers. There is agreement between analysts and CIOs that IT organizations boasting heterogeneous environments will have less difficulty with Linux than those with single-platform environments. Concerns over support for Linux have made many CIOs hesitant to adopt the software, but the situation is changing now that all major hardware vendors and growing ranks of software vendors offer industrial-strength support for enterprise clients. CIOs' attitudes toward the Linux community are also shifting: Their admiration for the passion and support typical of Linux developers has long been tempered by an unstructured, unprofessional image--but as that image erodes, CIOs become more confident in the quality of open-source code and the support offered by hardware and software suppliers. "[The open-source community is] no longer this loosey-goosey group of rogue developers; they're elite programmers who really want to do this, rather than being assigned to it," notes Aberdeen Group analyst Bill Claybrook. Still, CIOs must ensure that their development teams conduct compatibility and security testing for new code, and they must also take the terms of the General Public License into consideration.
    Click Here to View Full Article

  • "The Eyes Have It"
    CSO Magazine (12/03); Hapgood, Fred

    Machine vision--and its ramifications for security both inside and outside the enterprise--is on the cusp of widescale implementation thanks to advancements in camera, lighting, and processor technology primarily sponsored by the Defense Advanced Research Projects Agency. Unlike flesh-and-blood security staff, machine vision technology is free from distraction, does not forget or become exhausted, and is scalable, easily upgradeable, searchable, archivable, and network-compatible; furthermore, the technology is becoming more and more affordable. The machine vision application with the most momentum is intelligent optical character recognition, which can recognize characters emblazoned on physical objects, and its most critical application seems to be license plate recognition. Machine vision's biggest potential application is centered around object and behavior recognition, which could be employed to detect signs of tampering and damage. The U.S. Bureau of Customs and Border Protection has upgraded its security by installing people recognition systems, and bureau representative Bill Anthony reports that the number of false positives along the Mexican border has dropped to zero thanks to the upgrade. People recognition systems can also be deployed to identify incidences of tailgating and piggybacking. One of the ways in which machine vision is expected to revolutionize corporate security is to make CSOs more cognizant of signs of unusual activities, or flags, rather than outright violations. The enterprise-wide perspective afforded by machine vision could also benefit departments and divisions outside of security: For instance, observations of employees working overtime could be useful to the personnel department, while the detection of an odd vibration might tip off facilities management that a key piece of equipment is about to malfunction.
    Click Here to View Full Article

  • "Identity Crisis"
    Wired (01/04) Vol. 12, No. 1, P. 27; Rosen, Jeffrey

    Jeffrey Rosen, author of "The Naked Crowd: Reclaiming Security and Freedom in an Anxious Age," notes that battle lines have been drawn over the design and implementation of a national ID card: On one side are advocates who recommend that the government be granted access to as much information as possible in the interests of homeland security, while libertarians argue that such an allowance is tantamount to privacy infringement. However, the author points out that Verified Identity's Steven Brill put forward a compromise last fall in the form of a volunteer ID card that reportedly shields both security and privacy. Rosen posits that Brill's card is a considerable improvement over Oracle CEO Larry Ellison's proposal that all personal data recorded by state and federal governments be combined into one database, along with biometric identifiers such as iris scans or thumbprints that are encoded onto a card; such a system would be highly invasive, and enable the government to identify people without giving them any say in the matter. Brill's concept works by having Verified Identity turnstiles electronically appraised of all current V-ID card customers on a daily basis, allowing card carriers to pass without relaying any travel information back to the company. Brill says he will reduce the system's vulnerability to hacking by storing fingerprints only and enlisting an ombudsman to prevent civil liberties from being violated. However, Rosen writes that such safety measures will not ensure mass acceptance, even if they do not raise people's fears of government surveillance. Furthermore, there is no assurance that an ID card system will forestall another terrorist attack, and Rosen feels ID card proponents and opponents should look into effective anti-terrorist security measures.
    Click Here to View Full Article

  • "Software for the Next-Generation Automobile"
    IT Professional (12/03) P. 7; Simonds, Craig

    A dramatic change in the way automakers design and build cars is necessary for manufacturers to remain competitive amid shifting consumer demands, and key to this competitiveness is the rollout of responsive, flexible software architectures that give customers a personalized driving experience enhanced by real-time data collection, with an emphasis on simplicity, upgradeability, and multiple communication modes. All software architectures must take into account the automotive industry's quirky pressures and expedient design techniques, with consumer-product interaction the highest priority. Packaging, weight, power consumption, ease of use, and automotive requirements are all interrelated, which adds to the complexity of designing and deploying electrical and electronic components; the growing popularity of mobile services among consumers must also be considered. Ford Research and Engineering's Vehicle Consumer Services Interface (VCSI) provides consistent, consumer-desired features and services across diverse vehicle lines while simultaneously addressing vehicle design challenges. This is done by allowing manufacturers to develop more generic subsystem components, while Ford suppliers get a universal specification and application programming interface (API) so they can sustain their products' proprietary nature and use the APIs to interact with other systems, both inside and outside the vehicles. The APIs deal with such factors as the human-machine interface, compliance with varying regulations and safe-driver practices, and personalization. VCSI's upgradeability enables manufacturers to provide consumers with the latest bells and whistles throughout the vehicle's life cycle. VCSI can manage non-telematics systems, and the components are reusable across vehicles manufactured in different years.
    Click Here to View Full Article

  • "Tracking the Digital Puck Into 2004"
    Syllabus (12/03) Vol. 17, No. 5, P. 10; Green, Kenneth C.

    The Campus Computing Project's 2003 survey lists instruction and instructional infrastructure as a campus concern, a new emphasis on administrative systems, and long-term technology budget and resource pressures. The survey data reflect sizable college and university efforts to address media concerns related to the unsanctioned distribution and downloading of commercial digital content on campus networks, although the entertainment industry continues to subscribe to the false notion that all U.S. college students, ages 17 to 67, have round-the-clock access to high-speed campus networks. Some 77.2 percent of respondents reported the deployment of wireless LANs, compared to 29.6 percent three years ago, while Wi-Fi deployments experienced a nearly 100 percent increase among four-year public colleges and community colleges; however, the proliferation of Wi-Fi was accompanied by many institutions adopting a "deploy first, plan later" strategy. Survey results show that 28.4 percent of institutions set up Web-based campus portals in 2003, up from 21.2 percent in 2002, while 20.4 percent declared that their campus portals were "under development" or in the process of being installed. There was a 300 percent increase in the number of schools offering online course registration, online transcripts, and online course reserves in the past five years, while the number of U.S. institutions offering online credit card transactions rose from 18.6 percent in 2000 to 53.3 percent in 2003; on the other hand, an estimated two-year lag between the campus community and the consumer sector in terms of "services on the Web" still existed as of 2003. "Assisting faculty integrate technology into instruction" remained the biggest individual short-term (two- to three-year) IT issue on surveyed campuses across all higher education sectors, but a 50 percent decline across all sectors since 1998 was attributed to the growing importance of IT financing, ERP upgrade/replacement, and other issues. The percentage of college courses employing course management tools rose 18.9 percent in the past three years, according to the 2003 survey; 41.3 percent of survey participants reported a falloff in campuses' academic computing budgets in the current academic year--a 29.9 increase from three years ago--while growing numbers of schools documented elevated budgets for portals, IT security, and ERP services.
    Click Here to View Full Article