HomeFeedbackJoinShopSearch
Home

      

FREE ThinkPad Mobile Traveler bundle! Stay powered and connected.

Buy select ThinkPad� notebooks and ask for a free ThinkPad Mobile Traveler bundle. Bundle includes ThinkPad nylon carrying case, IBM AC/DC adapter, and Tripp Lite 4-Port USB Mini Hub. A $207.99 value!

Hurry! Offer ends June 16, 2003 or while supplies last. Visit ibm.com/businesscenter/acm or call 800-426-7235, ext. 3559

Offer available from IBM only in the US through 6/16/03. Not combinable with any other offer. Shipping and handling not included. Limit 10 per customer.



ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either IBM or ACM.

To send comments, please write to [email protected].

Volume 5, Issue 504: Friday, June 6, 2003

  • "SCO Suit May Blunt the Potential of Linux"
    Los Angeles Times (06/06/03) P. C1; Menn, Joseph

    SCO Group's insistence that Linux contains proprietary Unix code owned by the company has dampened corporate enthusiasm for the free Linux operating system while angering thousands of open-source programmers. Gartner recently warned firms to limit their use of Linux in "mission-critical" systems because of the uncertainty over the case. If Linux turns out to contain protected Unix code, then Linux users could have to pay royalties to SCO. Yankee Group analyst Laura DiDio, who reviewed the copied source code this week, says the claims may have merit because even some explanatory comments are the same in Linux, word for word. But long-time open-source advocate Eric Raymond says the situation is not so clear-cut because of Unix's history, noting that the disputed portions of code may actually hail from an open-source effort at the University of California, Berkeley. Besides that possibility, he says more than 100 people claim to have seen the Unix source code without signing non-disclosure agreements over the past 20 years, weakening the trade secrecy surrounding the Unix source code. Raymond recently wrote, "What is at stake here is not just the disposition of a particular volume of computer code, but what amounts to a power grab against the future." Reflecting wide-held sentiment in the open-source community, Raymond says Microsoft is behind the SCO suit in an attempt to torpedo Linux. Others see the suit as a broad attack against the Linux community, while Linux creator Linus Torvalds says the suit justifies his decision to make Linux open source. Torvalds says he is disappointed that SCO has "spread a lot of rumors without showing what they are actually talking about." Although Raymond says the suit could be the "last gasp of proprietary Unix," SCO's Blake Stowell argues that Unix has always been protected and has never been licensed as open source.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Mass. Could Be Fifth State to Adopt Anti-UCITA Law"
    Computerworld (06/04/03); Thibodeau, Patrick

    Massachusetts may join Iowa, North Carolina, Vermont, and West Virginia in the adoption of "bomb-shelter" legislation designed to shield citizens and businesses from the Uniform Computer Information Transaction Act (UCITA). A hearing on the anti-UCITA proposal was held by the Massachusetts Joint Committee on Commerce and Labor on Monday, but a spokesperson for the panel says no action was taken. Thus far, only the states of Maryland and Virginia have adopted UCITA, while John McCabe of the National Conference of Commissioners on Uniform State Laws (NCCUSL) reports that the District of Columbia is considering adopting the bill as well. Carlyle Ring Jr. of the NCCUSL's drafting committee insists that a universal code of conduct for Internet transactions is necessary, and warns that the failure of states to adopt such rules will spur Congress to enact them. Ring blames an 18-month withdrawal of UCITA, authorized by the American Bar Association so it could review the legislation, as the reason why state adoption has stalled. Opposing UCITA are library and consumer protection organizations, large software users, and most state attorneys general, who claim they successfully blocked UCITA's adoption in Nevada and Oklahoma this year. Critics contend that UCITA chiefly serves software vendors and waives their liability for any product difficulties users encounter, while UCITA advocates say companies can arbitrate terms and conditions, and have made an unsuccessful effort to satisfy opponents by eliminating contestable stipulations.
    Click Here to View Full Article

    To learn more about ACM's activities regarding UCITA, visit http://www.acm.org/usacm/Issues/UCITA.html.

  • "Packet Tracking Promises Ultrafast Internet"
    New Scientist (06/05/03); Fox, Barry

    Fast TCP from the California Institute of Technology (Caltech) is a Transmission Control Protocol (TCP) refinement designed to dramatically speed up the sending and receiving of data over the Internet without having to install new equipment on recipient computers. Current TCP methodology involves the breakdown of large files into small packets of approximately 1,500 bytes that include the sender and the recipient's addresses; files are sent packet-by-packet, with the transmission of each subsequent packet dependent on the receiving computer's confirmation of the previous packet's arrival. However, if the recipient does not verify a packet's arrival, that packet is retransmitted at slower and slower speeds until receipt is acknowledged, meaning even small network errors can seriously impede connections. Fast TCP can piggyback on the existing Internet infrastructure because the packets it uses are equal in size to regular TCP packets. A sender running Fast TCP is equipped with software and hardware that measures packet arrival and receipt verification times, which allows the computer to detect the onset of delays and anticipate the highest data rate the Net connection can sustain without incurring packet losses. November 2002 saw the first practical trial of Fast TCP, when Caltech, Stanford University, and CERN researchers transmitted data from California to Switzerland at an average rate of 925 Mbps, compared to 266 Mbps for regular TCP. The scientists were able to exceed ordinary broadband connection capacity more than 6,000 times by linking 10 Fast TCP systems, which boosted transmission speeds to over 8.6 Gbps.
    http://www.newscientist.com/news/news.jsp?id=ns99993799

  • "Senator Wants Limits on Copy Protection"
    CNet (06/04/03); McCullagh, Declan

    Sen. Sam Brownback (R-Ky.) has written a bill that aims to regulate digital rights management (DRM) systems, effectively limiting how copyright owners can control the distribution of digital content through copy-protection technology. "The legislation recognizes that the same DRM technologies used to combat piracy are also sought after by the content industry to create new DRM-enabled business models," declared Brownback. "My legislation gives them a free hand in seeking out DRM technologies that permit them to explore these new opportunities, but ensures their success or failure will rest in the marketplace, where it belongs--not in Congress." Brownback's Consumers, Schools, and Libraries Digital Rights Management Awareness Act states that consumers have the right to resell copy-protected products, while copyright owners cannot force ISPs to reveal the names of alleged peer-to-peer copyright infringers without a court ruling in their favor. Furthermore, manufacturers of digital media would have to clearly label products that include FTC-approved anti-copying measures beginning one year after the legislation's enactment, unless the FTC accepts "voluntary" guidelines outlined by industry organizations. The bill also bans the FCC from forcing manufacturers or vendors of PCs or digital video products to equip such products with specific anti-copying measures, and requires the FTC to furnish a DRM status report two years after the act is enacted, as well as assign an advisory committee to outline how the use of digital media products by consumers, libraries, and schools could be impacted by access control and redistribution control technology. An industry official announced at a privacy and politics summit on Tuesday that Brownback's bill will probably be introduced in the middle of next week, and Public Knowledge attorney Mike Godwin said that his group will back the measure.
    http://news.com.com/2100-1028_3-1013037.html

  • "You Make It, You Take It: PC Law"
    Wired News (06/05/03); Friedman, Gabe

    A bill authored by state Sen. Byron Sher (D-Calif.) and approved by the California Senate on June 4 would require electronics producers to formulate and fund a system for collecting, transporting, and recycling unwanted computers and monitors, and recover 90 percent of e-waste by 2010. Charles Corcoran of the Department of Toxic Substances Control estimates that California residents throw out about 7,500 TVs or computer screens every day, for a yearly total of 2.75 million units. Meanwhile, Californians Against Waste (CAW) reckons that a mere 11.5 percent of all computers discarded last year were recycled by citizens. A 2002 report from the Silicon Valley Toxics Coalition and the Basel Action Network finds that between 50 percent and 80 percent of all U.S. e-waste is exported to third-world countries that lack environmental protections. Under Sher's proposal, electronics manufacturers would have the option of paying the state to recycle discards if they are unwilling to develop their own recycling plans, but Hewlett-Packard's David Isaacs says his company prefers collection and recycling to be split between manufacturer, vendor, government, and consumer. CAW executive director Mark Murray counters that the inclusion of toxic substances in electronic products is a design decision, so reclamation should be wholly the responsibility of producers. Recycling programs supported by most manufacturers currently require consumers to pay a $15 to $50 shipping and recycling fee. A measure similar to Sher's bill was approved by both the state Senate and Assembly, but was killed by Gov. Gray Davis.
    http://www.wired.com/news/business/0,1367,58850,00.html

  • "What Internet2 Researchers Are Dreaming Up"
    NewsFactor Network (06/05/03); Powell, Ellen

    Internet2 (I2) researchers are experimenting with technologies that could radically enhance the network and open up a plethora of new applications. I2, which is supported by a nonprofit consortium of universities and corporate research and development labs, fulfills what I2 spokesperson Greg Wood calls "a need to have a place where the long horizon can foster new technologies to be used in high-performance networks." Recent I2 innovations of note include the linkage of 500 educators and students in Seattle and Virginia into a live, virtual classroom via tele-immersion; an Ohio gall bladder operation remotely directed by a surgeon in Washington, D.C.; a videoconference that allowed far-flung U.S. poets to participate in a reading; and the enablement of real-time, bi-hemispherical astronomical observation through I2-connected telescopes in Hawaii and Chile. I2 also supports distance learning and telemedicine through the use of haptics technology, allowing users to virtually manipulate remote objects through tactile feedback technology. Experts believe that integrating I2 with the conventional Internet will take at least several years, and they are busy laying the foundation for a network that is less chaotic than the I2's commercial precursor. In order for standards to be successfully implemented, a balance between regulation and accessibility must be struck. "We don't want to go to the point where [the network is] codified through some kind of legislation or statute," says John Muleta of OI Systems. I2 researchers are investigating the potential of Internet Protocol Version 6, quality of service, and multicasting, among other technologies.
    http://www.newsfactor.com/perl/story/21668.html

  • "Storage Methods Come and Go, But Tape Holds Its Own"
    New York Times (06/05/03) P. E8; Austen, Ian

    Many storage experts predict that tape has a long and healthy life ahead of it, despite advances in hard disk drives, DVDs, and holography. One of tape's advantages over these alternate media is its storage space, thanks to the material's extreme thinness and its ability to hold large numbers of data tracks. Gordon F. Hughes of the University of California, San Diego's Center for Magnetic Recording Research notes that tape can sometimes be more expensive than ordinary PC hard drives, and suffers from longer recording and retrieval times. On the other hand, Imation's Saurin Shah says that growing worries about terrorism and natural catastrophes is renewing interest in tape as a back-up tool--for example, copying critical data onto tape and transferring it to an off-site storage facility is easier than doing the same with hard drives. Some researchers are trying to ensure tape's future by expanding its storage capacity: For instance, Ohio State University scientists are attempting to increase the number of data tracks on each length of tape by smoothing out irregularities during the manufacturing process. Researchers there are investigating the possibilities of new polymers, or studying the slitting stage of the assembly process to see if improvements can be made. Meanwhile, Imation is considering whether new tape coatings that boost storage capacity can be manufactured by retooling the high-vacuum evaporation system used to coat hard drive discs for tapes. Shah's company is also part-owner of O-Mass, a firm that is developing a new tape drive that ought to squeeze in more data tracks via laser-light reading. Advanced tape cartridges currently boast a maximum storage capacity of 200 GB.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Putting a Lid on Spam"
    Wall Street Journal (06/06/03) P. B1; Mangalindan, Mylene

    Thirty-three U.S. states have enacted anti-spam legislation, but Utah's stands out for the amount of litigation it has spurred. The Utah statute classifies spam as any commercial email sent to people with no prior business relationship with the sender, and requires spammers to clearly mark spam in the subject line, include a legitimate domain name and street address in the message, and allow recipients to opt out of mailing lists. Any spammer who fails to comply with these regulations can be sued by recipients; if the court rules in favor of the plaintiff, spammers would have to pay either $10 for every piece of spam sent or $25,000 for each day they sent spam to the plaintiff. Utah lawyers Jesse Riddle and Denver Snuffer have drawn publicity from the approximately 930 lawsuits they have filed against alleged spammers, many of whom are major firms such as J.P. Morgan Chase and Verizon Communications. Riddle claims he and his partner originally targeted purveyors of online pornography or other obviously objectionable content, but the difficulty of tracking such spammers down caused them to go after easier-to-find companies. Riddle's legal pursuit of legitimate firms illustrates the ambiguous definition of spam as mandated by Utah's legislation, and one of the anti-spam law's authors, state Sen. Patrice Arents, wants to address this uncertainty by refining spam classifications. Meanwhile, several state lawmakers are calling for state Gov. Mike Leavitt to hold a hearing that focuses on the anti-spam law. Critics of the law are especially contentious toward a provision that requires losing defendants to pay the plaintiff's legal bills, while winning defendants will not be compensated for legal services by losing plaintiffs.

  • "Cybersecurity Report Card--Serious Improvements Needed"
    ZDNet (06/02/03); Farber, Dan

    Dan Farber has furnished a report card gauging the progress made by the various parties working either for or against cybersecurity over the last year, and hackers received an "A-" thanks to their growing capability to thwart detection, the ease in which they exploit new security holes, and the increasing availability and sophistication of hacker tools and techniques. IT organizations, which received a "D+," had a hand in the hackers' progress because of their inability to keep pace with patches and managing across elaborate computer networks, although Farber thinks their poor grade could be the result of economic factors and executive management's failure to prioritize cybersecurity. The software developer community also received a "D+" because commercial and in-house software is still riddled with common flaws even though developers are striving to improve quality assurance and coding techniques; in addition, Gartner reckons that fewer than 20 percent of larger enterprises have the personnel and proficiency needed to develop safe software. Although security products show improvement and efforts are underway to standardize and ease security management, the security industry received a "C-" because many companies do not have the means to deploy such products, or face considerable implementation difficulties. Farber gave the U.S. government a "D" grade because it does not have enough money and staff to deal with cybercrime, which fosters a reactive rather than proactive attitude among agencies; furthermore, corporations are hesitant to disclose information about intrusions. End users, who Farber describes as "caught up in the maelstrom," got a "C," reflecting their general passivity and the lack of end-user awareness of cybersecurity issues.
    Click Here to View Full Article

  • "New Nanoscale Device Reveals Behavior of Individual Electrons"
    News@UW-Madison (06/03/03); Beal, James

    A new nanoscale device developed at the University of Wisconsin-Madison allows researchers to observe the behavior of electrons in detail not possible before. Just 100 nanometers wide, the device is a thin membrane stretched over a semiconductor gap. As electrons flow over the membrane, the heat dissipation creates vibrations the scientists can measure in voltage. Electrical and computer engineering associate professor Robert Blick says the device measures two-dimensional electron flows, but can also be scaled down to measure single-file, one-dimensional electron flows and even zero-dimensional states, where electrons are part of a single-electron transistor. Blick says semiconductor firms will be able to use the new data to better manage heat dissipation on the chip. In the longer view, the electron measurement device promises to enable quantum computing, which requires that computers be able to read quantum bits without changing their state. This Heisenberg principle postulates that the observation of quantum bits will change their condition. Blick says the new tool will allow researchers to see if that obstacle is a fundamental challenge to quantum information processing without destroying the qubit state. Blick says, "We can study information processing on the quantum level and see whether the Heisenberg principle gives us a real obstacle, or whether we can find ways around it by using quantum-nondemolition techniques."
    http://www.news.wisc.edu/view.html?get=8710

  • "Group Drafts a Truce in Flaw Dispute"
    CNet (06/04/03); Lemos, Robert

    The Organization for Internet Safety (OIS) issued a draft of bug disclosure guidelines on Wednesday in the hope of settling a dispute between software companies and security researchers over the best time and methods for publishing software vulnerability alerts. Although security has become a higher priority for software providers over the last several years, certain researchers are not giving companies enough time to build patches before publicly posting bug alerts. "[Researchers] don't always understand that sometimes the fix can take longer than a few days," notes Oracle chief security officer Mary Ann Davidson. The guidelines, which are expected to be released in late July at the Black Hat Briefings security conference, give software manufacturers a week to respond to a researcher's bug warning, and 30 days to devise an appropriate patch. Likewise, researchers must wait at least 30 days after the patch is released before they can publicly disclose the software flaw. Some researchers think that this latter policy is an unwise move. "If we don't have details, we are just going on the word of the software vendors and a small group of trusted companies," argues eEye Digital Security's Marc Maiffret. OIS members include Oracle, @stake, Internet Security Systems, SGI, Microsoft, and Guardent.
    http://news.com.com/2100-1002_3-1013423.html

  • "California Law Raises Bar For Data Security"
    Investor's Business Daily (06/06/03) P. A5; Howell, Donna

    A California law requiring U.S. firms to notify Californians in writing if their personal data has been compromised or corrupted as a result of database intrusions will go into effect on July 1. "It's a law that on its face purports to cover those outside California, as well as those inside, if they have databases with personal information on California residents," notes Christopher Wolf of Proskauer Rose, who adds that his firm's clients are working to adhere to the statute. The law is applicable when an intruder is able to obtain a person's surname and at least his or her first initial, as well as one other piece of data, such as a Social Security, California ID card, or driver's license number, or the number of a debit or credit card with associated PIN number or password. Companies are worried that being required by law to reveal such break-ins will damage their reputations and affect sales. Clients could take their business elsewhere or sue the firms as a result of such disclosures. Foley & Lardner tech attorney Mike Overly says that some firms are following a "safe harbor" strategy, examples of which include the encryption of stored information and the storage of personal details in separate databases. Higher-level encryption is drawing interest in light of security and privacy regulations, and new forms of encryption are forthcoming: RSA Security's Nightingale technology, for instance, will be able to encrypt a single item of data but store it on multiple servers. Other mandates that have raised the bar on privacy and security include the Health Insurance Portability and Accountability Act and the Gramm-Leach-Bliley Act, which serve the health and financial sectors, respectively.
    http://www.investors.com/editorial/tech01.asp

  • "IT Boom Reverses Brain Drain"
    Moscow Times (06/04/03) P. 9; Martin, J. Quinn

    As a result of the weakening U.S. IT sector, many Russian programmers and engineers are returning to their native country to seek jobs. Experts say 10 percent of high tech job seekers in Russia are recently retuned from aboard, including those who held H-1B visas. Experts expect that the trend could last for at least another two years, allowing Russia to regain many of the tech professionals it had lost over the course of more than a decade. "When American or European companies need to downsize, foreign employees are often first to go," says Larisa Lukashyova, human resources manager at Moscow-based software firm Spirit. Tech professionals are also returning in high numbers because the Russian IT sector is steadily expanding, she says. The sector has been growing at 20 percent annually recently, and salaries are also increasing but still not on par with those in Europe and North America, says Lukashyova. Russians who return from abroad have the added advantage of knowing English, which accounts for their higher salary compared to residents who have never been overseas, says Oleg Tsetovich, an IT consultant for recruiting firm Avenir & Partners in Russia. He adds that returnees have gained valuable international business understanding, strong corporate contacts, and unique skills.
    http://www.moscowtimes.ru/stories/2003/06/04/044.html

  • "India Fears Impact of Bid to Curb Jobs Exports"
    Financial Times (06/04/03); Alden, Edward; Foremski, Tom; Merchant, Khozem

    India is apprehensive about recent efforts by U.S. trade groups and legislators to stem overseas IT outsourcing as IT sector unemployment rises in the United States. More than 472,000 IT jobs will be relocated to foreign countries by 2015, according to Giga Information Group, compared to 27,000 jobs in 2000. Meanwhile, unemployment in the IT segment reached 5.2 percent in 2002, compared to 3.7 in 2000. Firms such as Microsoft plan to take advantage of lower labor costs and productivity gains realized overseas. Microsoft spent $400 million on IT and education deals in India last year, the firm's largest foreign expenditure. Microsoft also wants to double the number of engineers to 500 at its Hyderabad, India, facility within three years. In May, congressional lawmakers introduced bills designed to prohibit India-based employees from being transferred to their U.S.-based subsidiaries, citing an infringement of L-1 visa stipulations. Also, in a growing backlash to the loss of tech jobs to overseas workers, legislators in four states have proposed bills barring their state governments from doing business with companies that outsource to other countries. India's the National Association of Software and Services Companies (Nasscom) says companies are realizing productivity gains of 20 to 25 percent when outsourcing to Indian firms, and have saved $8 billion over the last four years. Nasscom also believes that legislative efforts to prevent outsourcing could be challenged under WTO.
    http://search.ft.com/search/article.html?id=030604000982

  • "Taking Technology to Extremes"
    New York Times (06/05/03) P. E1; Revkin, Andrew C.

    Using a combination of off-the-shelf products and customized solutions, adventurers to the world's most remote places are using technology to keep connected. Smaller and lighter, technology today is becoming vital to exploration because it does not displace as many other supplies, allows people to communicate with commercial and emotional sponsors, and assists in emergency situations. Extreme frontiers such as the Himalayas and the North Pole are putting technology to the test, challenging the range of satellite phones and quickly draining batteries in sub-zero temperatures. A Chinese TV team recently beat American competitors to being the first ones to broadcast from the summit of Mount Everest. Explorersweb is one such provider of rugged technology, and has found ingenuity to surpass complexity in some cases: For instance, lithium AA Energizer batteries are harnessed in a custom holder to produce 12-volt current. Swedish adventure trekkers Tom and Tina Sjogren run Explorersweb and sell or rent packages people can use to post to the Web blog updates and photos. At the North Pole, connecting means dealing with not only cold, but the convergence of longitudes, which can confuse GPS receivers as to whether one is standing in Honolulu's longitude or that of Paris. In the Himalayas, Iridium satellite phones are foiled by signal-blocking mountains, but Thuraya phones work, since they rely on satellite networks synchronized with the Earth's orbit. Simple coded communications over these phones let explorers make the most of batteries, substituting single digits for voice.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "World's Smallest Robot"
    Popular Mechanics (06/03) Vol. 180, No. 6, P. 44; Eisenstein, Paul

    Scientists at Sandia National Laboratories' Intelligent Systems and Robotics Center say they have developed the smallest untethered robot ever. Standing 1cm high, the robot features tank-like tread wheels that propel the robot at a speed of just 20 inches per minute. The machine is powered by three watch batteries, has 8 KB of memory, and can sense temperature. Scientists envision several uses for similar-sized robots, ranging from chemical detection to bomb searching. They may also be used for surveillance purposes, coming out as needed and taking a picture, then returning to their hideaway. Scientists used stereolithography to make the robot so small. Following the pattern of a computerized drawing, a stereolithographic laser solidifies a liquid polymer in layers until a part is formed. The scientists also assembled commercially available electronics on a glass substrate. Sandia researcher Doug Surgle believes "we've almost reached the limit" on size because of battery size, not because of the size of computer chips or motors. Other robotic advances are occurring in the medical field, where scientists plan to create surgical robots that can be injected into the body. Nanotechnology Development of Britain is working on a robot that can be inserted into the body in pieces and then assemble itself for tasks such as clearing blood clots, smashing kidney stones, or removing cancerous cells.

  • "Adolescent Angst"
    InformationWeek (06/02/03) No. 942, P. 30; Greenemeier, Larry

    An InformationWeek Research poll of 274 business-technology executives who have been using the open-source Linux operating system for 12 months finds that the number of respondents who are extremely satisfied with the OS has fallen over the past year from 86 percent to 74 percent. The percentage of people who cite the lack of Linux business applications as the greatest disadvantage has risen from 32 percent to 38 percent, while 19 percent call incompatibility between Linux distributions the biggest problem, compared to 12 percent a year ago. Over 75 percent of survey respondents list cost, reliability, and performance as the primary reasons their companies chose Linux, but over 25 percent cite proliferating Linux variants as the chief cause of implementation problems, while 20 percent blame inadequate technical support. The continuous improvement of Linux and the rapid rollout of new versions is anathema for companies striving for system stability, notes Brian Stevens of Red Hat. Although Linux's credibility has received a boost thanks to the support of such companies as IBM, Hewlett-Packard, and Oracle, the survey notes a slight decrease in the number of servers running Linux in the past year, accompanied by a slight increase in the number of Windows NT or 2000 servers. Tim Witham of the Open Source Development Lab says that this year's debut of the Linux 2.6 kernel will open up the enterprise sector to the OS. Modifying the kernel will enable programs to run heavier processing loads and access more information more reliably; large-array storage management and configuration will also be made easier, while database performance on eight- and 16-way symmetric multiprocessor servers will be significantly advanced. The majority of respondents expect their companies to continue to use Linux as an OS for database management, Web or intranet services, network file-and-print services, and application development, while fewer than 33 percent expect to run enterprise applications on Linux.
    Click Here to View Full Article

  • "Battle-Tested Tech"
    InfoWorld (06/02/03) Vol. 25, No. 22, P. 42; Epstein, Eve; Fisher, Susan E.; McCarthy, Jack

    Advanced commercial IT for enterprises is being put through its paces in the military; recent conflicts in the Middle East were a testbed for various technologies that emphasized a reversal of the technology development cycle, which traditionally moved from military research to commercial applications. Coordinating the Iraq war as well as the post-war reclamation effort using mostly commercial technologies--including Windows CE's Phraselator, the Global Positioning System, GDDS' Common Ground Station, and Iridium--illustrated the importance of an integrated communications system to real-time decision-making, morale boosting, remote reconnaissance, and medical efforts, which can be translated to the enterprise. Intelligence-gathering robots from the likes of iRobot and Cambridge's Draper Laboratory were employed in the war to keep people out of harm's way while surveying dangerous regions, and robots under development at the Defense Advanced Research Projects Agency (DARPA) may allow troop movements to be remotely monitored. These machines could be applied to civilian law enforcement and the mapping of difficult terrain. The Iraq campaign provided an opportunity to test the Defense Department's Common Access Card (CAC) technology, which involved issuing military personnel PKI-equipped smart cards that stored financial and medical data as well as transportation routines, permitted holders to obtain weapons, and allowed battlefield commanders to access encrypted communications; Schlumberger is developing a card that incorporates CAC standards. Meanwhile, the U.S. military used collaboration technology from Groove Networks to eliminate paper-based field reporting in Iraq--technology that DARPA is using to build its Terrorism Information Awareness project. Retailers are looking into the possibilities of radio frequency identification (RFID) tags, which U.S. forces used in the Iraq war to enable the military for just-in-time logistics and supply-chain management; also used during the war was Cognos Metrics Manager, a tool implemented to monitor, study, and report the readiness of personnel, gear, and operations.
    Click Here to View Full Article

  • "The Post-OOP Paradigm"
    American Scientist (04/03) Vol. 91, No. 2; Hayes, Brian

    Computer programming's sophistication has evolved from binary notation to assembly languages to structured programming to object-oriented programming (OOP), which has been the reigning programming paradigm for about 20 years. The wide adoption of OOP has hinged on its ability to encapsulate data and procedures within a single object, and its support of interactive software that works with a graphical user interface. But there are indications that OOP's appeal among programmers is waning, leading to investigations into new methodologies, many of which are more of a supplement to, rather than a replacement of, OOP. Aspect-oriented programming (AOP), for example, is being probed as a way to write object-oriented programs that support appropriate breakdowns into objects and classes; giving a class multiple parents--such as establishing a five-pointed star as a subclass of both a pentagon and a self-intersecting polygon--is one solution. AOP advocates believe the method is especially suitable to the deployment of omnipresent jobs such as error-handling and event-logging. People are also pursuing the idea of making programming an automatic function of the computer, a concept that goes by several names, including generative programming. Proponents of another programming technique that focuses on recurring patterns in both problems and solutions take a dim view of generative programming. Their philosophy stems from architect Christopher Alexander's theory that well-designed structures--and by extension software systems--must possess an unfailingly accurate "quality without a name."
    Click Here to View Full Article

 
                                                                             
[ Archives ] [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM