HomeFeedbackJoinShopSearch
Home

       HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to [email protected].

Volume 5, Issue 560:  Monday, October 20, 2003

  • "Researchers Take Debugging to the Masses"
    CNet (10/17/03); Shankland, Stephen

    The Cooperative Bug Isolation Project is a joint effort between Stanford University and the University of California at Berkeley designed to improve the tracking down and correction of software glitches by developing and releasing modified open-source software packages that send debugging data to a central site where users can participate in the search for errors. "We're actually trying to enlist some of the users' horsepower to really find the bug, to give the engineer some information that will lead him to the bug more directly," explains project member and Berkeley graduate student Ben Liblit. Open-source programmers can augment their own software with "sampler" software that allows the program to capture data during operation. Liblit says it was a chief goal of the project to have the sampler software slow program performance by no more than 5 percent, and so the sampler software uses randomization to record data sporadically. Each recording documents whether the program performed a proper exit or crashed; the debugger then correlates the two sets of data, and Liblit says the discrepancies uncovered by the debugger should allow engineers to trace likely points of failure. Finding software users willing to participate in the Stanford-Berkeley project will be a formidable challenge, according to Illuminata analyst Jonathan Eunice. More contributors could get involved if the open-source projects or companies issue feedback-enabled versions of their software, but Liblit reports that project members have no current plans to increase the availability of the sampler-enhanced programs.
    Click Here to View Full Article

  • "Tech Layoffs Fading, But No Hiring Boom"
    Investor's Business Daily (10/20/03) P. A1; Graham, Jed

    The technology employment landscape in the United States is looking different as numbers begin to flatten out and some sectors are actually adding workers: Notably, IBM announced plans to hire up to 10,000 new employees next year in "key skill areas" such as Linux, high-value services, and middleware, but previous reports also said the company intends to bolster its India workforce from less than 5,000 workers now to 10,000 workers by 2005. Analysts say the United States needs to provide high-skills jobs that are not as vulnerable to overseas outsourcing. Monster.com CEO Jeff Taylor says hardware and software job postings on his company's Web site have increased faster than overall job postings, growing by 33 percent and 28 percent, respectively, compared to 11 percent for more general job postings in September. But Taylor also says the technology sector is still waiting for the next big innovation or market driver to reinvigorate hiring. Indeed, the Labor Department says the United States continues to lose technology manufacturing jobs for the 32nd consecutive month. In general, the technology industry is still mixed in terms of hiring, but Quantit Economic Group economist Mat Johnson says the trend is promising, especially with 2,400 new computer system designers hired in September. He says new products and services will grow jobs, but that some industries that overinvested in the late 1990s, such as telecommunications, are still in cost-cutting mode and outsourcing overseas rapidly. Institute of Electrical and Electronics Engineers R&D policy committee Chairman Ron Hira notes a seminal change has taken place in the technology market that cyclical patterns have just masked: He says the effects of the technology bubble are covering up a shift of R&D overseas, where coordination between development and manufacturing is easier.

  • "Scientists Explain and Improve Upon 'Enigmatic' Probability Formula"
    Newswise (10/16/03)

    Researchers at the University of California, San Diego's Jacobs School of Engineering report in the Oct. 17 issue of the journal Science that automatic speech recognition, natural language processing, and other machine learning software could be enhanced by applying new knowledge about a mathematical formula that helped British cryptanalysts decrypt the Enigma code during the Second World War. That formula, the "Good-Turing estimator," was an nonintuitive equation that outmatched the more intuitive estimators. Good-Turing has been included in applications ranging from information retrieval to spell-checking to speech recognition software, but Alon Orlitsky of UCSD's Electrical and Computer Engineering Department comments that no objective assessment or results have been arrived at to explain why the formula works so well. Furthermore, scientists have noted that the formula does not perform well in all situations. Orlitsky and his research team claim to have created a new estimator that functions reliably under all circumstances, and propose a natural metric for estimator performance known as attenuation, which assesses the highest possible ratio between the likelihood assigned to each symbol in a sequence in any distribution, and the correlative likelihood assigned by the estimator. The researchers demonstrate that intuitive estimators are capable of attenuating the likelihood of a symbol by an arbitrary quantity, and show that Good-Turning never attenuates the probability of symbols by a factor over 2. "While there is a considerable amount of work to be done in simplifying and further improving the new estimator, we hope that this new framework will eventually improve language modeling and hence lead to better speech recognition and data mining software," remarks Orlitsky.
    Click Here to View Full Article

  • "Trouble Grows at the Internet's Root"
    Computer Business Review (10/20/03); Murphy, Kevin

    VeriSign and other organizations responsible for the 13 DNS root servers are voicing increasingly divergent views of how the Internet system should be managed: VeriSign CEO Stratton Sclavos recently said to the media that the system needed to be commercialized, a statement Internet Software Consortium (ISC) Chairman Paul Vixie calls a "pre-emptive strike." Many proponents of the current distributed system say VeriSign should no longer control its two root servers because of possible conflicts of interest. In addition, the firm drew tremendous criticism for and eventually shuttered its SiteFinder service. The ISC recently announced the formation of a new international DNS effort, the Operations and Analysis and Research Center (OARC), that will facilitate information sharing and more pro-active DNS security among participant organizations; a number of Internet organizations and commercial interests are already signed on, but VeriSign has not yet joined. The OARC is meant to better protect against attacks such as the denial-of-service attack one year ago that effectively closed off access to nine of the 13 DNS servers. Sclavos said recently that commercializing DNS would make more resources available, and that current university and non-profit DNS custodians are hindering commercial innovation. But Vixie points out flaws in many of Sclavos' arguments, and also says the denial-of-service attack one year ago never actually shut down the DNS servers, but rather clogged the pipelines leading to them. He says the ISC has already mirrored its root server all over the world through IP anycast and continues to make it more resilient. Vixie further disputes VeriSign's $150 million investment claims, since the main effect of that money was to bolster VeriSign's commercial .com and .net services.
    Click Here to View Full Article

  • "E-Vote Firms Seek Voter Approval"
    Wired News (10/20/03); Zetter, Kim

    Electronic voting machine makers are working to quiet criticism from computer scientists and security specialists by acknowledging the need for a paper trail to verify results. Meanwhile, the Information Technology Association of America (ITAA) has drafted a proposal to help vendors of electronic voting systems counter the bad publicity stemming from disclosure of security flaws by organizing a media campaign to "generate positive public perception" of the firms and significantly lower the amount of criticism leveled at them. Voting activists and computer scientists have opposed any e-voting machine that does not provide a printed receipt, despite repeated claims from election officials and the voting industry that such receipts would compromise the efficiency, cost effectiveness, and security of the voting process. Under the ITAA proposal, voting machine companies would adopt an industry code of ethics and pay $100,000 to $200,000 each for the association to launch a campaign on their behalf. The proposal's author, Michael Kerr, says the voting industry is still weighing its options. Stanford University computer science professor and voting activist David Dill argues that the voting industry should devote less time to repairing public perception and more time to patching e-voting technology by "making the voting process transparent, improving certification standards for the equipment and [ensuring] there is some way to do a recount if there is a question about an election." David Allen of Plan Nine notes that Kerr's memo, which lists nine objectives, ranks security improvements as No. 5, behind public relations and lobbying efforts. Bill Stotesbery of voting machine firm Hart InterCivic claims that all industry vendors are working toward products that furnish voter-verifiable paper ballots, an announcement that surprised and pleased Dill, who noted that "there are right ways and wrong ways to do it."
    Click Here to View Full Article

    To learn more about ACM's activities regarding e-voting, visit http://www.acm.org/usacm/Issues/EVoting.htm.

  • "The Future of Data Storage"
    E-Commerce Times (10/15/03); Horowitz, Alan S.

    Advanced Technology Attachment (ATA) drive technology is taking off thanks to the emergence of serial ATA (SATA), which Meta Group program manager Rob Schafer describes as "the wave of the future" in data storage. Although solid-state and fibre-channel technologies offer superior reliability and speed, they are very expensive; tape, on the other hand, is cheap but very slow, while disk technology occupies a price-performance niche between these two extremes. SATA offers significant speed and performance upgrades over standard ATA, and costs dramatically less than SCSI and other high-performance technologies. SCSI itself has been upgraded to Internet Small Computer System Interface (iSCSI), which Gartner's Robert Passmore expects to be supported by products from nearly all storage industry vendors in the next six to 12 months; he also notes that the new storage networking standard promises to cut storage-area network (SAN) connectivity costs, while Dell Computer's Marc Padovani predicts that vendors will move toward SAN interoperability over the next year. None of the experts expect wireless technology to play an important role in data storage in the foreseeable future. Federal regulations that require corporations to save massive amounts of data, such as the Sarbanes-Oxley Act and the Health Insurance Portability and Accountability Act, are fostering an ongoing need for expanding storage capacity, and making solutions such as SATA technology attractive. The growing corporate desire to store and manage information is causing Hewlett-Packard and other companies to pursue information lifecycle management (ILM) initiatives. Passmore remarks that management is becoming more highly valued than technology in the storage industry, as evidenced by the organization of dedicated storage management groups.
    Click Here to View Full Article

  • "New Internet Speed Record Set by Euro-U.S. Labs"
    Reuters (10/15/03)

    Olivier Martin of the European Organization for Nuclear Research, CERN, touts its transmission of data across the Internet to the California Institute of Technology in early October as a milestone. CERN reports that it sent 1.1 TB of data at 5.44 Gbps from its Geneva facilities to Caltech's lab in the United States, which is a record that doubles the Internet's previous top speed for data transfer of 2.38 Gbps reached in February. The CERN-Caltech transmission was more than 20,000 times faster than a regular home broadband connection. The two labs completed the transmission, which covered more than 7,000 kilometers of network, in nearly 30 minutes. Martin is excited about the possibility of achieving even faster transmissions, which would allow for instant collaboration between researchers around the world to take place. Caltech's Harvey Newman believes systems operating at 10 Gbps "will be commonplace in the relatively near future."
    Click Here to View Full Article

  • "Magnetic Memory Makes Logic"
    Technology Research News (10/15/03); Smalley, Eric

    Harnessing the logic capabilities of magnetic random access memory (MRAM) technology, researchers at the Paul Drude Institute for Solid-State Electronics in Germany have created a new reconfigurable processor scheme. The proposed device would have the nonvolatile characteristics of magnetic memory, thus melding two of the three core computing elements: Processing, random access memory, and hard disk memory. Paul Drude researcher Andreas Ney says the device would increase computers' efficiency. The German researchers' design has the two magnetic layers common with MRAM, with three input on the top layer and one output connection on the bottom layer. The wires send positive and negative currents that change the magnetic field and allow software programs to program the device on the fly; the wire configuration also allow AND, OR, NAND, and NOR computing functions. Ney predicts magnetic logic processors will likely be arranged in a square mesh like current RAM chips. Such devices could be used for a number of purposes, unlike many of today's chips that are hardwired for specific tasks. Ney says the next step for his research involves creating more complex logic circuits, writing software compilers that could reprogram the devices, and building prototypes; Ney also says MRAM devices expected commercially next year could conceivably be used as logic devices with a proper addressing scheme, but forecasts that full universal magnetic logic processors will not arrive for at least another 10 years.
    Click Here to View Full Article

  • "Juniper Spearheads Effort to Fortify 'Net"
    Network World (10/20/03); Duffy, Jim

    Juniper Networks last week revealed its Infranet Initiative, a project that aims to devise a user-to-network interface for client/service provider interaction and an inter-carrier interface between service providers that will enable customers and providers to build an "infranet" offering both the ubiquitous connectivity of the Internet and the foreseeable performance and security of a private network. The global infrastructure stemming from the infranet will be able to support machine-to-machine grid computing, allow Web-enabled applications to reach their full potential, and launch the era of the Internet-based economy, according to Juniper. "I think [the project] has teeth, but I think the teeth that it has initially are as much political as technical," comments CIMI President Tom Nolle. The two infranet interfaces will comply with a series of interconnection standards that set up a "lowest common denominator" necessary for deployment. Lucent is the only Juniper business partner participating in the project thus far, but analysts think Ericsson and Siemens may join the initiative as well. Juniper reports that Cisco would be welcomed as a project participant, although Juniper has not made any overtures to its competitor yet. Juniper claims that the key factor to the infranet's success is not technology, but industry collaboration on the optimal implementation strategy to employ. The company says it is suggesting "selectively open" links between carriers, determined by unified specifications such as the Layer 2 Tunneling Protocol and Resource Reservation Protocol, that "support and reward the delivery of advanced services, such as content distribution and virtual private networks" on a global level.
    Click Here to View Full Article

  • "Progress, Innovation Coming to Cell Phones"
    SiliconValley.com (10/19/03); Gillmor, Dan

    Signs that smarter, more intuitive cell phones are staring to move out of theory and into practice were in evidence at the European Technology Roundtable in Berlin, where several products were unveiled by Orange CEO Solomon Trujillo. The phones Trujillo displayed integrate certain features of both voice phones and PCs while boasting ease-of-use and adaptability. The Orange CEO claimed that such machines will allow every customer to be "a segment of one," and eliminate the need for mobile carriers to assign customers with different requirements to large market segments. By marrying more intelligent phones with carrier-specific network services, customers will be able to select specific device functions; Trujillo, for example, expects future cell phones to be able to access train schedules so that a user can find a train with the departure time he or she desires and purchase a ticket with only a few clicks. "We should not add [new features to the smart phones] too fast, or else we'll fall into the trap of the general purpose platform," cautions 3Com Chairman Eric Benhamou. Orange and its competitors have spent billions of dollars to implement third-generation (3G) mobile services, and are anxious to justify the cost. Trujillo is hopeful that Orange customers will be so enamored about the new services that they will spend enough time online to allow Orange to recoup the enormous 3G costs and still turn a profit. On the other hand, Joe Schoendorf of Accel Partners doubts 3G will yield any financial rewards, and sees better potential in Wi-Fi and its wireless progeny. Dan Gillmor writes that all the competition is healthy and a breeding ground for innovation.
    Click Here to View Full Article

  • "A Connection in Every Spot"
    Wired News (10/16/03); Baard, Mark

    The UbiComp 2003 ubiquitous-computing conference played host to engineers who advocate that technology should be used to establish real-world links between human beings rather than separating people within virtual environments. The UbiComp engineers subscribe to the late Xerox technologist Mark Weiser's belief that computers should be embedded everywhere in the form of minuscule wireless devices designed to maintain people's comfort levels rather than aggravate them with intrusive, invasive technology. Practical applications of this concept have only just begun to emerge, following setbacks stemming from flaws in the Bluetooth wireless standard and poorly conceived marketing schemes. Technology expected to blur the line between the work space and the public space by supporting ad hoc meetings in various locations was showcased at UbiComp through such innovations as Intel Research's Experience UbiComp proactive display project. Participants addressing presenters wore a radio frequency ID tag that was scanned, and pertinent information about the attendees was shown on a plasma screen, which also displayed groups of users arrayed according to their personal and professional preferences. Another notable project is the University of California at San Diego's ActiveCampus: ActiveCampus Explorer, an instant messaging client for handhelds that lists chat buddies according to how far away they are from a specific user, is being tested at UCSD and Northwestern University. Young people are also drawn to public Wi-Fi hotspots and other areas where they expect an always-on Internet connection; "Wireless mobile networks loosen person-to-place relationships," observed William Mitchell of MIT. "They enable the nomadic occupation of space and create demand for multiuse space."
    Click Here to View Full Article

  • "Parsing Hype From Hope: Will ENUM Spark Changes in Telecom?"
    CircleID (10/09/03); Dixon, Rod

    Rod Dixon writes that ENUM may be the next large development in communications technology, following the development of the Internet. With ENUM, any person could use his or her telephone number to represent one contact point, as the ENUM protocol would make a machine readable DNS address from a phone number. For example, the phone number with country code +1-555-555-7997 would become 1.5.5.5.5.5.5.7.9.9.7, then would be reversed to 7.9.9.7.5.5.5.5.5.5.1, then would take the ENUM suffix to become 7.9.9.7.5.5.5.5.5.5.1.e164.arpa. The DNS can use this last address to link caller and recipient via telephone, pager, mobile telephone, email, or personal digital assistant. Dixon explains that "in essence, your circuit-switched telephone number is assigned a corresponding Web address, or URL that enables an end user to call anyone in the world using the Internet as a telephone network regardless of whether the call recipient has access to an Internet phone or a traditional circuit-switched phone." Dixon argues that ENUM also would serve people by taking away much of the influence local phone companies now have to give end users, customers, and entrepreneurs more say, and would drive new developments that could result in more e-commerce products. These might include better spam detectors, Voice over IP, smart caller ID, and higher device and appliance interactivity. "In other words, ENUM could lead to the creation of a bottom-up, user-centric communications-phone system that offers services sought by end users, and likely provided by Internet or Web-based service providers rather than traditional telecom providers," writes Dixon. Digital technology-driven convergence of various information technologies is still not completely accepted, but ENUM could change this, though one potential issue is that the more public control over information through digital networks, the greater the concerns about privacy and other rights. Additionally, ICANN likely would manage control over ENUM services, but may not be prepared for such a huge responsibility.
    Click Here to View Full Article

  • "University Students in Mexico Promote Free Software"
    NewsForge (10/16/03); Miller, Robin

    A recent free software conference was held in Mexico's Southern oil town of Villahermosa, organized by university students who are also members of the local Linux Users Group, which claims a membership of about 400 people. The gathering, while disrupted by Tropical Storm Larry, drew more than 600 people, including businessmen and professors. Most attendees were university students either from the Universidad del Valle de Mexico (UVM) in Villahermosa or from UNAM, Mexico's largest university and a hotbed of Unix research; Gnome founder Miguel de Icaza hails from UNAM as well. Many of the attendees at the conference aspired to help society through their work. A research physician who spoke at the conference claimed the United States imposed too many restrictions on biomedical research, cut funding, and focused on profitable research areas, whereas Mexican medical research is still dedicated to saving lives. Another conference attendee who had worked extensively in political advocacy said she was pursuing computer science education and open source in order to create voting machine systems that could not be tampered with. Free software's goals fit well with the socially progressive attitude of the conference attendees since it puts volunteer researchers on equal footing with corporate researchers. Much of the conference was focused on introducing people to open-source software and Linux, but there were a few heavy technical sessions and the obligatory debate between a Microsoft marketer and an open-source advocate (in this case Mexican free software luminary Fernando Magarinos Lamas).
    Click Here to View Full Article

  • "Feds Take Up Arms as Computer Crime Becomes Multibillion-Dollar Problem"
    Star Tribune (10/06/03); Alexander, Steve

    Fighting computer crime is the FBI's third-highest priority, following terrorism prevention and counter-espionage. The Minnesota Cyber Crimes Task Force--the first of its kind in the United States--will pool the resources of the FBI, the Secret Service, and the U.S. attorney's office when it is up and running by year's end, although there is uncertainty that the office will become a template for national operations. Minneapolis FBI agent Paul McCabe says the task force will probe cyberattacks, Internet fraud, online theft of intellectual property, terrorist communications via the Web, online child pornography, and other types of cybercrimes. The office will consist of 10 FBI staffers, including three "computer forensic examiners" and seven investigators. Assistant U.S. attorney Paul Luehr notes that in the past the FBI and Secret Service would investigate computer crimes separately, with the former conducting probes in Pittsburgh and San Diego and the latter carrying out investigations in San Francisco and New York. Gartner Internet security VP John Pescatore, a former Secret Service agent, contends that the new task force will not work, and the only truly effective solution lies in more secure software. BearingPoint's J. Michael Gibbons thinks the cybercrime situation will not improve until new ways of engineering computer systems emerge in about 10 to 15 years. Gibbons says, "We have to re-engineer everything." Many private enterprises are scrambling to beef up their network security in the wake of recent worm attacks such as SoBig, Blaster, and Welchia, though Pescatore doubts this strategy will be effective, given the erratic nature of corporate cybersecurity spending.
    Click Here to View Full Article

  • "Let the Games Converge!"
    Technology Review (10/15/03); Hadenius, Patric

    Sweden's Zero-Game Studio and Nokia are pursuing the integration of two major computer gaming trends--massively multiplayer games and portability--to develop new gaming paradigms. Nokia's N-Gage device combines a high-end mobile phone with a game deck boasting a color screen and fast graphics, while a Bluetooth component allows people in the same vicinity to engage in multiplayer games. Though early reviews of N-Gage criticize the device as awkward and costly, computer game publishers see an opportunity to find new customers and business channels. Twelve of the world's leading game researchers have set up shop at Zero-Game Studio, a subsidiary of Sweden's Interactive Institute, to explore computer games' full potential. One of the studio's first efforts is "The Visby Game," a real-world adventure scenario that takes place in the Swedish town of Visby, which is also Zero-Game's base of operations; The Visby Game is yielding insights not only into portable gaming, but how location-based games can be used to enhance tourism. "There's a new and emerging game area where mobility and location have particular kinds of impact on the kinds of game playing that are possible," explains Zero-Game research manager Craig Lindley. "This is something that is yet to be explored in depth." Zero-Game programmer Mirjam Eladhari notes that the majority of online multiplayer games take place in a sci-fi or fantasy setting, yet many of her predictions of game development--the combination of multiple game genres, a different choice of fictional themes for game environments--have not panned out.
    Click Here to View Full Article

  • "Aiming for Fast, Universal Access, Researchers Will Rethink the Architecture of the Internet"
    Chronicle of Higher Education (10/17/03) Vol. 50, No. 8, P. A35

    Researchers at Carnegie Mellon University will lead a group of universities and research laboratories across the country in studying the feasibility of using a glass-fiber telecommunications network for the Internet. As part of the five-year "100 Megabits to 100 Million Homes" project, researchers will design and develop fiber-optic networks, test small-scale prototypes, and study the impact of bringing more reliable and faster Internet access to U.S. homes and businesses. The superfast Internet access provided by glass-fiber networks would be 100 times the speed of most DSL connections. "Since we've used copper-based telecommunications networking for so long, this is the first time in 100 years of science that we've seriously re-evaluated network architecture," says Hui Zhang, an associate professor of computer science at Carnegie Mellon and the project's principal investigator. The National Science Foundation will contribute $7.5 million toward the project, which will draw participation from computer scientists, engineers, and economists from Carnegie Mellon, Rice and Stanford Universities, the University of California at Berkeley, the Internet2 consortium, and a number of supercomputing laboratories and for-profit research centers.

  • "Rebel Network"
    New Scientist (10/11/03) Vol. 180, No. 2416, P. 26; O'Brien, Danny

    By combining mesh networking and Wi-Fi, British programmer John Anderson envisions a wireless network with practically no boundaries that provides people with cheap broadband Internet access on demand. Mesh networks are comprised of nodes, with each node serving as an intermediary for the messages of its neighbors; the scope of the network can be broadened with the addition of nodes at the edge, while its resiliency can be boosted by deploying more nodes. Yet costs and technical difficulties have restricted the development of mesh networks to the experimental phase for the most part. Simulating such networks is a formidable challenge, while network performance can be affected by the vagaries of weather, terrain, other devices, antenna orientation, the routing protocols, and the network's decentralized control scheme. Anderson, who was asked by computing consultant Geoff Jukes to help build a prototype mesh network that could be used in the village of Devon, where ADSL connections were nonexistent, modified the Ad Hoc On Demand Distance Vector routing protocol and packaged it in software that he distributed for free; he also preinstalled the software in customized computers with built-in Wi-Fi cards called MeshBoxes. Anderson's open-source strategy has given the programmer a huge resource of user feedback that could lead to system improvements, while the wide availability of Wi-Fi allows anyone to deploy and improve the mesh network. Some mesh networking specialists are skeptical about using Wi-Fi as a component of a large-scale mesh, given that Wi-Fi cannot handle contention--the interference that occurs when two nodes try to communicate at the same time--as well as Ethernet. Another potential hindrance to the adoption of Anderson's network is the wide availability of inexpensive wired broadband in suburban areas.

  • "Thwarting Piracy"
    eWeek (10/13/03) Vol. 20, No. 41, P. 37; Vaas, Lisa; McCright, John S.

    Owners of commercial databases expect to secure copyright protection for databases with the proposed Database and Collections of Information Misappropriation Act. Advocates of extending copyright protection to databases have been in a battle over the issue for the past seven years, when the previous proposed bill was known as the Database Protection Act. In September, the House Judiciary and Energy and Commerce Subcommittee held a hearing on the draft legislation, which avoids intellectual property issues, and focuses more on data misappropriation. "We're hoping we can use it to prevent database piracy, where somebody takes somebody else's database, slaps their name on it and then goes into competition with the original database producer," says Keith Kupferschmid, who testified in favor of the bill on behalf of the Coalition Against Database Piracy. Advocates of the bill want Congress to realize that data pirates can use the Internet to steal, repurpose, and resell the information in the databases of Reed-Elsevier's LexisNexis services. However, courts often have viewed such databases as a collection of facts rather than as a creative work. Nonetheless, the new bill still is likely to draw the ire of academics, scientists, and businesses because it would impact the accessibility of commercial databases. Users would no longer have access to information in as timely, organized, and as comprehensive a manner as before, which opponents of the bill say would stifle innovation while strengthening existing monopolies.
    Click Here to View Full Article

  • "Ready to Ware"
    IEEE Spectrum (10/03) Vol. 40, No. 10, P. 28; Marculescu, Diana; Marculescu, Radu; Park, Sungmee

    E-textiles--electronics woven into apparel--promise to enhance the performance of battlefield commanders, firefighters, and athletes, as well as boost functionality and stylishness for average consumers. A SmartShirt or Wearable Motherboard is a cotton/polyester blend threaded with a conductive grid of optical fibers that function as data buses and power lines, conveying information from sensors to a controller, which wirelessly transmits data using Bluetooth or IEEE 802.11b. SmartShirts are being developed in multiple areas: Sensatex has created an intelligent garment designed to prevent sudden infant death syndrome (SIDS) by monitoring a baby's heartbeat, respiration, and temperature, and wirelessly alerting parents of any unexpected changes via PC, watch, or personal digital assistant. Other e-textile technologies under development or in the prototype phase include a sensor-studded carpet from Infineon and its German partners capable of motion detection and illuminating escape routes in the event of fire; an optical fiber display from France Telecom that can relay text and imagery; and International Fashion Machines' Electric Plaid wallpaper, which can vary its colors and patterns as its conductive fibers are heated or cooled. The practical and commercial success of such technologies depends on making them reliable, fault tolerant, and robust. By setting up the garment's electronic components in a network architecture, the workload can be automatically redistributed around malfunctioning nodes. Another key factor is designing e-textiles so that preprogrammed processing nodes can be reprogrammed in response to changing operating conditions. The momentum for e-textiles is expected to build from niche applications such as SIDS protection and military uses, while consumers are likely to consider privacy and safety issues when choosing such products.
    Click Here to View Full Article

 
                                                                             
[ Archives ] [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM