HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 685:  Monday, August 23, 2004

  • "Stopping Spam at the Source"
    CNet (08/23/04); Reardon, Marguerite

    Driving up the costs of spamming is the goal of new antispam technology standards reviewed this month by the Internet Engineering Task Force (IETF). Spammers' profitability depends on them sending the most spam they can in the shortest amount of time for the least amount of money and effort, and antispammers argue that hindering the process could force some spammers out of business by elevating costs and reducing profit margins. Experts say email authentication could act as such an obstacle and potentially eliminate millions of email addresses currently employed as spam fronts. Among those who submitted proposals for email authentication standards was Microsoft, whose Sender ID technology was approved by the IETF for fast-tracking; the task force also recommended that Cisco Systems and Yahoo! combine their signature-based authentication standards and resubmit them as a joint proposal. The stakes in the antispam movement have risen significantly as "phishing" attacks are increasing in frequency while the volume of spam is also expanding, constituting over 65 percent of all email processed by mail servers, in Symantec's estimation. Email authentication is designed to confirm both the legitimacy of the sending IP address as well as the sender's trustworthiness. Microsoft's Sender ID technology is set up to ensure that that the sender's return address is genuine, while DomainKeys from Yahoo! and Identified Internet Mail from Cisco propose that all outbound mail be tagged with encrypted digital signatures to enable Internet servers to check and confirm inbound mail's origins. Another proposed solution is TurnTide's antispam router, which TurnTide parent company Symantec claims can reduce network spam by up to 90 percent; the solution analyzes packets and estimates the probability that they originate from spammers, restricting the amount of traffic emanating from these sources using fundamental TCP/IP features.
    Click Here to View Full Article

  • "Lost Votes in N.M. a Cautionary Tale"
    Washington Post (08/22/04) P. A5; Keating, Dan

    A snafu during the 2000 presidential election in which 678 votes cast on computerized voting machines in Rio Arriba County, N.M., were lost illustrates that even the best e-voting systems cannot prevent miscounts due to programming errors caused by inexperienced local election staff. Tampering has become a major concern as many states transition to e-voting systems this year, but the Rio Arriba County incident demonstrates that perfectly innocent mistakes can also undermine election outcomes. A Washington Post review of New Mexico's voting results showed that no votes were recorded in one district of Rio Arriba County despite the fact that 203 voters turned out, while two-thirds of those who voted in the month before Election Day in another district were also disenfranchised. There are three voting districts in the county, but for early voters the county employed only a single ballot listing the names of all the state legislature candidates, who differ in each district. New Mexico computer technician Steve Fresquez recalls that Rio Arriba County had just two early-voting locations, and the county attempted to program one machine to cover all three districts instead of programming separate machines at each location for each district. Although a sophisticated three-step audit process praised by the federal Election Assistance Commission as a "best practice" uncovered the programming error, the lost votes were irretrievable. MIT political scientist Steve Ansolabehere contends that errors will probably crop up when thousands of U.S. counties program ballots for multiple districts with numerous races in each election: "These offices in rural areas do not have the staff with the kind of technical expertise necessary to do electronic voting," he explains. Both critics and advocates of e-voting concur that local workers and volunteers need better training on computerized voting systems.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

    For information on ACM's activities involving e-voting, visit http://www.acm.org/usacm.

  • "Computers Can Argue, Researcher Claims"
    NewsFactor Network (08/20/04); Martin, Mike

    University of Southampton computer science professor Nick Jennings contends that computers are capable of arguing, as well as assessing the most successful conflict resolution strategy. The researcher finds that conflict is unavoidable in systems with multiple computer agents, where each agent is focused on its own unique objective. Jennings maintains in a recently published paper that the best solution to such problems is negotiation among agents, a process that must be enabled by artificial intelligence. Agentis CTO David Kinny, who co-authored with Jennings and Michael Wooldridge the precedent-setting "Gaia" paper on agent-oriented analysis and design, concurs with the University of Southampton researcher that agent technology is a promising and vital element. "Negotiation techniques are crucial in open-agent systems, where agents representing different individuals or organizations interact--as well as in any systems where agents have conflicting goals or information," he says. Agents' potential role in e-commerce is one argument for the necessity of negotiation-based conflict resolution, although Kinny notes that negotiation is only one component in a much larger architecture. He explains that creating practical artificial agents requires "the automation of a broad range of decision-making behaviors, not only those concerned with reaching agreements or solving conflicts." Jennings insists that argumentation is only a last-resort strategy for agents.
    Click Here to View Full Article

  • "Legal Victory for File Sharing"
    Los Angeles Times (08/20/04); Healey, Jon

    In a major defeat for the entertainment industry, a three-judge panel of the 9th Circuit Court of Appeals ruled Aug. 19 that the companies that distribute Grokster and Morpheus file-sharing software did not violate copyright law, even if their customers did. This decision came in a three-year-old lawsuit filed against Grokster and StreamCast Networks by the seven leading movie studios and the five top record labels and major music publishers. The court cited the precedent-setting 1984 Sony Betamax ruling, which shields products with significant legitimate uses from copyright lawsuits. Judge Sidney R. Thomas with the appeals panel wrote that file-sharing software distributors cannot be held accountable for merely knowing that their users are pirating copyrighted works; they must possess knowledge about specific acts of piracy in time to halt them, and neither StreamCast or Grokster had such knowledge, the court found. The court also found that, in keeping with the Betamax decision, StreamCast and Grokster's software has legitimate applications, such as for the distribution of works in the public domain, and as an inexpensive tool artists can use to publicize their works. Thomas opined that the entertainment industry would adapt to file sharing in a way that benefits both sides, arguing, "History has shown that time and market forces often provide equilibrium in balancing interests." Meanwhile, industry representatives such as Motion Picture Association of America President Jack Valenti and Recording Industry Association of America CEO Mitch Bainwol vowed that copyright holders will continue to fight digital infringement and lobby Congress to toughen copyright laws. Adam Eisgrau of P2P United warned lawmakers that acting rashly in the entertainment industry's favor could ultimately choke technological innovation and hurt the economy.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Smart Tiles Add Reality to Virtual Worlds"
    New Scientist (08/18/04); Knight, Will

    Japanese researchers at the University of Tsukuba and ATR Media Information Research Labs have developed CirculaFloor, a prototype intelligent floor tile system that enables people to travel within virtual environments while staying in one place. Current virtual reality systems are less realistic and can even induce dizziness because users remain stationary while they navigate via a joystick or some other controller. Imparting the sensation of walking in a virtual environment is critical "both for a general sense of presence, but also for a number of applications which depend critically on providing a realistic sensation of locomotion," notes the University of Toronto's Paul Milgram. CirculaFloor, which was highlighted at the recent Siggraph 2004 conference, features four square floor tiles individually outfitted with motors and wheels so they can move in any direction, using magnetic sensors to determine the walker's course; the tiles' motion counterbalances the walker's movement. The walker wears a virtual reality headset to perceive the artificial world he or she is interacting with. CirculaFloor's creators maintain that the invention has advantages over the conveyor belt technique in the areas of portability and compactness. University College London virtual reality expert Alasdair Turner thinks CirculaFloor would be an important asset in an experiment to determine the route choices made by pedestrians. One of CirculaFloor's drawbacks is the low speed at which the system operates, and Turner believes increasing the speed is a challenging prospect.
    Click Here to View Full Article

  • "Building Peer-to-Peer Applications"
    IST Results (08/20/04)

    The Information Society Technologies program-funded P2P Architect project offers an open source architectural platform for peer-to-peer (P2P) application development. Project coordinator Simela Topouzidou explains that the platform foregoes a client-server model in favor of a semi-centralized P2P model. "We have built the environment that integrates the various tools into the environment, and prototypes of these can be downloaded from the project Web site," she says. The development of P2P Architect involved the participation of two academic institutions and two tech providers, and Topouzidou notes that the architecture was built in two distinct environments so that applications could be devised for the project's two users, the TOP AD publishing house and Siceas, an Italian bookshop, ticketing, and box office services provider. The TOP AD deployment speeds up the publication of the Kid's Fun magazine by providing an environment that allows the same document to be accessed, viewed, and edited simultaneously by multiple users, who can also track each other's activities concurrently. Siceas, meanwhile, employs a network with 100-plus servers and needs to supply reliable round-the-clock service. The results of the P2P Architect pilots are publicly accessible via an open source agreement, and Topouzidou says the consortium believes that consultancy and support services, rather than the software, will drive revenue. "We anticipate that there will be a community of interested people, and who will contribute to the software's development, enhancement and revision," she says.
    Click Here to View Full Article

  • "Diverse Sciences Propel Bioinformatics"
    eWeek (08/20/04); Tenenbaum, Jessica D.

    Whereas two years ago the Computational Systems Bioinformatics (CSB) Conference was considered a forum for experts in biology and computer science, this year the event has expanded to include chemistry, physics, mathematics, engineering, and medicine. In a keynote speech at this year's CSB event, UC Berkeley professor Eugene Myers explained that the obvious bioinformatics research approach has reached its limits, and future breakthroughs require more cross-disciplinary conferences and researchers. He cited the human genome project as a case in point: Although the numbers of gene sequences archived in public databases have risen exponentially in the last decade, this does not necessarily guarantee a greater understanding of genomics, while the acceleration of experimentation only increases the risk for error. The interdisciplinary nature of bioinformatics was spotlighted at CSB, with discussion topics ranging from fractal geometry to high-throughput microscopy to data mining to text processing to artificial intelligence. Stanford University's Eran Segal demonstrated a technique that blends computer science, expression profiling, statistics, and protein signaling. The first step involves modeling genes and proteins as nodes on a graph, while edges between the nodes symbolize hypothetical interactions within a biological system. Results from DNA microarray experiments are then plugged in as a data set for Bayesian networks, which enables the researchers to computationally clarify the interactions in order to determine the cross-regulatory architecture between genes and proteins. Harvard University's Steve Wong, meanwhile, discussed how digital microscopy and robotic automation could be employed to analyze thousands of cells concurrently for high-throughput drug screening and other tasks.
    Click Here to View Full Article

  • "Where Computers Go to Die"
    Salon.com (08/17/04); Mieszkowski, Katharine

    California's new e-waste legislation will require electronics buyers to pay $6 to $10 upfront for recycling costs when they purchase a new computer monitor, flat panel display, or television, a solution that has left electronics recycling advocates only partially satisfied. Critics of the new plan say it lacks funding and planning, and that putting the responsibility on the shoulders of consumers and the public sector ensures inefficiency. Other programs, such as those adopted by the European Union and the state of Maine, would instead make manufacturers responsible for the collection and recycling of their own products after they are no longer wanted: Those shared-responsibility plans will translate into product designs that are more easily recyclable and efficient processes, and have the backing of at least one computer maker, Hewlett-Packard. Unlike IBM and Apple, Hewlett-Packard supported the Maine e-waste law, which will go into effect in 2006, because it means the company can recover its materials and has more incentive to design efficiently, says environmental strategies and sustainability director David Lear. Hewlett-Packard and Dell are currently offering recycling programs to their customers in order to test demand for those services. The number of consumers actually willing to participate in any recycling program is a big unknown, but electronics recycling advocates say that anything less than 100 percent participation is dangerous to the environment. The bottle and can recycling program that California's e-waste law was modeled on currently nets about 60 percent of all recyclable bottles and cans, but the percentage of recyclable electronics is likely to be less than that, says Californians Against Waste executive director Mark Murray. Without a state law, about 20 percent of all targeted electronics in California are being recycled.
    Click Here to View Full Article
    (Access to full article is available to paid subscribers only.)

  • "OSDL Introduces Improved Linux Kernel Development"
    Computerworld (08/19/04); Weiss, Todd R.

    The Open Source Development Lab (OSDL) consortium announced the release of an upgraded version of the Scalable Test Platform (STP) used for Linux kernel development on Aug. 19. OSDL's mission is to boost the adoption of Linux worldwide. Version 3 of the STP boasts new features to enhance simulations of enterprise data centers on the Linux kernel; developers will be able to model how different workloads affect the most popular open-source databases, such as MySQL and PostgreSQL. STP supplies test suites that developers can employ to confirm code changes in the kernel through automatic performance tracking and stability quantification, and qualified developers can download the tool for free off the OSDL Web site. OSDL reported that STP enables enterprises to run stability and performance tests on patches and company-specific Linux systems before production implementation. "A mandate from our members and the development community is to make OSDL a leader in Linux testing," stated OSDL CTO Timothy Witham. "That's why we created STP and why we invested more than $15 million in our data center testing systems and make them available to developers around the world." During the migration from development Kernel 2.5 to the production 2.6 release, STP was used to guarantee stability in Linux by conducting over 1,000 kernel tests per month, on average; more than 10,000 additional tests on the 2.6 production kernel have been carried out since then.
    Click Here to View Full Article

  • "Living Well"
    Taipei Times (08/18/04) P. 15

    Computer makers and research groups are working to create tomorrow's digital home environments, tying together many of the technologies and components that are already available today. The Swiss government's FutureLife project is working out possible kinks and using a real-life Swiss family home as a test environment, which has meant at least one evening spent in below-freezing temperatures for the residents who could not get the door's biometric sensor to read their fingerprints. The technology was taken from the banking industry where it is usually deployed in temperature-stable environments, explains FutureLife's Beat Schertenleib. But the FutureLife test family has spoken well of other benefits, including automatic lawn watering and text-message alerts when a digital drop box receives post-delivered packages. Other European research groups are working on similar projects, including the inHaus and HomeLab projects in Germany: Fraunhofen Institute's inHaus project makes use of components and devices already available commercially, ranging from energy conservation technology to Internet-controlled devices; one of the main focuses for the inHaus effort involves integrating these technologies and creating simple, easy-to-use interfaces. Sony and Philips are both working on wirelessly networked home entertainment schemes that could form the foundation for more pervasive home networks. These systems rely heavily on computer technology, including the WLAN standard and network servers that archive, serve up, and route digital content to different devices. Both companies are sticking to non-proprietary standards in order to appeal to the widest number of users.
    Click Here to View Full Article

  • "DNA Technique Protects Against 'Evil' Emails"
    New Scientist (08/19/04); O'Brien, Danny

    The bioinformatics research group at IBM's Thomas J. Watson Research Center has adapted a algorithm originally developed to analyze DNA to weed out spam emails. The Teiresias algorithm was designed to look for recurring patterns in different DNA and amino acid sequences indicative of important genetic structures. When fed 65,000 examples of known spam, Teiresias, now renamed Chung-Kwei, treated each email as a DNA-like chain of characters, and spotted 6 million patterns, such as "Viagra," that represented a common sequence of letters and digits that showed up in more than one spam message. The IBM researchers then fed a collection of known non-spam to the algorithm, and eliminated the patterns that manifested in both groups. Chung-Kwei scores incoming email according to the number of spam patterns it contains, and the algorithm correctly identified 64,665 of 66,697 test messages as spam. Furthermore, Chung-Kwei only misidentified one out of 6,000 genuine emails as spam. The algorithm's embedded tolerance for different yet functionally equivalent DNA sequences allows it to deal with popular techniques spammers use to circumvent pattern-recognition schemes, such as replacing letters with symbols. "What is exciting is not the particular algorithm, but the fact that IBM has shown there is the entire field of bioinformatics techniques to explore in the fight against spam," notes SpamAssassin developer Justin Mason. IBM plans to incorporate Chung-Kwei into its commercial SpamGuru product.
    Click Here to View Full Article

  • "The Outback, Soccer Robots, & Future Engineers"
    Design News (08/18/04); Field, Karen Auguston

    About 175 middle and high school teachers from around the world have gathered in Austin, Texas, for NIWeek, dedicated to ROBOLAB, a robot development kit combing LEGO pieces and a version of NI's LabVIEW graphics development software that since being introduced 14 years ago has spread globally. One of the reasons for the kit's popularity is the RoboCup Junior International Competition, this year involving 20,000 students from 29 countries who gathered in Lisbon this past July to compete in a soccer-like tournament with robots designed by teams of four or five students each. Australian educator Brian Thomas recounted at NIWeek how the only 10 students in a tiny middle school in the Outback made it to this year's competition. In all, Australia fielded some 1,000 teams for the event. Australia, like the United States, faces a shortage of engineering graduates to fill technical needs. Thomas sees the competition as a way to draw youth to science and technology fields. "For a lot of kids, math and science courses are dull and uninspiring," he says, "We're finding RoboCup is turning them around, even getting them thinking about pursuing a degree in engineering."
    Click Here to View Full Article

  • "Face Synthesis Technology Makes Waves"
    Computerworld Singapore (08/24/04) Vol. 10, No. 26; Sze, Tan Ee

    XID Technologies CTO Dr. Roberto Mariani has been nominated for this year's World Technology Awards (WTA) for his development of an adaptive face recognition technology that uses face synthesis. Face synthesis is a process in which numerical values are applied to specific points on a face in order to generate a feature vector that is placed in a database: Lighting, facial rotation, facial hair, and other various conditions are synthesized to produce many faces from a single 2D facial image, and then feature vectors are extrapolated for each synthesized face and stored in the database. Every time someone goes through the entry channel, that person's image is compared to the database of synthesized feature vectors. XID is marketing the technology in Singapore as the XID SmartID system, which has been implemented in Kaki Bukit's Immigrant Workers Dormitory; roughly 6,000 employees use the system to gain access across 16 channels of entry. XID's face synthesis technology is the recipient of 2002's Singapore Defense Technology Prize Award and an award for Technological Innovation in Biometrics at Biometrics 2002, and was also a finalist for the Global Entrepolis @ Singapore Awards at 2003's Asian Innovation Awards. Mariani is sharing the WTA nomination with I2R lead scientist Dr. Tele Tan. World Technology Network founder and Chairman James Clark explains that WTA nominees are elected by current World Technology Network members who are tasked with singling out "above all others around the globe the ones they think are doing the work that will have the greatest 'ripple effect' over time."
    Click Here to View Full Article

  • "Internationalizing the Web"
    Information Today (08/04) Vol. 21, No. 7, P. 15; Peek, Robin

    The name "World Wide Web" implies that the global network is international at its core, but internationalization to the World Wide Web Consortium (W3C) is about making content more in line with the cultural and linguistic characteristics of its various users. An internationalized Web would accommodate the unique symbols, punctuation, alphabets, and directionality of text of the entire global audience. Aware that designing the Web with the ISO 8859-1 character, commonly known as ASCII (American Standard Code for Information Interexchange), which only backs Western European languages, W3C created the activity group i8n in 1995 to study the issue of internationalization. In looking at HTML, internationalization would be difficult, but XML and XHTML offers more flexible solutions. In May 2004, i8n released three draft documents that suggest the Web is ready to embrace internationalization. "Ignoring this advice in this document or relegating it to a later phase in the development will only add unnecessary costs and resource issues at a later date," concludes one document. Though making content internationally friendly would not be too difficult for navigation schemes or static data such as an order form, the same cannot be said when working with something more colorful. Internationalization of content brings up the question of practicality, and may even extend to social engineering as much as it does to technological advancement.

  • "PCAST Will Advise Bush on Nanotech"
    Small Times (08/01/04) Vol. 4, No. 4, P. 8; Gruenwald, Juliana

    The President's Council of Advisors on Science and Technology (PCAST) will serve as the presidential advisory panel on nanotechnology-related issues, as stipulated in the nanotechnology authorization bill that President Bush signed into law last December. The law calls for the creation of a National Nanotechnology Advisory Panel with participants from academia and industry, but PCAST, which was created in 1990, will fulfill the role, says John Marburger, chief science adviser of the Bush administration. PCAST already advises presidents on a wide range of technology and scientific research matters, and Co-Chairman E. Floyd Kvamme believes it is logical for the group to assume the largely oversight role of addressing nanotechnology issues. Moreover, creating a panel from scratch would have left the committee with little time to submit a report on the oversight of the federal nanotechnology policy to Congress by the end of the year. Dell CEO Michael Dell and MIT President Charles Vest are among the prominent members of PCAST. Although PCAST does not have any big names involved in nanotechnology, the group has a subcommittee devoted to nanotechnology. The subcommittee includes nanotechnology experts such as Richard Smalley of Rice University, George Whitesides of Harvard University, and Stan Williams of Hewlett-Packard. Kvamme says he likes the direction of the program so far.
    Click Here to View Full Article

  • "Show Us Your ID"
    Government Technology (07/04) Vol. 17, No. 7, P. 40; Newcombe, Tod

    Growth in multi-party, Web-based applications involves identifying users, and the security assertion markup language (SAML) can help. Counties began using imaging technology to archive land documents in the mid 1990s, but title and mortgage companies still deliver them in paper form since there is no manageable way to electronically identify all those involved in the recording and exchange of such documents. However, a group of counties, lenders, and title companies created a method last year to identify parties involved in the electronic exchange and recording of title documents, known as the County Land Document Recordation Exchange. With the new method, electronic title recordings can take place in minutes instead of weeks. The SAML standard, which lets users to log onto one Web site and be recognized by an affiliated organization, enabled this system. "The problem with identity management in the past has been that data and people are spread all over," explains Oblix director Rick Caccia, and state and local governments lacked a cost-effective way to permit single-sign-on and transactions across multiple agencies. The Liberty Alliance Project has been the main force behind the growth of federated identity management and the use of SAML, and has developed federated identity management specifications from a number of emerging standards. Caccia says that almost 30 percent of his company's business is with government; the General Services Administration is running tests through its E-Authentication Initiative so as to demonstrate to government agencies the interoperability of SAML-based solutions. Experts say that federated identity management's potential is enormous at the state and local government level.
    Click Here to View Full Article

  • "Don't Just Break Software. Make Software."
    Better Software (08/04) Vol. 6, No. 6, P. 18; Reppert, Tracy

    Harsh criticism has been leveled against using storytest-driven development (STDD) for creating software systems, despite testimony from advocates that it yields benefits for all parties involved: Industrial Logic XP coach Somik Raha explains that STDD boosts the confidence of customers, prevents programmers from being blamed for building unacceptable systems, and removes the stigma programmers tend to place on quality assurance. STDD convenes customers, developers, and testers before coding, and they cooperate to identify a particular "story" or piece of functionality to focus on, after which customers and testers particularize standards to confirm that the story works and build an executable document that any team member can access. The traditional acceptance testing scheme has developers build the whole system, declare it complete, and then submit it for acceptance testing, a paperwork-heavy process that happens at such a late point in the development cycle that any defects are costly to remedy. STDD allows the customer expectation to be delineated into an executable and readable contract that programmers must comply with if they wish to claim a functioning system. Raha notes that STDD provides a way to quickly iron out conflicts and lower the risk of building the wrong system. Many people resist adopting STDD despite its advantages, possibly because shifting to the STDD process is such a major change. "It was difficult to trust the process in the beginning, but it's so much better than what we used to do," says Nielsen Media Research senior project manager Pam Smoot. Ironically, the rest of the organization often takes the success of STDD practitioners as an insult, and the successful integration of STDD requires the consideration of such idiosyncratic behavior.

  • "Wireless Grids"
    Internet Computing (08/04) Vol. 8, No. 24, P. 24; McKnight, Lee W.; Bradner, Scott; Howison, James

    Wireless grids link mobile, nomadic, and fixed wireless devices such as sensors and mobile phones with wired grids and each other, and these devices can deliver new resources, locations of use, and institutional ownership and control patterns for grid computing via ad hoc distributed resource sharing. Wireless grid applications are divided into three classes that are not mutually exclusive: Those that amass information from the range of input/output interfaces found in nomadic devices, those that leverage the locations and contexts in which the devices exist, and those that leverage the mesh network capabilities of collections of nomadic devices. The wireless grid outlined by the authors draws on three related computing frameworks (grid computing, peer-to-peer computing, and Web services), while the wireless grid infrastructure is split into three categories--physical-layer technologies and policy, network infrastructure requirements, and enabling middleware. The community dedicated to the first category is pushing to supplant the current FCC licensing model with rules-based public access to the airwaves, while researchers focused on the networking layer are investigating power efficiency and coverage in wireless networks. The middleware research community has thus far identified a quintet of abstract needs for ad hoc resource sharing: Resource description, resource discovery, coordination, trust establishment, and clearing. As a demonstration of the potential of wireless grids and distributed ad hoc resource sharing to tap the combined capability of mobile devices in social contexts outside the expected computing environments, Syracuse University's Distributed Ad Hoc Resource Coordination (DARC*) project aims to build a system that allows devices with no previous knowledge of each other to collectively record and mix an audio signal.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM