Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to technews@hq.acm.org.
Volume 7, Issue 744: Friday, January 21, 2005

  • "Another H-1B Battle Coming?"
    CNet (01/20/05); Frauenheim, Ed

    An omnibus bill approved by President Bush last month expands the H-1B visa program's current annual cap of 65,000 workers by granting an exemption for up to 20,000 foreigners earning graduate degrees at U.S. schools, a move that critics are opposed to. Vin O'Neill with the U.S. branch of the Institute of Electrical and Electronics Engineers (IEEE) argues that over 40,000 additional guest workers could be brought in this year thanks to cap exemptions, warning that "The greater the supply, the more leverage the employer has to drive down wages." IEEE-USA defends the lower visa cap by citing Labor Department data indicating that the average number of unemployed American workers in such high-tech job categories as database administrators and computer programmers fell significantly last year. H-1B supporters claim exemptions are necessary, given the fact that this year's cap was reached on the first day of the current fiscal year; indeed, some backers suggest that raising or eliminating the visa ceiling may be a sensible move. Information Technology Association of America President Harris Miller says the latest cap exemption will help keep talented foreign workers in the United States. He also says guest workers can help U.S. companies sell products in foreign markets. Meanwhile, Rep. Bill Pascrell (D-N.J.) recently hinted at plans to introduce legislation requiring all employers applying for H-1B visas to look for qualified domestic employees first. One recent revision to the H-1B program's rules supposedly requires companies to attest that American workers will not be displaced by visa holders, but the fine print reads that this only pertains to H-1B-dependent employers as well as companies found guilty of a "willful failure or misrepresentation" in the previous five years.
    Click Here to View Full Article

  • "The Emerging World of Wireless Sensor Networks"
    InformationWeek (01/20/05); Ricadela, Aaron

    Researchers see the future of networking in wireless sensor technology that will enable the Internet to sense real-world environmental factors such as air temperature, equipment vibrations, or abnormal noise indicators. Though many of the talked-about applications seem somewhat prosaic, wireless sensor technology could be the enabler for the next killer app, says Palo Alto Research Center computer science lab manager Teresa Lunt. The applications people have already come up with for wireless sensor networks replace existing systems with solutions that are easier to deploy, but Lunt says there is still serious technical work to be done, such as balancing processing, power, scalability, and bandwidth requirements; ad-hoc networking so far has been unable to scale to high numbers because of bandwidth issues, since more nodes on the network mean less available bandwidth. Sensor networks are likely to be first deployed in the military, which is always interested in increased sensing capabilities. The Defense Advanced Research Projects Agency is currently investigating sensor technology for use in autonomous airborne and ground vehicles, while other potential applications include in-vehicle sensors that let cars communicate with one another on the highway or prevent accident or injury by sensing dangerous situations. Software will play a critical role in adding value to wireless sensor networks, especially because of the in-network processing issues, says Intel Research associate director Hans Mulder. Intel is working with SAP, Rockwell Automation, and Honeywell to endow sensor networks with improved software capabilities that will add value and make the network more efficient. Mulder says it is too early to tell whether wireless sensor nodes will be a market for Intel, but he says there are services opportunities and important expertise to be gained.
    Click Here to View Full Article

  • "Celebration of Diversity in Computing Conference Seeks Submissions"
    HPC Wire (01/19/05)

    The 2005 Richard Tapia Celebration of Diversity in Computing Conference, which is scheduled to take place in Albuquerque in October, has put out a call for submissions for papers, panels, workshops, posters, and birds-of-a-feather sessions. This year's conference is themed "A Diversity of Scholars--a Tapestry of Discovery," and researchers are invited to submit abstracts of proposed papers covering collaborative and emerging technologies, multidisciplinary activities in science, and computational mathematics and science. Proposals for panels are recommended to deal with technical areas or issues relating to boosting diversity in the computing field, while workshop proposals should also cover such issues. "The topic of 'computing' conjures up different ideas in different people, and we hope to tap into that diversity when evaluating submissions and designing the conference program," notes Sandia National Laboratories' Pamela Williams, who serves as the conference's general chair. "Our goal is to bring together leading researchers from around the world to give presentations on state-of-the-art topics in the diverse fields of computing." Tapia 2005, whose sponsors include the ACM, the Computing Research Association, and the IEEE Computer Society, is designed to provide a networking environment for minorities throughout the wide spectrum of computing and information technology. Institutions that have committed to support the conference include Sandia and the University of Illinois, Urbana-Champaign. The conference Web site at www.ncsa.uiuc.edu/Conferences/Tapia2005/cfp.html offers details about electronically submitting proposals.
    Click Here to View Full Article
    For more information on the Coalition to Diversify Computing, visit http://www.ncsa.uiuc.edu/Outreach/CDC/?Outreach/CDC/.

  • "New Copyright Protection Bills Likely in 2005"
    IDG News Service (01/19/05); Gross, Grant

    Legislative agendas this year will continue the wrangling over software piracy, peer-to-peer file-trading, and other issues that have pitted content owners, technology vendors, and consumer groups against one another. Although Sen. Orrin Hatch's (R-Utah) Induce Act is unlikely to be resurrected considering the broad opposition it gathered last year, the powerful Hollywood and music industry lobbies that supported the effort are unlikely to relent, says P2P United executive director Adam Eisgrau. Copyright interests want to lower the legal bar in order to make their court lawsuits more effective. The U.S. Supreme court is expected to rule on a case against peer-to-peer software firms Grokster, StreamCast Networks, and MusicCity.com in mid 2005, while lower courts said those companies were not liable for illegal uses of their products. The Induce Act showed significant support for legal protections of consumer media rights, says Rep. Rick Boucher (D-Va.), whose Digital Media Consumers' Rights Act was stifled in the House Energy and Commerce Committee last year; but new committee Chairman Rep. Joe Barton (R-Texas) was co-sponsor of the Boucher bill, and he says he will introduce similar legislation this year. The Business Software Alliance (BSA) is striking a delicate position after many of its technology industry members opposed the Induce Act for fear it would cramp innovation; the BSA wants law enforcement to become more strict with current laws. Congress is also likely to investigate whether ISPs need to do more to facilitate law enforcement efforts by handing over subscriber information. Other technology issues on the congressional agenda include spyware penalties and the possible inclusion of IT security requirements in corporate accountability laws such as the Sarbanes-Oxley Act.
    Click Here to View Full Article

  • "Dead Electronics Going to Waste"
    Washington Post (01/21/05) P. A4; Eilperin, Juliet

    The International Association of Electronics Recyclers estimates that Americans discard 2 million tons of electronic products annually, a figure that is expected to skyrocket to 400 million products by the end of the current decade. Despite the growing threat of e-waste at home and abroad, attempts to curb the problem are spotty and lack a unified framework. The EPA released non-binding e-waste management guidelines last March, and has supported electronics recycling pilots from retailers such as Staples, which takes back products regardless of where they were bought. EPA's Thomas Dunne authorized the formulation of a wide-ranging, voluntary e-waste recycling plan last month. Though federal officials have worked with industry for years to come up with a comprehensive national strategy for e-waste recycling and regulation, a lack of consensus between manufacturers over who should foot the bill is a major impediment. Electronics makers are divided on whether to pay for recycling themselves or add the recycling fee to the purchase price of their products. Some manufacturers have taken steps to more conscientiously dispose of old products: Hewlett-Packard, for instance, has promised to no longer export e-waste for disposal in other countries, and will recycle any PC for a fee using an environmentally sensitive recycling service. Federal officials say they are taking a more incremental approach to managing the government's contribution to the e-waste problem, but Ted Smith with the Silicon Valley Toxics Coalition says their efforts should be more comparable to the European Union's mandate to phase out all hazardous material in electronic products as well as require manufacturers to set up a system for recycling dangerous products when they become obsolete.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Artificial Intelligence Alive and Well in a Robot Named Maria"
    Scoop (NZ) (01/19/05)

    Maria is a virtual robot created by Ph.D. student Shahin Maghsoudi and Dr. Tiru Arthanari of the University of Auckland that can interact with people using a 203,000-word vocabulary supplemented by 118,000 logical inference rules and 106,000 rules of grammar. "When humans interact, they rely on their mutual understanding of a huge body of shared knowledge," Maghsoudi explains. "On a daily basis, we add to our personal database of knowledge stored in our brain." He says such a database must be programmed into a robot in order for it to interact with people in a way that feels natural. Arthanari brought his knowledge of statistical inference methods to the project so that the end product could be used as an electronic "assistant" that can answer multiple student questions concurrently and conversationally, 24/7; he created a specialized database that included the subject expertise, knowledge of where to refer students for ancillary data, and content relating to effective teaching principles. A key focus of the project is building trust between the robot and students, says Arthanari, who adds that "The real success will be when students feel that interacting with Maria is just like going to a real teacher who apart from the subject matter knows a little about the students as well." To this end, a portion of the database is devoted to composing profiles of students from keyboard input or a form the student fills out when first interacting with the robot. Shahin, who is working toward his Masters degree by developing virtual robots that can be employed as help desk operators, teaching aides, and Web-based marketing assistants, is currently concentrating on giving robots the ability to make common-sense judgments.
    Click Here to View Full Article

  • "Users Put ObjectWeb Software to Work"
    InfoWorld (01/21/05); Niccolai, James

    The ObjectWeb open-source middleware consortium's annual conference this week spotlighted government and corporate production use of ObjectWeb software such as Java Application Server (JOnAS) and the Enhydra Shark open-source workflow engine. "Until this year we were seen as mostly technologists, but now we are seen as providers of tools for building real applications," reported ObjectWeb Chairman Jean-Pierre Laisne of Groupe Bull, which founded the consortium with France Telecom and France's National Institute for Research in Computer Science and Control. France Telecom's Remi Cayuela said JOnAS is being used in his firm's online store, consolidated address book for mobile, voice, and Internet customers, and service management applications. Groupe Bull incorporated JOnAS into a system developed for France's Caisse Nationale d'Allocations Familiales (CNAF) that coordinates data mining requests, OLAP queries, and report production for users at over 100 local CNAF branches. Insurance broker Greco International, meanwhile, employs a document management system based on Enhydra that handles some 1.5 million documents. The success of these projects could encourage other users to embrace ObjectWeb software for production use, and analyst Shawn Willet says certifying JOnAS with Sun Microsystems' Java 2 Enterprise Edition 1.4 could make the software more appealing to users. Meanwhile, Cayuela advised customers thinking about open-source middleware to set up an in-house team of specialists to handle development and implementation, as well as test out Java virtual machines to use with the middleware and append a standard package.
    Click Here to View Full Article

  • "UTD Establishes One of the Few Motion Capture and Virtual Reality Laboratories in the Country"
    University of Texas at Dallas (01/18/05)

    A new motion capture and virtual reality laboratory for studying human movement is scheduled to open at the University of Texas in Dallas (UTD) in February. Funding was provided under Project Emmitt, an economic-development agreement between the state of Texas, the University of Texas system, and Texas Instruments that commits up to $300 million to UTD's engineering and computing science programs. The facility, which is a joint venture between UTD's Erik Jonsson School of Engineering and Computer Science and the School of Arts and Humanities, will focus on research areas that include animated gaming, motion pictures, military uses, biomedicine, prosthetics, and human facial recognition technologies under the dual leadership of Institute for Interactive Arts and Technology director Dr. Thomas Linehan and Jonsson School computer science professor Dr. Balakrishnan Prabhakaran. "I look forward to the opportunities the lab will bring to UTD, particularly for the students who will learn here and who will move on to become the great filmmakers, researchers, educators and military leaders of their generations," Linehan declared. The lab will be equipped with a motion capture stage big enough to support five performers, and 16 cameras that permit concurrent video and recording that will match data frame-by-frame; Linehan noted that the system will employ the fastest available active-optical, real-time 3D technology and deliver scalability, ease-of-use, and a highly accurate wide-angle operation. Prabhakaran reported that the lab's systems facilitate 3D motion tracking and can produce multi-attribute values for a specific type of movement. "Students as well as faculty will be able to use the lab to develop algorithms that segment, recognize and index these motion sequences, and these algorithms could have important applications in security and in medical research," he said. Prabhakaran's research in the lab will be supported by a grant he received from the National Science Foundation on animation databases.
    Click Here to View Full Article

  • "Techies Talk Tough in D.C."
    Wired News (01/20/05); Dean, Katie

    Technology and consumer proponents say last year's bitter conflict between intellectual property holders and consumer organizations over the Inducing Infringement of Copyrights Act (Induce Act) was a watershed event that ought to make major tech companies take a more offensive stance when the same legislation inevitably reappears in the new congressional session. "Technology companies are starting to get religion in terms of courting Washington policy makers," reports Public Knowledge President Gigi Sohn. Critics say the Induce Act, which would establish liability among tech companies for alleged copyright infringement committed by their users, threatens to stifle tech innovation. The legislation will most likely remain in limbo until the Supreme Court makes a decision on MGM's infringement lawsuit against the Grokster and StreamCast peer-to-peer file-sharing networks. Mitch Glazier with the Recording Industry Association of America says his organization is waiting for the Supreme Court's ruling before reconsidering the Induce Act, and is planning to expedite its promotion of a compromise measure that passed the Senate in the last session but was waylaid by disagreements in the House. "If technology companies want to turn a corner here and make sure that their needs are taken into account, they need to make these kinds of gestures," argues Electronic Frontier Foundation attorney Jason Schultz. Possible items on tech companies and consumer groups' political agenda is the codification of the Betamax principle and the passage of the Digital Media Consumers' Rights Act, which permits the circumvention of digital protection on copyrighted content for purposes outside of infringement.
    Click Here to View Full Article

  • "Unwrapping the Biometric Present"
    Technology Review (01/18/05); Garfinkel, Simson

    The primary driver of biometrics, which was a major element of the National Intelligence Reform Act President Bush signed into law last month, is the desire to protect U.S. borders and combat terrorism via a homogeneous system that employs biometric technology to screen out terrorists as they attempt to enter the country or access secure sites. While D.C. policymakers struggle to make the concept of a national ID card palatable to the American public, they are also expanding biometrics use inside and outside the country: Under the aegis of the US-VISIT program, the government is electronically scanning fingerprints and taking digital photos of foreigners entering the country, while U.S. forces in Iraq have implemented a system in Fallujah in which all Arab males are fingerprint- and iris-scanned and assigned biometric ID cards they must show at city entrance/exit points, where they are scanned again to confirm their identities. Privacy advocates often regard biometrics with suspicion out of concern that such measures could be used to enforce citizen surveillance. "Biometric technology is inherently individuating and interfaces easily to database technology, making privacy violations easier and more damaging," warns the Electronic Frontier Foundation. However, biometric passports and visa applications are appealing, given the current passport system's vulnerability to abuse. On the other hand, using biometrics for primary identification is undesirable, given the method's inaccuracy and the likelihood of exploitation, so a case can be made for establishing rules and regulations about appropriate and inappropriate uses of biometric data. Meanwhile, reliable and accurate biometric recordings can be a problem in themselves, as database entries may remain susceptible to falsification. Furthermore, such a system cannot account for people not on terrorist watch lists but who may become terrorists.
    Click Here to View Full Article

  • "Nanotechnology Is Next Big Thing in Electronics and Manufacturing"
    Pittsburgh Tribune-Review (01/21/05); Bails, Jennifer

    Pittsburgh researchers such as Richard McCullough, dean of Carnegie Mellon University's (CMU) College of Science, believe nanotechnology could yield a new generation of cleaner, smaller, lighter, stronger, and more accurate products. Nanotech research has yielded many insights on nanomaterials, which are not subject to the same laws of quantum mechanics that they are at larger scales. University of Pittsburgh engineering professor Hong Soo Kim says the next step is to control nanomaterials' properties to design and fabricate new and useful machines, "but we can't do that without understanding the unique phenomena that happen at the nanoscale." Nano-products in development at Pittsburgh institutions include nanoscale semiconductor chips that combine optics and electronics, more voluminous data storage technology, thin films for measuring body chemistry, atomic corrals that screen out pollutants and chemical weapons from soil and water, tiny computers and cell phones, more durable tennis balls, and advanced cancer treatments. CMU engineering professor Ed Schlesinger says keeping up to speed on nanotech is necessary to sustain the U.S. economy's world dominance. "We need to sell products to the world that the world values, and if the products of the future are based on nanotechnology, that needs to be our focus," he explains. CMU physics professor Randall Feenstra projects that nanotech could spawn a $1 trillion industry within the next 10 years, a prediction bolstered by the federal government's nanotech research investment of $961 million last year and a requested allotment of $982 million this year. Meanwhile, questions about nanotech's potential medical and biological effects have prompted agencies such as the EPA and the National Nanotechnology Initiative to commit sizable sums to nanotech safety research.
    Click Here to View Full Article

  • "Cargo Containers' Electronic Sensor Says 'Do Not Disturb'"
    New York Times (01/20/05) P. E8; Eisenberg, Anne

    Shipping companies are looking to increase the security of cargo containers by equipping them with wireless-enabled devices that record whether the container was opened or not. The United States receives approximately 7 million containers each year and officials worry that terrorist groups could hide weapons inside the containers or attack ports to shut down trade shipments. General Electric is working with China International Marine Containers Group to test a new system that measures the magnetic field of container doors after they are sealed: The device can be queried wirelessly by either stationary dockside readers or by a handheld device at the receiving port, allowing port officials to quickly determine if any of the incoming containers have been tampered with; in addition, the devices record important data about where specific containers have come from, which would be useful in the event of a terrorist attack on port facilities. In such a scenario, officials would be able to quarantine certain shipments without shutting down operations entirely. Designing the magnetic sensor device required rigorous testing because of the pressures containers undergo at sea; with sea swells sometimes 40 feet high and containers stacked eight on top of the other, pressure sensors that were tested sometimes gave false alarms. There are other container-tracking technologies that are more expensive, such as devices that use the Global Positioning System to provide geographic details. ABI Research security expert David Schrier says it is only a matter of time before the government mandates some more sophisticated security measures for cargo containers. Compared to the costs of an attack, prevention measures such as the GE device are worth the money, says retired U.S. Coast Guard commander and Council on Foreign Relations senior fellow Stephen Flynn.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Ex-Cybersecurity Czar Focuses on Global Coordination"
    National Journal's Technology Daily (01/11/05); New, William

    Former White House cybersecurity adviser and R&H Security Consulting head Howard Schmidt is leading the development of an international cybersecurity partnership program for the National Computer Emergency Readiness Team (US-CERT) and the Homeland Security Department. The idea, based on a program developed by former Homeland Security cybersecurity division director Amit Yoran, is to coordinate international cybersecurity efforts to more effectively recognize weaknesses and develop ways to measure success. "It's almost like creating a NATO of the cybersecurity world," says Schmidt. A recent cybersecurity stakeholders meeting in Wye River, Md., involved confidential discussion of how much information the government and the private sector should share in order to boost cooperative cybersecurity actions. Schmidt is not concerned about the recent departures of Yoran and National Computer Emergency Readiness Team director Larry Hale, because the most work is required in the private sector and not in the government sector. Schmidt says the project should be in place by this summer and will complement the private-sector National Cyber Security Partnership.
    Click Here to View Full Article

  • "Tsunamis Prompt Interest in 'Net Alerts"
    Network World (01/17/05) Vol. 22, No. 2, P. 33; Marsan, Carolyn Duffy

    The recent tsunami disaster has spurred the Internet Engineering Task Force (IETF) to consider how to establish an Internet-based emergency alert system to reduce casualties in similar scenarios. Cisco fellow and IETF member Fred Baker reasons that one way to evacuate a beach in such an event would be to send a Short Message Service to every active cell phone in a particular cell near the coastline, whose presence would be established through the GSM service. The formulation of an online tsunami alert system ties in well with the IETF's Internet Emergency Preparedness (IEPREP) working group, which was formed in response to a federal request following the Sept. 11 tragedy. The Internet, however, is not set up to support pre-emption or prioritization of voice calls using standard protocols, and IEPREP is trying to circumvent this drawback to ensure that important messages are received in a specific time frame. The IETF has been focused on recognizing requirements for Internet-based emergency communications and establishing high-level frameworks to support the creation of those systems by federal agencies. IEPREP intends to publish a document that identifies best practices for implementing Internet-based emergency telecom services using current protocols by year's end, although IEPREP co-Chair Kimberly King reports that these are not solutions. "We're going to have to re-charter IEPREP to produce solutions or charter another group to produce solutions," she exclaims. Meanwhile, the IETF's Session Initiation Protocol (SIP) group is developing a header that can identify and prioritize emergency calls, which Columbia University professor Henning Schulzrinne says should be ready in a few months.
    Click Here to View Full Article

  • "Putting Applications to the Test"
    eWeek (01/17/05) Vol. 22, No. 3, P. D1; Coffee, Peter

    Florida Institute of Technology software engineering professor Cem Kaner advises enterprise development managers to deploy consistent, conscientious software application testing, a practice that is gaining weight as a result of regulatory directives and heightened awareness of the consequences of failure, one of which could be claims of negligence brought against application developers and users in cases where such a failure could lead to a person's death, for instance. On his Web site, Kaner lists over 100 kinds of coverage tests that may need to be performed by the development team, but these tests are no guarantee against failure; they merely confirm that the applications were built as specified and no rules were broken. Additionally, such exhaustive testing is theoretically impossible, and because negligence is defined as the failure to take reasonable precautions, the negligence lawsuit Kaner envisions can be challenged by cost-benefit calculations, provided that at least some degree of conscientiousness and good faith is demonstrated in the assessment of the costs of testing and the benefits of risk reduction. As a consequence, application testing cannot be viewed as a mechanical or mathematical process. Some formally produced, repeatable tests can be inefficient and too time-consuming, which makes a solid case for less formal, earlier tests coordinated by developers experienced in determining the most likely, least acceptable errors. Test automation is often seen as a logical way to trim fat from the testing process, but this approach has its own problems, including the generation of accurate but misleading statistics and false-positive alerts; obscuration of inappropriate application behavior by confirmation of appropriate behavior; and failure to spot errors because the person running the automated test did not design it. Many application security problems can also be traced to developers' failure to consider things that should not happen, or things that should be prevented if someone attempts to induce them.
    Click Here to View Full Article

  • "VoIP: Ready for Prime Time"
    Computerworld (01/17/05) P. 30; Hamblen, Matt

    Voice-over IP (VoIP) technology has started to penetrate the mainstream now that many of its glitches, such as poor reliability and voice quality, have been addressed. Gartner analyst Jeff Snyder projects that VoIP or hybrid VoIP/circuit-switched systems will account for 97 percent of new phone systems deployed in North America in 2007; his company forecasts a 32 percent drop-off in revenues for circuit-switched systems sold in North America this year, while sales revenues for pure IP systems and hybrid systems should increase 32 percent and 30 percent, respectively. Snyder says cost savings are not a compelling enough reason for most organizations to upgrade to VoIP, and argues that sustained system protection should be the primary consideration, as traditional systems will become unavailable as time goes on. The key drivers of VoIP adoption, killer applications, are missing, while Current Analysis analyst Brian Riggs reports that "the majority [of organizations] may not feel the pressure to make the switch right now because their existing systems are working and the applications they have are good enough." Snyder notes that a scarcity of credible VoIP technology integrators is another major obstacle. He says that some previous VoIP failures could be attributed to integrators and IT staffers familiar with data networks but not well-versed in voice networks. But despite these shortcomings, IT managers are highly interested in VoIP, especially as spending strictures are relaxed. "People ask me, 'Do I need to change to VoIP?' and I say, 'You need to be aware of it and informed about it so you can decide when the time is right, but don' t feel forced to do it,'" Snyder says.
    Click Here to View Full Article

  • "The BitTorrent Effect"
    Wired (01/05) Vol. 13, No. 1; Thompson, Clive

    The BitTorrent peer-to-peer (P2P) program, which over 20 million people have downloaded thus far, is the brainchild of Bram Cohen, who designed BitTorrent as a free and open-source tool that allows geeks to cheaply exchange Linux software online; however, the program has found its greatest use among TV and movie fans, who use BitTorrent to upload and download massive data files. BitTorrent circumvents the bottlenecks resulting from uploading and downloading big files at unequal speeds by breaking up files and distributing the fragments among several uploaders simultaneously, which means that a file downloads faster the more popular it is. BitTorrent is also considered the catalyst responsible for a sea change taking place in the broadcast media sector, because the program massively reduces the expenses needed to distribute content: For example, a Stanford graduate student successfully distributed a controversial documentary hosted on his Web site using BitTorrent for only $4 in bandwidth. Despite BitTorrent's use for swapping copyrighted files without authorization, Electronic Frontier Foundation lawyer Fred von Lohmann says Cohen cannot be held legally liable, since file-sharing technology has been determined in court to have "substantial noninfringing uses." Union Square Ventures venture capitalists believe the proliferation of peercasted TV could transform networks into aggregators that find shows, distribute them in P2P video torrents, and highlight key segments; their revenues would come from selling ads or subscriptions to their portals. "Eventually the consumer will become the programmer," forecasts Scripps Networks' Channing Dawson. "Content will be accessible anywhere, anytime."
    Click Here to View Full Article

  • "A Conversation With Alan Kay"
    Queue (01/05) Vol. 2, No. 9, P. 20; Feldman, Stuart

    In an interview with Stuart Feldman of IBM Research, Xerox Palo Alto Research Center co-founder and Viewpoints Research Institute President Alan Kay discusses the development of programming languages and their shortcomings. He compares most current software to an Egyptian pyramid comprised of millions of bricks stacked atop one another by slave labor, with no structural integrity. Kay argues that the successful commercialization of personal computing and operating systems has in many respects impelled a devolution in computer science. "I think there's this very long lag between what you might call the best practice in computing research over the years and what is able to leak out and be adapted in the much more expedient and deadline-conscious outside world," he explains. Kay classifies a great deal of programming languages' success as "expeditious gap-filling," and attributes many of the problems that have plagued computing over the last quarter century to systems designed to fix a short-term flaw without consideration of the ideas' long-term scalability. He says that a programming language is ultimately a user-interface design, and envisioning it as such will likely yield much better results regardless of the task at hand. Kay explains, "The number-one thing you want to make the user interface be is a learning environment--something that's explorable in various ways, something that is going to change over the lifetime of the user using this environment." What this translates to is improvements in both the applications as well as the user interface itself. Kay draws parallels between the science of computing and the science of bridge construction, in that some people are responsible for erecting the bridges while others are responsible for demolishing them and coming up with better theories, while keeping in mind that bridge-building is a never-ending process.

  • "Taking Handheld Devices to the Next Level"
    Computer (12/04) Vol. 37, No. 12, P. 36; Myers, Brad A.; Nichols, Jeffrey; Wobbrock, Jacob O.

    A team of MIT and Carnegie Mellon University researchers are designing next-generation handhelds and applications that can be employed as more effective and easier-to-use remote controls under the auspices of the Pebbles project. For the household, the researchers have built a personal universal controller (PUC) that controls all computerized appliances by providing appliance-specific control interfaces. A PUC can establish consistency between multiple interfaces, tailor interfaces to the user's needs or preferences, furnish platform-specific interfaces, and combine the functions of multiple appliances. Human trials with the PUC inspired the development of an XML-based high-level specification language that encapsulates all the data the controller needs to automatically generate interfaces, while appliance control is facilitated by software and hardware modules that act as appliance adapters. Among the handheld applications the team developed is Remote Commander, which supports all PC keyboard and mouse functions and displays screen images on the handheld in three distinct modes; SlideShow Commander, which can control a PC running PowerPoint; and Shortcutter, a general-purpose application that allows users to build custom panels to control PC applications. These applications have been employed in initiatives that include the Defense Advanced Research Projects Agency's Command Post of the Future project, and have demonstrated particular value to disabled users. The Pebbles project has produced important design insights, including the need for designers to consider context of use, as well as which interface components should reside on the handheld and which should reside on the original device.
    Click Here to View Full Article