HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 671:  Wednesday, July 21, 2004

  • "Tech Bust Zaps Interest in Computer Careers"
    Los Angeles Times (07/20/04) P. A1; Pham, Alex

    The bursting of the technology bubble and horror stories about unemployment and offshore outsourcing of tech jobs have dampened students' interest in computer careers, and fueled a 23 percent decline in computer science program enrollments across U.S. universities between 2002 and 2003. Between 1999 and 2003, computer science enrollments at MIT fell 44 percent, computer science and engineering enrollments at UC Berkeley dipped 41 percent, and Georgia Institute of Technology enrollments decreased 45 percent. Jeanne Ferrante, associate dean of the UC San Diego school of engineering, says the rising jobless rate for computer scientists and systems analysts over the last few years--and the accompanying erosion of salaries, bonuses, and other amenities--have discouraged students. The enrollment drop-off is happening just as U.S. companies are on a the cusp of a hiring resurgence, which is prompting concerns about a scarcity of domestic tech professionals as well as a stifling of technological innovation. The U.S. Department of Labor forecasts 46 percent growth in the number of jobs for computer software engineers between 2002 and 2012. Tech companies such as IBM and Microsoft are actively trying to allay students' fears of having no job opportunities after graduation: IBM made a 2004 announcement to recruit 4,500 U.S. workers, while Hewlett-Packard CEO Carly Fiorina said her company would probably hire an additional 5,000 employees. Some people see a silver lining in the enrollment decline--for computer science students it means less difficulty in getting into a class, while professors appreciate smaller classes that are easier to manage.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "NU Researcher Glimpses 3rd Generation of Internet"
    Chicago Sun Times (07/20/04); Wolinsky, Howard

    A third-generation Internet envisioned by Northwestern University researcher Joe Mambretti promises to ramp up data transmission speeds thousands of times, and that is only the tip of the proverbial iceberg. As director of Northwestern's International Center for Advanced Internet Research (iCAIR), Mambretti has organized a team working on infrastructure and tools for a 3rd Generation Internet or I-3G. Notable iCAIR achievements that could become essential components of I-3G include a search engine that can retrieve audio and video tracks based on textual queries as well as display multiple images simultaneously; the tool can be used to build customized digital media Web sites. Another iCAIR breakthrough is a system that can easily accommodate gigabyte-scale files. I-3G capabilities that could be available in the next several years include video-on-demand that can retrieve all analog media; support for hundreds of thousands of stations; and a personalized CAVE, or 3D virtual reality system that people can use to immerse themselves in artificial environments for gaming, travel, and other applications. Mambretti says the adoption of 1-3G technology in the U.S. will be driven by the proliferation of high-capacity fiber-optic lines to homes and offices, and a shift away from copper-based networks. However, making local and national policy-makers more receptive to such technology will be a difficult challenge, especially because their view of the Internet as a form of telephone service is so deeply entrenched. "This approach is crippling the adoption of new technologies in this country," Mambretti argues.
    Click Here to View Full Article

  • "Changing the Face of Web Surfing"
    Wired News (07/20/04); Andrews, Robert

    Some Web users are taking it upon themselves to correct or re-design poorly accessible Web sites and open them up to the public, a practice that has earned praise from other users but anger--and even threats of litigation--from sites' owners. Oxford University math graduate Matthew Somerville hosted a streamlined version of Britain's Odeon Cinema chain site that offered greater accessibility than its model, but shut it down when the chain warned him that he was transgressing copyright and data statutes. Somerville insists he did not set up his site for commercial gain, and claims that the official cinema site violates disability discrimination law; Kim Greenston of Odeon counters that Somerville's site is so similar to the official site that customers were unintentionally giving personal information to him. Web Standards Project coalition co-founder and usability expert Jeffrey Zeldman applauds Somerville's efforts, arguing that they have saved Odeon a lot of money on consultancy services. He thinks similar volunteer online accessibility corrections and the buzz surrounding them should make other companies reconsider their own sites' design. Another accessibility hacker, David Jones, cites the inept design of Wales' National Assembly Web site as justification for his decision to republish articles from the official site on his own Assembly Online site. Judy Brewer, director of the World Wide Web Consortium's Web Accessibility Initiative, says the organization's standards guarantee accessibility for all users, but only if designers comply with them. She acknowledges that "Not all businesses have yet understood the advantage in ensuring their Web sites are accessible to people with disabilities, who constitute a significant percentage of the marketplace," but is confident that inaccessible sites will eventually become a thing of the past.
    Click Here to View Full Article

  • "Low Bar for High School Students Threatens Tech Sector"
    IT Management (07/16/04); Befi, Tony

    More U.S. high school students are not taking the rigorous coursework that is needed to succeed in college and later in the workplace, writes IBM vice president of systems development and Texas senior state executive Tony Befi. He says that trend runs counter to the trend in the workplace itself, where the U.S. Department of Labor estimates that job growth will be fastest in the service-providing industries such as IT. Jobs requiring computer skills, problem-solving, and a sophisticated comprehension of customer markets will grow far more rapidly than goods-producing employment. The U.S. Department of Education says seven out of 10 high school graduates have not finished critical courses needed in college and in a services-oriented work environment, while 49 percent of those graduates will have to take remedial classes if they attend college. Because of this trend, experts predict a shortage of 12 million workers for fast-growing sectors of the U.S. economy. The root cause of high-schoolers' underperformance is the mistaken belief on the part of students and parents that high school courses do not matter, and the State Scholars Initiative is countering this misperception by joining business and educators in motivating high school students to excel and tackle difficult coursework. The simple and low-cost initiative has produced encouraging results in Texas, where the Houston Independent School District has increased the percentage of graduates who completed a defined set of courses from 18 percent of graduates in 1999 to 70 percent of graduates in 2003, despite the poverty of the area. A key element was the enrollment of local business leaders who made clear to students how their choices in high school would affect them in the workplace. The number of students who took tough classes increased, and the expectations of parents and teachers rose commensurately.
    Click Here to View Full Article

  • "Aust Supercomputing Undergoes Renaissance"
    ZDNet Australia (07/20/04); Ferguson, Iain

    The Australian supercomputing sector is being revitalized by growing demands and the advent of commercial cluster computing that runs on the Linux operating system. The South Australian Partnership for Advanced Computing recently purchased a new Silicon Graphics Altix supercomputer valued at AU$4.5 million, while the Australian Partnership for Advanced Computing (APAC) released a request for proposal (RFP) for a four-year, AU$12.5 million contract for a new peak computing system several weeks later; and the Victorian Partnership for Advanced Computing (VPAC) today announced its intentions to issue a AU$500,000 to AU$600,000 RFP for a new cluster system with a peak capacity of 1 teraflop. APAC plans to replace a pair of Hewlett-Packard AlphaServer systems with a supercomputer that APAC head John O'Callaghan estimates should have a peak processing capacity of 10 teraflops. APAC's RFD was released after the Australian government announced a AU$29 million grant to fund the partnership from mid 2004 to mid 2006. APAC is charged with developing a grid infrastructure that meshes APAC and partner facilities so that researchers can seamlessly access computational and data resources from a national facility. O'Callaghan says APAC has undertaken six application projects in the areas of astronomy, bioinformatics, high-energy physics, earth systems, geosciences, and chemistry, while three other projects are focusing on application infrastructure. VPAC CEO Bill Appelbe says the commercial sector has experienced the sharpest increase in supercomputing demand, and notes that demand has also risen in the academic sector. Supercomputing's leading users include mining and exploration, large-scale engineering, and automotive industries.
    Click Here to View Full Article

  • "Start-ups Search for Hard-Drive Replacements"
    CNet (07/21/04); Kanellos, Michael

    A number of startups are pursuing often strange data storage solutions to alleviate the cost pressures on hard drive and flash memory manufacturers, and all the contenders offer ways to pack data more densely than possible with silicon circuitry, and reduce the cost of manufacturing. NanoMagnetics is using a biologically derived protein called ferritin to carry a magnetized metallic material that can be flipped to represent either a one or zero value; the nanometer-level technology is similar to experiments done by IBM and other companies with an oily biological coating, but is more resilient to the high temperature involved in the manufacturing process. NanoMagnetics is currently testing its technology with a potential large customer in Asia, and could have products out by the end of next year if it can convince memory makers to adopt its inkjet particle deposition process. Nanochip is taking the ovonics route, a technique involving super-heated amorphous silicon touted by Intel's Gordon Moore back in the early 1970s. The company expects to ship samples next year, but is working on increasing the speed of the devices, which use relatively slow mechanical actuators to move probe tips as they write data. Nanochip's ovonics device, which is extremely small and dense, is likely a competitor for NAND flash memory and minidrives. ZettaCore offers data storage on the molecular level, with specially made molecules that represent four bits of data; ZettaCore relies on a chemical process to orient the molecules on a substrate with etched memory cells, and the company has attracted a number of industry luminaries, including Les Vadasz, and investment from Intel Capital and Draper Fisher Jurvetson. CEO Randy Levine says ZettaCore has already built working chips with high densities, but needs to refine its manufacturing approach.
    Click Here to View Full Article

  • "IBM to Help Train Students for IT Work"
    eWeek (07/20/04); Taft, Darryl K.

    The new IBM Academic Initiative is a program to aid academic institutions that support open-source software and open standards by contributing hardware and software and training students for in-demand IT skills such as programming, architecture, and technology certifications, says IBM's Buell Duncan. IBM will offer participating schools free software and free or discounted hardware systems, and help bring school curricula up to speed. One element in the initiative is the IBM Scholars Program, which supplies free training, course materials, and technology to certain schools; IBM says over 8,000 faculty members are currently registered with the Scholars Portal, through which schools can avail themselves of approximately 40 IBM software products to augment their instructional infrastructure. The company adds that its IBM Virtual Innovation Center will "loan" hardware for virtual use, and will evaluate the IT aspects of the schools' academic programs using a technical team. Duncan says IBM expects to sign up 250 schools by the end of 2004, while its long-term goal is to bring 1,000 institutions on board. Northface University is one of the schools participating in the initiative's pilot, and Northface CEO H. Scott McKinley believes the IBM program will help keep his school's infrastructure and curriculum up to date. "The industry is experiencing a severe shortage in quality and quantity of IT graduates," he notes. "We as a university have as our mission to close that skills gap."
    Click Here to View Full Article

  • "Handheld PC Virus Holds Ominous Promise"
    New Scientist (07/20/04); Knight, Will

    The underground group 29A has created "Duts," a new virus that is designed to infect handhelds powered by Microsoft's Windows CE software, as a proof-of-concept program. The virus proliferates by embedding copies of its code inside normal software applications, which handheld users often exchange through infrared transmission. Programs can also be sent to PCs via email and then installed by connecting the devices. Loading and running a Duts-infected program spreads the virus to all applications within the system's main program folder that exceed 4 KB, but the program asks permission before performing this job. Thus, Duts' method of contagion is slow and easily blocked. Some experts believe Duts and a cell phone virus, Cabir, are signs that malware authors are turning their attention to mobile technology. This has generated worries of a virus epidemic that is rapidly transmitted between devices that link to mobile phone networks. "Malicious programs are evolving in yet another direction, bringing the first global outbreak caused by a mobile virus closer and closer," warns Eugene Kaspersky of Kaspersky Labs. On the other hand, Symantec's Kevin Hogan does not think the problem represents an immediate danger, given that long-predicted handheld computer virus outbreaks have yet to happen.
    Click Here to View Full Article

  • "Battlefield Tech for Aide Workers"
    Wired News (07/21/04); Harrison, Ann

    Almost 20 military and civilian organizations were represented at the Strong Angel II conference in Hawaii this week to demonstrate communication and collaboration technologies designed to enhance the coordination of humanitarian aid in war-torn locales and disaster sites. The event was launched by the Office of the Secretary of Defense, which asked event director Cmdr. Eric Rasmussen of the U.S. Navy Medical Corps to propose ways to augment field communications between civilian and military relief personnel in Iraq. One of the technologies demonstrated at Strong Angel II was BBN Technologies' e-TAP, a system that can record Arabic TV broadcasts and translate the audio into English in near-real time, as well as provide a synchronized display in a browser that features the video, the Arab transcript, and the English transcript. Rasmussen explained the rationale for developing e-TAP: "We...recognize that the influence of the press is enormous and that we often have a genuine impediment to understanding the population around us if we don't keep track of what the local media is saying." Another demonstration involved radio traffic from a local fire department that was recorded to a date-stamped MP3 file and transferred into a shared Groove work space, which showed that such broadcasts could be monitored from any location on Earth and plotted on an electronic map. Rasmussen said the Groove software would help cement trust between military and civilian organizations by enabling transparency. "The collaboration that we need between military, [nongovernmental organizations] and U.N. agencies will not comfortably rest on a server owned by somebody else, especially if they have to go through a .mil military server," he asserted.
    Click Here to View Full Article

  • "Swift Searching for Open Source"
    IST Results (07/20/04)

    Spanish university researchers have developed a novel search engine that enables programmers to easily find pieces of open-source code they need according to that code's function. The AMOS system relies on a simple ontology and search-term dictionary to search included elements quickly. Technical University of Madrid (UPM) AMOS technology manager Manuel Carro said the project was intended for system integrators and programmers who are using open-source code, and that the new search engine provided a much-needed function that was not available through general-purpose search engines such as Google. Programmers often do not know about all the open-source code that is available to them, or where to find it if they are aware; AMOS asks users to enter the type of function they require, and will return packages of code and code artifacts that fit those purposes. "It is the first search engine that allows users to find assemblages of packages if, when combined, they perform the desired task," Carro said, noting that AMOS will make code re-use and building-block style programming easier. UPM developers used the school's own Ciao Prolog logic programming environment to create AMOS, which is licensed under the open-source GPL. The search engine has been tested with 700 elements connected by over 90,000 relationships, but the prototype demonstrator currently available offers 120 elements, mostly for Web development. Carro said AMOS can also be applied to other fields besides software, including scientific and legal archives, or any other information store that can be classified.
    Click Here to View Full Article

  • "Face of the Future?"
    Observer (UK) (07/18/04); McKie, Robin; Smith, David

    Most scientists agree that humanoid robots popularized in the media will one day become a reality, but opinion is split as to whether that day is close or far off. Future Horizons reports that robot infants for parenthood training, house-cleaning robots, robotic elderly caregivers, window-cleaning robots, and machines that can play against humans in board games are just some of the applications currently under discussion, and the industry analyst estimates that robot revenue will skyrocket from $4.4 billion to $59.3 billion between 2003 and 2010. Future Horizons Chairman Malcolm Penn envisions the ubiquitous presence of domestic robots in five to 10 years, performing tasks such as taking care of pets, dispensing medicine, and cooking. However, the likelihood that androids will penetrate households in a relatively short time is slim. Stuff editor Ollie English notes that humanoid robots will remain prohibitively expensive for the next 10 to 15 years, while Maryland University robotics expert Dr. David Akin says the evolution of robots' ability to replicate human behavior is a painfully sluggish process: For instance, the twin Mars rovers, despite their sophistication and autonomy, move very slowly and do not always operate smoothly or flawlessly, while a Pentagon-sponsored desert race between autonomous vehicles was less than stellar. The model for humanoid robots was established by sci-fi writer Isaac Asimov, who posited that such machines would consist of expensive, sophisticated electronic brains mated to simple limbs. But the opposite has proven to be the case, with the brain or microprocessor emerging as a cheap component while artificial limbs and robotic movement are expensive to produce and slow to develop. In addition, psychologists have observed that the more robots resemble a human being, the more difficult it is for people to accept them.
    Click Here to View Full Article

  • "Teleport Lifts Quantum Computing"
    Technology Research News (07/21/04); Smalley, Eric

    Two research teams have successfully teleported the quantum states of atoms on demand, thus enabling quantum information to be transmitted reliably and stably; this breakthrough is an important step toward the construction of quantum computers that use trapped ions, which are quantum particles that can perform multiple computations before they expire. Up to now, teleportation was only possible with mass-free photons, whereas ions are relatively massive in comparison, notes Bell Labs researcher Steven van Enk. The teams, based at the National Institute of Standards and Technology (NIST) in the United States and Austria's University of Innsbruck, demonstrated that quantum bits (qubits) can be stored before and after teleportation so that quantum computers can process data, transfer it to a different part of the computer, and then perform additional data processing. The University of Innsbruck researchers teleported the quantum states of calcium ions contained in a radio-frequency electric field, which was framed by electrodes so that the ions were configured into a row. The NIST researchers' technique used a similar scheme, but employed beryllium ion qubits, which significantly extended the life of the quantum states, according to NIST scientist David Wineland. University of Innsbruck physics professor Rainer Blatt wagers that trapped-ion information processors will probably require quantum registers to store, process, and error-check data, and teleportation would permit the transmission of data between registers. Van Enk says the transfer of information via teleportation enables the performance of two-ion operations on ions that are not in close proximity to each other, and raises the computer's error threshold, thus simplifying the creation of quantum computers.
    Click Here to View Full Article

  • "ICANN Crunch Meeting Begins"
    Register (UK) (07/19/04); McCarthy, Kieren

    ICANN's efforts to gain international backing for its oversight of the Internet face stiff opposition at a meeting this week in Kuala Lumpur, Malaysia. Chief among the complaints is ICANN's proposed budget for 2005, which is 91 percent higher than the $8.27 million allotted in 2003. The Council of European National Top Level Domain Registries has already accused ICANN of a "lack of financial prudence" and has refused to provide any support for the organization, while an alliance of 75 registrars says the proposed method to raise funding would have a disproportionate effect on smaller registrars. ICANN chief Paul Twomey and head of business operations Kurt Pritz, who drew up the budget, say the added funding is needed to prevent ICANN from losing its mandate from the U.S. Department of Commerce. Under a Memorandum of Understanding with Commerce, ICANN is required to fulfill 24 objectives within 2.5 years; it has thus far met seven of the objectives. If ICANN fails to meet its commitment, the International Telecommunications Union will assume its role. For ICANN's proposed budget to get approved, two-thirds of registrars must approve it, with voting rights contingent on the number of domains sold, meaning that larger registrars, who stand to benefit from the funding measure and represent the two-thirds needed, have a larger say in the issue than their smaller counterparts, who stand to lose the most. Two-thirds of the increased budget will be financed by registrars.
    Click Here to View Full Article

  • "Opposition Grows to Paperless Voting"
    eWeek (07/19/04) Vol. 21, No. 29, P. 20; Carlson, Caron

    There is a growing movement to address concerns about paperless voting systems, as evidenced by the huge turnout of protesters at the "Computer Ate My Vote" rallies held across the United States last week. Prestigious computer scientists, civil rights supporters, and tech professionals are worried that the rush to update the electoral process through the purchase of electronic touch-screen machines mandated by the Help America Vote Act of 2002 has reduced the reliability and security of voting. Critics have long been clamoring for the deployment of voter-verifiable paper audit trails (VVPATs) to ensure accurate recounts, and it seems assured that such options will be available to voters by the 2006 elections. Among those calling for the addition of VVPATs to direct recording election (DRE) systems is Democratic National Convention Chairman Terry McAuliffe, who advised the U.S. Election Assistance Commission to consider such a measure, and Rep. Rush Holt (D-N.J.), who proposed a nationwide VVPAT standard in his Voter Confidence and Increased Accessibility Act. Some supporters of Holt's bill say it has been held up because legislators are reluctant to admit their error in passing the Help America Vote Act that contributed to the current voting dilemma. "Nobody wants to admit a mistake," explains computer scientist and "Computer Ate My Vote" participant Barbara Simons, The former president of ACM says, "I think the idea of not being able to do recounts is something all people can understand. Software is buggy. Bugs in these files can cause miscounts." Election Data Services estimates that touch-screen e-voting machines are ready for use in the upcoming presidential election in over 675 counties. Some computer scientists doubt that the addition of printed ballots to e-voting systems will boost voter confidence, and think a much better approach is to jettison DRE systems altogether and start over from scratch.
    Click Here to View Full Article

    For information about ACM's e-voting activities, visit http://www.acm.org/usacm.

  • "Some Say U.S. Supercomputing Needs a Jump-Start"
    Computerworld (07/19/04) Vol. 32, No. 29, P. 7; Thibodeau, Patrick

    The U.S. House of Representatives passed a pair of bills in June designed to build momentum for U.S. supercomputing efforts, the High-Performance Supercomputing Revitalization Act of 2004 and a request for about $200 million in funding for supercomputer development at the Energy Department. The bills were crafted to coordinate federal supercomputing development and require U.S. agencies to make supercomputing resources accessible to researchers, but Ford Motor IT official Vincent Scarafino has told Congress that the federal government needs to expand its role in supercomputing development. He explains that federal supercomputing investment has dipped because of a growing emphasis on parallel processing using lower-cost, off-the-shelf components--and though this has helped cut prices and increase productivity, the United States is lagging behind other countries' supercomputing initiatives as a result. "Instead of learning how to do new things, we're learning how to do old things cheaper," Scarafino argues. He adds that supercomputers that use a vector processor-based architecture are less programming-intensive than parallel processing-based machines. Echoing Scarafino's views is Kevin Wohlever of the Ohio Supercomputer Center's Springfield facility: "If we keep trying to make all codes fit into the cluster environments, we're losing the opportunity to make the codes that run best in the vector environment," he declares. The passage of the federal supercomputing bills was applauded by the Computing Research Association, which nevertheless observed that the proposed federal IT research budget for 2005 is 0.7 percent less than the 2004 budget.
    Click Here to View Full Article

  • "U.S. E-Passport Plan Raises Tech, Diplomatic Hackles"
    EE Times (07/16/04) No. 1330, P. 1; Yoshida, Junko

    The U.S. Department of Homeland Security and other federal agencies are pushing technology vendors to test their prototype electronic passport solutions before the congressionally appointed Oct. 26, 2005, deadline. The U.S. Congress wants the 27 governments with which it exchanges visa waivers to enable biometrically secured, machine-readable passports by that time in order to bolster border security; the Bush administration has asked for a more lenient timetable, and Secretary of State Colin Powell said that rushing implementation would create more problems. International Biometric Group senior consultant Joseph Kim said Congress passed the deadline based on a misunderstanding about standardization of the technology. The United States, the European Union, and other countries are adhering to standards set by the International Civil Aviation Organization (ICAO), which has included in its specification face images and mandatory biometric data, whether it be fingerprints, iris recognition, or some other identifier; the ICAO further detailed its specifications two months ago, defining the data structures, command sets, and communication requirements for the passports and reader terminals. Vendors are struggling to keep up, not only with the basic technology, but also with making sure their products meet standards and are interoperable. The Homeland Security Department will host testing sessions this month, and the former state-owned German printing house Bundesdruckerei has begun using 32 KB chips from Philips Semiconductor, while Infineon recently announced a prototype 64 KB chip that would hold the facial image and two fingerprints Bundesdruckerei originally wanted. STMicroelectronics ID general manager Andreas Raeschmeier says extensive field-testing is still required and that work needs to be done to ensure the durability of any solution, given its expected 10-year lifespan. The European Union is also demanding tighter security, both on the chip and in terms of how the data will be handled in government systems.
    Click Here to View Full Article

  • "Groundbreaking Research: ACM's SIGGRAPH 2004 Papers"
    Video Systems (06/04) Vol. 2, No. 1, P. 23; Doyle, Audrey

    SIGGRAPH's Papers sessions have been consistently popular as a platform for spotlighting important and challenging new areas of research in computer graphics and interactive methods, and the papers being presented at this year's conference promise to continue that tradition, and detail both incremental and seminal advances. ACM's SIGGRAPH 2004 Papers Chair Joe Marks boasts that SIGGRAPH papers are of a consistently high quality, and explains that "computer graphics is unique in that there's one indisputable 'best place' to publish, and that's SIGGRAPH." The Papers Committee conditionally approved 83 papers out of this year's 478 submissions, while a dozen papers were also selected for publication in ACM's Transactions on Graphics (TOGs). Marks is particularly excited about sessions that concentrate on capture and display devices, which he thinks will be a key area of research and development and commercial applications over the next decade. Also of interest to Marks are papers that detail interactive techniques, such as "Interactive Modeling" and "Identifying & Sketching the Future." The review process for the SIGGRAPH Papers Committee is challenging, because submissions have increased in the range of 10 percent to 20 percent over the last several years, while the number of reviewers has not risen at the same rate. SIGGRAPH has also had to make do with a smaller budget for the review process because of lower attendance and fewer vendor exhibitions, and 50 percent of this year's budget was funded through donations. In addition, more and more submissions possess an experiential element, meaning that reviewers have had to personally experience the systems and products detailed in the papers in order to evaluate them.
    Click Here to View Full Article

    For more information, or to register for SIGGRAPH 2004, visit http://www.siggraph.org/s2004/.

  • "Is Natural Language Real?"
    Speech Technology (06/04) Vol. 9, No. 3, P. 34; Dahl, Deborah A.

    In 1950, Alan Turing predicted that computers would be able to process unconstrained natural language within approximately 50 years, and we are about halfway there, technologically speaking. Natural language processing technologies are not advanced enough to enable people to address computers as freely as they would address other people, but they still have practical uses. Applications benefiting from natural language processing include foreign language tutoring, spoken and text-based dialogue systems, spell-checking, text summarization, document retrieval, language translation, database queries, Web search, and environment and visual display controls. The most general and reusable natural language processing scheme follows the standard model of linguistic analysis, covering the steps of morphological analysis, lexical lookup, syntactic parsing, semantic analysis, and pragmatic analysis; however, limited domain applications such as commercial spoken dialogue systems are better served through semantic annotations on speech recognition grammars. A more recent scheme based on statistical probabilities is applicable to an application's speech recognition and natural language comprehension features: In the first instance, the system recognizes speech based on the statistical likelihood of words coming in a certain order. Users have a greater latitude of verbal expression because a statistical system can identify many inputs the developer may not have anticipated. A second statistical method squeezes intent out of the user's utterances by classifying those vocalizations into one of several previously defined categories through comparison to established samples. Initiatives to further advance natural language processing are being undertaken by academic institutions, corporate research labs, and federal agencies such as the National Science Foundation.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "Sensors & Sensibility"
    IEEE Spectrum (07/04) Vol. 41, No. 7, P. 22; Kumagai, Jean; Cherry, Steven

    Personal information can be exploited more creatively and intrusively thanks to the emergence of new tracking and monitoring technologies, database mining programs, and greater tolerance toward surveillance; privacy experts are worried about this trend, especially because legal safeguards have not caught up. Radio-frequency ID (RFID) tags, sensor networks, biometric scanners, and video surveillance are just some of the technologies growing in sophistication and being used to build finer-grained profiles of people's personal details, transactions, habits, and activities that government and commercial networks sift through to supposedly uphold law enforcement, security, and convenience. A new report commissioned by the Defense Department cautions that the federal government is using data-mining tools "to scrutinize personally identifiable data concerning U.S. persons who have done nothing to warrant suspicion," a practice that could potentially curtail religious expression, public discourse, and political dissent without legal and technological oversight. The possibility that pervasive technologies such as RFID tags and the IPv6 Internet addressing scheme could make our lives even more transparent to marketers, law enforcement, government, and others has spurred privacy organizations to call for limits, while a small number of researchers are developing countermeasures. Palo Alto Research Center scientist Teresa Lunt is developing an ID-protective "privacy appliance" that can be affixed to a database to filter incoming and outgoing data like a firewall. Mohan Trivedi of the University of California, San Diego, has invented a video surveillance system that represents people and other objects as colored cubes, and only reveals the true image if suspicious activity is detected. Carnegie Mellon University's Latanya Sweeney, meanwhile, has created software that modifies the results of database searches so that no identifying details are uncovered.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM