HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 669: Friday, July 16, 2004

  • "Life After Death for CAPPS II?"
    Wired News (07/16/04); Singel, Ryan

    Whether the proposed Computer Assisted Passenger Pre-Screening System II (CAPPS II) is dead for real or in name only is a matter of debate. Homeland Security's Suzanne Luber says the program itself is being redesigned because of public comments and other "operational factors," but privacy activists such as Bill Scannell and Barry Steinhardt of the ACLU's Technology and Liberty program are confident that the plan is beyond resuscitation. The purpose of the $100 million CAPPS II program was to keep terrorists and violent criminals off of commercial flights through the use of commercial databases, intelligence data, a centralized terrorist watch list, and a list of outstanding warrants, but both left-wing and right-wing civil liberties groups sharply criticized the plan as a tool for privacy infringement that only gives the illusion of better airline security. The American Conservative Union (ACU) has promised to take a firm stance against any successor effort that shares a core similarity to CAPPS II. "Renaming a program does not satisfy the civil liberties concerns of conservatives so long as that program turns law-abiding commercial airline passengers into terrorism suspects," asserts the ACU's Ian Walter. Meanwhile, the Government Accountability Office (GAO) reported six months ago that CAPPS II failed to meet seven of its eight criteria for certification. Governmental Affairs Committee leader Sen. Susan Collins (R-Maine), a CAPPS II supporter, believes the program can uphold both airline security and passenger privacy if properly designed and deployed. The rapid rollout of a successor program to CAPPS II is unlikely, because the Transportation Security Administration will probably have to reissue a Privacy Act notice describing the workings of the system, gather comments on the notice, issue new regulations or a clandestine mandate to force airlines to disclose passenger data to the system, and have the GAO certify it.
    Click Here to View Full Article

  • "Internet Engineers Step Back to See Ethical Picture"
    Investor's Business Daily (07/15/04) P. A6; Riley, Sheila

    The maturation of the Internet is forcing engineers to consider privacy and other ethical ramifications, according to experts. Central Connecticut State University computer science and philosophy professor Brian O'Connell, who heads a group within the IEEE studying technology's societal impact, says the group's purpose is to integrate "technical and social understanding," partly through education. For example, software engineers who build commercial Web sites are pushed to query how information will be gathered and presented, and how the privacy of personal data will be preserved; e-business is especially supportive of the group's goals, since trust established by reliable tech features is essential to maintaining the health of an online economy, O'Connell notes. Solving online ethical problems is the mission of academic-industrial collaborations such as those coordinated by Santa Clara University's Markkula Center for Applied Ethics. The facility's director, Kirk Hanson, observes that sensitivity to cyberspace ethics issues has grown considerably. He says, "We study ethical choices so that companies, governments and individuals understand the nature of the choices they have to make." Among the issues the center seeks to address is the corporate responsibility to protect clients, the level of backup a company sustains, and the desire to close the gap, or digital divide, between those who have access to computers and those who do not. VeriSign security director Jerry Brady reports that privacy infringement is the most important issue, although it is not always intentional: Frustrated gamers, for instance, may choose to hack into their opponents' systems if they are not winning. The digital piracy of copyrighted material is another major consideration, one that intellectual property lawyer Craig Cardon traces back to the anonymity people enjoy online. Educating people on the harm they are causing through such activities is one solution Brady supports.

  • "New Panels Announced for SIGGRAPH 2004"
    Business Wire (07/14/04)

    The ACM SIGGRAPH 2004 conference, the 31st International Conference on Computer Graphics and Interactive Techniques, which takes place August 8-12 in Los Angeles, will bring back the popular panels program. The roughly 25,000 conference attendees will have a chance to hear top experts in various computer graphics fields discuss pressing topics, as well as have the opportunity to ask questions and offer criticism, says SIGGRAPH 2004 Panel Chair Jonathan Gibbs. "Plus, this year's moderators and panelists read like a 'Who's Who' of the SIGGRAPH community," he notes. The panels include "3D Animation: Difficult or Impossible to Teach and Learn?," a forum for addressing the teaching skills and learning processes necessary for learning 3D animation, which is a unique blend of computer science and art; and "Next-Generation User Interface Technology for Consumer Electronics," which focuses on using work done within the SIGGRAPH community to improve consumer device interfaces. "Careers in Computer Graphics Entertainment" offers insight from both large and small companies about employment, while "Custom Software Development in Post-Production" covers the creation and use of custom software for high-end work. Such software is often fragile and difficult to use, but nevertheless often produces the most impressive results. "Games Development: How Will You Feed the Next Generation of Hardware?" reviews the increasing burden on games development teams to make use of ever-increasing hardware capabilities and to pioneer new gaming approaches. "Building a Bridge to the Aesthetic Experience" and "Cultural Heritage and Computer Graphics" will separately address how useful computer graphics are in conveying educational and cultural lessons.
    Click Here to View Full Article
    For more information about SIGGRAPH, visit http://www.siggraph.org/

  • "Sizing Up Robots"
    Journal News (NY) (07/15/04); Alterio, Julie Moran

    Today's robots are a far cry from the thinking, feeling, companionable (and sometimes antagonistic) machines envisioned by science fiction and considered by many people to be the ultimate goal of robotics research. Current robots are nowhere near the humanoid models popularized in movies and stories--nor are they intelligent enough: IBM researcher and MIT graduate Jonathan Connell calls intelligence the most important component in robotics, one that is conspicuously absent. He characterizes the simple act of tying one's shoes as a far more difficult and challenging task for a robot than, say, playing chess, and explains that imbuing human-like intelligence in a machine will require a deep understanding of the human thought process. But while progress on machine intelligence may be slow, advancements in other areas such as machine vision and navigation are proceeding rapidly. Robotics experts openly acknowledge that their enthusiasm--and indeed, their career choice--has often been inspired by the artistry of fictional robots. IRobot co-founder Helen Grenier says she owes her decision to enter the robotics field to R2-D2 from "Star Wars," and adds that the upcoming film "I, Robot" should inspire a new generation of engineers. Early fictional robots were often cast in a sinister light, but Isaac Asimov explored robots' benevolent possibilities with his formulation of the three laws of robotics, a series of principles that prevent robots from harming humans. Robotics pioneer Joseph Engelberger does not think the field of robotics has come very far in the last four decades, and is particularly disappointed at the lack of practical domestic robots. He thinks the technology exists to build a sophisticated robot companion or caregiver for elderly people, but argues that the robotics industry has strayed from Asimov's vision of machines that can positively impact people's lives.
    Click Here to View Full Article

  • "Father of Visual Basic Begs: Stop the Insanity!"
    ITBusiness.ca (07/09/04); Greiner, Lynn

    In his book, "The Inmates Are Running the Asylum: Why High-Tech Products Drive Us Crazy and How to Restore the Sanity," Visual Basic architect Alan Cooper argues that software has become excessively and unreasonably complicated because programmers and engineers are designing it for their peers rather than for end users. He contends that "In our rush to accept the many benefits of the silicon chip, we [business executives] have abdicated our responsibilities" and put engineers in charge of the high-tech industry. Cooper says the programmers' goal is for the programming process to be simple and seamless, whereas users desire simple and seamless software operation. He categorizes users as either apologists or survivors, the former being users who do not mind software's problems because they relish challenge, and the latter being those who have little knowledge about computers yet have little choice but to cope with bad software. The first half of the book is dedicated to the description and causes of bad software, and Cooper asserts that allowing software engineers to tackle software failures is akin to "asking the fox to solve henhouse security problems." His preferred solution is interaction design, a process that involves building biographies, needs, and skills of probable users of software under development and customizing the interface and feature set to their requirements. Certain areas of Cooper's book, which is expanded from its original 1999 edition, are in need of updating, as it mentions some problems that have already been tackled.
    Click Here to View Full Article

  • "Debate Over Auctions for Internet Addresses"
    International Herald Tribune (07/14/04); Schenker, Jennifer L.

    At a week-long meeting scheduled to begin in Kuala Lumpur on Saturday, ICANN will consider changing the process for choosing administrators for generic top-level domains, such as .com and .net, from a merit-based to an auction system. ICANN had asked the Paris-based Organization for Economic Cooperation and Development to weigh in on the issue, and OECD has "come down on the side of auctions," according to Sam Paltridge, an economist for the organization. Critics of the current system say ICANN has been too political in awarding contracts for the management of domain names. "There is less risk of decisions being challenged in court and less risk of favoritism" in an auction system, Paltridge says. Meanwhile, VeriSign, which currently oversees the addressing system for .com and .net, opposes auctions, citing the huge monetary commitment that goes into running top-level domains. Robert Shaw, an Internet strategy and policy adviser for the United Nations' International Telecommunication Union, has also raised concern over turning domain management into a "marketplace." Other critics charge that auctions could lead to overbidding, which would shut out smaller contenders. OECD has proposed using a lottery system to award non-profit domain contracts as a way of offsetting this potential disparity. OECD also suggests that auction participants be pre-qualified to ensure their technical and financial viability. OECD's report says that auctions would "provide a transparent and verifiable mechanism for the market to value .net appropriately and avoid the pitfalls associated with comparative selection." ICANN vice president Paul Verhoef says the OECD study "is an important work," but notes that the organization also plans to hear from technical experts as well as the World Intellectual Property Organization.
    Click Here to View Full Article

  • "Probabilities Ease Genetic Logic"
    Technology Research News (07/21/04); Patch, Kimberly

    The use of genetic algorithms--mathematical calculations that "evolve" a random population of solutions or designs to arrive at an optimum generation--is complicated by the enormous volume of data produced; this challenge is often met by using multiple parallel processors, which itself presents the challenge of efficiently passing data back and forth between processors. Researchers at Portugal's University of Algarve have accelerated the process with a probabilistic model building genetic algorithm that permits models of entire populations to be transferred between processors. A separate compact genetic algorithm runs on each of several processors, each of which shares information with a central coordinator processor at certain intervals; this enables every processor to activate or deactivate at any given time so that the system can scale to additional or less processors, an ability that gives the system fault tolerance. "In traditional parallel genetic algorithms, population members need to be sent back and forth between the different processors," notes University of Algarve computer science professor Fernando Lobo. "With the compact genetic algorithm...we only need to send [a] probability vector, which can be transmitted much faster than the whole population." Lobo says the researchers' goal is to ease the use of genetic algorithms, and David Goldberg at the University of Illinois at Urbana-Champaign believes their work has expanded the scope of using parallelism by enabling the exchange of data about compressed populations rather than individuals. Lobo explains that an increase in population also results in an increase in communications savings: For instance, a population of 1 million individuals and a 1,000-bit problem occupies 1 GB of information, but the compact genetic algorithm reduces that consumption to 20,000 bits.
    Click Here to View Full Article

  • "Bluetoothful Loser"
    Technology Review (07/15/04); Brown, Eric S.

    Short-range Bluetooth networking technology will get an upgrade soon with Enhanced Data Rate (EDR), just as the technology's popularity begins to take hold. Two million Bluetooth devices ship each week, according to the Bluetooth Special Interest Group, many of them smart phones in the European market, PDAs, or wireless headsets. Yet Bluetooth is racing against time as the competing ultrawideband technology nears standardization; ultrawideband allows much faster data transfers and has drawn away from Bluetooth the support of companies such as Intel. Currently, Motorola and the Institute of Electrical and Electronics Engineers are embroiled in a standards debate that has held up ultrawideband and given Bluetooth further chance to embed itself in the market. Bluetooth EDR, expected to arrive next year, offers 2.1 Mbps speeds that actually save power because transfer sessions are completed sooner. Gartner analyst Tod Kort says Bluetooth EDR will only have a short window of opportunity to make an impact before ultrawideband hits the market--and even with faster speeds, Bluetooth EDR is still not competitive with the low-power ultrawideband technology, which can send data up to 110 Mbps at 10 meters and as much as four times faster within two meters. Ultrawideband is designed as a replacement for PC cables, a use for which Bluetooth never succeeded in, and is also cheap to put into mobile devices; and even though the wireless networking Wi-Fi technology is not a direct competitor to Bluetooth, its popularity could make it a threat. IDC analyst Alex Slawsby says the main problem with Bluetooth is the inability of vendors' products to interoperate without configuration, which creates complexity for users.
    Click Here to View Full Article

  • "New SGI Supercomputer to Scale Linux to 1,024 CPUs"
    Computerworld (07/15/04); Weiss, Todd R.

    Researchers at the University of Illinois Urbana-Champaign's National Center for Supercomputing Applications (NCSA) will have vastly more computer power to study meteorological data, simulate astronomical phenomena, and perform other computationally-intensive operations with the installation of Cobalt, a Altix supercomputer from Silicon Graphics (SGI) that can scale the Linux operating system to 1,024 Itanium 2 processors and 3 TB of shared memory. Older NCSA-based cluster supercomputers employed an individual image of the Linux OS for every node, and featured dedicated memory allocations for each processor. NCSA interim director Rob Pennington explains that Cobalt's increased power will stem from the availability of all the memory for the applications and calculations, which will ease programming, yield better performance, and help accelerate and polish the work being carried out. He notes that two images of Linux will be used in the initial rollout, each one encompassing 512 processors, during Cobalt's testing and configuration. Later on, the full complement of processors will host a single image of the SGI Advanced Linux operating system. SGI says that Cobalt will consist of a symmetric multiprocessor machine linked to a 370 TB shared-file system, and other supercomputers based at the NSCA will have access to the storage. Cobalt's potential peak performance could exceed 6 TFLOPS, while total disk storage at NCSA will be expanded to 75 percent of one petabyte. The system's online deployment is expected to be complete by March 1, 2005.
    Click Here to View Full Article

  • "Computer Brains"
    e4Engineering (07/14/04)

    Artificial Development recently announced the completion of the CCortex-based Autonomous Cognitive Model (ACM), a realistic simulation of a functioning human cortex's workflow that runs on the company's 1,000-processor, 500-node Linux supercomputing cluster. The CCortex system is designed to imitate the architecture of the human brain, and boasts a tiered configuration of 20 billion neurons and 20 trillion connections. CCortex is reportedly the first neural system whose complexity competes with that of the mammalian brain, and is up to 10,000 times bigger than any preceding experiment to partly or totally simulate primary traits of human intelligence. June marked the activation of Kjell, the first ACM computer "persona," which is currently undergoing testing. At a recent Workshop on Cognitive Systems hosted by the University of New Mexico and Sandia National Laboratories, Artificial Development CEO Marcus Guillen said that ACM is designed as a testbed for later models; he noted that the Kjell persona features a realistic frontal cortex along with motor and somatosensory areas, but still needs visual and auditory cortex sections, while basal ganglia, the hippocampus, thalamic systems, and other areas are under development. The ACM's learning process consists of a stimulus-reward scheme, and the persona interacts with trainers via a text console interface, writing answers to the trainers' questions. The associative cortex "evolves" possible antagonistic responses for the ACM by having large groups of neurons vie for their own associated response until the strongest population prevails. The ACM is or is not rewarded depending on the validity of the "winner" response, and the ACM adjusts the balance between the responses and the strength of the associative neural path with the consideration of new experiences.
    Click Here to View Full Article

  • "802.11n: The Battle Begins"
    Wi-Fi Planet (07/12/04); Griffith, Eric

    The corporate fight over next-generation Wi-Fi standards is just starting to take shape, with the 802.11n Task Group N (TGn) entering the proposal phase at the IEEE 802 Plenary Session. So far, TGn has a total of 22 complete proposals and 39 partial proposals for the new standard, but none of them will be heard until September; the deadline for proposals is August 13. The plenary session meeting will focus on technical presentations to introduce topics and gather feedback, says TGn chair Bruce Kraemer. TGn is aiming for at least 100 Mbps of actual throughput, not just data-rate, for 802.11n, while Agere Systems is leading a group consisting of Intel, Sony, Nokia, Philips, and Atheros that will offer up to 500 Mbps data-rate using the 40 MHz channel instead of the 20 MHz channel used for today's Wi-Fi systems. The Agere proposal, named TGn Sync, is backwards-compliant with 20 MHz systems, and would standardize on 2x2 multiple-input multiple-output (MIMO) antennas providing 250 Mbps data-rate with an optional 4x4 MIMO antenna for speeds up to 500 Mbps. Agere's Mary Cramer says the proposal goes beyond the channel bonding technique used in Atheros chips because it fills in the gaps between the channels as well; Japan does not allow Wi-Fi outside of 20 MHz, but Cramer says, "You can't cripple the standard for one country." Another major contender is from Airgo and partners in the World Wide Spectrum Efficiency (WWiSE) group: Airgo CEO Greg Raleigh and chief scientist VK Jones pioneered MIMO, and plan to use 20 MHz channels with 4x4 spatial multiplexing MIMO. The jockeying between companies is already heating up, with Agere warning about non-IEEE patents that could torpedo WWiSE technology, but even Airgo's Carl Temme say 802.11n will likely be made up of a variety of technologies and approaches as compromises are struck.
    Click Here to View Full Article

  • "New World Computer Chess Champ Crowned"
    New Scientist (07/13/04); Knight, Will

    The latest version of the Junior software program developed by Israeli programmers Amir Ban and Shay Bushinsky has assumed the throne as the world's computer chess champion, based on the results of this year's finals on July 12. Junior differs from most leading chess programs in that it places more emphasis on factors such as mobility and positional advantage than on the value of individual pieces, which enables the program to extrapolate very audacious and atypical maneuvers, although it also becomes vulnerable to human-like errors. ChessBase co-founder Frederic Freidel observes that the various programs vying for the championship boast individual characters dictated by what factors each program's algorithms emphasize. Older chess programs were very power-consumptive because they analyzed potential maneuvers in painfully encyclopedic detail, whereas current leading programs employ more intelligent algorithms to streamline the amount of required positional searching. Junior and similar programs study approximately 3 million moves every second--far less than older programs--but decrease their workload by ignoring certain search strategies. Such programs can operate more effectively on less computer power. The specialization of chess programs often lowers their value to artificial intelligence research, but they are still considered valid for research into problem solving and algorithms.
    Click Here to View Full Article

  • "The Rise of 'Digital People'"
    MSNBC (07/13/04); Perkowitz, Sidney

    In his book, "Digital People: From Bionic Humans to Androids," Emory University physics professor Sidney Perkowitz puts the development of artificial beings into perspective. He notes that our fascination with artificial beings covers the whole gamut of human reactions in art and literature, ranging from feelings of love and compassion toward them ("Blade Runner") to fears of a robotic takeover or machine-driven annihilation of the human race ("The Terminator") to artificial beings' aspiration to become human ("Star Trek") to visions of robots possessing the intelligence, wisdom, and ethics to improve humanity ("I, Robot"). Perkowitz believes the potential medical and therapeutic benefits of bionic or robotic technologies present the most compelling--and moral--case for the development of artificial beings. However, robot technology is currently limited in terms of intelligence, mobility, perception, and navigation, and no machine is truly conscious or emotional. Nor are robots behaviorally indistinguishable from people, which British mathematician Alan Turing suggested would be the defining criterion for an intelligent machine. But the author postulates that the technology may exist to reach Turing's benchmark through advancements in artificial intelligence, digital electronics, computational technology, nanotechnology, materials science, molecular biology, and other areas. Perkowitz cites Georgia Institute of Technology roboticist Ronald Arkin's book, "Behavior-Based Robotics," which contends that it is enough that artificial beings merely boast a semblance of consciousness, not actual consciousness. The Emory University professor concludes that the creation of artificial beings is attractive because it represents an opportunity to explore the human condition in more detail: "To create artificial minds and bodies, we must first better understand ourselves," he asserts.
    Click Here to View Full Article

  • "Tulsa Leading in Cyberterrorism Training"
    Associated Press (07/12/04)

    University of Tulsa officials this week announced that the school has received a $4.7 million grant from the National Science Foundation (NSF) for its Cyber Corps program, aimed at creating a force of computer security specialists to detect cyberterrorism. Tulsa runs the U.S.'s largest Cyber Corps program, with 47 graduates in three years and 38 students currently in training. Among the other 20 universities participating are the Naval Postgraduate School, the State University of New York at Stony Brook, and Carnegie Mellon University, all of which also recently received NSF grants. The program was conceived after the September 2001 terrorist attacks in the United States, and started with just six universities. The idea is to recruit, train, and place skilled workers in government agencies that rely heavily on IT. The program offers scholarships and other incentives, some of which are purely intellectual; members get two years of training and then must work for the federal government for two years. Tulsa Cyber Corps graduate Leigh Ann Winters says she enjoys participating in raids with government agents, where she is first on the scene to inspect digital evidence among PC hard drives, mobile phones, or digital cameras.
    Click Here to View Full Article

  • "More Than an Open Source Curiosity"
    CNet (07/15/04); LaMonica, Martin

    Mono version 1.0 has begun shipping, enabling developers to take advantage of Microsoft's .Net development platform to write programs for Linux and other operating systems. The open-source Mono is basically a port of .Net's code, which has been submitted to the Ecma International standards body, and Mono project founder Miguel de Icaza says the three years it took to bring Mono to fruition has left the project 18 months behind the official .Net, but that the technology is still a very attractive alternative for developers. Novell, which acquired de Icaza's Ximian and the stewardship of Mono last year, is standardizing on Mono as its internal development platform because it supports Windows, Linux, and the Mac OS with the same tool base, and lets developers avoid platform-specific details. De Icaza says Mono is late to market compared to .Net because it is the nature of open-source software to respond to need rather than generate it; the Mono team, however, is already working on version 2.0 features that will incorporate Microsoft's C# 2.0, for example. Unlike .Net, Microsoft has not yet defined its Longhorn technology, so Novell cannot do much for the Linux community in terms of making compatible software, but Novell's iFolder does do many of the same data synchronization and backup tasks that Longhorn will include. The Mono project is trying to avoid Microsoft's stable of at least 30,000 patents and will remove any infringing code upon notification, says de Icaza; he is certain the idea of a multi-language virtual machine such as .Net is already covered under prior art, specifically a failed Open Software Foundation development. De Icaza says J2EE has become too difficult to use for smaller software enterprises, and that the Mono project conducted a survey where developers said J2EE was 25 percent less efficient than Microsoft's ASP.Net.
    Click Here to View Full Article

  • "Meet the Eye Cam"
    Newsweek (07/12/04); Levy, Steven

    Ko Nishino and Prof. Shree Nayar of Columbia University have devised a "corneal imaging system" that can capture and study the mirror image of a person's surroundings reflected in the limbus of the eye. A high-resolution camera first takes a picture of a person's face, and software lifts a wide-angle view of the subject's environment as seen in the limbus. The researchers noticed that the eye reflects a much broader scene than what the retina perceives, so the system they developed could enable people to play such scenes back to see what details they may have missed. Nishino and Nayar are also using algorithms and the eye's anatomical details to calculate what people are actually looking at. Nayar postulates that the technology could have security applications: For instance, a corneal image taken at a terrorist's hideout could provide hints as to his location, while surveillance cameras at security checkpoints could potentially be modified to notice whether people are spending an excessive amount of time studying the precautions. The Columbia researchers will detail how their system can enhance digital filmmaking at SIGGRAPH. For example, examining corneal reflections could ease the insertion of an object into a scene or the replacement of a person with a digital stand-in while re-creating the original lighting conditions. The technology could also be employed by computer-interface designers to develop software that displays information relevant to what a user is looking at, while psychologists could draw insight on human reactions by knowing what a person is staring at in a given moment.
    Click Here to View Full Article

  • "Top Ten Tech Trends"
    PC Magazine (07/13/04); Ulanoff, Lance; Rupley, Sebastian; Cohen, Alan

    Highly promising technologies under development include WiMAX (Worldwide Interoperability for Microwave Access), which is touted as a faster, cheaper, and more efficient substitute for cable modems and DSL lines; WiMAX service providers are hoping that the technology will carve out a niche in a projected $1.2 billion broadband wireless market. WiMAX is designed as a "last mile" technology that can facilitate the penetration of wireless broadband into neighborhoods and office parks, as well as the 20 percent of American households without DSL or cable access. Intentional Software's Charles Simonyi is hoping to evolve software design with self-writing software that uses intentional programming to build elaborate applications based on initial descriptions and diagrams contributed by developers. Researchers are also focusing on sensor "motes," which are tiny, autonomous devices that can configure themselves into wireless sensor networks to monitor a patient's vital signs, building infrastructure, factory equipment, and water and chemical levels, to name just a few areas. Voice over IP's (VoIP) popularity is swelling thanks to services offered by major telecom providers and the likelihood that cable companies will soon join the bandwagon. VoIP involves the transmission of voice traffic over IP networks, which translates into improved call quality, especially for local calls; the technology is expected to drive the development of sophisticated phones with "presence" features, as well as hybrid cellular/Wi-Fi devices. The Smart Skin project at the University of Texas at Arlington has yielded a robust and flexible microsensor that could be embedded in clothing and objects for military, domestic, and medical monitoring applications. There are multiple initiatives afoot to advance machine translation and comprehension of text, which has received a boost thanks to the advent of terascale databases that can store more text for training and comparison of similar phrases to find meaning; another breakthrough that holds great promise is software that employs new techniques for phrase-based, statistically ranked translation.

  • "Clique Here"
    Information Highways (06/04) Vol. 11, No. 4, P. 12; Pearson, Janice

    People are using social networking services such as Ryze, LinkedIn, Tribe, and Friendster to engage in social activities such as dating and making business contacts. Within companies, social networking could enable employees to tap resources that may not be apparent to them in daily office life, while outside of companies networking could spur higher sales potential and greater innovation via cooperation and collaboration. Computerized social networking is based on the "six degrees of separation" theory, although many networks set limits at four degrees because networking starts to break down the further down the chain users go and trust between users and the people they meet becomes harder to establish. Carolyn Burke, regional chair of Ryze Toronto, explains that her service offers a higher level of trust than one might have meeting complete strangers because the Ryze.com site allows users to profile their individual enthusiasms, invite people they are already intimate with, and find others with similar or reciprocal interests. Spoke Software's Chris Tolles says that his firm sells social networking software that is letting individual business professionals, workgroups, and whole businesses find opportunities based on their relationships. Being a member of a social network can be a status symbol, but there are problems. Networks such as Orkut are notorious for bringing the same people back, thus limiting networking opportunities, and the targeting and exclusion of "Fakesters"--users who build fake profiles--is another troublesome aspect.
    Click Here to View Full Article

  • "Time for a Redesign"
    CIO Insight (06/04) No. 40, P. 44; Wieners, Brad

    Nielsen Norman Group co-founder and Web design and usability guru Dr. Jakob Nielsen says intranet usability is critical for raising the productivity of workers in the white-collar and service economies: He says, "to get productivity gains today we have to adjust the machines--and by machines now, we really mean software," and adds that adjusting machines to human thinking requires usability studies. Nielsen says the biggest problem in Web design continues to be search on individual Web sites or within intranets, while the second biggest problem is information architecture, which is still oriented around the production of information rather than its consumption. Another notable flaw is a shortage of clarity in content, in which descriptions are not explicit enough to give people the answers they need. According to Nielsen's November 2003 Alertbox column, two-thirds of corporate Web sites do not explicitly summarize the function of the site or company; fail to employ a liquid layout that enables users to adjust the size of the home page; do not use color to differentiate visited and unvisited links; use graphics as ornamentation rather than as illustrations of actual content; and fail to post active links to the home page on the home page. Other Web site pet peeves of Nielsen's include PDF documents, whose linear, letter-sized configuration makes searching and reading hard. Nielsen attributes resistance to conducting usability reviews to several factors, including high-level executives who are unaware of the poor shape of their intranet because they rarely use it; the failure of IT department workers to realize how hard the processes are for mainstream employees; and difficulties in delegating the responsibility of carrying out usability improvements. The usability maven recommends that the existing Web site or intranet design be rigorously studied before any redesign is implemented.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM