HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 619:  Wednesday, March 17, 2004

  • "Software Schools Evolve to Help Students Compete"
    San Jose Mercury News (03/17/04); Heim, Kristi

    China hopes to nurture the next generation of IT leaders and entrepreneurs and raise its profile as a global technology player through the establishment of software colleges at 35 Chinese universities. One such institution, Peking University's School of Software, is equipped with cutting-edge labs funded by major American tech firms such as Oracle, IBM, Microsoft, and Intel, which expect that their contribution will bring graduates into their China-based operations. There is heavy collaboration between the software school and Chinese and foreign corporations, and industry needs form the basis of the school's curriculum. The school's departments are headed by U.S. technology executives and academics, whose expertise includes not just technology, but business management--an area where China's software industry is sorely lacking, according to students. Subjects that students specialize in include integrated-circuit design, digital arts design, information security, and entrepreneurship, while dean Chen Zhong plans to cultivate students' creativity by leveraging Peking University's liberal arts assets. Most instruction is done in English. Chinese software schools are encouraging students to learn through real-world experience by having them participate in actual software development projects. Despite Chinese software professionals' current focus on low-level programming, the number of tech graduates China is churning out is intimidating: The U.S. Presidential Council of Advisors on Science and Technology estimates that 58 percent of all degrees awarded in China in 2002 were in engineering and physical sciences, compared with 17 percent in America. A total of 60,000 engineering bachelor's degrees were awarded in America in 2002, compared with almost four times that number in China.
    Click Here to View Full Article

  • "W3C Finalizes Internet Voice Standards"
    eWeek (03/16/04); Taft, Darryl K.

    The World Wide Web Consortium (W3C) announced the finalization of two core components of the Speech Interface Framework--VoiceXML 2.0 and the Speech Recognition Grammar Specification (SRGS)--on March 16; the former will be used to integrate Web-based development and content delivery with interactive voice response capabilities, while the latter will uphold VoiceXML's speech-recognition support. W3C officials say the Speech Interface Framework will let users interact with Web services via regular telephones using vocal or keyboard commands. "VoiceXML 2.0, which has been key in the growth of speech applications by providing a standards-based framework, allows businesses to deploy applications today that leverage existing development skills and resources," says IBM's pervasive computing program director Igor Jablokov. "Because it allows speech deployments to be built over a standard Web-application infrastructure, VoiceXML also provides a clear upgrade path as applications grow--unlike closed, proprietary languages." W3C's Janet Daly says VoiceXML 2.1 and 3.0 standards are planned by the Voice Browser Working Group, which is also exploring potential applications of Extensible Hypertext Markup Language plus Voice and Speech Application Language Tags. SRGS enables applications to set up the vocal prompts users will need to say, but the standard has also found use in handwriting recognition and other applications. On the same day of the W3C announcement, the VoiceXML Forum declared that its 370-plus member companies are backing the VoiceXML 2.0 specification as well as building VoiceXML applications and services; the forum also presented a plan to launch a VoiceXML Platform Certification Program. Over 10,000 VoiceXML-based applications have been implemented worldwide, the forum estimates.
    Click Here to View Full Article

  • "Congress Let Privacy Programs Be Cut"
    Associated Press (03/16/04); Sniffen, Michael J.

    Last year's termination of the Pentagon's Terrorism Information Awareness (TIA) program was accompanied by the quiet elimination of a pair of privacy tools designed to partially satisfy critics. One such tool was designed for the Genisys program, which analyzed government and commercial records for signs of terrorist planning. Palo Alto Research Center researcher and former Defense Advanced Research Projects Agency (DARPA) program manager Teresa Lunt worked on Genisys software designed to protect identities in the records the government scanned, prevent each intelligence analyst from reviewing unauthorized data, and keep a permanent record so abuses could be tracked. The other privacy tool was developed for the Bio-ALIRT project, conceived as a bio-surveillance system that would look for signs of biological attacks by studying symptoms of patients at emergency rooms and doctors' offices, school absenteeism, and other factors. The Bio-ALIRT privacy software was principally devised by Carnegie Mellon University Professor LaTanya Sweeney, who reported that DARPA funded the software's development, but did not pay for any public field tests. TIA may have been killed by congressional decree, but some data-mining projects originally developed under its aegis were retained and transferred to other intelligence agencies. Also preserved was the Novel Intelligence from Massive Data project sponsored by the office of Advanced Research and Development Activity. The Intelligence Authorization Act sanctions continued data-mining research, but declares that the development of policies and procedures designed to support civil liberties and privacy should proceed parallel to the development of data-mining tools.
    Click Here to View Full Article

  • "California Researchers' New Center Will Study, Test Computer Viruses"
    Oregonian (03/14/04); Keefe, Bob

    A system that private firms and Internet monitors can use to examine, study, and possibly counter computer viruses and other forms of malware before they become a serious threat is being developed by the University of California at Berkeley and the University of Southern California with a $5.5 million federal grant. Berkeley researcher and associate professor Anthony Joseph says the closed network, known as the Defense Technology Experimental Research (DETER) project, should serve as the cyber-equivalent of the Centers for Disease Control and Prevention, where dangerous viruses are studied in isolation. This cybersecurity "test bed" could eventually comprise 1,000 computers distributed among three sites in California and Virginia. Once DETER is fully operational later this year, researchers will be able to introduce computer viruses found in the "wild" of the Internet into the network so they can analyze their propagation mechanisms and start developing countermeasures. DETER will also be used to modify or redesign existing malware in the hopes of getting a jump on future cyberattack strategies and methods. Berkeley's Shankar Sastry reports that the network will be used as a public resource, insisting that "The need for this sort of research was yesterday." Berkeley intends to open the system to private firms, government officials, and any other benevolent cybersecurity organization.
    Click Here to View Full Article
    (Access to this article is free; however, you will need to enter your zip code, year of birth and gender to access the article.)

  • "Rough Ride for Robots, But Humans Smiling"
    MSNBC (03/14/04); Boyle, Alan

    Although the desert may have defeated the robot vehicles racing in the Defense Advanced Research Projects Agency's (DARPA) Grand Challenge on March 13, competitors and DARPA officials said the project was ultimately successful. DARPA director Anthony Tether reported that his agency has already received a return on investment that is four or five times the $13 million funneled into the program, and is confident that the technologies demonstrated in the past week would help the U.S. military's mission to make one-third of its vehicles operate without human intervention by 2015. DARPA organizers wish to assess some of these technologies, and think that another Grand Challenge could be held in a year or two. "We did what we wanted to do, but that doesn't mean that we aren't going to keep coming back and keep doing this until somebody picks up that million-dollar check," said Tether. Grand Challenge project manager, Air Force Col. Jose Negron, was inspired by the grass-roots robotics movement the contest kindled. "You had teams collaborating--'my technology for your technology,'" he noted. Each robot was followed by a pickup truck with DARPA judges that employed wireless "E-Stop" controls to pause or deactivate the machines when they ran into trouble. None of the participating vehicles crossed the finish line, but the autonomous Humvee designed by Carnegie Mellon University's Red Team traveled 7.4 miles along the 142-mile off-road course--further than any other contestant--before spinning out.
    Click Here to View Full Article

  • "Patently Unfair?"
    InternetNews.com (03/16/04); Kuchinskas, Susan

    Critics are faulting the U.S. Patent and Trademark Office for allowing patents on Internet technologies and business methods, a practice that allegedly enables a select few patent holders to set terms for e-commerce, digital media, and even the Web itself. The policy, based on the landmark court ruling in the case of State Street Bank & Trust v. Signature Financial Group in 1998, has opened many companies up to patent infringement lawsuits that stretch out for years and cost millions of dollars. Gesmer Updegrove attorney David Jacobs notes that many business process patents would be upheld as valid under scrutiny, since the same standards applied to any other patent award--novelty, non-obviousness, sufficiency, and utility--are applied to them; however, Jacobs points out that obviousness is a notoriously ambiguous criterion. Also adding to the patent office's troubles in determining the proper patent owner is the simultaneous filing of multiple patent applications for the same idea, explains Internet Patent News service publisher Greg Aharonian. Furthermore, the USPTO's stable of approximately 3,600 examiners is overworked: Each individual reviewer handles 40 to 50 cases at a time, while the review process can take as long as 18 months. The office's deputy director of public affairs, Brigid Quinn, estimates that the USPTO's current backlog stands at 500,000 cases; the office receives about 300,000 applications annually, which translates into roughly 1,000 new applications each business day. Aharonian thinks patent applicants should be more circumspect when researching the existence of any prior art before applying. Meanwhile, last year the FTC made a list of recommendations the USPTO should institute, such as process enabling post-grant review of and opposition to patents, and basing patent determinations on "a preponderance of evidence" rather than on "clear and convincing evidence."
    Click Here to View Full Article

  • "Gunning for the U.S. in Technology"
    Business Week (03/16/04); Salkever, Alex

    America's technological superiority is threatened by a confluence of offshore outsourcing and rising international competitiveness in terms of research, entrepreneurship, and innovation: Gartner estimates 500,000 U.S. technology jobs will head overseas in 2004, while U.S. companies seem to announce a new Asian R&D center every week. Although U.S. companies such as IBM, Microsoft, Hewlett-Packard, Texas Instruments, and General Electric remain dominant forces in global technology, other countries have supplanted the U.S. in areas such as mobile phones, information-security technology, and robotics. Asian countries especially are eager to foster home-grown technology and offer startups co-funding or five- and 10-year tax breaks. For its part, the United States under President Bush has significantly boosted research spending as a percentage of domestic discretionary spending, but critics say too much of that money goes towards defense research; basic key sciences such as chemistry, materials science, and physics are being left behind, leaving fewer chances for serendipity lab discoveries such as those that led to Teflon, Velcro, nylon, and X-rays. U.S. science education, while still preeminent globally, is less alluring due to more stringent visa processes and improved educational opportunities in people's home countries. Importantly, many of the top U.S. higher education institutions hosting foreign graduate students reported declines in applications, according to the Association of International Educators. Education spending in the United States also continues to fall while tuition skyrockets, leaving universities in dire financial straits, and experts also note the number of European scientists publishing work in major journals has surpassed the number of U.S. scientists; the U.S. has also begun importing more technology goods than it exports. Still, Sun Microsystems CTO Greg Papadopoulos says the U.S. will continue to shape the global technology environment because it leads in architectural innovation, such as better chip design or materials.
    Click Here to View Full Article

  • "GPS Technology Helps Blind Find Way"
    Wall Street Journal (03/17/04) P. B4D; Prince, Marcelo

    A number of portable devices on the market or under development are designed to help blind or visually impaired users get around, often by employing location data pinpointed by global-positioning-system (GPS) technology. Pulse Data International's BrailleNote is a textbook-size device equipped with a Braille keyboard and display; its uses, beyond sending email and Web browsing, include providing walking directions by voice output or the Braille display when linked to a GPS receiver. BrailleNote also integrates GPS with digital street maps and points of interest to tell blind users their location and surroundings. The Trekker navigation tool from VisuAide includes a GPS and voice receiver, and can store street maps and points of interest in a database; a forthcoming Trekker upgrade can emit vocal directions and create walking routes. Both gadgets suffer from one of GPS' biggest limitations: They cannot work indoors because GPS satellite signals are blocked by buildings. Vectronix is working with VisuAide and Sendero to address this problem by developing a machine that mates a GPS receiver to motion sensors; once users move indoors, the sensors activate to pinpoint their location and relay directions via a BrailleNote or handheld computer with mapping software and databases. Testing demonstrates that the gadget, which was adapted from technology originally developed for military applications, boasts an error rate of roughly 5 percent of the distance traveled, according to Vectronix's Quentin Ladretto. Techniques and devices for blind-pedestrian navigation are being explored and evaluated as part of a project underwritten by the National Institute on Disability and Rehabilitation Research: One initiative spearheaded by University of Minnesota researchers involves organizing a database of floor plans and building interiors that can be downloaded off the Web or transmitted wirelessly to a handheld gadget.

  • "Scientists Develop Breakthrough Internet Protocol"
    TechNewsWorld (03/15/04); Kroeker, Kirk L.

    University researchers have developed next-generation network protocols that would utilize existing bandwidth much more efficiently than the current transmission control protocol (TCP), which was developed in the 1980s. The Stanford Linear Accelerator Center recently compared six new protocols under development and found North Carolina State University's Binary Increase Congestion Transmission Control Protocol (BIC-TCP) to be the best overall solution. North Carolina University associate professor Injong Rhee and colleagues also presented the BIC-TCP protocol at the IEEE Communications Society Infocom 2004 meeting this month. Rhee said BIC shuttles data about 6,000 times faster than digital subscriber line (DSL) technology and would prove especially useful for a growing number of international scientific projects that require large sets of data to be transferred long distances; BIC basically finds the fastest routes for data more quickly than regular TCP, and thus handles more traffic. However, Rhee said it was difficult to balance how fast BIC is able to fill available bandwidth with the needs of other protocols. High-speed transmission of large data packets would give a huge boost to many cutting-edge network applications, including telemedicine, multi-user online gaming, and real-time environmental monitoring. Last year's widespread power outages also highlighted the need for redundant backup systems physically far apart. Rhee predicted that BIC would become available within a few years and eventually achieve status as a standard Internet protocol.
    Click Here to View Full Article

  • "Silicon Valley's Changing Landscape"
    Los Angeles Times (03/15/04) P. A1; Menn, Joseph; Pham, Alex

    To compete in an increasingly global market, many companies in Silicon Valley are outsourcing most of their operations overseas, where they are done by people willing to work for substantially less than their American counterparts. "The character of the valley is changing pretty dramatically" with this transition, notes AnnaLee Saxenian, dean of UC Berkeley's School of Information Management and Systems. The landscape of the valley is becoming increasingly dotted with corporate headquarters where top executives run their companies, with the gruntwork--software programming, technical support, and computer system design, among other things--handled offshore in countries such as India and Romania, where local talent is becoming more abundant as well as more skilled. The appeal of cheaper labor is not the only factor driving offshoring; also playing a part is growing demand for advanced technology in the far corners of the world. Fortinet CEO Ken Xie explains, "It's more expected today that companies start to engage in global markets earlier." More sophisticated communications technology such as videoconferencing and high-speed Internet connections is also spurring the migration, since it helps make globally dispersed operations easier to coordinate--and is becoming less costly to implement. And long-distance collaboration is easier thanks to the global appeal of tools such as database software and spreadsheets. The flipside of outsourcing is the erosion of job opportunities for U.S. workers: UC Berkeley economist Cynthia Kroll estimated in February that 15.7 percent of Silicon Valley's workforce was in at-risk fields, compared to 11 percent of the national workforce.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "In E-Mail Warfare, the Spammers Are Winning"
    Baltimore Sun (03/14/04) P. 1A; Shane, Scott; Packard, Jean

    In the arms race between spammers and anti-spam proponents, the bad guys have the upper hand thanks to underhanded tactics such as using computer worms to compromise vulnerable systems and turn them into "zombies" for mass-mailing spam. Spamhaus director Steve Linford predicts that spam will probably account for 80 percent of all email in the United States by summer. Worms are not the only tool in spammers' arsenal: Other methods include counterfeiting return addresses, and tweaking the spam with odd spellings and blocks of random text to thwart electronic filters. The profit potential is irresistible for spammers: Spam can be cheaply distributed to millions of people in a few hours, and profits can be realized even if only one spam recipient out of 10,000 makes a purchase. Spam hurts the productivity of businesses that must use up precious time to get rid of junk email, while filters, despite their increasing sophistication, cannot avoid mistaking legitimate email for spam. One proposed approach for curtailing spam involves challenge-response systems designed to authenticate the legitimacy of email if the sender types in a certain word or code, thus indicating that the sender is an actual person and not a computer program; another is to charge senders a penny for each email they send. Legislation such as the CAN-SPAM Act appears to have had little effect on the spam problem, though the biggest U.S. email providers recently invoked the law to file lawsuits against scores of spammers. Linford believes spam can only be effectively controlled through combined technological, litigious, and prosecutorial efforts, though the situation is likely to worsen in the short term.
    Click Here to View Full Article

  • "Memories Captured in a Digital Shoebox"
    Financial Times (03/16/04) P. 9; Cane, Alan

    Memorabilia is undergoing a transition from the physical to the digital world as new technologies give users the power to preserve their memories in computer files rather than in shoeboxes. First conceived of at the dawn of the computer age by electronics pioneer Vannevar Bush, who envisioned a device called a memex that would store all personal records and communications for easy retrieval, the concept is not new, but only today has the technology advanced enough to make such concepts possible. Accompanying the memorabilia trend is technology researchers' continued development of tools and software that can tackle the challenge of organizing and navigating through such digital collections. Nokia's "Lifeblog" technology, to be launched on March 17, allows photos, videos, and messages to be digitized into a computer and categorized for rapid retrieval. Designed for consumer use, Lifeblog software works with cell phones featuring built-in digital cameras: Digital memorabilia is transferred from the phone to a PC via the software, which organizes the material into a horizontal diary, whereby all the day's events are archived under that specific date. Christian Lindholm, director of multimedia applications at Nokia, says it is easier to browse and search for particular items using the diary than it is to store items in albums or folders. "You can 'Google' you life," he declares. So that users will not lose their digital memories to system failures or technological obsolescence, Lindholm thinks that successive versions of Lifeblog will enable users to move their archives to a "trusted repository" through the Internet. However, unresolved cost, security, and flexibility issues with digital memories are likely to uphold the appeal of traditional, physical memorabilia.

  • "Optimistic IT Employments Outlook"
    IT Management (03/15/04); McMahan, Steve

    IT spending is perceivably increasing and buoying the IT job market: A number of industry trends are influencing IT employment opportunities, including businesses' willingness to invest in new services infrastructure, offshore outsourcing, security-related issues, and the emergence of new fields such as biotechnology and nanotechnology. Federal tax relief for business IT investments expires this year, making it more likely that companies will push through much needed projects. A Gartner/SoundView CIO survey already shows that companies are focusing on relatively quick IT projects expected to yield immediate returns rather than long-term enterprise projects, while new caps on H-1B and L-1 visas for foreign workers should also benefit the U.S. technology job market. Even as companies increasingly make use of offshore outsourcing, they often choose to retain sensitive processes because of security concerns; workers knowledgeable in firewall deployment, system upgrades, and the more secure Linux operating system will benefit from this increased security interest. Technology workers that also have administrative and business skills are also likely to immunize themselves from offshore outsourcing's impact. Software design and engineering is a needed skill as software vendors shift from a traditional licensing model to one where software is provided as a service via the Internet. Entertainment and advertising firms seeking to make use of new interactive media technology will need specialists in that field, while tech support personnel should be in demand as businesses quickly deploy priority projects such as wireless LANs, security, portal software, VPNs, and storage management software. Recent surveys also show employers are looking for stable hires, while workers should keep their career focus on macro trends and long-term plans.
    Click Here to View Full Article

  • "Can Social Networking Stop Spam?"
    NewsFactor Network (03/15/04); Martin, Mike

    A new algorithm developed by UCLA professors P. Oscar Boykin and Vwani Roychowdhury applies social networking principles to spam filtering. "We routinely use our social networks to judge the trustworthiness of outsiders...to decide where to buy our next car, or where to find a good mechanic," notes Roychowdhury. "An email user may similarly use his email network, constructed solely from sender and recipient information available in the email headers, to distinguish between...'spam,' and emails associated with his circles of friends." The researchers' algorithm processes a specific user's personal email network to concurrently determine both the user's trusted networks of friends and spam-spawned sub-networks, Boykin explains, adding that the algorithm distinguished between spam and legitimate email with no errors or false negatives in a recent test. The researchers studied six weeks' worth of emails from assorted individuals so they could ascertain the "components" of their email network, a component being a series of nodes that can connect to each other in the network, according to Boykin; analyzing "clustering coefficients" in a network--provided the network is big enough--is an easy way to tell spam and non-spam components apart. Boykin says he and Roychowdhury observed that clustering coefficients run high for non-spam components, and are equal to zero for spam components. Roychowdhury's colleague attests that the algorithm can be used to train content-based filters to recognize words and phrases typical of spam and non-spam, once 50 percent of email can be accurately classified as either junk or legitimate email. Boykin points out that the tool also produces white lists and blacklists used to verify that content filters are properly classifying email.
    Click Here to View Full Article

  • "10-Gigabit Ethernet Comes Alive"
    CNet (03/15/04); Reardon, Marguerite

    The two-year-old 10-Gigabit Ethernet (10-GigE) technology is now reaching a much larger business market thanks to lower prices and the prospect of a one-size-fits-all networking solution that would reduce training costs and complexity. There are no common applications that alone require the bandwidth 10-GigE offers, but its adoption emphasizes a growing need for bandwidth at the midtier of enterprise networking infrastructure and in some data center applications. Companies are using 10-GigE switches to aggregate slower data streams from network nodes, such as PCs and servers with 1 Gbps connections; at that 1 Gbps, a full-length DVD-quality movie could be delivered in about 30 seconds, compared to several hours for a normal broadband connection. Another early 10-GigE application is to connect supercomputer clusters, placing the technology in direct competition with Infiniband. Network engineers see this versatile technology being used widely across many applications, reducing training costs, but the most significant factor in 10-GigE's recent spread has been dramatic price reductions. Yankee Group analyst Zeus Kerravala says two-thirds of the 7,300 10-GigE ports with necessary optical interfaces shipped last year went in the fourth quarter; he says that momentum will continue for several years. Meanwhile, the IEEE 10GBase-CX4 specification ratified in February is generating some excitement because it allows 10-GigE to work on copper infrastructure, which is much less expensive than optical fiber connections. The drawbacks of 10GBase-CX4 is that it only uses less common four twin-axial copper wiring and travels less than 15 meters compared to up to 10 kilometers for optical switches; the IEEE is expected to produce a 10GBase-T standard using normal copper wiring in a couple years.
    Click Here to View Full Article

  • "The End of Passwords"
    E-Commerce Times (03/13/04); Millard, Elizabeth

    Lavasoft vice president Michael Wood says the way that passwords are currently used poses a danger to companies since individuals could use keylogging spyware to record keystrokes and so learn passwords. However, alternative user authentication technologies such as smart cards have not caught on widely. Users themselves often open the greatest security holes by writing down passwords or using the same password for multiple systems. The recent RSA conference showcased a number of user authentication choices, including SecurID technology, which was created by RSA and Microsoft for Windows in particular. It uses a personal identification number and an authentication token, and generates new passwords every 60 seconds. VeriSign has announced an alliance with Microsoft for authentication services based on the Windows Server 2003 products, and Sun Microsystems says it will create an identity-management solution for Microsoft environments such as Windows. Given the widespread corporate use of Windows, such technologies could change network security. IT departments must find a balance between security and usability, and blended techniques are likely to become more popular this year. Forrester analyst Michael Rasmussen says, "There can be a trade-off on speed for security, depending on your architecture. The decision on what to implement is going to come down to an IT department's preferences and needs."
    Click Here to View Full Article

  • "Open Source Database Improvements Grow"
    Network World (03/15/04) Vol. 21, No. 11, P. 32; Cox, John

    Open source databases are usually niche products, but upcoming improvements are making the tools increasingly important for a growing stable of industries, including e-commerce applications, high-speed Web searching, content management, data warehouse reporting, and Web portals. A survey by Evans Data estimates that the use of MySQL expanded by over 30 percent last year, compared to only a 6 percent increase in use of Microsoft SQL Server and Access. April will see the release of new database clustering software from MySQL designed to keep applications running even if a server goes down; the product will be available under a commercial license, the GNU General Public License, or an open source license. Version 1.5 of Firebird, which was released last month, transfers the source code from C to C++, cleans the code up, fixes numerous bugs, improves memory management, and reportedly speeds up queries by as much as 60 percent or higher. Users testify on the Southforge.net's Firebird site and other Web sites that they appreciate the database's compactness, its Java support, its simplicity, its speed, and its uncomplicated deployment on Win32 systems. Open source databases are increasingly being perceived as components in a stack of open source software that allows corporations to build an application infrastructure, whose first iteration was called LAMP, representing the Linux operating system, the Apache Web server, the MySQL database, and either Python, Perl, or PHP as the development language. Supporters of PostgreSQL, whose version 7.5 is due out in June, are touting a new stack or "brighter LAMP" comprised of Linux, Apache, middleware, and PostgreSQL. PostgreSQL 7.5 will boast a port for Win32-based operating systems, a new memory management algorithm, more efficient data partitioning, and support for two-phase commit.
    Click Here to View Full Article

  • "Domain Master"
    Technology Review (03/04) Vol. 107, No. 2, P. 74; Frauenfelder, Mark

    Internet Corporation for Assigned Names and Numbers (ICANN) CEO Paul Twomey says that his organization must remained focused on maintaining a single interoperable Internet while meeting the needs of international constituents. ICANN is responsible to governments, businesses, academics, and Internet users for the maintenance and upgrade of core Internet identifiers such as IP addresses, protocol parameters, domain names, and the Internet root server system. Controversy about ICANN erupted at the United Nations' World Summit on the Information Society (WSIS) in December, but Twomey says much of the debate did not touch on ICANN's core technical responsibilities; instead, ICANN served as a lightning rod for social, cultural, legal, and economic issues that should be taken care of by governments outside the realm of ICANN. In addition, Twomey says the U.S.-based ICANN suffered from anti-American sentiment abroad: He explains that technical administration of the Internet was originally centered in the United States where the network began, but has since been slowly moving abroad. It was vital that technical decisions about the Internet be left in the hands of an engineering, business, academic, and government partnership, a structure that has ensured a bottom-up decision-making process instead of the top-down structure represented by government-only organizations such as the WSIS, where Twomey was ejected from a recent opening meeting along with other non-members. ICANN meetings, he notes, are Webcast and open. Going forward, the most difficult task for ICANN will be maintaining a singular, interoperable Internet even while different language groups create non-English content and sites. ICANN will govern top-level technical aspects to this diversification, but vendors and other players will have to design translation systems for a multilingual Internet.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "Closing In on the Perfect Code"
    IEEE Spectrum (03/04) Vol. 41, No. 3, P. 36; Guizzo, Erico

    With turbo codes, French researchers Claude Berrou and Alain Glavieux put an end to over four decades of speculation on whether data could indeed by conveyed at speeds up to channel capacity virtually devoid of errors and with very low transmitting power using the right error-correction codes, as electrical engineer Claude Shannon theorized. Shannon postulated that the noise bedeviling all communication channels could be circumvented by separating data into strings of bits and adding to such strings a set of "parity bits" that would help detect and fix errors at the receiving end, resulting in a group of bits or codeword that in the right combination with other codewords would make channel capacity achievable. However, Shannon hypothesized that attaining capacity required the random selection of infinitely long codewords, an impractical solution. The complexity problem is addressed by turbo codes, which divide the problem into more manageable units: The scheme eschews the single encoder/decoder setup at the sending and receiving ends in favor of a double encoder/decoder architecture that works in parallel. "What turbo codes do internally is to come up with bit decisions along with reliabilities that the bit decisions are correct," explains Bell Labs researcher David Garrett. In Japan, turbo codes have been bundled into the standards for third-generation mobile phone systems, and Hirohito Suda of NTT DoCoMo's Radio Signal Processing Laboratory reports that the codes are used to transmit pictures, video, and mail to cell phones and other portable devices. The European Space Agency and NASA have deployed or are planning to deploy turbo codes in their space missions, while digital audio broadcasting and satellite links are about to embed the codes within their systems. In addition, turbo codes could aid engineers working to mitigate communication problems such as multipath propagation, and inject new vitality into long-dormant codes such as low-density parity check codes, which also reach capacity via an iterative decoding technique.
    Click Here to View Full Article



    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM