HomeFeedbackJoinShopSearch
Home
 
ACM TechNews sponsored by Learn more about Texis, the text-oriented database providing high-performance search engine features combined with SQL operations and a development toolkit, that powers many diverse applications, including Webinator and the Thunderstone Search Appliance.   Learn more about Texis, the text-oriented database providing high-performance search engine features combined with SQL operations and a development toolkit, that powers many diverse applications, including Webinator and the Thunderstone Search Appliance.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Thunderstone or ACM. To send comments, please write to [email protected].
Volume 7, Issue 837: Friday, September 2, 2005

  • "How High-Tech Is Coming to the Rescue"
    MSNBC (08/30/05); Boyle, Alan

    Catastrophes, such as Hurricane Katrina, are raising the profile of new search-and-rescue technologies. The researchers developing the technologies say they are not trying to supplant emergency-response workers: "It's us getting the technology to the people who will use it to save people," insists University of Florida professor Robin Murphy. Tools developed by Murphy's team and Louisiana State University's Emergency Training Institute will be tested in New Orleans. Such tools include miniature robots that can comb through collapsed structures for survivors and bring them provisions and two-way communications; "triage sensors" that use thermal imaging, odor detection, and other techniques to find signs of life from three feet away; robot planes and helicopters equipped with video for aerial surveillance; and night-vision sensors. Institute for the Future director Paul Saffo says sensors could be particularly critical for assessing the extent of Katrina's environmental damage. Syracuse University professor David Warner says he and other communication experts have been focusing on delivering search-and-rescue solutions using commercially available technology. He cites products that can enable first responders to communicate from anywhere to anywhere, such as a DingoTel USB device that can transform walkie-talkies into voice-over-Internet phones and Starband satellite receivers that allow the Internet to reach places where phones are impractical. Eric Frost with San Diego State University's Immersive Visualization Center says a lot of the high-tech search-and-rescue technology is dedicated to establishing telecommunications but notes, "It's really about how you use a whole bunch of things so that you are able to manage the resources for medicine, power, water, and all the 20 or so major things that you need to do in the wake of a disaster."
    Click Here to View Full Article

  • "Open-Source Projects Intertwine for Integration"
    CNet (08/31/05); LaMonica, Martin

    In a direct challenge to IBM and other commercial vendors, three open-source initiatives--ServiceMix, Apache Synapse, and Celtix--have joined forces with the goal of producing more fluid and integrated middleware. The partnership aims to address the growing demand of businesses for linking software that cobbles together disparate systems. Each of these open-source projects is underpinned by standard protocols: ServiceMix, a server software, is rooted in the Java Business Integration standard; Synapse is based on the Simple Object Access Protocol; and Celtix supports an array of languages and protocols. Although these applications are still in their early stages, their development is raising eyebrows among commercial vendors concerned about the deleterious effect that open-source movements will have on the low end of their business. There is no doubt that open-source middleware applications increasingly sway the movements of the IT industry: Enterprise bus services are particularly active in the market, though projects, such as Iona Technologies' Celtix, are still many months from commercialization. Still, open-source movements are posing a "with us or against us" challenge to small and large companies alike. The Apache 2.0 license will cover ServiceMix and Synapse, and Celtix is likely to employ the Lesser General Public License, while Iona will address the potential legal ramifications by re-releasing the code issued by other projects under the Apache license. As appealing as the partnership sounds on paper, many logistical obstacles remain, such as coordination among the various staffs of the commercial vendors.
    Click Here to View Full Article

  • "TechScape: Vint Cerf on the InterPlanet"
    Register (U.K.) (08/31/05); Robinson, Bill

    Vint Cerf envisions an interplanetary Internet, or InterPlaNet, as a communications network for people or devices in space or on other planets. Key challenges for realizing this vision include reducing or minimizing "flow control," or the delay in communications across space; the availability of domain names, which prompted ICANN to reserve planetary domains, such as .mars; and overcoming problems with reception quality and satellite positioning. Cerf says a new protocol, Delay & Disruption-Tolerant Networking (DDTN), was developed to address part of this last challenge. The next step was garnering the interest of the Defense Advanced Research Projects Agency, which clearly saw the military and tactical advantages of an interplanetary network, committing about $500,000 to InterPlaNet research. Cerf's team has been studying such issues as how to send satellite transmissions at low angles and the impact precipitation can have on interplanetary communications. Cerf believes a potential solution may lie in the Arctic Circle and has conducted tests in which members of the nomadic Sami tribe communicate via 802.11-enabled laptops in the frozen wilderness of Lapland. The InterPlaNet and DDTN protocol now play a vital role in the communications system for the Space Shuttle and the International Space Station. The first steps toward a truly interplanetary Internet have been made with missions to Mars in which IP hardware, software, and antennae are deposited on the planet's surface in order to set up the infrastructure for faster, more efficient, and less-garbled data transmission.
    Click Here to View Full Article

  • "Time-Saving Tool: Google Galvanizes Invention by UCSD Student During Summer of Code"
    UCSD News (08/30/05); Ramsey, Doug

    The University of California, San Diego's James Anderson has invented a technique that allows a file to be transferred automatically from one computer to other devices. The graduate student's invention, known as transparent synchronization, or Tsync, has drawn the attention of Google, which helped Anderson finish the coding and issue an open-source beta version under the GNU General Public License; Tsync retains consistency among file sets of different computers, PDAs, and third-generation cell phones, ensuring connectivity among the disparate machines. Anderson's work was part of Google's Summer of Code initiative, where the search-engine company encouraged student programmers to develop innovative open-source code. Anderson owes the idea for his project to the frustration he felt when having to manually transfer files back and forth among the four computers he uses at work and home. Anderson's central innovation is automating the transfer process, whereas existing synchronization programs, such as Rsync and Unison, require manual updates, linking files only between two devices; by contrast, Tsync scripts a configuration file that defines which directories are to be synchronized and identifies which hosts are part of the group. The peer-to-peer and overlay techniques ensure that many hosts can be synchronized, and a root fail-over protocol is dispatched that keeps each node connected and manages the update process. Anderson wrote Tsync in C++ and Mace and tested it on clustered systems on wide-area networks.
    Click Here to View Full Article

  • "Spotlight on Copyright Piracy"
    SiliconValley.com (08/30/05); Schoenberger, Karl

    The U.S. Commerce Department estimates that American companies lose $250 billion in annual sales globally due to copyright piracy, and China is considered one of the worst offenders due to its slipshod enforcement and undependable judicial system. Pressuring China to correct these deficiencies and more effectively enforce copyrights is the task confronting Chris Israel, the Commerce Department's newly appointed Coordinator of International Intellectual Property Enforcement. Foreign businesses in China have traditionally tiptoed around the issue of copyright enforcement so as not to offend their hosts and risk losing their chances to penetrate the Chinese market, but software firms and others have become more outspoken with China's admission into the World Trade Organization and rising expectations that the country conform with international business standards. The Business Software Alliance reports that improving Chinese copyright enforcement is a major concern for software companies, given that all but 10 percent of the software in use there is illegal. Symantec's William Plante says the Chinese have an incentive to keep intellectual property enforcement lax, as the export of pirated software has played a major role in the country's recent economic prosperity. Meanwhile, Business Software Alliance CEO Robert Holleyman notes that Chinese corporate end users copy software for internal use without securing lawful licensing permission. He says, "The only legitimate measure of improvement would be when software sales rise at the same rate as PC sales in China, which is not happening."
    Click Here to View Full Article

  • "Computer Program Learns Language Rules and Composes Sentences, All Without Outside Help"
    Cornell News (08/30/05); Lang, Susan S.

    Cornell University psychology professor and computer scientist Shimon Edelman and Tel Aviv University researchers have developed Automatic Distillation of Structure (ADIOS), a technique enabling a computer program to scan text, then autonomously extract language rules and compose new sentences. "This is the first time an unsupervised algorithm is shown capable of learning complex syntax, generating grammatical new sentences and proving useful in other fields that call for structure discovery from raw data, such as bioinformatics," Edelman says. ADIOS repeatedly aligns sentences and scans for overlapping segments in order to uncover complex patterns in raw text. The ADIOS algorithm has been tested on the full text of the Bible in multiple languages, musical notation, biological data, such as protein sequences, and artificial context-free languages with massive sets of rules. Experiments demonstrated that ADIOS, using structured generalization and statistical pattern extraction methodology, can discover complex structures from transcripts of parents' speech directed at very young children. Edelman says this ability may ultimately help researchers learn how children eventually become adept at speaking their native languages. The U.S.-Israel Binational Science Foundation partly sponsored the ADIOS collaboration.
    Click Here to View Full Article

  • "The Internet: What Lies Ahead?"
    CNN (08/29/05); Sieberg, Daniel

    ICANN Chairman Vinton Cerf, who is often credited as one of the Internet's founding fathers, anticipates that the Net of the future will allow users to operate and manage other Internet-enabled equipment with mobile devices, while further out he envisions an interplanetary network connecting spacecraft, ground stations, and missions. About 66 percent of people surveyed last year by the Pew Internet and American Life Project expect at least one catastrophic cyberattack within the next decade, although former U.S. cybersecurity czar Howard Schmidt is confident that by that time computer systems will be capable of self-configuration and self-repair. He also predicts that all devices and appliances will be owner-controlled and boast a secure Internet protocol address. More than two thirds of the Pew respondents doubt that online voting will be secure and widespread by 2014, although Mike Alvarez, who co-directed last year's Caltech-MIT Voting Technology Project, says the migration to e-voting could help address problems related to polling places, the training of polling place workers, and the challenges voters must contend with. Most Pew respondents expect access to online entertainment--movies, music, and the like--to improve. The home computer or digital hub is also foreseen as the focal point of the living room. The report indicates that government surveillance of citizens will proliferate thanks to the pervasive nature of the Internet, while the line between work and leisure will continue to blur.
    Click Here to View Full Article

  • "AIMing for Business Innovation"
    IST Results (09/01/05)

    The IST-funded AIM project has developed a software platform designed to facilitate business innovation by encouraging the exchange of knowledge among an organization's various employees and departments. AIM coordinating partner Alvaro Gorostiza says employees are typically discouraged from sharing their ideas and knowledge because there is no central repository in which to store and analyze their input; the AIM platform provides access to such a repository. Employees can enter suggestions or information into a central database via a simple interface, and this data is analyzed and assigned to problem and solution categories by other software components. Designers can then exploit it to address problems quicker and boost the efficiency of product development, while suppliers and customers can also submit feedback to the database through the system, which is modular and Internet-based. The prototype AIM platform was tested at three end-user companies in Britain and Germany, and Gorostiza reports that in all three trials the system reduced the time needed to resolve production and processing difficulties by 15 percent to 20 percent. However, he says "The real benefits are likely to be much greater over time as the different actors become more accustomed to the system." A commercial version of the AIM platform is planned by the end of the decade.
    Click Here to View Full Article

  • "U.S. Tests $3.5m Computerized Lie Detector"
    VNUNet (09/01/05); Jaques, Robert

    The U.S. Department of Homeland Security will apportion $3.5 million to Rutgers University researchers to develop next-generation computerized lie detectors that can analyze the veracity of statements by studying subtle facial expressions, hand gestures, and other body language cues. Rutgers computer science professor Dimitris Metaxas with the Center for Computational Biomedicine Imaging and Modeling says lie-detector tests based on physiological indicators are about 50 percent reliable under even ideal conditions but is convinced that expressions and gestures are not only more difficult to conceal but are highly consistent regardless of race or culture. Metaxas says this research is based on his earlier work, which focused on computer simulation of facial expressions and organ dynamics, such as blood flow and heartbeat, through image-based modeling. Challenges highlighted by researchers include the incorporation of postures, gestures, and other new cues into the computer models, and determining the accuracy of movement capture under real-world conditions. The Rutgers researchers have partnered with Lockheed Martin to develop and integrate 3D sensor technology for motion capture. The team expects its innovation will increase the efficiency of screening people at points of entry and border crossings, aid law enforcement in routine investigations, and improve security at embassies and other buildings.
    Click Here to View Full Article

  • "Predicting How You're Going to Shop Online"
    Israel21c (08/28/05); Hershman, Tania

    IBM Haifa Laboratories researcher Amit Fisher has combined data mining, operations research, artificial intelligence, and economics to take a more sophisticated approach to predicting the long-term behavior of online shoppers so the most valuable customers can be identified and given preferential treatment to ensure their continued patronage. Fisher chose an Israeli online auction site as his first test case and outlined relevant behavioral variables, such as how many times a person visits a site over a given period, the number of bids they put in, the number of items they win, and how much they spend. Fisher's technology used these variables to automatically divide the site's 10,000 users into 10 groups, then collected data every day for a year; upward of 70,000 customer purchases worth more than $18 million were made during this period. The researcher then applied Markov Chain Theory to estimate the probabilities of a customer from a specific group performing a specific action, in combination with each action's monetary value to the site. Fisher discovered he could almost perfectly anticipate the future behavior of the group with the largest number of people. Companies could use this knowledge to boost their bottom lines by targeting these people and tailoring their services to customers' preferences, as well as target less-significant customers.

  • "Chair Shines in World Wide Web Consortium"
    Inside CSULB (09/01/05) Vol. 57, No. 16; Hagen, Teresa

    Wayne Dick, chair of Computer Science and Computer Engineering at California State University, Long Beach, is involved in the World Wide Web Consortium's Educational Outreach Working Group. The group is a component of the W3C's Web Accessibility Initiative (WAI), and Dick, who is visually handicapped, plays a vital part in making accessibility recommendations, especially with disabled users in mind. Dick says Web accessibility guidelines are divided into three sets: One set focuses on content accessibility, the second set concerns authoring tools, and the third set covers user-agent accessibility. Discussions of accessibility must cover different kinds of disabilities, and Dick believes innovations designed to help the disabled Web user can also benefit non-disabled users. Dick supervised the introduction of WebAdapt2me, a product enabling users to customize the Web-viewing experience. WebAdapt2me features include spoken text, text/image magnification, independent text-size adjustment, pop-up images, background concealment, browser-control enlargement, and audio feedback as typing cues.
    Click Here to View Full Article

  • "Microsoft Claims Secure Development Success"
    ZDNet Australia (09/02/05); LeMay, Renai

    Microsoft says its Security Development Lifecycle (SDL), created to ensure that developers are writing secure code, is showing early indications of success. The program was developed in the wake of a series of publicized vulnerabilities, and Microsoft's Rick Samona notes that each of the company's server and commercial products must meet SDL requirements before it can be released; he cites Internet Information Services 6 and the SQL database server as principal SDL success stories. The implementation of SDL was not easy, as it entailed a complete overhaul of the way developers address security, and new hires must complete their SDL training within 60 days, followed by annual refresher courses. Each trainee is paired with a security expert to ensure clean code through peer-checking, while 100 functions are prohibited altogether. A GS flag helps stop buffer attacks in compiling the software and was especially helpful in containing the Blaster worm. At the end of a product's development, Microsoft runs a beta version for three months, often hinging its shipment on an error-free run throughout that period. Samona points out that SDL often ends up saving the company money and keeping the cost of software down, as much of the estimated $100,000 cost associated with each security alert can be defrayed if the vulnerabilities are detected early.

  • "Starting a New Digital Chapter for Historical Documents"
    IST Results (09/02/05)

    The IST program-funded MEMORIAL project has developed a method for processing digital images of paper documents to the degree where portions of text can be identified by special software and rendered as electronic text by commercial optical character recognition (OCR). The software component of the MEMORIAL Digital Document Workbench (DDW) "facilitates the interpretation of typewritten texts and allows the separation of graphical images, such as signatures and stamps from textual ones, as well as the background cleaning of noise and overstricken machine-typed characters for later OCR of the textual images," according to MEMORIAL coordinator Alexander Geschke. Electronic document quality assessment is also permitted, while DDW processing--critical for inserting metadata or modifying processing parameters--can be controlled by the human editor at any point. "The retrieved information can be converted from the original to a highly interactive, editable, and linkable electronic form, suitable for a future Web-based virtual memorial," says Geschke. Such an approach adds legibility to the digital equivalents of typewritten documents, aiding historical preservation.
    Click Here to View Full Article

  • "The Technology Dream Deferred"
    Today's Engineer (08/05); Templeton, John William

    America's increasing dependency on "temporary" guest-worker programs and the export of technology-related jobs to countries where labor is cheaper constitute a serious blow to African-American tech professionals, particularly those in rural communities and central cities. The Bureau of Labor Statistics estimates that African-Americans comprise only 5.5 percent of the 3.8 million Americans who work for high-tech companies, while the high-tech sector comes in tenth in terms of the percentage of African-Americans employed by the 12 leading industry sectors. African-Americans account for 7 percent of the 1.6 million computer systems design jobs in the U.S., and 5.6 percent of the 1.03 million-member science and technology management and consulting workforce. In contrast, 15.4 percent of the radio/television/cable media workforce is black, as is 13.9 percent of the telecommunications workforce. The percentage of African-American high-tech workers in IT departments at IT-using companies is greater than that at IT-producing companies, but the increased offshoring of business-processing jobs threatens domestic black employment at those companies as well. Studies of San Francisco Bay Area hiring practices conducted by the Coalition for Fair Employment in Silicon Valley over several years would seem to indicate a bias against African-Americans and preferences for foreign workers brought in on H-1B visas. The only positive trend for African-American tech workers in recent years is the increase in black-owned tech companies, which employ over 15,000 workers and earn six times the revenue earned by African-American businesses overall.
    Click Here to View Full Article

  • "Foreign Workers First?"
    InfoWorld (08/29/05) Vol. 27, No. 35, P. 10; Schwartz, Ephraim

    The U.S. Department of Labor's refusal to disclose job openings submitted by U.S. employers seeking to hire workers on H-1B visas so domestic workers might have a fair shot at them is a sign of disrespect to American labor, according to University of California, Davis, computer science professor Norman Matloff and others. The U.S. Citizenship and Immigration Services (USCIS) says "the H-1B visa program is utilized...to employ foreign workers in specialty occupations that require theoretical or technical expertise in a specialized field." U.S. employers who wish to import foreign workers on an H-1B to fill a stateside job are required to provide a Labor Condition Application (LCA) describing the available position; the DOL has so far received 51,939 LCAs, which Programmers Guild President Kim Berry wants posted on the department's Web site so anyone can apply for the jobs. The DOL's David James says the department does not have the authority to release this information, but Berry counters that no statute specifies the time and manner of LCA disclosure. Furthermore, a foreign worker with a valid H-1B visa is disallowed from starting employment until Oct. 1, 2005. Berry reports that an inordinate number of openings are for computer programmers and software engineers. He acknowledges that employers are not obligated to hire Americans, but says the DOL should be required to make the information available to the public before the positions are filled. The maximum number of allowed H-1Bs in 2006 stands at 58,200, but a USCIS Web site indicates that 22,383 H-1B visas have already been approved and that 29,556 are pending.
    Click Here to View Full Article

  • "BLAST"
    Scientist (08/29/05) Vol. 19, No. 16, P. 21; Harding, Anne

    Since it was first developed 15 years ago, the Basic Local Alignment Search Tool (BLAST) has revolutionized bioinformatics by providing a way to clarify gene sequence data to uncover sequence homology. BLAST has no direct applications, but sequence homology informs gene function, which consequently informs the development of drugs. BLAST's chief advantages are its speed and its ability to make supercomputing power accessible from users' desktops. Variables contributing to BLAST's popularity include the homogeneity of life at the sequence level, the rise in computing speeds, and decline of machine costs thanks to the PC revolution, along with the emergence of the Web. BLAST allows researchers to compare their particular sequences against the National Center for Biotechnology's GenBank database of more than 40 million sequences, returning results for two-thirds of such queries within 22 seconds. The concept of BLAST started with computer scientist Gene Myers in 1988, who envisioned an algorithm that could use a seed string to produce a set of matches that were very similar to the original. Myers favored a deterministic algorithm, but a heuristic model conceived by David Lipman eventually won out. Today there are more than 12 BLAST variants, some optimized for high-performance computing, and BLAST's obsolescence, while inevitable, is not imminent.
    Click Here to View Full Article

  • "The Invasion of the Chinese Cyberspies (And the Man Who Tried to Stop Them)"
    Time (09/05/05) Vol. 166, No. 10, P. 34; Thornburgh, Nathan; Forney, Matthew; Bennett, Brian

    The revelation that a ring of Chinese hackers, collectively known as Titan Rain, has been launching coordinated attacks on sensitive and seemingly secure U.S. networks to steal data for some time has unsettling implications for U.S. security. The Department of Defense issued a warning that Titan Rain could not only be a coalition of data thieves but also a patrol point for more critical attacks that could hijack or cripple certain U.S. military networks. Such threats are compounded by the fact that federal investigators must jump through bureaucratic hoops to gain authorization to track down and neutralize foreign cyberspies, while concerns of potential international incidents as a result of such probes only add to the delicacy investigators must practice. There is also a lack of experienced investigators, prompting the intelligence community to encourage or at least unofficially sanction freelancers, such as former Sandia National Laboratories computer network security analyst Shawn Carpenter, who traced the Titan Rain intrusions to a trio of Chinese routers in the province of Guangdong, and dutifully informed the FBI. Sandia dismissed Carpenter because his activities constituted hacking into foreign computers, which is unlawful. Carpenter justifies his actions by saying his case shows the need for reforms if the U.S. is to more effectively respond to cyberthreats. Although Washington has no official position on the power behind Titan Rain, Carpenter and other network-security analysts are convinced that the Chinese government masterminded the attacks.
    Click Here to View Full Article

  • "The Threats Get Nastier"
    InformationWeek (08/29/05) No. 1053, P. 34; Claburn, Thomas; Garvey, Martin J.

    Business technology and security professionals are confident their IT systems are adequately protected against cyberthreats, according to InformationWeek Research's U.S. Information Security Survey 2005, but this attitude belies the fact that worms, viruses, and other forms of malware are more insidious and dangerous than ever. The recent Zotob worm epidemic shows that such threats have not gone away, while the motivation behind such attacks has shifted from bragging rights to financial gain. The most common types of security threats and espionage during the past year were viruses and worms, phishing, denial of service, and Web-scripting language violations, while suspected culprits have included hackers, virus writers, unauthorized and former workers, and organized crime. Seventy-eight percent of survey respondents who believe their vulnerability to cyberthreats has increased or remained steady over the past year say the growing sophistication of such threats is their chief concern, while other anxiety-provoking factors include more ways to attack corporate networks, increased volume of attacks, and more malicious intent. Fifty-one percent of businesses plan to boost their IT security budget this year, while 56 percent of respondents say they are approaching IT security in a more structured way due to the need to conform to government regulations. Enhanced application security, secure remote access, and improved access controls are among the top priorities for these companies. Not only are cyberattacks being launched across multiple modes, but virus writers are taking a cue from hackers and using rootkits to conceal their activities from detection systems. Six percent of companies admit hackers gained access to their customer records, but the actual percentage may be higher if one assumes that some companies are hiding the truth or have been compromised without their knowledge.

  • "Kryder's Law"
    Scientific American (08/05) Vol. 293, No. 2, P. 32; Walter, Chip

    Seagate Technology CTO Mark Kryder, who founded and directed Carnegie Mellon University's Data Storage Systems Center (DSSC), believes new products and applications have made the increasing information storage capacity of hard drives even more important than the expanding power and speed of semiconductors. The emergence of the iPod and replacement of low-capacity flash memory cards with hard drives are a sign of this trend, and Kryder envisions the imminent migration of hard drives into common products, such as automobiles, PDAs, still cameras, and phones. Kryder has played a key role in the advancement of hard-drive technologies since 1982, when he called a conference of hard-drive industry leaders to determine their most important research requirements; the following year he founded the Magnetics Technology Center, which later became the National Science Foundation-funded DSSC. His first priority at the DSSC was to store 4GB of data in a square inch of disk space; by the time he joined Seagate in 1998, the facility was aiming to store 100GB. Seagate started shipping drives with 110GB storage capacity in one square inch earlier this year. However, current hard-drive technologies face the technical challenge of increasingly unstable data storage, as the areas magnetized by the disk head shrink. Kryder's team is investigating perpendicular recording as a solution, and Seagate has prototyped such a technique to the degree that at least 200GB could be crammed into a square inch within the next several years. To fulfill his vision of packing a terabit of data into a square inch, Kryder has set up several projects focusing on such methods as heat-assisted magnetic recording; he figures hard drives will be smaller and less expensive than holographic storage when the terabit benchmark is reached.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM