HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 710:  Monday, October 25, 2004

  • "Electronic Voting Raises New Issues"
    Washington Post (10/25/04) P. A6; Keating, Dan

    Electronic voting systems' lack of a verifiable paper trail, their software's uncertain reliability, and questions of local election officials' competence in operating such machines have fueled an undercurrent of mistrust that both manufacturers and local election administrators dismiss as paranoia. Twenty-eight U.S. states as well as the District of Columbia have deployed e-voting systems, which are expected to be used by one-third of the country's voters in the Nov. 2 presidential election. The paperless nature of the machines, which makes accurate recounts impossible and leaves no clear evidence of hacking or error, is a particularly sore point for critics. "Computer scientists want technical solutions, and the election supervisors just want to get rid of their paper," notes former co-director of the CalTech-MIT Voting Technology Project Steve Ansolabehere. Such complaints led several states to carry out audits of e-voting systems, which uncovered security vulnerabilities, software bugs, and a lack of software certification by independent testing facilities. The results convinced the state of Ohio to retain its existing voting system, while both California and Maryland will take added security steps and use "parallel testing" to ensure that the machines are counting properly and do not have buried software switches that could purloin votes on Nov. 2. Election officials believe these improvements will bolster the election against the one problem whose existence everyone acknowledges--human error, in the form of unprepared poll workers. "We need to design and test our ballots and train our poll workers so they know what they're doing," insists CalTech-MIT Voting Technology Project co-director Ted Selker.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

    For information on ACM's activities involving e-voting, visit http://www.acm.org/usacm.

  • "From Zzstructures to mSpaces: New Ways to Compare Web Navigation Tools"
    Innovations Report (10/22/04); Lewis, Joyce

    Monica Schraefel of the University of Southampton's School of Electronics and Computer Science (ECS) and Michael McGuffin of the University of Toronto's Department of Computer Science earned an ACM Special Research Distinction for their paper describing "hyperstructures" in graph-theoretic terms presented at this year's ACM Hypertext Conference in August. Hyperstructures enable hypertext data such as the Web to be rendered in ways that reveal the multiple relationships between the information in the pages, in addition to the links between pages. Among the hyperstructures the paper focuses on are ECS Visiting Professor Ted Nelson's zzstructures and Schraefel's mSpaces. Schraefel and McGuffin were able to compare these two hyperstructures in particular by rendering them as graph theory. By developing formalized descriptions, the researchers aimed to furnish a straightforward method designers could use to compare hyperstructure properties in order to determine the approaches most suitable for their information design requirements. Schraefel says, "By considering new models for representing information which go beyond generic organizing structures like the lists we see from a Google search, we can consider equally new approaches for representing hypermedia information spaces that let us explore the relationships among the information, rather than just the data in a page."
    Click Here to View Full Article

  • "Caltech Seeking to Blend Sciences"
    Pasadena Star-News (10/23/04); Groshong, Kimm

    Caltech aims to transform itself and the way its students and faculty conduct research by combining the disciplines of physics, biology, chemistry, computer science, mathematics, and engineering under the aegis of its Information Science and Technology (IST) program, thus easing and expanding cross-disciplinary collaboration. Computer science professor Andre DeHon remarks that such cooperation is important to students headed for graduate school, who must obtain an understanding across all related fields. The questions participating IST researchers will seek to answer are linked in their relevance to information, which is defined as an area that includes its generation, sharing, storage, encryption, retrieval, and societal impact. The IST initiative will be divided up among four centers, or cross-disciplinary research groups: The Center for Biological Circuit Design will study the storage, use, and communication of information by organisms; the Social Information Science Laboratory will focus on social systems; the Center for the Physics of Information will advance computer technology by dealing with problems such as what materials future computer chips should use once silicon's capacity has played out; and the Mathematics of Information center will accommodate these various mixed-discipline projects by developing a new mathematical language for computational advancement. IST will involve the participation of about 50 percent of Caltech's students and about 25 percent of its faculty, and the institute expects to raise $100 million to support the program by 2007. The initiative has started to plan new undergraduate and graduate programs, classes, and a campus facility dedicated to collaboration. IST director Jehoshua Bruck reports that Caltech is the ideal environment for such an initiative, since a historical precedent for cross-discipline collaboration already exists and the institute's modest size eases cooperation between students and researchers.
    Click Here to View Full Article

  • "CFI Research Projects Could Rewrite Computing Rules"
    ITBusiness.ca (10/21/04); Sutton, Neil

    The Canada Foundation for Innovation (CFI) announced the allocation of funding to 135 projects distributed throughout 32 Canadian research institutions on Oct. 20. The winning proposals are judged according to their strengths, their researchers' credentials, and the projects' potential to be used as training tools for future generations of Canadian scientists and to ultimately benefit Canadian society and economy. One of the projects to receive CFI funding involves interactive character animation developed by professor Sageev Oore of St. Mary's University in Halifax. Oore's goal is to create multimedia interfaces that transcend the mouse/keyboard model, and his latest breakthrough allows people to interact with onscreen 3D characters using a tool that would usually be employed to animate puppets: "It's effectively virtual puppeteering where you use input devices that are like puppet sticks except that they're connected to sensors," Oore explains. Another CFI recipient is Queen's University researcher Ying Zou's proposal to improve software upgrades, a collaborative effort involving partners at IBM Canada and the University of Waterloo. Zou is deconstructing the source code of IBM's WebSphere e-commerce applications in order to streamline the upgrading process for developers working on subsequent permutations of the software. "My research tools...help the developer understand existing software...and guarantee the quality of the software won't be decreased," Zou notes; predicting the performance of a segment of software and its response to structural alterations throughout the development process will also benefit from Zou's research. Oore and Zou's projects will together comprise a $23.7 million block of CFI funding.
    Click Here to View Full Article

  • "Professor Blames Limited Training for High Level of Software Failures"
    Computer Weekly (10/20/04); Kavanagh, John

    A lack of proper training is a big reason why software failures will continue to occur, according to John Knight, a computer science professor at the University of Virginia. Speaking at a meeting of the BCS Safety-Critical Systems Club, Knight acknowledged that there will always be software failures, but added that the way software is engineered is unacceptable. Knight said students are not given a basic education in software development. "They can often produce software that provides certain basic functionality, but they fail to understand or are completely unaware of topics such as the crucial importance of specification, the difficulties that arise in concurrent programs, the limitations of testing, the effects of rounding error or the lack of timing predictability in processors," said Knight. An electrical engineer can obtain a specific degree in electrical engineering, but computer science undergraduates are likely to have limited coursework in software engineering. And there is no consensus of opinion on the areas in which a student should be trained in software engineering. "To be a successful and responsible professional in safety-critical systems, a developer must understand the intricacies of a large number of fields, including real-time systems, formal specification and dependability assessment, among others," said Knight.
    Click Here to View Full Article

  • "A Quantum Leap Forward for Computing"
    IST Results (10/21/04)

    European research into quantum information processing and communication (QIPC) supported by IST's Future Emerging Technologies branch was detailed last month at the fifth QIPC workshop. QIPC combines quantum physics and computer science to effect the storage, processing, and transmission of information in accordance with the laws of quantum mechanics, and advanced simulations, massive parallel processing computing power, and invulnerable code encryption systems are among QIPC's potential applications. Among the projects highlighted at the workshop was the SQUBIT-2 initiative, whose goal is to probe the applications of quantum bits (qubits) made from superconducting electrical circuits known as Josephson Junctions. SQUBIT-2 coordinator and Chalmers University of Technology professor Goeran Wendin commented that "Reaching 1,000 state cycles within the superconducting qubits lifetime, or decoherence time, was a challenging goal but we have achieved it now." Qubits, unlike conventional computing bits, can represent 1 and 0 simultaneously, and manipulating series of qubits via quantum entanglement is the mechanism of quantum calculations. Quantum computers can solve parallel processing tasks much faster and with less memory consumption than conventional computers, and though current quantum computing experiments can manage just a few qubits, it is expected that future quantum systems will comprise numerous qubits, thus facilitating advanced quantum modeling. IST European Commission officials Ralph Dum and Antonella Karlson discussed with workshop participants QIPC's prospects in the upcoming Seventh Framework Program for research, with emphasis on providing a strategic report by December and a QIPC roadmap by next May that will record the current standpoint on QIPC R&D and suggest subsequent funding precedence.
    Click Here to View Full Article

  • "NEC Strikes Blow in Supercomputer Battle"
    New Scientist (10/21/04); Biever, Celeste

    Japan's NEC expects to recapture the crown for the world's fastest supercomputer by assembling 512 SX-8 nodes--each capable of performing 128 gigaflops--into a 65-teraflop machine that should outpace its nearest competitor, IBM's 36-teraflop Blue Gene, by 80 percent. The SX-8 nodes, which were announced on Oct. 19, are speedier, more power-efficient, and smaller than the building blocks used in the Earth Simulator, NEC's former reigning supercomputing champion. The SX-8 units build on a technology already widely employed by climatologists and biologists for simulation, while Blue Gene is a "massively parallel" system based on experimental technology. David Bailey of the Lawrence Berkeley National Lab's (LBL) National Energy Research Scientific Computing Center cautions that NEC may be jumping the gun with the SX-8 announcement, since the technology's projected supercomputing power has yet to be truly demonstrated. "Each time you double the size of the system, you expose another layer of hardware and software bugs," he notes. Blue Gene, which is built from cheap, low-power, off-the-shelf chips rather than expensive, custom-made hardware, is envisioned by many as the template for future supercomputers. California computer scientist Eric Strohmeier says that vector systems such as SX-8 and massively parallel systems like Blue Gene could complement each other, since the former architecture is good at handling code comprised of "regular" calculations, while the latter could be more suited for "irregular" calculations.
    Click Here to View Full Article

  • "Toe-to-Toe Over Peer-to-Peer"
    Wired News (10/21/04); Grebb, Michael

    A recent panel discussion between advocacy groups and recording, movie, and technology industry representatives showed that consensus on possible copyright-infringing technology is still as far apart as ever; however, Recording Industry Association of America legislative counsel Mitch Glazier expected at least one industry-approved peer-to-peer (P2P) music service would emerge in the next few months, though he conceded the market outlook for such a service was unclear. Content owners faced off against technology vendors over the last several months over the Inducing Infringement of Copyrights Act proposal, which was aimed at P2P services Grokster and Morpheus but contained ambiguous language that could have made other technology a target. Cato Institute telecommunications studies director Adam Thierer said compromise seemed impossible, and that such a situation could lead to compulsory licenses for content distributed online. NetCoalition general counsel Markham Erickson said assurances from the entertainment industry that the so-called Induce Act would only be targeted at illegal P2P activity was not supported by the language of the legislation: He pointed out that even basic Internet technology such as Web browsers allow people to copy files. "The entire Internet is one big copying machine," he said. After Induce Act talks fell through, content owners asked the Supreme Court to consider overturning a string of legal defeats in which district and federal courts ruled P2P services such as Grokster could not be held liable for the illegal acts of their users. If the Supreme Court hears and rules in content owners' favor, pressure for legislation will evaporate, but the technology industry would then likely seek counter-legislation protecting fair-use rights, according to Thierer.
    Click Here to View Full Article

  • "Can You See Me Now?"
    Technology Review (10/21/04); Brown, Eric S.

    Technology writer Eric S. Brown has reason to think that videoconferencing technology could at last migrate from the enterprise market to the consumer market thanks to a number of trends, such as a decline in equipment costs, quality enhancements, and tighter travel budgets. These developments are cited in a June 2004 study from Wainhouse Research that also anticipates a surge in videoconferencing sales revenues from $530 million in 2003 to $1.1 billion in 2008. Although the videoconferencing industry realized early on that an IP-based system would ease scheduling and improve integration with conferencing software, the Web, and other applications, Brown writes that it is only recently that better hardware and software have improved IP systems' reliability and tolerance to latency and other bugs. Other trends of potential benefit to videoconferencing include increased travel time and higher gasoline and air travel prices; upgrades in computer speed; steady improvements in broadband speeds; and the maturation of standards such as H.323 and Session Initiation Protocol (SIP), which can interoperate via the H.350 directory services standard. Brown expects most major voice over IP (VoIP) vendors to also offer video over IP next year, while instant messaging is emerging as another videoconferencing platform through its provision of a massive, ready-made directory of real-time users. Videoconferencing could finally achieve spontaneity with IM, in conjunction with the H.350 standard. Brown also identifies the growing sophistication and proliferation of Web conferencing software, which can bundle together voice, video, and data conferencing in a single session, as another trend in videoconferencing's favor.
    Click Here to View Full Article

  • "UF Scientist: 'Brain' in a Dish Acts as Autopilot, Living Computer"
    EurekAlert (10/21/04)

    Researchers hope to design biological computers as well as understand the genesis of neural disorders and outline noninvasive treatments by studying a network of 25,000 living rat neurons cultured in a dish by University of Florida professor Thomas DeMarse. The neural network is connected to a desktop computer via a grid of electrodes arranged on the bottom of the dish so that the living "brain" can interact with an F-22 fighter jet flight simulator. The neurons are fed information about flight conditions from the computer. The cortical cells analyze the data and respond by transmitting signals to the aircraft's control system, causing adjustments in the plane's flight path that are likewise relayed to the neurons in a feedback loop. The neural network can control the pitch and roll of the simulated plane in variable weather conditions, but DeMarse explains that the purpose of the experiment is to understand the basic mechanisms of neuron interaction. The National Science Foundation has awarded DeMarse a $500,000 grant to devise a mathematical model of neural computation. "If we can extract the rules of how these neural networks are doing computations like pattern recognition, we can apply that to create novel computing systems," the UF professor remarks. Steven Potter of the Georgia Tech/Emory Department of Biomedical Engineering says neural networks are being studied so that researchers can better comprehend the changes taking place in mammalian brains during the learning process.
    Click Here to View Full Article

  • "Wireless Music's New Social Sound"
    The Feature (10/21/04); Pescovitz, David

    The technology for delivering music wirelessly has changed significantly in the 25 years since the debut of the trend-setting Sony Walkman, but the listening experience has not. "We have all of this network infrastructure to deliver music, but the music we share and the way we listen to it hasn't changed to reflect the social and information dynamics that the infrastructure provides," notes Sony Computer Science Laboratory Paris' developer of future music systems Atau Tanaka. Efforts to enhance the mobile music listening experience include Tanaka's prototype Malleable Mobile Music system, which is designed to engage users in a social activity. The system is comprised of sensor-equipped PDAs that concurrently play the same wirelessly streamed music, while each listener chooses a specific element in the music to represent him or her. Each device augments the selected musical component in response to sensor input derived from the listener's movement or grip, and increases the element's prominence as listening partners move closer to one another. Meanwhile, researchers led by Mattias Ostergren at Stockholm's Interactive Institute are developing Sound Pryer, a wireless peer-to-peer system that allows people dispersed throughout different automobiles to listen to music together by having each driver send their digital music stream to other cars within Wi-Fi transmission range. Converging vehicles form ad hoc wireless networks and increase the number of available "stations," and tuning in to a specific car causes the driver's PDA to display an icon representing the shape and color of the car. Tanaka remarks that music has historically existed within multiple contexts, and new technologies "can add some of those elements back in to the listening experience."
    Click Here to View Full Article

  • "OASIS Groups to Tackle Utility Computing"
    InternetNews.com (10/21/04); Boulton, Clint

    The OASIS standards organization has partitioned the Data Center Markup Language (DCML) group into a quartet of technical committees to cultivate utility computing standards. The DCML outlines frameworks for how servers, networks, applications, and services can capture siloed data using an automated, on-demand methodology. Customer demand to trim costs and streamline infrastructure is making improved data center performance highly desirable for next-generation computing schemas. The OASIS DCML Framework Technical Committee will focus on the improvement of the current DCML specification; the Applications and Services Technical Committee will develop a homogeneous data model and interchange format to enable the reference and management of Web services and other application elements; the Network Technical Committee will create a data model and XML-based format for sharing data about networking components in the data center; and the Server Technical Committee will effect information interchange between tools and devices, and boost data center automation. Representatives of BEA Systems, Computer Associates, BMC Software, Inkra Networks, Tibco, Electronic Data Systems, and Opsware are participating in the OASIS DCML technical committees, which will convene for the first time on Nov. 15. Analysts thought that separate utility and on-demand computing initiatives from IBM, Hewlett-Packard, Sun Microsystems, and others could endanger the DCML group's objective, but OASIS' decision to host and develop the standard has assuaged such concerns. Nevertheless, IBM and Veritas co-launched the Utility Computing Working Group under the auspices of the Distributed Management Task Force in February.
    Click Here to View Full Article

  • "Report: Women Execs Hold Their Own in Some But Not All IT Sectors"
    Computerworld Australia (10/21/04); McBride, Siobhan

    The Equal Opportunity for Women in the Workplace Agency's annual Census for Women in Leadership study finds that female executives have made significant gains in Australia's software and services industries, but are still least represented in its hardware sector. The Australian Information Industry Association reports that female participation in the ICT arena has increased from 25 percent to 50 percent, yet current representation is still too low. Citrix marketing director Glynis Marks notes that within the software industry, women have a stronger presence in marketing than in sales and technical departments. "I guess if you look about companies that sustain women in their organization and promote them it really comes down to the culture of that company and their approach to issues like [pregnancy]," explains Microsoft's Tracey Fellows, who was hired as a manager when she was pregnant. Still, she reasons that "software has really come along in the past 10 years, and software also relies on more types of vertical knowledge, so women that have different professional backgrounds, other than technical, can bring this sort of knowledge to the industry." Gartner research director Greta James attributes the disproportionate numbers of female execs in hardware and software to the age of the respective sectors: Whereas software is a relatively new industry, hardware has been around much longer. She notes, however, that groups such as Females in Information Technology & Telecommunications are helping to fill the void by cultivating skilled IT women and providing positive female role models. Meanwhile, IBM and the New South Wales Department of Education and Testing will soon kick off the Exploring Interests in Technology and Engineering camp, which provides instruction in emerging technologies to girls between the ages of 13 and 15.
    Click Here to View Full Article

  • "Smart Fabrics Make for Enhanced Living"
    New Scientist (10/20/04); Biever, Celeste

    MIT engineers are working on smart fabrics that can imbue objects with the ability to provide information or sense the environment. Each fabric "patch" features a module equipped with a microprocessor and memory, along with either a sensor, a radio transceiver, batteries, a microphone, or a display. These patches can be joined together via modified Velcro that can enable both physical and electrical links; a wire connection between the circuit board and silver-coated contacts in the Velcro allows data and electricity to flow between modules, and the circuit board is water-proofed by a resin coating. Unlike other kinds of apparel-enhancing technology, the patch modules can be re-tasked for different functions. "These smart patches open up the idea of having computer interfaces that you can rapidly customize to suit your life," notes Georgia Institute of Technology pervasive computing researcher Thad Starner. MIT Media Lab developers Adrian Cable and Gauri Nanda have built a module with a radio antenna and receiver that scans for signals from radio frequency identification tags on objects, and fitted it onto a handbag. When a sensor module in the bag's handle detects that the bag has been picked up, the reader runs down the list of objects the computer module has been programmed to look for; failure to confirm the presence of a required item triggers a warning from a voice synthesizer module in another patch. Cable and Nanda plan to upgrade the system's intelligence with the addition of a Bluetooth chip to support an Internet connection and data downloading through a nearby computer.
    Click Here to View Full Article

  • "Scientists Can Now Read Your Eyes--Literally!"
    expressindia (10/25/04)

    Columbia University computer science professor Shree Nayar has developed a computerized technique to extract data about a person's surroundings, using images reflected on the surface of the human eye. The corneal imaging system's potential applications include surveillance, journalism, filmmaking, and human machine interfaces. The technique is based on the understanding that both the cornea of the eye and the camera capturing the appearance of the eye can act as a combination mirror and lens imaging system, allowing a wide-angle view of the surrounding environment to be imaged with maximum resolution focused in the direction of the person's gaze. Nayar's partner on the project, Ko Nishino, says the system can reveal more of the surrounding environment than the person in the image or photo can perceive because the system boasts a wider field of view. Professor John Pavlik of New Jersey State University notes that eyewitness accounts of events could be verified or corrected with Nayar's method. Nayar says the corneal image could be used to ascertain the location of a sought-after individual, and the technique's ability to reconstruct the lighting conditions around that individual could be used by filmmakers, for instance by replacing one actor's face with another's while keeping illumination consistent. Nishino explains that this ability could also find application in computer graphics. Nayar believes the system could expand quadriplegics' ability to interact with many devices through eye movements, while Nishino thinks the method offers a simple way to program robots for complicated tasks: "For example, knowing what part of an object people look at while accomplishing an assembly task can be useful info for designing assembly work flows," he says.
    Click Here to View Full Article

  • "Whatis Happening to Whois?"
    INTA Bulletin (10/15/04) Vol. 59, No. 19; Barritt, Keith

    ICANN requires all registrars of generic top-level domain names to provide a publicly accessible online list of domain name owners' names and postal addresses through the Whois database, but many domain name owners provide inaccurate or incomplete information due to privacy concerns or fraudulent intentions, writes attorney Keith Barritt, chairman of INTA's Whois Subcommittee. In August, ICANN announced two new policies designed to bolster the utility of Whois, both effective on Nov. 12. The Restored Names Accuracy Policy requires that domain names deleted due to the submission of false contact information or a lack of response and subsequently restored during the 30-day grace period be placed on hold until updated and accurate data is provided. The Whois Marketing Restriction Policy requires bulk access agreements that prohibit the use of Whois data for marketing purposes or its redistribution except under specifically mentioned circumstances. Three Whois task forces were organized by ICANN to study the following issues: how to limit access to the database for marketing and data mining purposes; an examination of the type of data collected by registrars and how the data is shown to the public; and ways of improving the accuracy of data. Preliminary reports from the task forces were issued in May 2004, but final reports are not yet out. Meanwhile, the U.S. House has already passed a measure currently pending Senate approval--the Fraudulent Online Identity Sanctions Act--that would create a presumption of an online violation of the Trademark Act if a violator willfully provides false contact data when registering a domain. Meanwhile, Barritt says INTA is working with EURid to bolster Whois protection in Europe within the framework of the European Data Protection Directive. Barritt also notes that the Canadian Internet Registry Association is also reviewing its Whois policies.
    Click Here to View Full Article

    For more information on Whois and ICANN, see http://www.acm.org/usacm.

  • "Still a Ways to Go"
    InformationWeek (10/18/04) No. 1010, P. 32; Brown, Patricia

    The rapid pace of technology change has enabled women to reach top positions in the industry. Over the past 20 years, many women have benefited from what they know, rather than who they know. "I was always given a choice of projects to work on and given opportunities to get into leadership at a young age," Tama Olver, CIO of biotech firm Applera, explains of her start as a computer programmer at Control Data in the late1960s. Of the companies listed in the Standard & Poor's 500 stock index, eight are headed by women, and three are tech firms. Although Carly Fiorina at Hewlett-Packard, Anne Mulcahy at Xerox, and Patricia Russo at Lucent Technologies are success stories, the high-tech industry still has more work to do. According to the U.S. Bureau of Labor Statistics, men filled 69 percent of computer and information-systems manager jobs last year. Ilene Lang, founding CEO of AltaVista and currently CEO of Catalyst, which advocates for women in the industry, says women continue to struggle because companies do not objectively identify and develop their talent, adding that they also lack role models and access to informal networks. Lang says large companies tend to provide better support for women.
    Click Here to View Full Article

  • "15 Innovators"
    CRN (10/18/04) No. 1117, P. 19; Longwell, John; Clancy, Heather; Neel, Dan

    Fifteen innovators are recognized by CRN Magazine for identifying complex technical problems and solving them in ways that became widely accepted, thus giving birth to new technological trends. Marcus Ranum is honored for inventing the first proxy firewall and deploying the first commercial firewall, while Barry Appleman's pioneering work with presence awareness at IBM was key to the ubiquity that instant messaging enjoys today. Dennis Moore of SAP developed XApps, a prototype system for developing composite applications that today are used to justify service-oriented architectures; Red Hat VP Michael Tiemann's GNU C++ compiler has become the cornerstone of many widely adopted open-source technologies; and JBoss Chairman Mark Fleury has inverted the J2EE server market with his open-source Java application server. IBM's Alan Ganek is recognized for his work with self-managing and self-healing computers modeled after the human body's autonomic nervous system, Stanford professor Mendel Rosenblum has earned accolades with his breakthrough server virtualization technology, and Oracle VP Andrew Mendelsohn's Oracle Database 10g software suite is widely acknowledged as a milestone of grid computing enablement. Other honorees include Vic Hayes, who supervised the development of the 802.11 specification as chairman of the IEEE's WLAN working group; Research In Motion President Mike Lazaridis, whose company developed the BlackBerry wireless messaging system; and Anand Chandrasekher of Intel, who helped make wireless connectivity pervasive with the Centrino wireless platform. Amedeo CEO T. Peter Brody's radical use of thin-film transistors to control pixels in liquid crystal displays has had a profound impact on today's flat-panel display technologies, and Advanced Micro Devices' Dirk Meyer oversaw the development of the Opteron processor, which made waves by supporting both 32- and 64-bit applications.
    Click Here to View Full Article

  • "ACM's Professional Development Centre Adds Books, ITPro Collection"
    ACM (10/25/04)

    ACM Professional Development Centre offers members free access to 395 online IT volumes on the latest IT topics. The listing page (http://pd.acm.org/books/books.cfm) provides direct links to full text volumes from Books24x7 as well as to citation information. For an additional $249 USD, ACM members can now gain access to an expanded collection of online Books in the ITPro Collection, which provides extensive coverage of more 100 technology topics. Premier industry publishers Wrox, McGraw-Hill, Microsoft Press are among the contributors to this collection. Popular book series, such as The Complete Reference, Inside Out, and Bibles are also included as is community-driven content offering coverage of specific as well as emerging technologies.
    https://pd.acm.org/itpro

    For more information about the PDC visit
    http://pd.acm.org/

  • "Focusing Enterprise Search"
    InfoWorld (10/18/04) Vol. 26, No. 42, P. 36; Gincel, Richard; Heck, Mike

    Enterprise search platforms (ESPs) comb through multiple repositories of structured and unstructured data to refine search results using a combination of natural language processing (NLP), entity extraction, autocategorization, and other technologies. ESPs can enable businesses to build customized search applications while automating the preparation of documents for archival and tabulation; the platforms also offer a common set of data and search logic that can be adjusted for individual applications to enhance the search results' relevance. Delphi Group's Hadley Reynolds notes that basic federated search cannot prevent the retrieval of irrelevant results from multiple search engines rather than just one, but ESPs can circumvent this difficulty by effecting syntax translation, spell-check, phrase detection, and other linguistic controls on the query before scouring the data repositories. The ESP returns lists of enhanced query selections based on the context of the original query to the user. Reynolds reports that ESPs are ushering in a new search-indexing process for unstructured data with technologies that enable dynamic categorizations or targeted text analytics within the document parsing, query evaluation, and relevant data retrieval processes. Meanwhile, entity extraction uses grammatical analysis to allow a search engine to dynamically extract terms from indexed content off the cuff, and NLP can make weak queries stronger. FAST VP Andrew McKay says the manual definition of document properties is losing ground to intelligent search platforms' "custom logic"-based autotagging ability, while ESPs' automatic generation of metadata elements allows patterns in the content to be uncovered, thus increasing the content's value inside the platform framework. ESPs "must be tied into the collaborative tools of the organization" if they are to work successfully, explains Susan Feldman of IDC.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM