HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 7, Issue 795:  Monday, May 23, 2005

  • "Student Interest in Computer Science Plummets"
    Chronicle of Higher Education (05/27/05) Vol. 51, No. 38, P. A31; Foster, Andrea L.

    Student interest in computer science has dwindled over the last several years, which gives technology companies and scientific researchers cause for worry; tech firms cannot maintain their competitiveness without IT professionals, while researchers say they would be hard pressed to solve homeland security problems and achieve scientific breakthroughs if intellectual talent is in short supply. A May report from the Computing Research Association finds the number of newly declared computer science majors has dipped 32 percent between the fall of 2000 and the fall of 2004, while a survey from UCLA's Higher Education Research Institute indicates a 59 percent decline in the number of incoming freshmen interested in becoming computer science majors over the last four years. Undergraduates and computer scientists chiefly attribute student disinterest to the dot-com implosion and news-media reports of tech outsourcing, though computer science professors are quick to note that most reports of the field hemorrhaging jobs to other countries are distorted. Other factors blamed for the decline of computer science majors include the lure of more glamorous disciplines such as molecular biology; erosion of students' math skills; the emergence of information technology degrees; and negative stereotypes of computer scientists. Various institutions, the National Science Foundation (NSF) among them, are trying to attract more students to computer science, particularly women, minorities, and the disabled. Institutions attempting to recruit students from such groups will receive $14 million for fiscal 2006 from the NSF's Broadening Participation in Computing program. University of Oregon professor Janice Cuny recommends that colleges reorganize their curricula to enlist more women by promoting the practical aspects of computer science.
    Click Here to View Full Article

  • "Giving Innovation a Boost"
    HPC Wire (05/20/05) Vol. 14, No. 20; Barker, Trish

    The Distributed Innovation and Scalable Collaboration in Uncertain Settings (DISCUS) tool developed by researchers at the National Center for Supercomputing Applications (NCSA) and the Illinois Genetic Algorithms Laboratory (IlliGAL) is designed to enable creativity, innovation, and collaborative work in complex scenarios. Funded by the Technology Research, Education, and Commercialization Center program underwritten by the Office of Naval Research, DISCUS fuses IlliGAL's genetic algorithm research, data mining and text mining research from NCSA's Automated Learning Group, chance discovery and the KeyGraph method, and collaboration tools into an easy-to-use architecture. The DISCUS prototype features information-sharing among group discussions; archival of discussions for later review and examination; data and text analysis through the NCSA's Data to Knowledge and ThemeWeaver tools; user-centric genetic algorithm models to support innovation; and real-time KeyGraph analysis to facilitate scenario study, topic highlighting, and collaborative data-sharing. The prototype was used to help Japan's Hakuhodo Institute of Life and Living collate and analyze data about consumer cell phone preferences in order to help clients discover nascent markets and develop new products. DISCUS not only speeds up focus-group studies, but also lets focus groups be conducted by a smaller number of moderators. DISCUS developers are refining the interface based on evaluations users made during the Hakuhodo experiment, while the development of innovation and creativity support tools to be embedded in the next DISCUS iteration are underway. Researchers at Hewlett-Packard's Japanese division are also discussing the application of DISCUS to the process of aggregating software development needs.
    Click Here to View Full Article

  • "Innovative Curriculum Helps Blind Students Get Start in Computer Science Studies"
    Winona Daily News (MN) (05/22/05); Fiecke, Shannon

    Saint Mary's University and Winona State University (WSU) coordinate the Computer Science Curriculum Accessibility Program (CSCAP), an effort to help visually impaired students pursue computer science careers that was initially funded by a $450,000 National Science Foundation grant. The first CSCAP students have helped refine the math and computer science curriculum to the needs of the visually disabled in return for tuition, room and board, tutoring, and equipment. When the program was launched four years ago, computer science publishers had to send textbook authors' raw CDs so blind students could hear the text with the aid of screen readers. WSU professor Joan Francioni says nowadays most companies provide electronic textbooks, although screen readers cannot render items such as diagrams and math equations. Alternative methods for conveying such information have been developed by instructors: Examples include straight lectures and the creation of tactile versions of graphics or mathematical notations through the use of a Braille printer. The JavaSpeak tool enables the visually impaired to learn the Java programming language by verbalizing data about Java's structure and semantics. Francioni says CSCAP plans to recruit more students and test new computer tools as it improves. She says the program's goal is to level the playing field for visually impaired students, while not reducing the curriculum's intense study element.
    Click Here to View Full Article

  • "NCSA Shows Off Its Latest Technologies"
    Champaign News-Gazette (IL) (05/19/05); Kline, Greg

    The National Center for Supercomputing Applications (NCSA) showcased its ongoing research to partners and prospective partners at the Private Sector Program meeting this month. NCSA researcher Alan Craig described a new Internet search application called VIAS that enables users to easily narrow results, cross-search, and categorize results based on the names or qualifications of people, for instance. Another NCSA project called Search Customization and Visualization automatically filters results according to users' interests. Many of the NCSA projects displayed at the meeting focused on either online collaboration, data mining, or automated information processing technologies that made large amounts of data more usable. Though search engines such as Google can pull up millions of results for some queries, the difficult part for users is identifying relevant information, says Craig, who is part of the NCSA Visualization and Virtual Reality Group. Craig and colleague Kalev Leetaru have developed an "editable Web browser" that enables non-technical users to easily create and maintain Web pages, something NCSA officials say could be especially useful to small businesses that do not have dedicated IT staff. NCSA's Private Sector Program helps generate new partnerships and lets companies take a look at licensable technologies.
    Click Here to View Full Article

  • "'Madagascar' Pushes Tech Limits"
    CNet (05/23/05); Olsen, Stefanie

    DreamWorks Animation's upcoming film "Madagascar" depicts many colorful furry animals, but rendering such creatures in a believable way--and in large groups--was a challenge for animators. Computing power breakthroughs helped them meet this challenge, and have also helped computer animation reach new heights that have earned animators and producers accolades, awards, and big box office. DreamWorks developed algorithms for lighting, surface animation, and characters in-house. The studio gave "Madagascar's" animal actors a stylized, cartoony appearance by constructing them out of models of tubes, enabling contortions that would be invisible to viewers yet remain true to their character, according to head of production design Kendal Cronkhite. Computer animation has largely removed the necessity of key-framing, which traditional hand animators rely on to create the illusion of fluid character movement. However, physical models of characters and environments are still a vital starting point of any animated film. Once those are furnished, the production team will flesh out environmental details such as textures and colors, while landscaping, layout, and camera angles are the responsibility of the design team; the last step, imbuing the characters with life, is the animators' job. Competition between animation companies has grown in intensity, leading to expectations of even more stunning effects, such as synthetic characters so perfectly rendered as to be virtually indistinguishable from human beings.
    Click Here to View Full Article

  • "Next for BitTorrent: Search"
    Wired News (05/23/05); Poulsen, Kevin

    BitTorrent is preparing a new search function for its open-source BitTorrent software that will make it much easier to find and manage downloads, but the free search tool could invite a lawsuit from the movie industry. The new search engine is offered in partnership with Ask Jeeves, which will share advertising revenue generated through the search with BitTorrent, and will be offered on the BitTorrent Web site where new users can learn about and download the application. BitTorrent has dramatically changed file-sharing by turning conventional file-sharing on its head--making the downloaders share the file instead of the publisher; the result is a situation where in-demand files are actually more available than obscure ones. The new search engine will incorporate this aspect in its ranking system so that search results are ranked not only according to relevance, but also according to availability, or the number of nodes downloading the file. BitTorrent chief operating officer Ashwin Navin says BitTorrent is more than a perfect facilitator for copyright piracy, and is ideal for people who want to publish their own videos, music, or software without worrying about hosting issues. The Motion Picture Association of America (MPAA) has targeted popular BitTorrent clearinghouses, and may go after BitTorrent itself now that the search function provides a sort of centralized role for the company, according to Stanford University law professor Mark Lemley. Although the Digital Millennium Copyright Act exempts "information location tools," it does require administrators to disable links upon notice of copyright infringement, a caveat that could strain the five-person BitTorrent operation. On the other hand, the MPAA may decide to leave the search function unmolested since it allows them to easily identify top files that infringe on their copyrights.
    Click Here to View Full Article

  • "Wouldn't It Be Nice..."
    Wall Street Journal (05/23/05) P. R6; Rutkoff, Aaron

    Asked what kinds of technologies and innovations they would like to see, tech pioneers, inventors, and trend observers responded with a wealth of gadgets. Gizmodo.com editor Joel Johnson envisions a music player that allows users to exchange songs between devices, while the introduction of Wi-Fi could enable the instant acquisition of content from strangers' players; Johnson says such devices are technically feasible, but warns that fear of copyright violation is a major impediment. CDW founder Michael Krasny desires a device that integrates the functions of a BlackBerry, a cell phone, an iPod, and a planner, but with a flexible, roll-up display that can be plugged into the gadget instead of a rigid, built-in screen. Gordon Bell with Microsoft's Media Presence Research Group is personally testing a prototype for his group's MyLifeBits project, whose goal is to create an archive of practically every aspect of life experience to act as a surrogate memory. The system involves a wearable gadget that snaps pictures and records audio notes throughout the course of a day, and plans are underway to equip the device with GPS to map the user's exact location. Electronic Frontier Foundation founder John Perry Barlow's immediate need is for a universal plug compatible with any device that can eliminate the snarl of cords and plugs typical of current electronic gear, while even further off he would like an implantable computer interface. DiamondCluster International's John Sviokla expects the cell phone to become the mainstream successor to the PC, but this can only happen if cell phone companies cede control of the phone's capabilities and features to innovators. Sviokla's personal preferences include a cell phone with a scanner to store business-card information; a GPS; a flash-memory reader so that the device can function as a personal hard drive; a radar detector; and a fingerprint reader.

  • "The Grand Convergence in 2010"
    ZDNet (05/18/05); Farber, Dan

    By the end of the decade, increasing connectivity, ambient intelligence, and more mature semantic technologies will enable powerful new opportunities similar to that brought on by the Web, according to IT research firm Gartner. Though predictions of disruptive technologies are somewhat misleading and foster expectations of sudden and improbable changes, each of three critical technologies identified by Gartner are definitely on track: Ubiquitous access is rapidly emerging, and Gartner analyst Nick Jones predicts average users will connect to six networks each day and utilize a variety of devices. Ambient intelligence is predicated on a number of different technologies, each of which already exists and is moving along the path toward wider deployment. RFID and other embedded computing capabilities provide objects with intelligence, and the cost barriers are quickly falling so that companies can start employing ambient intelligence for a number of new applications; ambient intelligence also poses new consumer applications, especially when considering OLED display technology. OLEDs could be used for repair manuals, or key performance indicators for businesses or people's personal health. Mesh networking will also play an important role in ambient intelligence, and there are already several companies exploring the convergence of mesh networking and sensor technology, notes Gartner. Gartner also says collective intelligence--the gathering and dissemination of information in a community--will bring 10 percent productivity increases by 2015 by providing more accurate forecasts and more democratic decision-making. Semantic technologies are also identified as a key enabler of a new generation of information-sharing among machines.
    Click Here to View Full Article

  • "New Cornell Institute Will Apply Artificial Intelligence to Decision Making and Data Searches"
    Cornell News (05/18/05); Steele, Bill

    Cornell University's new Intelligent Information Systems Institute shows how far the university has come in its involvement in artificial intelligence, which was nonexistent on the campus not too long ago. The institute is the result of a $5 million, five-year grant from the U.S. Air Force Office of Scientific Research. In addition to conducting research into helping computers handle enormous amounts of data and problems that involve numerous choices, the researchers will focus on game theory, information retrieval, and automatic verification of software and hardware. Visiting scientists will be heavily involved at the institute with Cornell computer scientists and other faculty in operations research, applied economics, mathematics, and engineering. The facility has a dedicated computer cluster comprised of a cluster of 12 Intel processors operating in parallel that can be accessed from the computer terminals. The institute is about collaboration between scientists, says director Carla Gomes, associate professor of computing and information science and applied economics and management.
    Click Here to View Full Article

  • "Hacking the Grid"
    Red Herring (05/18/05)

    The U.S. power grid is vulnerable to hackers because the Supervisory Control and Data Acquisition (SCADA) systems the grid relies on have become susceptible as a result of control system standardization brought on by industry consolidation; power companies' linkage of business computers to SCADA systems via the Internet; and the wide online dissemination of hacking knowledge and SCADA system mechanics. Documented cases of public utility system hacks date back more than 10 years. The causes of these intrusions include computer worms, disgruntled employees, thrill- or credibility-seeking adolescents, and government-authorized security testing; there is also documentation suggesting that terrorist organizations are accessing Web sites with information that could be used to hijack power grids. The Homeland Security Advanced Research Projects Agency has funded proposed encryption, authentication, grid monitoring, intrusion detection, and open-source security solutions that must be compatible with existing control systems and secure the SCADA systems' multiple access points. Control system break-ins share a certain similarity to routine control functions, so startups are working to more narrowly define control system intrusions and provide software that looks for more nuanced signs of activity than simple network event patterns. Digital Authentication is trying to generate unpredictable encryption keys and location-based communication authentication through the use of physics equations, while Asier Technology is working on a software solution. Despite the threat SCADA system vulnerability presents to national security, power companies have little incentive to make security investments.
    Click Here to View Full Article

  • "The Sociology of Interfaces"
    TheFeature (05/18/05); Frauenfelder, Mark

    Korean and Finnish university researchers have conducted a new study into how cultural differences influence computer interface requirements, focusing on how Korean, Japanese, and Finnish people responded to different mobile data services and how those responses matched cultural aspects. The researchers designed their survey to rank people's responses in four basic cultural dimensions: Uncertainty avoidance, or the effort people take to maintain predictability; context, or the amount of information needed to fully understand something; individualism vs. collectivism; and the proclivity to multitask. The teams from Yonsei University in Seoul, Korea, and the University of Helsinki created 12 video clips of people in Japan, Korea, and Finland engaging in different mobile data services tasks. Different people from each of those countries were asked their opinions about the mobile data services, and then the responses were used to identify 52 main attributes of mobile data services, including speed, screen size, and line spacing. When matched against the cultural dimensions, the researchers found Koreans to be more collectivist than the Japanese and Finns; Koreans preferred to know the most popular ringtone downloads, whereas the Japanese and Finns sought out ringtones that pleased themselves, for instance. Koreans and the Japanese shared other characteristics, such as avoiding uncertainty and requiring high context, but all the groups said streamlined processes were important.
    Click Here to View Full Article

  • "Confronting the Reality of Web Services"
    Harvard Business School Working Knowledge (05/16/05); Grant, Sara

    Harvard Business School professor Andrew McAfee said in the Winter 2005 edition of MIT Sloan Management Review that common standards must be better integrated if Web services are to deliver their promised advantages in terms of communication exchanges and data-sharing. In an interview, McAfee attributes application integration difficulties to technical and organizational problems: The technical challenge is that it is practically assured that any two applications will contain dissimilar data and perform dissimilar business processes, while the organizational challenge is that disputes will inevitably arise during the stakeholders' attempts to establish common definitions. McAfee sees successful examples of both internal and cross-company Web service deployments, although the second type of implementation is comparatively scarce, and mostly restricted to large and technically advanced companies with long-term relationships. He points to Amazon and eBay's efforts to entice small sellers to plug into their IT infrastructures via Web services, although the companies do not renegotiate Web services standards with each seller. McAfee says Web services are thus far used to automate simple business processes, although he expects such processes to gradually increase in sophistication. His advice to those considering a Web services project is to ask important questions about collaborators and their dedication to the project, how to reach consensus with partners about shared data and business processes, the ultimate goal of the project, the interactions to be automated and which should be prioritized, and guaranteeing the organization captures its knowledge as it proceeds and develops a capability. McAfee says he skips over questions regarding cost savings or return on investment because "I just don't think there are crisp and meaningful ways to do these calculations."
    Click Here to View Full Article

  • "E-Records R&D Gets Grants"
    Federal Computer Week (05/06/05); Olsen, Florence

    The Library of Congress and the National Science Foundation have announced the winners of $2.8 million in federal grants for research in digital preservation, with results expected from those projects one year from now. The money was made available through a congressionally funded digital preservation program, which is meant to build up national expertise in that field. The recipients of the $99,000 to $500,000 grants include University of Maryland at College Park researchers for automated collection and verification in distributed digital collections, the University of California at San Diego for archival of large multimedia collections, an automatic metadata capture project led by the University of Arizona and Raytheon, research into ways data producers and archivists can work together on archive-ready data sets at the University of Michigan, Old Dominion University research concerning use of existing Internet infrastructure standards for archiving purposes, and a University of North Carolina at Chapel Hill prototype for storing video objects.
    Click Here to View Full Article

  • "A Robot in Your Future?"
    Network World (05/16/05) Vol. 22, No. 19, P. 1; Weinberg, Neal

    Speakers at last week's RoboBusiness Conference expressed dismay that commercial robot products are paltry and for the most part impractical. Putting aside recent innovations such as the entertainment-themed Robosapien and the Roomba robot vacuum cleaner, robotics pioneer Joe Engelberger said the growth of the robotics field is in a state of relative stagnation compared to the computer industry. He was particularly frustrated that robot caregivers for the elderly have not been rolled out by now. Evolution Robotics chief scientist Paolo Pirjanian blamed the robotics industry's slow progress on a number of missing or sparse elements, including venture capital, practical applications, industry standards, and a robotic counterpart to Moore's Law. He added that the economic cost of developing a robot capable of interacting with its environment is out of consumers' price range. Pirjanian expects robotics technology to be incorporated into other products, while the emergence of versatile, standalone robots is still a ways off. Other presenters were generally positive about new products: IRobot founder Helen Greiner said her company's PackBot devices, currently employed in the Middle East to survey hostile areas, should be adapted to work in swarms to find enemy soldiers, ordnance, and chemicals, while John Deere's Greg Doherty said his company is interested in automated farm vehicles that use GPS. However, he also said a lot of the technological innovations needed to perfect their operation, such as real-time environmental modeling, object manipulation, voice recognition, and obstacle avoidance, will not be around for decades.
    Click Here to View Full Article

  • "Particle Smasher Gets a Super-Brain"
    New Scientist (05/21/05) Vol. 186, No. 2500, P. 10; Muir, Hazel

    Once it is fully up and running in late 2007, CERN's Large Hadron Collider (LHC) is expected to generate 15 million GB of data annually, and a distributed computing architecture will be used to store and analyze the data. The LHC will operate for 10 to 15 years, performing experiments to spot the Higgs boson, "supersymmetric" particles, and other exotic particles in order to gain new insights about the nature of the universe. Raw data will be stored in tape silos in a local facility, and transmitted to 12 European, Asian, and North American storage sites; it will then be relayed to approximately 100 smaller sites in 40 nations, and then on to individual universities and institutes. More than 6,000 physicists will log on to local PCs to analyze the data through the LHC's worldwide grid, and they must be allowed to access the data at any time. Last month, Geneva-based computers sustained a continuous 600 Mbps data stream to seven U.S. and European data sites for 10 days, transmitting 500 terabytes in all; in September, CERN will attempt to reach another milestone by transmitting data to many other computer centers and sustaining the stream for three months, while the ultimate target is a sustained flow of 1,800 Mbps by the time the collider is fully operational. On March 15, the LHC grid became the largest scientific grid of all time with the interconnection of 100-plus computing centers in 31 countries, but this network provides only 5 percent of the estimated processing power the LHC will ultimately require. CERN must also take steps to prevent any one research team or institute from monopolizing the grid.
    Click Here to View Full Article

  • "Processing for Science"
    Scientific American (05/05) Vol. 292, No. 5, P. 30; Choi, Charles Q.

    Virtual supercomputer projects are starting to proliferate thanks in large part to the increase in distributed-computing platforms that are able to host multiple projects. The Berkeley Open Infrastructure for Network Computing (BOINC), the host of SETI@home, Einstein@home, and Climateprediction.net, offers one of the larger platforms. The open-source infrastructure code that BOINC offers saves scientists the time of having to write their own, a task that can take years due to the need for software to operate in the background on different operating systems and million of computers, and to guard against faulty results and malicious attacks. "We want to make it easy for scientists to get access to millions of computers' worth of processing power," says David Anderson, director of BOINC and SETI@home. He expects the number of @home projects to jump from around 60 today to several hundred or more over the next few years, and the number of personal computers participating to rise from about 1.3 million to 30 million. The distributed-computing projects send scientific tasks to computer users who want to contribute spare processor power, and Anderson estimates that a typical computer can handle up to 12 @home projects. He says a service that rotates projects on individual computers is possible down the road.
    Click Here to View Full Article

  • "Linux in the Datacenter"
    Business Communications Review (05/05) Vol. 35, No. 5, P. 20; Mancill, Tony

    Vesta senior IVR programmer Tony Mancill writes that the maturation of the open-source Linux operating system, along with that of traditional hardware platforms and commercial software, is a positive development because it makes any of the three OSes suitable options for the majority of IT projects. Hardware consolidation reduces the challenges developers face in integrating Linux with their servers, and accelerates the speed of implementation. In addition, the availability and stability of relational database management systems, open-source application development languages, and Web servers such as Apache and Tomcat provide an open-source infrastructure that is being more readily accepted by the mainstream. Meanwhile, issues discouraging software vendors and customers from embracing Linux are being addressed through the efforts of Red Hat and IBM. Red Hat's strategy has been to concentrate on support in the server room and the provision of a fairly stable goal for solution providers to pursue. IBM's approach involves endorsing Linux, helping with the Linux ports to the iSeries and zSeries mainframes, and sponsoring open-source software projects. Java Virtual Machines (JVMs) also generate excitement because they can make applications that run inside them almost completely impartial to hardware and OSes; this allows software developed on one processor and machine architecture to run on a dramatically dissimilar processor and architecture. Moreover, running multiple JVMs on a single computer can shield other applications from the effects of a crashing application.
    Click Here to View Full Article

  • "How to Hook Worms"
    IEEE Spectrum (05/05); Riordan, James; Wespi, Andreas; Zamboni, Diego

    IBM Zurich Research Laboratory research scientists James Riordan, Andreas Wespi, and Diego Zamboni detail an intrusion-detection system designed to specifically target computer worms, which Mi2g says were partly responsible for more than $68 billion in damages in February 2004 alone. The majority of intrusion-detection systems employ a dual-tier strategy in which "sentinel" programs are posted on both network-linked host computers and on the network itself, but this approach generates many false alarms and exhibits little resistance to both malicious attacks and accidental failures. The researchers' system, dubbed Billy Goat, runs on a network-connected dedicated machine and can identify worm-infected machines anywhere within the network. The genesis of Billy Goat was Riordan, Wespi, and Zamboni's realization that computers linked to the network frequently got automated requests from other machines that did not dovetail with their normal operation; worms were behind a large percentage of these requests, because they usually locate new computers to target by randomly searching through Internet addresses. Billy Goat is assigned to unused, unadvertised addresses where the illegitimacy of received requests is a given, and the system responds to requests by providing bogus virtual services, effectively fooling worms into disclosing their identity and making them easy for Billy Goat to reliably track. The system tries to attract many different kinds of worms by presenting multiple feigned services, while new fake services can be created by standard programming tools and interfaces supported by the virtualization infrastructure; Billy Goat also follows a distributed architecture that permits the coexistence of multiple Billy Goats on a network. The researchers claim Billy Goat can detect worm-infected machines within seconds of contamination, and provide their addresses as well.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM