HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 7, Issue 835:  August 29, 2005

  • "NSF Preps New, Improved Internet"
    Wired News (08/26/05); Baard, Mark

    The National Science Foundation (NSF) announced the launch of the Global Environment for Networking Investigations (GENI), an initiative that could yield a better, more secure Internet architecture, on Aug. 24 at a meeting of ACM's Special Interest Group on Data Communication (SIGCOMM). NSF said GENI will "enable the vision of pervasive computing and bridge the gap between the physical and virtual worlds by including mobile, wireless and sensor networks." SIGCOMM Chairwoman and GENI organizer Jennifer Rexford stressed the lack of privacy on the Internet, but cautioned that an overabundance of privacy could complicate the detection of network attacks; she thinks GENI could yield a network that keeps security and privacy balanced by providing a massive range of opt-in and opt-out features. GENI stemmed from the concept of a "clean slate" Internet envisioned by luminaries such as MIT researcher David Clark, who thinks new network architectures are necessary to address various threats that run rampant because the current Internet architecture was never designed to contend with them. GENI organizers say the increasing ubiquity of sensors and communications devices could also support a new network architecture that allows privacy to be tailored to individuals and communities. The GENI experimental facility will probably be linked to the Internet2 and National LambdaRail experimental networks, but NSF said GENI will stand out by featuring experimental hardware, "new classes of platforms and networks," and "new computing paradigms enabled by pervasive devices."
    Click Here to View Full Article

  • "Linux Kernel Update Improves Event Monitoring"
    eWeek (08/26/05); Galli, Peter

    Though some developers had anticipated it sooner, the next few weeks should see a stable Linux update promising improvements in file system event monitoring, a host of system calls permitting users to load a kernel from the Linux kernel functioning at the time, and the Xtensa architecture. Linux pioneer Linus Torvalds had hoped the complete 2.6.12 update would happen first, but instead opted for a -rc7 first, containing mostly minor changes with specific targets. The 2.6.13 kernel is expected to usher in substantive changes, however, such as the Inotify event-monitoring file system designed to replace Dnotify. The Xtensa processor core to be included in the next iteration of Linux is configurable, extensible, and synthesizable, geared toward embedded SoC applications. Among the other enhancements forthcoming in the 2.6.13 version will be the Kexec set of system calls, executeI in place for s/390 applications, as well as an augmentation to the Complete Fair Queuing disk I/O scheduler. Future versions of Linux will contain the Xen virtualization device, the Fuse system enabling implementation of a completely functional file system in a user-space program, and the second version of the Oracle Cluster File System; the future of Red Hat's Global File System, which enables cluster nodes to read and write to one shared file system simultaneously, is uncertain. Though significant progress has been made on the development of the Linux kernel, there is still work to be done in the mapping of the development process and open sourcing a disparate array of device drivers. Other areas of ongoing development include memory add/delete, system throughput, and serviceability.
    Click Here to View Full Article

  • "Intel Opens Doors on Autonomic Computing"
    Electronic News (08/25/05); Deffree, Suzanne

    Intel's Justin Rattner and IBM's Alan Ganek cited "user-aware" technologies as a critical step toward autonomic computing, which will probably not be realized in a mainstream sense for quite a while. Speaking at the Intel Developer Forum, Rattner said the future of electronics will be fueled by the need for less complicated, more intuitive methods to handle technologies that subsequently help people reach their goals. He described a user-aware platform as any device that is self-maintaining, can recognize and locate users, and attempts to predict users' needs; such devices will need an awareness of their surrounding environment and their operations, as well as higher levels of intelligence to comprehend user requirements and interact with other electronics to fulfill those requirements without causing harm. Intel said this intelligence will be facilitated with the addition of multiple processing cores on each processor, and the company predicted the eventual incorporation of hundreds of cores within a single processor so that each chip can dynamically assign individual cores or entire clusters, as well as the required bandwidth and memory, to specific operations while shielding each task's dedicated portion of computing resources with virtualization software. Ganek noted that most computer users must perform tedious tasks they do not want to do, and that they are not particularly proficient in. Ganek said the biggest challenge to autonomic computing is a dearth of standards and problems with developing standards for a holistic autonomic perspective across various technologies.
    Click Here to View Full Article

  • "Squirrel Helps With Mobile Calls"
    BBC News (08/26/05); Alexander, Luke

    MIT research student Stefan Marti has come up with a unique solution to the irritation of intrusive cell phone calls: An Autonomous Interactive Intermediary (AII) that takes the form of a cute animatronic animal that answers phone calls, determines if its owner is occupied or asleep, assesses the importance of incoming calls, and takes messages. The intermediary alerts the user to calls through movement rather than sound. The AII prototype is designed to look like a squirrel and is bound to a physical location because of its need to communicate with a computer, but the technology could conceivably be shrunk down to the size of a mobile phone. The device is based on the principle that a computer ought to communicate information in a manner that is responsive to surrounding social situations, and Marti observes that mobile communication technologies do not adapt to the fluid nature of such situations; this in turn breeds anger and even malice toward technology. Marti believes people's future interaction with technology will be synonymous with their interaction with each other, either through agents such as the AII or by technology being so deeply embedded in our lives as to become invisible. Marti says future technology must "have a deeper understanding of how humans like to interact, what humans want, and eventually what humanity stands for."
    Click Here to View Full Article

  • "Model-Driven Development Today"
    Builder AU (08/26/05); Overington, Matthew

    Companies such as Borland and Rational envision model driven development (MDD) as a partial solution to the vexing problem of making software development more transparent, predictable, and accountable. The promise of MDD lies in its ability to enable developers to rapidly generate quality code that is easy to maintain by freeing them up to move effortlessly back and forth between the various abstraction levels. "MDD...allows different people with different experience and ideas to work together on a project at whatever layer of abstraction they're most comfortable with," says Rational architect Davyd Norris, who believes all abstraction layers are encompassed in the model. Embarcadero Technologies APAC regional director Philip Ball says heavy use and compliance with models gives the development process additional transparency, accelerates development time, and makes bugs less potentially damaging. He calls MDD one component of a movement to give metadata a bigger role in development: "There's a need to store metadata so that developers can get a more holistic view of not only their application, but where it fits into the bigger picture in an entire software lifecycle," Ball explains. MDD is not a be-all, end-all panacea, but rather a methodology that complements the writing of quality code.
    Click Here to View Full Article

  • "Penn State IST Researchers to Enhance Search Engine"
    Penn State Live (08/26/05)

    Penn State and University of Kansas researchers have received a $1.2 million National Science Foundation grant to augment and upgrade the CiteSeer academic search engine, which has enabled the public to access over 700,000 computer and information sciences documents since its launch eight years ago. Projected advantages of the Next Generation CiteSeer include more document archiving, new kinds of searching, personalized recommendations and services, and synchronous live-object collaboration. The four-year NSF grant will encompass the growth of CiteSeer's database and the addition of new offerings such as a parsing service that facilitates header analysis and acknowledgement extraction, as well as an improved indexing service for documents and document citations. CiteSeer will receive a boost in reliability and ease of use through an open-source architecture, which will also expedite greater access to CiteSeer metadata by being an aggregation of Web services. Interest in turning CiteSeer into a collaborative resource was driven by the expansion of the computer and information sciences communities. David Reese Professor of information sciences and technologies Lee Giles, who co-created CiteSeer at NEC Labs, will serve as principal investigator for the NSF Computing Research Infrastructure Collaborative Grant.
    Click Here to View Full Article

  • "New Cybersecurity Center to Warn Law Enforcement of Critical Infrastructure Attacks"
    InformationWeek (08/24/05); Greenemeier, Larry

    A pilot of the Philadelphia-based Cyber Incident Detection Analysis Center (CIDDAC) is enabling numerous private enterprises to anonymously report cyberthreat and attack data with other enterprises and the government without fear of law enforcement audits. CIDDAC avoids audits by not being a government entity and not sharing from whom the information was collected. Currently, enterprises are not sharing important information, because a resulting security audit makes their valuable proprietary information available to the press and the public under the Freedom of Information Act. CIDDAC members are voluntarily participating and have donated about $100,000 to the project, while the Homeland Security Department's Science and Technology Directorate has provided $200,000 in funding. CIDDAC needs an additional $400,000 in funding to move out of the pilot phase and into a permanent phase where it can charge members $10,000 per year to participate. AdminForce Remote developed CIDDAC's real-time, cyberattack-detection sensor technology that gathers information from member networks. The intrusion-detection device alerts law enforcement and other CIDDAC members of developing threats on member networks without releasing identification data. Both the FBI and the Homeland Security Department will receive CIDDAC reports and will use the reports to begin informal investigations. The SANS Institute is running a similar program, the Internet Storm Center, using the Dshield intrusion-detection system technology, which is freeware the SANS Institute maintains. Internet Storm Center, which is free to use, enables users to anonymously submit firewall log data and read 30 days' worth of log submissions.
    Click Here to View Full Article

  • "QA Group of W3C Releases Specification Guidelines"
    LinuxElectrons (08/21/05)

    The Quality Assurance (QA) Working Group of the W3C will complete the Specification Guidelines this month, which will inform writers and editors on how to generate workable technical specifications. Steve Bratt, the W3C's COO, said the creation of standards will help software developers better understand the technologies of the W3C. The QA Activity was created on the heels of a workshop in 2001 with three principal goals: offering guidelines to improve W3C specifications, ensuring adherence to those guidelines through the review of draft specifications, and working with W3C groups to develop test suites and other aids to interoperable implementation. The drafting of documents such as the Specification Guidelines will help authors create content that can be implemented according to their original intent. The QA effort has won the support of W3C members and the general community of developers, including Boeing, NIST, the Open Group, and Microsoft since its inception. The W3C has also formed a QA Interest Group soliciting ideas for improvement from members of the developer community who may not have had the time for participation in the Working Group, thereby ensuring a comprehensive approach to the process of defining standards.

  • "Play and Learn"
    Age (AU) (08/27/05); Stonehouse, David

    The idea of video games as a learning tool is gaining credibility and drawing controversy with the publication of books such as "Everything Bad Is Good for You: How Today's Popular Culture Is Actually Making Us Smarter," by Steven Johnson. Johnson cites scientific studies showing that video games can make users better at visual recognition and more mentally energy-efficient than non-players, as well as increase their sociability, confidence, and comfort in problem solving. He also disputes arguments that games incite violence and aggression among young people with the most recent Child Well-Being Index from Duke University, which estimates an almost two-thirds decline in violent crime among American teens and adolescents in the last 30 years. Johnson theorizes that computer games "function as a kind of a safety valve--they let kids who would otherwise be doing violent things for the thrill of it get out those kind of feelings sitting at home at a screen." Elyssebeth Leigh of Sydney's University of Technology thinks video games teach children how to use technology, and allow them to harmlessly experiment with their surroundings and learn about choices, strategy, risks, and consequences. University of Wisconsin researcher and author James Paul Gee says it is too early to see definite proof that gamers are applying the lessons they learn in the real world, although he notes that games are a good tool for teaching language early in school. Colorado educational psychologist Jane Healey feels that too much use of video games and increased dependence on computers in schools can be detrimental, and she advises parents and educators to carefully supervise children's activities in this regard.
    Click Here to View Full Article

  • "Black Hat Work for the White Hat Good"
    Press-Telegram (Cal) (08/25/05); Butler, Kevin

    Researchers at Cal State Dominguez Hills believe their cyber monitoring technology will help protect sensitive electronic information from hackers. Using a $130,000 grant from the U.S. Department of Defense over the next two years, Mohsen Beheshti, a professor and chairman of the computer science department, associate professor Richard Wasniowski, and their students will conduct real cyber attacks to assess the effectiveness of the fusion technique. The system is designed to collect and compare data from four computer servers, and detect patterns in requests for access. The DOD is optimistic about the new fusion technology, and has an option for a third year, which would increase the grant amount to $200,000. Cal State Dominguez Hills plans to offer a new computer security course as well.
    Click Here to View Full Article

  • "Fine Arts Scholars Join Computer Scientists to Explore Cultural Creativity"
    AScribe Newswire (08/25/05)

    Longtime strangers creative arts and computer science could share a future together, as a new program at the University of Illinois demonstrates. Jonathan Fineberg, an art history professor at the university's Urbana-Champaign campus, recently partnered with art and design professor Kevin Hamilton and computer science professor Roy Campbell to teach an interdisciplinary course that elicited a scientific response to Ilya and Emilia Kabakov's "Palace of Projects" art installation; the results were overwhelmingly favorable, and boasted an equal distribution of male and female students with artistic and scientific concentrations. Fineberg reports that in the course, which emphasized a collaborative, team-oriented approach, each student contributed uniquely in accordance with their areas of specialization. Computer scientists had a strong idea of the capabilities of computers, while the graphic designers and artists were able to offer insight into the visualization and aesthetics. The resulting product, known as "Project 66," is Web-based to allow for remote access, but is designed as an installation that Fineberg is currently discussing with museums in New York and London for exhibit. The course is one product of Illinois' Seedbed Initiative, which the university launched in 2002 aimed at developing alternative approaches to learning, such as the Cultural Computing Program (CCP). Overseen by the computer science department, the CCP seeks to foster collaboration among disciplines and export computer science to the arts and humanities, with the ultimate goal of "creating and transforming culture with computers." The CCP has attracted national attention, and is currently devoting much of its attention to infusing video games with art and culture, instead of war and violence.
    Click Here to View Full Article

  • "Computer Programmers Need to Speak the Right Language"
    New Zealand Herald (08/24/05); Bland, Vikki

    Many New Zealand computer programmers agree that proficiency in multiple programming languages and other computing fundamentals is necessary in order to find and maintain gainful employment, and they advocate solid technical training. Dave O'Rourke, a systems integration manager for a retail software developer, says New Zealand universities are not producing enough programming graduates to satisfy demand, which necessitates the hiring of immigrants. Microsoft New Zealand's Sean McBreen says programming graduates will be better prepared for the workplace and more apt to remain in New Zealand than relocate overseas if teachers, students, and the business community communicate openly about trends and required programming skills. O'Rourke's employer insists that natural aptitude is also a prerequisite for programming graduates. Common programming skills in demand in New Zealand include enterprise resource planning, relational database management development, Linux, graphic user interface programming and geometric transformations for games and design applications, C++, C sharp, Visual Basic, and Delphi. Employers tend to prefer students that have studied math, computing, or science up to a high secondary school level.
    Click Here to View Full Article

  • "A Proposal for Governing the 'Net"
    Network World (08/22/05) Vol. 22, No. 33, P. 63; Johnson, Johna Till

    The architects of the Internet designed it to be governed by a combination of democratic and free-market principles, which was seen as favorable to the previous "monarchy" of Ma Bell. The Internet Engineering Task Force (IETF) was set up to be a forum where different stakeholders could work together on technical aspects of the Internet. In this opinion piece, Johna Till Johnson suggests that these governance models worked in the early days of the Internet, but may now be in need of an upgrade. Specifically, she suggests that the IETF is not able to fully tackle the most important concerns facing the Internet today, which are regulatory and operational, not technical, in nature. However, the international nature of the Internet makes federal government intervention difficult. As an alternative, Johnson recommends the establishment of a new body, modeled after the IETF, that would provide a forum for economic and operational concerns pertaining to the Internet. This hypothetical, self-regulating body, called the International Association of Networking Service Providers, would be tasked with ensuring the cooperation of service providers in streamlining Internet operations for the greater good.
    Click Here to View Full Article

  • "Mesh Scheme Crosses Borders"
    Wireless Week (08/15/05) Vol. 11, No. 17, P. 16; Smith, Brad

    To widen the scope of a local area network, groups such as the Telluride Bluegrass Festival and the Chinese government are implementing wireless mesh networks. Mesh networks often expand on Wi-Fi technology, and pose a threat to the emergence of the non-certified WiMAX scheme; costs for mesh networks are relatively low, as the only process required is to send power to the various nodes. The IEEE is now considering standardizing mesh networks under the 802.11 framework that governs Wi-Fi. The bid for the standardization of mesh wireless networks has two major competitors: The Wi-Mesh Alliance, which includes Nortel Networks, Philips Electronics, and InterDigital Communications, and SEEMesh, which includes Intel, Motorola, Nokia, and Texas Instruments. A standard is necessary if a provider uses equipment from multiple vendors. Nortel is partnering with Wireless Valley Communications to create a mesh network that covers 90 percent of Taipei through 10,000 access points. At the LeMans sports car race in France, Motorola helped create a mesh network for General Motors' Corvette racing team to link the drivers and crews through live video feeds and network radios; Motorola's network uses its Mesh Enabled Architecture (MEA) radios instead of Wi-Fi, claiming that they are more resistant to interference and reliable in harsh environments. The popularity of mesh networks is on the rise, and their standardization by the IEEE will likely result in their even more pervasive adoption.
    Click Here to View Full Article

  • "It Pays to Be Persistent"
    Government Computer News (08/22/05) Vol. 24, No. 24; Jackson, Joab

    The Energy Department's Information Bridge program is seeking to eliminate Document Not Found messages through the use of persistent identifiers, or permanent Web addresses to ensure the future recovery of content. Previously, agencies had posted content on the Web only to see it moved or its relevance compromised by the proliferation of copies. On the surface, the implementation of persistent identifiers is a relatively simple process where a document is identified by its Permanent Uniform Resource Locator (PURL). The Information Bridge initiative will assign persistent identifiers to a wide variety of Energy material, including scholarly articles, working papers, and informal presentations. To address the need for a naming system, OSTI used PURL, which adds to conventional URLs the guarantee that the document will always be available. Thus far, the project has assigned PURLs to roughly 110,000 documents. A dedicated server, or resolution service, helps redirect incoming Web requests for a given PURL to the appropriate server. Although the actual processes at work are relatively simple, the key to smooth implementation in an environment as large as Energy is custodial oversight. The Defense Department has undertaken an endeavor similar to Energy's, though instead of PURLs it employs the Handle System, which assigns a permanent tag on top of a conventional URL. The projects at Defense and Energy answer the 2002 E-Government Act's call for the adoption of standards that enable the organization and classification of information, of which persistent identifiers are an integral part. Energy has joined CrossRef, the academic linking service, which will make its collections available to all CrossRef users through digital object identifiers. Several government agencies are moving toward Uniform Resource Names (URNs), which will be a multi-part address enabling the generic use of persistent identifiers.
    Click Here to View Full Article

  • "R&D in India: The Curtain Rises, the Play Has Begun..."
    Knowledge@Wharton (09/06/05)

    Experts split the evolution of Indian research and development into a three-act play, with advanced product development being the first act, basic research the second act, and the country's transformation into an R&D giant the third act; general consensus is that Indian R&D has just started the first act. Rafiq Dossani of the Stanford University Institute for International Studies says R&D is following U.S. multinationals' export of IT work to India. Experts say hiring local engineers and researchers makes sense, since they understand emerging markets and can better customize products to those markets; critics, however, allege that offshoring R&D could threaten America's innovation capability. Deo Bardhan at UC Berkeley's Haas School of Business says the globalization of R&D probably constitutes both an opportunity and a threat, although its ultimate effect is too early to evaluate. Some U.S. companies are keeping quiet about the extent of their Indian R&D operations, while firms with a long-entrenched presence in India are more open to disclosure, having achieved a level of comfort cultivated by years of exposure to the nation's massive base of engineering, scientific, and design talent. Microsoft is attracted to India because of its heterogeneous culture, weather, religion, language, and geography, which represent diverse challenges to product transfer; one Microsoft representative says, "Solutions that work in India are more likely to transfer to other locations because they will have been tested across these barriers." Experts predict gradual R&D growth in India as companies struggle to resolve internal issues about outsourcing and work management. Saikat Chaudhuri with the Wharton School of Business sees the intellectual property regime, the defection of talent overseas, and lower levels of basic research as the three major challenges to India's transition into an R&D powerhouse.
    Click Here to View Full Article

  • "From the Lab: Information Technology"
    Technology Review (08/05) Vol. 108, No. 8, P. 83; Baker, Monya

    A team of researchers from the University of Southern California has developed a computer graphics technique that will enable filmmakers to alter the lighting conditions of film once it has been shot and to recreate live-action lighting conditions of a setting in which the actors were never present. Their method, reported in the ACM Transactions on Graphics, had an actor performing inside a two-meter wide spherical structure filled with LED light sources capable of simulating 180 light conditions; the actor was filmed at the speed with which the lighting changed, recording eight seconds of footage that were then downloaded to the researchers' computers, which applied algorithms to recreate the desired conditions. Their work could lead to editing techniques that will save directors the considerable time and money they spend setting lighting conditions each day of shooting to ensure consistency in the final product. To improve information retrieval on the Web, a team of IBM researchers has developed an algorithm that analyzes content and provides results through a more comprehensive understanding of the search terms. Sorting the results into groups of similar topics helps reduce the time spent searching, as search terms often have multiple meanings and contexts, producing a large portion of irrelevant results. Meanwhile, a Microsoft team has developed software to tag audio files with small samplings of their content, known as fingerprints, to make their search easier. The thumbnails created are more likely to contain unique features of the song, such as its title or chorus. As digital music libraries continue to grow, the Microsoft tool reduces the chances of duplicate files appearing through its duplicate detector, which compares songs in search of a matching fingerprint and then compiles a list of repeated songs for the user.
    Click Here to View Full Article

  • "Beware the March of This IDE: Eclipse Is Overshadowing Other Tool Technologies"
    IEEE Software (08/05) Vol. 22, No. 4, P. 108; Goth, Greg

    The Eclipse Foundation reports that the open source Eclipse Project owns at least three-quarters of the development tools platform market for the most sophisticated aspects of its technology, while a survey of over 500 attendees at the March 2005 Java Symposium found that 53 percent of Java developers used Eclipse as their primary integrated development environment (IDE). The Eclipse platform's accommodation of tools developed as plug-ins comprises its core functionality, and one of the Eclipse Foundation's major challenges will be balancing the need to cultivate the fast growth of numerous Eclipse projects with the need to release new technology in reasonable timeframes, says Eclipse Foundation executive director Mike Milinkovich and IBM Rational Software's Lee Nackman. Nackman adds that reconciling overall platform stability and growth will also be important, as people's increasing reliance on Eclipse will raise urgency to decelerate the platform's rate of change. BEA's Tim Wagner says the foundation must never forget the distinction between its outside image and internal reality, noting that "Inside Eclipse, it's...how can we manage what is becoming a very large, diverse software artifact built by people all over the globe and keep it orderly and under control?" Developers' acceptance of the foundation as an organization wholly independent from IBM has been key to the IDE's fast expansion, according to consensus. Microsoft and Sun Microsystems have maintained an aloofness from Eclipse, although there are unofficial collaborations between Microsoft and Eclipse, such as the VSTSEclipse project. Bill Weinberg with the Open Source Development Labs believes in the inevitable cross-fertilization of Sun and Eclipse technology, given the ubiquity of Solaris and Sun's Java tools.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM