HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 7, Issue 826:  August 8, 2005

  • "SpelBots Score With Technology, Education"
    CNN (08/08/05); Walton, Marsha

    The participation of Spelman College students in the RoboCup 2005 contest is a major victory against the stereotypical image of technology as a white, all-male field. The Spelman team was comprised of six young African American women who exhibited a close-knit spirit of collaboration. "You always have that feeling that you always have somebody else by your side," says team member Ebony Smith. Another team member, Brandy Kinlaw, adds that no one is chastised for mistakes or failures. This year's RoboCup competition involved 24 college teams that augmented Sony AIBO robot dogs so they could play soccer without human assistance. The dogs are equipped with cameras to identify colors, such as the orange soccer ball, and the blue and yellow of the goals. The contest provides women such as the Spelman students with a learning experience that could help shape their future research and career tracks; Kinlaw notes, for instance, that working with the dogs yields insights on how the joints of prosthetic limbs move. Ebony O'Neal, another Spelman team member, envisions using robotics and artificial intelligence to create new medical technologies.
    Click Here to View Full Article

    For information on The Coalition to Diversity Computing, visit
    http://www.ncsa.uiuc.edu/Outreach/CDC/?Outreach/CDC/.
    For information on ACM's Committee on Women in Computing, visit http://www.acm.org/women.

  • "You Say You Want a Web Revolution"
    Wired News (08/05/05); Singel, Ryan

    The emergence of Internet-based software applications, such as Google Maps and Amazon's A9 search engine, could pose a challenge to traditional PC operating systems. These new programs are powered by AJAX, short for asynchronous JavaScript and XML, which dramatically enhances the ability of Web pages to interact with data. AJAX's central departure is its ability to absorb new data without reloading the page, which has historically been the major frustration of Web applications. Though Microsoft remains committed to its Windows operating system, it has introduced Atlas, an AJAX-based program geared toward Web developers working in ASP.NET. Jesse James Garrett, a co-founder of Adaptive Path who coined the term AJAX, believes that the new generation of Internet-based software applications are revealing the true potential of the Web. While devoting its primary attention to enhancing Windows desktop applications for its new operating systems, such as the forthcoming Vista, Microsoft may unveil Atlas as early as next month. Microsoft is hedging its bets with Atlas, though group product manager Forest Key touts the incorporation of Windows Presentation Foundation, a user development interface, into Vista as a platform to do "amazing things that approach cinematic user interfaces" well beyond existing AJAX applications. Key and other experts agree that while the more oblique interfaces will fall by the wayside as users come to expect richer and fuller applications, desktop applications will not be totally eclipsed by Internet-based programs, though the emergence of AJAX has clearly signaled that Web-based software can function independently and with great success.
    Click Here to View Full Article

  • "Enhancing Employability of ICT Professionals"
    IST Results (08/05/05)

    The IST-funded Indic@tor study concentrated on the working lives of European ICT professionals in small and midsized enterprises (SMEs) and factors that shape their expertise, with the goal of devising a methodology for identifying and enhancing their employability. The four-year project involved a survey of 1,106 ICT employees and 967 supervisors in Britain, Norway, the Netherlands, Poland, Germany, Greece, and Italy using a Web-based personal development questionnaire. The survey yielded a general profile describing ICT professionals as relatively happy in their work life, healthy, and well educated. They boast an average 5.5 years of experience in their current field of expertise, as well as good relationships with their supervisors and jobs with significant learning value. However, the survey indicates a lack of time to learn and practice new skills, and limited networking opportunities with other ICT pros. In addition, European ICT SMEs possess neither the resources to develop human capital nor an employability enhancement culture. The study suggests that ICT workers could augment their own employability by furthering their education; maintaining their health and keeping personal/professional conflicts to a minimum; acquiring experience in line management; nurturing a beneficial and developmental relationship with their line manager; networking with individuals inside and outside the organization; selecting organizations or department/work units that offer more time to develop new skills; and remaining in a position long enough to master areas of expertise before moving on. Recommended employability enhancement practices for organizations include the design and deployment of human resources systems that improve job satisfaction and opportunities for gaining new skills, augment the learning value of ICT jobs, and enable more interchange between workers in different parts of the organization.
    Click Here to View Full Article

  • "National IDs in Need of a Fix"
    PCWorld.com (08/04/05); Yegyazarian, Anush

    The nationally standardized ID card created by the Real ID Act of 2005 will require few practical changes, since current government photo ID cards are already used for so many purposes, although cardholders will have to pay more and wait longer for the driver's licenses that will double as national IDs. However, there will be privacy-related changes behind the scenes that could increase the possibility of privacy leaks. For example, the databases housing the personal data will not be confined to a single state but will be linked together nationwide, and the licenses will also have to include nationally uniform machine-readable technology. In addition, the licensing agency will have to check information with several sources, including a check on immigration and citizenship status, before issuing a license. There is little guidance in the law for what privacy and security measures will be needed to safeguard the data, although it does require background checks on employees as well as physical security of the premises. The Department of Homeland Security thus should require very high standards for states' data security, and there should also be some serious limits on how the data can be bought and sold. The law also lacks exceptions for people who in the past were able to keep their real addresses off of their licenses, such as battered women or judges. Another concern that has been raised is how people and organizations could abuse the ID cards' machine-readable nature--not just for fraudulent purposes but for other purposes, such as marketing-data collection, that many cardholders might object to.
    Click Here to View Full Article

    For information on ACM's activities regarding the Real ID Act, visit http://www.acm.org/usacm.

  • "Commercializing Open-Source Stirs Debate"
    eWeek (08/04/05); Vaughan-Nichols, Steven J.

    Moves by the Debian Core Consortium (DCC) and Mozilla to commercialize open-source projects are riling members of the open-source community who view such maneuvers as sell-outs, notes Illuminata analyst Gordon Haff. "From articles to online discussion boards to even personal 'real world' discussions, there is increasingly the sense of an open source orthodoxy that must be defended at all costs, and not just from its enemies," he says. One vocal opponent of open-source commercialization is OpenBSD operating system leader Theo de Raadt, who says he is not surprised at this development, in light of news reports indicating that U.S. firms appear to be entitled to more rights than either nonprofits or individuals. Others say owning a for-profit can make sense to larger software projects that require a full-time staff to help ease the burden for volunteers and facilitate sustainability, while open-source proponent Bruce Perens says "most open source projects are able to muster a large programming staff without directly paying for" it. Other open-source figures see nothing particularly earth-shaking about commercialization, nor do they consider such a move to be an act of betrayal. Open Source Initiative co-founder Eric Raymond explains that open-source commercialization has been around as far back as 1993, when the first CD-ROM based Linux distribution came out.
    Click Here to View Full Article

  • "A Standards Truce in the Browser War?"
    CNet (08/04/05); Festa, Paul

    As tensions have eased between Microsoft and Web standards advocates, industry watchers are optimistic that Microsoft may be softening its stance toward standards compliance: The most visible signal of this trend is the partnership Microsoft forged last month with the Web Standards Project (WaSP), an organization whose relationship with the software giant had long been characterized by antagonism and public clashes. Developers expect to yield the most significant benefit from Microsoft's new stance, as they have devoted countless hours to coding pages to conform to Internet Explorer over the years. WaSP co-founder Jeffrey Zeldman, no longer active with the organization, said that the standards initiative began "because we were wasting too much time--and charging our clients too much money--working around browser differences instead of focusing on brand and usability issues." Microsoft possesses 90 percent of the browser market share, and thus faces no real mandate to adhere to standards, though voluntary compliance will lead to a dramatic improvement in the progression of the Web, in addition to yielding considerable time savings for developers, standards advocates believe. In the past, WaSP had publicly advocated alternatives to Microsoft's Internet Explorer through its satellite Web site Browse Happy; their reconciliation came about through overtures between the respective blogs of Microsoft's Robert Scoble and WaSP's Molly Holzschlag. The ensuing dialogue led to collaboration between Holzschlag and Microsoft developers at work on Virtual Studio, ASP.NET, Internet Explorer, and other programs. Microsoft has historically embraced standards at times, but critics argue that the company has done so only when it expected to see a direct benefit, such as in the early days when Internet Explorer was struggling to overtake Netscape's dominant position.
    Click Here to View Full Article

  • "An Exploratory Assessment of the Pedagogical Effectiveness of a Systems Development Environment"
    RedNova (07/30/05); Meso, Peter; Liegle, Jens

    Georgia State University information systems professors Peter Meso and Jens Liegle have used the theory of technology acceptance to evaluate the new Visual Studio.NET suite of technologies as a pedagogical tool for teaching a course in technical information systems (IS). A performance-based comparison of students who used the .NET technology to design and construct an object-oriented distributed system against those who used the more conventional J2EE technology demonstrated a correspondence between the variables that encouraged the selection of .NET and those outlined by the technology acceptance model (TAM)--namely, usefulness and ease of use. Usefulness, as it pertains to IS technology for building IS solutions, is defined as how well a system lets designers develop efficient, scalable, robust, reliable, and secure IS solutions that also uphold integrity and data quality. An easy-to-use technology, meanwhile, delivers intuitive-like features that guide the user through established automated processes for turning specifications into functional program modules or components, mesh them into a functional unitary system, and deploy the system into production. Teams that chose .NET and those that chose J2EE performed equally well during the project's implementation/deployment and presentation stages, in keeping with the researchers' hypothesis that both technologies were relatively the same in terms of usability and ease of use. However, .NET users reported substantially fewer technical difficulties in the deployment stage. Meso and Liegle write that the TAM framework offers a known framework with well entrenched decision criteria for assessing technology prior to classroom use. This suggests that TAM and other information diffusion models can be used to effectively gauge a specific technology's suitability and fit for teaching technical IS courses.
    Click Here to View Full Article

  • "A New Way to Authenticate Your Identity?"
    Associated Press (07/30/05)

    Congressional lawmakers are hoping to improve identity security by restricting access to Social Security numbers, such as eliminating the numbers from inclusion on benefit checks, increasing use of encryption when dealing with Social Security numbers, and decreasing the ways in which the numbers are sold. However, even with these added restrictions, many companies and organizations are putting identity data at risk by using Social Security numbers as employee or student ID numbers or as account passwords. ChoicePoint's James Lee thinks banning Social Security number harvesting by data brokers would be deleterious to the accuracy of background checks and other reports, given how important the numbers are to distinguish people with similar names and analyze financial histories. Jody Westby of PricewaterhouseCoopers suggests increasing identity security by implementing universal use of fraud alert, which requires all card issuers and loan providers to contact applicants and ensure their identity before granting credit. Many security companies are working on new technology that will go to further lengths to protect Social Security numbers and other personal identity data. Liberty Alliance technologists have labored for several years on a method in which people can log in one network and be automatically authenticated at another using an encrypted numeric token.
    Click Here to View Full Article

  • "Next Version of GPL Coming in 2007"
    IDG News Service (08/04/05); Martens, China

    The General Public License (GPL) will be updated to meet the growing demand for free software. The most popular license for free software was last updated in 1991, and GPL 2 helped make free software accessible to more than a very select community, according Eben Moglen, a member of the board of the Free Software Foundation. Moglen, who also chairs the Software Freedom Law Center and is a professor of law and legal history at Columbia University Law School, will address the issue of a new version of the GPL during a talk at the LinuxWorld show next week in San Francisco. Clarifying the language of the GPL would enable the license to become more accessible to lawyers around the world, according to Moglen. "The GPL needs to recognize global copyright more explicitly," says Moglen, who adds that the license also needs to be updated to take advantage of the latest technology, such as new Web services. The foundation continues to receive recommendations on the draft license, and a draft discussion disclosing the rationale for the decisions made could be available at the end of 2005 or early 2006. Moglen wants to release GPL 3 in early 2007.
    Click Here to View Full Article

  • "Key Bugs in Core Linux Code Squashed"
    CNet (08/03/05); Evers, Joris

    Six critical defects in the core file system and networking code of Linux version 2.6.9 were discovered in December by Coverity, but a recent scan of Linux version 2.6.12 found no such programming errors, says Coverity CEO Seth Hallem. This indicates the maturation of Linux as an operating system as well as its core code security. The scan did uncover 1,008 defects, which is an increase over earlier analysis when just 985 defects were discovered, according to Coverity. Critical bugs, or those causing vulnerabilities, were fixed, but the Linux version remains buggy. The Coverity scan measured a decrease in overall bug density from 0.17 bugs per thousand lines of code to 0.16 bugs. Hallem says flaws in the file system and networking code were rated more serious because all Linux users will employ those elements. Since Microsoft does not make its Windows operating system kernel source code available, a side-by-side comparison of Linux with Windows is not possible, laments Coverity.
    Click Here to View Full Article

  • "Wireless Sensors Land Anywhere and Everywhere"
    Electronic Design (07/21/05) Vol. 53, No. 16, P. 65; Allan, Roger

    The commercialization of wireless sensor networks has begun thanks to advances in hardware device miniaturization, lower power consumption levels, and small software operating systems, and wireless sensor-net technology is expected to become a pervasive element of our daily lives once certain technical kinks are ironed out. Applications once thought to be unrealistic are now achievable because of wireless sensing technology, which facilitates signal monitoring in hard-to-access locations and makes factory-floor cabling redundant. The fundamental components of wireless sensor nets are minuscule "mote" computers that run on batteries and use radio to communicate with each other as well as configure themselves into ad hoc networks. "Wireless sensor nets will become most ubiquitous in commercial markets for the near future, with applications ranging from security and bio-detection to building and home automation, industrial control, pollution monitoring, and agriculture," says Avaak CTO Bar-Giora Goldberg. Sensors' presence in the automotive industry is particularly strong in tire-pressure monitoring systems and automatic remote-meter-reading applications, and wireless sensor nets also hold promise for the homeland security market. The sensor market segment exhibiting the fastest growth is the image sensor segment, which is being fueled by breakthroughs in affordability, image resolution, and low power dissipation. Sensor nets' potential will undoubtedly expand as sensors, transceivers, antennas, batteries, controllers, and communication protocols and topologies continue to improve. It is expected that dust-sized sensors will eventually become de rigueur.

  • "From Push to Pull: The Next Frontier of Innovation"
    McKinsey Quarterly (08/05); Brown, John Seely; Hagel III, John

    Most companies mistakenly assume that "push" resource mobilization systems marked by top-down, centralized, and inflexible programs of previously specified operations and behavior cultivate efficiency; in fact they impede participation in the distributed networks that are now crucial to competitive advantage, limiting the number and diversity of participants and, by extension, innovation and learning. Modularly designed, decentralized "pull" systems with loosely coupled components offer more flexibility and thus greater participation, and their increasing importance will force executives to rethink their company's prerequisites for success. Pull systems represent a more versatile strategy to mobilize resources that may exist inside or outside the organization. Pull models continuously attempt to widen the scope of resources available to participants while helping them discover the most appropriate options, as well as enabling even peripheral participants to creatively exploit opportunities as they come up. The pull model is particularly pronounced in the mass media, which has undergone a transformation through the digitization of content and the emergence of new means of access, assembly, and distribution via the Internet. Pull resource mobilization platforms are also gaining ground in product businesses, especially those marked by compressed life cycles and rapidly developing customer demand. Areas that have benefited from the pull approach include supply chain management and manufacturing, product innovation, employee learning and education, and open-source software.
    Click Here to View Full Article
    (Access to the full article is available to paid subscribers only.)

  • "Preserving Maps for the Future"
    Federal Computer Week (08/01/05) Vol. 19, No. 25, P. 54; Sternstein, Aliya

    Preserving cartographic data is critical, but determining which geographic information is worthy of preservation and managing it is a formidable challenge, according to officials of the U.S. Geological Survey (USGS). The USGS National Geospatial Programs Office was established last August to collate and make available data from federal, state, local, and private sources for the Geospatial One-Stop and National Map projects; the former initiative is a Web portal and the latter is a continuously updated online topographic map. Hank Garie with the National Geospatial Programs Office says the federal government lacks the resources to map the entire country, while librarians say better approaches to preservation are needed if future generations are to clearly understand geospatial documents. Linda Zellmer, director of Indiana University's Geology Library, says large states may not be able to afford digital maps, while another problem is a lack of understanding of geographic information system terminology among many users. "The system should be intuitive enough so that anyone can use it," she says, adding that a librarian should not have to serve as a mediator in accessing maps. The National Map project is of particular concern to librarians: Donna Koepp with the Harvard College Library's Social Sciences Program points to a lack of assurance from government officials that libraries will be able to freely access the map, that earlier versions of the map will be preserved, and that librarians will be able to download and print full-size maps. She also notes that screen size will make the viewing of full-size maps difficult.
    Click Here to View Full Article

  • "Development Teams Get Bigger, Richer"
    Embedded Systems Programming (07/05) Vol. 18, No. 7, P. 38; Turley, Jim

    Embedded Systems Programming's survey of European and North American embedded systems developers shows a general increase in the size and funding of project-development teams. The average size of respondents' teams was 13.1 people, which was relatively consistent across the military, security, automotive, and networking sectors. Sixty-five percent said their team members all worked in one locality, either a single building or city, while the one-third of development teams spread across multiple systems or multiple countries were usually to be found in defense, telecom, aerospace, and industrial control organizations. A quarter of respondents said their development teams have grown since their last project, while 24 percent reported an increase in funding between projects. The survey validates the existence of a software bias among development teams across all industries: Fifty-three percent said more team members were programmers than engineers; two-thirds reported spending more time developing software than hardware; and 44 percent indicated higher spending levels for software development, most likely on programmers' salaries. A distressing finding is that developers are spending more time testing and debugging software or hardware than they are creating it, which would indicate that either developers are severely underskilled or debugging tools are dramatically deficient. Programmers also devote more time to debugging than hardware engineers do, a trend most probably caused by software development's dominant role in projects.
    Click Here to View Full Article

  • "The State of Surveillance"
    BusinessWeek (08/08/05) No. 3946, P. 52; Yang, Catherine; Capell, Kerry; Port, Otis

    Future surveillance technologies may be more effective terrorism deterrents, and the public's apparent acceptance of the increased privacy infringement they entail is encouraging research in this area. Scientists at the University at Buffalo and elsewhere are investigating systems that could turn one's breath, saliva, or body odor into a biometric ID; however, biometrics' tendency to generate false positives is a problem, while the increased use of biometrics for identity authentication, building access, and so on raises the risk of theft or forgery. Biometrics-based surveillance technologies being funded by the U.S. government include camera-based software that can identify people from a distance according to their strides, and systems for tracking known criminals by comparing their irises to prints in a database. Faster and cheaper data processing and storage systems carry with them the likelihood of serious privacy infringement, by establishing an infrastructure for stitching together a detailed picture of a person's daily activities from the various pieces of data he discloses throughout the day. Furthermore, the government is focusing on the use of software to mine multiple databases in order to extract relationships between people to aid in criminal investigations, which could also be applied to eavesdropping. The most highly prized concept is a universal sensor capable of identifying any known pathogen or toxin, which would be distributed in networks. Among the most uncomfortable aspects of high-tech surveillance is the fact that citizens will be at the mercy of the government and corporations, which will reserve the most advanced surveillance technologies for themselves. In addition, widespread consumer use could lead to abuses, such as the victimization of individuals with unpopular views.
    Click Here to View Full Article

  • "Tomorrow: A Sneak Preview"
    Business 2.0 (08/05) Vol. 6, No. 7, P. 77; Lotsson, Anders

    Emerging technologies are taking shape in centers spread throughout the globe, in states of development ranging from R&D to beta testing to early adoption. Hotspots on the U.S.'s West Coast include Seattle, where a pilot Wi-Max wireless Internet network has been deployed, and Berkeley, home of wireless smart dust network technology the Energy Department is using to test building energy conservation schemes. Also being beta-tested are biometric payment systems in three Southern U.S. states, while Motorola is developing a prototype nano-emissive display in Illinois. Plastic spray-on electronics being beta-tested by Plastic Logic and Cambridge Display Technology in England promise to offer more design flexibility and lower cost for incorporating circuitry into consumer appliances. In Singapore, radio-frequency ID (RFID) tags are being used to monitor the movements of incoming and outgoing cargo, track books in libraries, and collect road tolls. Bangalore, India, is a hotspot for inexpensive computers such as Encore and Picopeta's Simputer, a handheld pen device used by illiterate villagers as well as the military. Japanese companies such as Kawada Industries are pushing the envelope in the field of assisted-living robots, while high-bandwidth, fourth-generation wireless telephony is under development in Beijing. Machine language translation technologies are being used at the European Union's Brussels headquarters, and commercial transactions facilitated by mobile phones are catching on in Helsinki, Finland.
    Click Here to View Full Article

  • "Wide Open Spaces"
    CIO Insight (07/05) Vol. 1, No. 55, P. 36; Baker, Edward

    University of California at Berkeley political science professor and author Steven Weber says the open-source movement is commoditizing information technology at the software level and eliminating certainty in the belief that tight protection of intellectual property is the only way to build a sustainable value-creation system. Commoditization gives rise to competition, which is fundamentally beneficial to the industry, Weber argues. He says the focus on intellectual property rights has led to "the tragedy of the anti-commons," a situation in which small companies or individuals who wish to experiment with patented inventions cannot because of the prohibitively high cost of determining what permissions must be obtained to use the products. Weber says the open-source movement can be of value to managers because participation in open-source encourages creativity and learning during the creative process. This shows managers that there are other motivations besides money that most organizations do not take advantage of. "It's critical to recognize that if you give people the infrastructure to create their own products, they're likely to figure some out, because they know what they need better than you do," Weber reasons. "I think the open-source community, at least at the level of underlying operating systems, has done that, not necessarily because that was what they intended to do, but they created an ecology in which that's possible." Weber says the open-source community believes embedded software and devices and utility grid computing--not the desktop--constitute the next big growth market for IT infrastructure.
    Click Here to View Full Article

  • "'Madagascar' Tech Turns Imagination Into Reality"
    eWeek (08/01/05) Vol. 22, No. 30, P. N2; Galli, Peter

    At a recent media conference about the DreamWorks animated film "Madagascar," DreamWorks CEO Jeffrey Katzenberg stressed that technological advances are the key element in translating imaginary worlds into animated reality. He explained that every component of the animation process is shaped by technology, noting for example that the last 13 years have witnessed the expansion of the color palette from just four colors to 250. "Madagascar's" formidable production design, animation, and rendering needs were met through a combination of Hewlett-Packard systems running Linux software and DreamWorks' in-house E-motion operating system, which has been continually developed for more than two decades. E-motion uses a programming language similar to C, which gave technical users a certain degree of control over their work and saved time by making some components programmable. DreamWorks Animation's Rex Grignon said films such as "Madagascar" could not have been created without technological advances made over the last five years; "Our desktop machines have benefited from the surge in available memory, and that allows us to now deal with those elements that were previously very complex far more easily at the desktop level," he said. Maintaining the proprietary E-motion software is enormously expensive, but Grignon said it is advantageous. "One of the best things about this is that I can talk to the people who wrote the actual software, explain to them my specific needs or problems, and know that they will start working on that immediately," he noted. The production team's servers, laptops, desktops, and notebooks were HP models powered by AMD Opteron processors, while Opteron-based HP ProLiant servers comprised the render farm for production.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM