HomeFeedbackJoinShopSearch
Home
 
ACM TechNews sponsored by Learn more about Texis, the text-oriented database providing high-performance search engine features combined with SQL operations and a development toolkit, that powers many diverse applications, including Webinator and the Thunderstone Search Appliance.   Learn more about Texis, the text-oriented database providing high-performance search engine features combined with SQL operations and a development toolkit, that powers many diverse applications, including Webinator and the Thunderstone Search Appliance.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Thunderstone or ACM. To send comments, please write to [email protected].
Volume 7, Issue 785: Friday, April 29, 2005

  • "House Signs Off on Supercomputing"
    Federal Computer Week (04/27/05); Sternstein, Aliya

    This week the U.S. House of Representatives approved the High-Performance Computing Revitalization Act of 2005, which calls on the Department of Energy and the National Science Foundation to ensure the availability of supercomputers to American engineers and scientists, and names the director of the White House Office of Science and Technology Policy tasked with supervising all federal initiatives. The legislation requests improved supercomputing hardware, software, standards, and training, but does not authorize new funding, which is an area of concern for research and development organizations. Computing Research Association (CRA) Chairman James Foley said the Defense Advanced Research Projects Agency's de-emphasis on basic research in favor of technology with more short-term applications has "left gaps in the federal portfolio that threaten to constrain future innovation in IT." In addition, budgetary cutbacks are forcing layoffs of federal government supercomputing employees. For instance, NASA officials recently announced that as much as one-quarter of the Ames Research Center's supercomputing and space exploration robotics work force would be let go. Unlike last year's congressionally approved Energy Department High-End Computing Revitalization Act, the House legislation pertains to all IT R&D efforts in agencies within the bounds of the House Science Committee, not just the Energy Department.
    Click Here to View Full Article

  • "Adventures in the Skin Trade"
    Technology Review (04/29/05); Brown, Eric S.

    Nearly 10 years after MIT Media Lab researchers Thomas Zimmerman and Neil Gershenfeld created a prototype intra-body communication network that tapped the naturally-occurring electrical fields of human skin to transfer data between devices, NTT Labs in Japan has developed an application based on that research, in the form of RedTacton "human area networking" technology. First demonstrated at Siggraph two years ago as ElectAura-Net, the RedTacton system is a transceiver consisting of an optical receiver circuit outfitted with a photonic electrical field sensor and a crystal to send data over the human body between wearable devices at a maximum transmission rate of 10Mbps. No direct connection to the skin is required, and NTT's Hideki Sakamoto believes the pocket-sized device will soon be small enough for embedding within cell phones. He says the system presents no danger to the user since no current flows into the body, while atmospheric factors such as temperature, rain, or electrical storms cannot affect signal integrity--although electromagnetic waves sometimes can. Sakamoto says the major advance for RedTacton is the elimination of interference between wireless devices. He expects RedTacton to be incorporated into security applications before commercialization, and the system's immunity to wireless snooping would permit such activities as private touch-based purchases; however, RedTacton could also be used for Big Brother-type surveillance. Problems with instability, noise, and power consumption have been eliminated from the technology since the 2003 Siggraph demo, according to Sakamoto. Even with RedTacton's convenience and potential usefulness, Sakamoto does not discount the presence of psychological factors that may hinder its mainstream acceptance.
    Click Here to View Full Article

  • "NCSA Allows Faster, Better, Cheaper Engineering"
    HPC Wire (04/29/05) Vol. 14, No. 17; Ricker, Kathleen

    Large construction projects such as highway improvements involve high costs, traffic disruption, and worries over quality, but civil engineering researchers at the University of Illinois, Urbana-Champaign (UIUC), have created software to help optimize those projects. Faster, cheaper, and better are three axes that need to be balanced in engineering and software development projects, and in the case of large public works projects, finding the optimal compromise is an incredibly complex task. There are often up to 700 separate tasks that must be completed during at large highway construction project, and modeling these projects involves an exponential number of situations. UIUC assistant civil engineering professor Khaled El-Rayes partnered with the National Center for Supercomputing Applications (NCSA) to make his software run in parallel so that state and federal highway agencies would be able to run it on office computers over a weekend; the software is based on a genetic algorithm and tailored to provide three-dimensional time-cost-quality trade-off analysis. El-Rayes would like to add other considerations as well, such as safety, service disruption, and environmental impact. He used records from the Illinois Department of Transportation to develop quality measurements. For a large construction project encompassing roughly 700 activities, the software model takes 430 hours of computation on a single processor, but that time was narrowed to just 55 hours--a weekend--on a distributed office grid comprised of between 10 and 15 processors. The software model is also expected to help other large public works, such as arena or bridge construction.
    Click Here to View Full Article

  • "Bringing the Internet to the Whole World"
    Washington Post (04/29/05) P. E1; Krim, Jonathan

    Advanced Micro Devices intends to bridge the digital divide between computing haves and have-nots by making computers with Internet access available and affordable to half the world's population by 2015 through the distribution of low-cost PCs. AMD's Personal Internet Communicator (PIC), which costs roughly $200, is a stripped-down PC pre-loaded with a Microsoft operating system as well as Web browsing, email, instant messaging, spreadsheet calculations, and word processing programs. The PIC also boasts connections for standard monitors, a printer, and either dial-up or high-speed Net access, while distributors should be able to install specialized software for specific customers later on. The device cannot run most popular software programs or support games, but such restrictions were incorporated to fortify the system against malware, keep costs low, and promote ease of use for first-time users. The product targets people who earn a yearly income of $5,000 to $10,000, which University of Michigan business professor C.K. Prahalad argued is the demographic corporations should focus on if they wish to make profits while also eliminating poverty. Each PIC, which costs AMD about $160 to manufacture, is sold to Internet providers, who then sell it to consumers; using Internet providers gives AMD CEO Hector de Jesus Ruiz more control over machine distribution. Enrique Camacho with Internet provider Cable & Wireless says the PIC, which is also sturdy and low-power, could become very popular in Third World schools. Ruiz acknowledges that putting PICs in the hands of half the world by 2015 "will take a lot more effort than we probably thought," and says Third World governments must pull their weight by reducing taxes on the machines.
    Click Here to View Full Article

  • "License Proliferation: When More Is Less"
    Linux Insider (04/26/05); Albert, Philip H.

    With more than 50 open source software licenses available, developers are often at a loss when deciding on which license to use. Nobel Laureate Herb Simon was once quoted as saying an abundance of options leads to people choose less optimal options for the sake of minimizing risk; in the open source licensing context, this means developers may not take the time to perform an exhaustive evaluation of the risks and benefits associated with each choice. Eric Raymond has commented that patent attorneys advise developers to do only cursory review of intellectual property issues, but sound legal practice always involves evaluation of risks. With a more streamlined open source licensing system, developers would be able to more accurately target the license that meets their needs, whether it is return-on-investment, widespread adoption of the technology, or creation of a vibrant community of contributors. Thankfully, companies such as Intel and the Open Source Initiative (OSI) itself have taken early steps to reduce the number of open source licenses: Intel voluntarily de-listed its Intel Open Source License, which was basically a BSD license with an export clause added. Companies have often used open source licensing as another front in ideological wars, but still are coming together to try and rationalize the number of licenses, such as with a joint IBM-Sun-Computer Associates effort to create a common commercial open source license based on Sun's Common Development and Distribution License. OSI says it will also institute new audit rules for open source license submissions, requiring new licenses not be duplicative, be clearly written, and reusable; the OSI will also classify licenses as "ordinary," "preferred," and "deprecated." The group has also set up a license proliferation committee to work on streamlining available licenses.
    Click Here to View Full Article

  • "Mining Reality"
    TheFeature (04/28/05); Pescovitz, David

    As a participant in MIT Media Lab's Reality Mining project, graduate student Nathan Eagle has compiled about four decades' worth of continuous data on human behavior using communication, location, activity, and proximity information taken from 100 cell phone users, and this data could be used to enable new mobile phone functionalities. Reality Mining, as Eagle describes it, is the capturing of objective data on human behavior so that behavioral patterns can be more accurately predicted. The grad student says he splits mobile phone data into three categories: Location, determined through the use of cell tower IDs; communication taken from logs that list call recipients and frequency; and proximity revealed by Bluetooth scans every five minutes. Eagle says this data can facilitate behavior prediction by identifying patterns indicating life routines, which could be applied to the customization of a phone's appearance and operation for specific user demographics. He notes, for example, that about three out of every 10 MIT students in his study use their phone's clock as an alarm each day, but setting the alarm takes 10 keystrokes; from this observation, the value of a phone that assigns a single button to the alarm clock becomes clear. The grad student says he intends to apply the human behavior data he has collated to various disciplines, including computer science, social science, and physics. Eagle has created an application, Serendipity, designed to identify people with common interests by studying their distance in behavior states and/or proximity; Eagle admits that Serendipity raises privacy issues.
    Click Here to View Full Article

  • "New Intelligent Dictionary Searches Make Sense"
    IST Results (04/29/05)

    The IST-funded BENEDICT project has yielded a semantic context sensitive dictionary lookup that can facilitate more intelligent online dictionary searches. The BENEDICT dictionary does more than scan the text around the search word and provide the proper base-form and syntactic category; it also highlights the correct meaning of the entry via the Domain Detection System (DDS). BENEDICT coordinator Jukka-Pekka Juntunen explains how the system works: Once the user chooses an onscreen word with the mouse button, the BENEDICT tool appears and reads the text with optical character recognition, tags the surrounding text semantically, references the word from the dictionary, and highlights the corresponding meaning with the DDS. However, Juntunen acknowledges that semantic dictionary look-up is a formidable challenge, as there is frequently little available onscreen context; "The biggest problem is that users expect a feature like this to be perfect and often ask just the difficult questions," he says. The project involved the development of a semantically sensitive Finnish-English-Finnish dictionary and a French-English-French dictionary. Juntunen says BENEDICT has demonstrated a fuzzy search feature that helps users find a word that they have only heard and do not know how to spell. Prototype mobile versions of BENEDICT are currently available for the Microsoft PocketPC and Nokia Communicator operating systems.
    Click Here to View Full Article

  • "IT Industry Seeks to Set Parameters for Growth"
    Moscow Times (04/27/05) P. 1; Pronina, Lyuba; Levitov, Maria; Belton, Catherine

    Russian President Vladimir Putin has a vision to transform Russia into an IT giant, and he recently disclosed that this vision involves creating 10 Russian IT parks aided by tax breaks by the end of the decade; Russian IT and communications minister Leonid Reiman says the state is planning a five-year IT investment of $644 million to make Russia one of the top 10 global IT players by 2010, by which time the country should be pulling in annual outsourcing revenues of $40 billion. However, industry players see the need for immediate tax breaks and an appropriate road map for IT parks if Russia is to retain its competitive advantage in terms of technical expertise. Companies are displeased with the plan to construct IT parks from scratch and only institute internal tax breaks: Russoft director Valentin Makarov warns that the parks will be able to accommodate no more than 15% of industry players, which will lead to corruption. Meanwhile, Luxoft CEO Dmitry Loshchinin says Russian competition with outsourcing behemoth India is almost inconceivable given Russia's tax burden--Russian companies pay 30% on revenues, while Indian companies only spend 5%. Special tax breaks instituted for IT parks could be exploited by non-IT companies, since the legislation's definition of IT services outsourcing is hazy. Still, Russia prides itself as having a greater concentration of scientists per capita than India, Germany, Britain, or France, while the Auriga software company says the number of Russians earning advanced degrees in computer science and software engineering has risen significantly from 2003. "We should not strive to provide outsourcing services at a lower price [than India], but create new products and become providers of more complex solutions," argues Makarov, citing Russian IT workers' outstanding mathematical and creative skills.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "Just Like the Real Thing"
    Straits Times (04/26/05); Tan, Aaron

    The National University of Singapore's (NUS) Mixed Reality Lab (MRL) and Nanyang Technological University's Center for Advanced Media Technology (CamTech) are engaged in research into "mixed reality" technologies that enhance real-world views with virtual object or data overlays. One project is MRL's Human Pacman game, in which participants wear special headgear that fuses their point of view with digital "cookies" and "ghosts" distributed throughout the NUS campus. Another prototype mixed reality technology is a Malaysian tourist book that displays 3D virtual objects of key sights as users read the pages and listen to recorded narration. The Augmented Reality Chinese Character Learning system developed by CamTech director Dr. Wolfgang Muller-Wittig and his team generates 3D pictorial representations of Chinese characters using mixed reality software and a Webcam. The Webcam captures the image of a place card with a Chinese character printed on it, and the software identifies markers on the card that signify the specific character, producing a corresponding virtual object. Muller-Wittig says mixed reality technology can also be applied to architecture. Mixed reality has been identified as an area of concentration in the Infocomm Development Authority (IDA) of Singapore's ITR5 technology roadmap, which cites the need to comprehend and manage expanding volumes of data. "Mixed reality will allow us to communicate ideas and learn through perception, rather than just in a textual mode," asserts IDA deputy director for technology direction Raymond Lee.
    Click Here to View Full Article

  • "Speech Recognition Software Slowly Making Progress"
    TechNewsWorld (04/27/05); Korzeniowski, Paul

    The future of speech recognition is promising if not spectacular, as the technology is experiencing gradual but steady growth. "Because of the growing emphasis on customer service recently, many companies have become interested in speech recognition systems," says Datamonitor analyst Daniel Hong, who also says speech recognition product and service pricing experienced a 30% decline last year. This trend stems from the simplification of product development via vendor adoption of standards such as Voice XML (VXML) and Speech Application Language Tags (SALT). However, the existence of different standards complicates interoperability between different speech recognition systems. Many speech recognition adopters are companies seeking to streamline or automate customer service, thus saving money and sparing customer service representatives from repetitive chores. Gartner analyst Steve Cramoysan says speech recognition rates have substantially improved thanks to new algorithms and more powerful processors, but Yankee Group analyst Art Schoeller says speech recognition systems still lack the sophistication to sift through the diverse reasons customers call for help. Usage of these systems is mainly restricted to closed questions, which give users only a handful of possible answers; companies must therefore devote a considerable amount of time to the design of speech recognition applications, and so they charge buyers a lot for professional services teams that assist with the design and development of such applications. Another reason for speech recognition technology's slow progress is the hyping of easier-to-deploy Web-based customer service by vendors.
    Click Here to View Full Article

  • "Florida Planning Son of Matrix"
    Wired News (04/25/05); Singel, Ryan

    Florida officials have put out a request for information on a sequel to the controversial Multistate Anti-Terrorism Information Exchange (Matrix) project, which combined government and commercial personal data to help track down terrorists and other criminals. The document outlines a more ambitious network than the original Matrix because it includes requirements for financial and insurance information that was not included in the original database. In addition, the requirement for financial and insurance information means that data broker ChoicePoint is a likely candidate to build the system since that company is the only vendor with a centralized insurance claim database. ChoicePoint was recently discovered to have unwittingly sold profiles on hundreds of thousands of individuals to identity thieves. The first Matrix project was began with $12 million from the departments of Justice and Homeland Security, but fizzled out after funding was not renewed and nine of the original 13 participating states pulled out. Florida officials conceptualized the first project and hope to sign on other states for the second version. The database was accessed 1,866,202 times during its three-year duration, but less than 3% of the searches were related to terrorism even though the system was promoted mainly for that reason. Civil liberties groups say sharing government information is a good idea, but worry about the inclusion of commercial data that is often error-prone.
    Click Here to View Full Article

  • "A Crisis of Prioritization"
    Computerworld Australia (04/27/05); Bajkowski, Julian

    A new report from the President's Information Technology Advisory Committee (Pitac) warns that the emphasis on bolstering national security in the wake of the 2001 U.S. terrorist attacks has left a critical element--cybersecurity of civilian technological infrastructures--severely underfunded. The report concludes, "The information technology infrastructure of the U.S., which is now vital for communication, commerce, and control of our physical infrastructure, is highly vulnerable to terrorist and criminal attacks." The Pitac study notes that in terms of research and development priorities, research emphasis is just as important as funding levels, if not more so. Pitac calls for the incorporation of holistic-level security within current and nascent architectures, which entails a change in thinking and IT design beliefs instead of pouring vast amounts of money into intermittent patches without addressing immediate problems. The committee says the federal government must guide the rehabilitation of the IT industry, asserting that "an expanded portfolio of U.S. federal cybersecurity R&D efforts is required because today we simply do not know how to model, design, and build systems incorporating integral security attributes." The Pitac report has many supporters in Australian government and industry, and AusCert director Grahame Ingram says vendors have started to make security a much more integral component of software and hardware design in the last few years. Professor Bill Caelli of Queensland University of Technology says the Pitac report was alarming and cites the need for a top-level reconsideration of embedding security within IT.
    Click Here to View Full Article

  • "What Does the Future of Communications Hold?"
    TechRepublic (04/21/05); Artner, Bob

    Communications channels are merging with one another, as well as with new search capabilities, says INBOX conference chair Martin Hall, whose gathering features discussion and exhibits about enterprise messaging. With email now a mission-critical business application, companies need to consider corporate policies that deal with government regulations and outbound email management. Email messages are not analogous with telephone conversations, but are digital business records and therefore businesses should control what is being sent out of the company. Integration among instant messaging, VoIP, email, and now RSS feeds is occurring slowly, but is evident with the new Mac OS X and Microsoft Windows versions including RSS reader capabilities; that said, email, instant messaging, VoIP, and desktop providers are all readying strategies to control the interface and promote their own brand in an integrated communications environment. On mobile devices, hardware limitations and a relatively immature market for mobile email and messaging mean vendors have more motivation to integrate messaging, email, and voice than with desktop systems. Hall says authentication, accreditation, and reputation are all being added to communications to make them more secure and to weed out spam and phishing attacks. SenderID is in the implementation stage, while digital signature approaches Domain Keys and Cisco's IIM are being encouraged to merge their specifications; further out, signing solutions and frameworks for reputation will be implemented. Desktop search is an important developing area and will likely involve email applications as well, like what is seen with Google's Gmail, but Hall says, "The battle will not just be about Web page search, but email search too."
    Click Here to View Full Article

  • "Software to Simulate Human Heart"
    Daily Californian (04/27/05); Lu, Andrea

    A cross-disciplinary team of university researchers has created the first-ever computer simulation of the human heart's fluid dynamics using algorithms developed at the University of New York and computer software developed at the University of California, Berkeley. UC Berkeley computer science professor Katherine Yelick has worked on the project since 1993, and helped develop the Java-based computer programming language Titanium to help create highly parallel software. Modeling fluid dynamics in the heart is much more complicated than fluid dynamics of a car engine, for example, because the tissue walls in the heart expand and contract. In order to realistically capture that movement and fluid interaction, Yelick had to team up with mathematicians from the University of New York who created an algorithm for flexible structures called the "immersed boundary method." The eventual goal is to create a software model of the entire human body, but that project could take as many as 20 years. Within five years, a detailed organ system can be modeled, Yelick estimates. The computer simulations will help design more effective medical aids such as prosthetics and artificial heart valves. Yelick says communication between different fields of expertise is increasingly difficult because of the highly specialized nature of mathematics, biology, and other disciplines. However, she says students are being taught communication skills that will help them conduct similar cross-disciplinary work in the future.

  • "Biotech Data's Big Bang"
    EE Times (04/25/05) No. 1368, P. 1; Brown, Chappell

    The automation of biochemistry is fueling an explosion in biotech data worldwide, shrinking the time it takes for such data to double in volume from 18 months a few years ago to six to three months today. Organizations charged with biotech data management are using a broad spectrum of new tools offered by major companies as well as specialized services from smaller startups and consultants that operate on high-performance biological data servers or supercomputers. The tapping of computers, databases, and algorithms to process the swelling volume of biotech information is formally known as bioinformatics, but the definition is mutable given the field's rapid expansion. "It's not a single science, actually, but an aggregate of interdisciplinary areas," notes Isidore Rigoutsos, manager of IBM's bioinformatics and pattern-matching initiative. Storing and managing the biotech data requires reengineering database management systems, followed by the development of algorithms for analysis and modeling of data for particular functions. IBM's scalable Blue Gene supercomputer boasts processor configurability for fulfilling the computational demands of specific projects, while biologists and biotech researchers are looking to supercomputing as a tool for "in silico" experiments where the use of living tissue or lab-on-a-chip technology is unnecessary. Stanford University researcher Vijay Pande, meanwhile, devised a distributed computing scheme to model protein-folding without supercomputers by utilizing 200,000 desktop systems that execute segments of a protein-folding algorithm when idle. With in-silico systems a distant vision, companies desiring a niche in bioinformatics are integrating sophisticated technologies, including robotic systems and gene probe arrays, with large-scale computing capability.
    Click Here to View Full Article

  • "Presidential Panel Recommends Steps to Promote Computational Science"
    Chronicle of Higher Education (04/29/05) Vol. 51, No. 34, P. A33; Kiernan, Vincent

    The President's Information Technology Advisory Committee recommended in an as-yet-unreleased report a restructuring of universities and federal agencies to promote multidisciplinary computational science, which plays an "integral role" in solving a multitude of problems ranging from traditional science and engineering to public health to homeland security. As chairman of the subcommittee of the panel that wrote the draft, University of North Carolina at Chapel Hill information technology chancellor Daniel Reed called for more strategic thinking and more coordinated work. University of Washington computer science professor Edward Lazowska noted that one way to encourage universities' acquisition of multidisciplinary expertise would be to raise officials' hopes that such capabilities would favor them for federal grants. Meanwhile, the report summary said federal agencies also need reorganization, and recommended a government-commissioned study by the National Academies on restructuring agencies' research priorities "to support revolutionary advances in computational science." The panel also called for the creation of a long-term computational science road map by the academies, and additional federal financial support for computational science initiatives such as archives of data being compiled by advanced digital tools. The report summary recommended that researchers with federal grants be required to place their software and data in these archives, while the government should additionally fund the development of new computational science hardware and software as well as supercomputer centers for researchers.

  • "Satisfied But Uncertain"
    InformationWeek (04/25/05) No. 1036, P. 32; Chabrow, Eric

    The 2005 InformationWeek Salary Survey of 12,158 IT professionals finds an overwhelming degree of job satisfaction among respondents, although approximately two-thirds do not think an IT career and the potential for salary advancement is as promising today as it was five years ago. This is despite the fact that IT's unemployment rate is almost two percentage points below the average for all occupations. A majority of respondents say challenge and responsibility are primary considerations, but just 33% of IT staff and 47% of IT managers feel their positions are challenging. Deirdre Woods, CIO and associate dean of the University of Pennsylvania's Wharton School of Business, says decent salaries and solid health-insurance benefits are taking precedence over entrepreneurism among business-technology workers. The InformationWeek survey finds two-fifths of IT staffers and managers to be somewhat or actively looking for employment elsewhere, although a desire for job security can dampen a willingness to take risks, which could be detrimental to corporate IT organizations. A majority of respondents agree that the current outsourcing trend is having an adverse effect on the IT profession, with 68% reporting fewer jobs, 61% reduced worker morale, 53% lower salaries for new hires, and 42% fewer advancement opportunities as a result of outsourcing. These claims are contested by University of Michigan business professor C.K. Prahalad, who told attendees at InformationWeek's Spring Conference that the demand for American exports is being fueled by the IT explosion in regions such as China and India. The poll demonstrates that IT professionals expect companies to help them advance, with two out of five staffers and managers anticipating further education and training, and 20% expecting to be remunerated for certification.
    Click Here to View Full Article

  • "Whatever Happened to Machines That Think?"
    New Scientist (04/23/05) Vol. 186, No. 2496, P. 32; Mullins, Justin

    The excitement generated by the field of artificial intelligence, and the support it garnered, waned dramatically in the 1990s as AI projects that promised to deliver convincing computer conversationalists, autonomous servants, and even conscious machines failed to pan out because their core principle--that computers can be made intelligent by human standards of intelligence--was flawed. This has prompted a reevaluation of what constitutes intelligence in many circles, although the Turing test still remains a key benchmark for gauging machine intelligence. The AI field was split into those who believe systems can become intelligent through symbolic reasoning, and those who favor biologically inspired approaches such as artificial neural networks and genetic algorithms. Carnegie Mellon University researcher Tom Mitchell is working to bridge the gap between these two approaches in his analysis of how the human brain reacts to spoken nouns and later verbs and sentences via functional magnetic resonance imaging. He thinks such research could perhaps lead to a mind-reading computer program. Meanwhile, former Stanford University computer scientist and AI research veteran Doug Lenat has been developing Cyc, an AI system that can learn by tapping a vast database of common-sense assertions, for over two decades. Lenat believes that once Cyc becomes freely available over the Web, the input contributed by millions of users will give the system the accumulated knowledge to correctly answer most questions within three to five years. Cyc is rekindling interest for AI, as are efforts in Europe, Japan, and America to build systems that can address uncertainty through statistical reasoning.
    Click Here to View Full Article

  • "Does Trusted Computing Remedy Computer Security Problems?"
    IEEE Security & Privacy (04/05) Vol. 3, No. 2, P. 16; Oppliger, Rolf; Rytz, Ruedi

    Rolf Oppliger and Ruedi Rytz with the Swiss Federal Strategy Unit for Information Technology weigh the benefits and drawbacks of trusted computing, and conclude that the technology is unlikely to completely inoculate PCs against the threat of malware. Trusted computing initiatives are consistent in their basic principle to convert software-open computer systems into software-closed or software-controlled systems, which cannot be done without a secure, reliable bootstrap framework. Software-open systems are key to the PC explosion because they allow the operating system and the application software to be easily modified, upgraded, and extended; they are also key to PCs' insecurity, which threatens users' personal data as well as the security and availability of the Internet. The authors point out that commercial antivirus software is ineffective against detecting and eliminating unknown malware, while the ability to introduce malicious code at any point in the software life cycle complicates testing and detection. Not only that, but typical computer memory architecture stores programs and data in the same place, which enables malware to alter data and programs simultaneously. The separation of programs and data--a prerequisite for a more secure architecture--is also difficult. Trusted computing allows software to be authenticated and authorized to confirm its genuineness and integrity before execution, but the technology cannot ensure that software running on a computer system does not contain exploitable programming errors or malware; this situation makes trusted computing effective against manual malware execution, but useless against malware that takes advantage of glitches, flaws, and vulnerabilities in authorized software for its own purposes. The authors write that trusted computing-enabled systems are more easily securable, but their degree of protection reflects how the systems are designed and implemented.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM