HomeFeedbackJoinShopSearch
Home

Be a TechNews Sponsor

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM. To send comments, please write to [email protected].
Volume 7, Issue 757:  Wednesday, February 23, 2005

  • "High-Tech Tension Over Illegal Uses"
    Washington Post (02/22/05) P. E1; Krim, Jonathan

    In March, the Supreme Court will hear the case of the Grokster file-trading service to determine whether the service has any liability for digital piracy committed by its users. The entertainment industry argues that this piracy is Grokster's key means of revenue, as evidenced by the service's refusal to crack down on such activities; if the court's decision favors the entertainment companies, hundreds of existing products could be affected and innovation could be strangled because of the potential for misuse, argue consumer electronics groups. "If it's so risky for me to try out new things or put new things on the market, you are really going to devastate people's willingness to innovate," warns Time Trax Technologies CEO Elliott Frutkin. The case will review the landmark Betamax decision of 1984, in which the court ruled that manufacturers of video recorders were not liable for unlawful acts of users provided the product was "merely capable" of substantial legal applications. Major tech companies have entreated the Supreme Court to consider whether Grokster is actively encouraging and aiding digital piracy rather than how often it is used for unauthorized file-trading, but entertainment industry officials counter that such a strategy allows companies to dodge prosecution for practices that are tacitly accepted but not openly promoted. Technologists complain that "acceptable use" is a relative term in the entertainment industry. Consumer Electronics Association director Gary Shapiro likens the industry's credo that all unauthorized use of content constitutes infringement to living in a dictatorship in which "You are not breaking the law, but you want to keep your head down and not be noticed because the dictator randomly kills." Some small companies are trying to avoid any hint of liability by embedding copying controls in their file-sharing software, but some people expect the Supreme Court's decision to have no effect on the final outcome, in which the democratization of access to information will make the organization and customization of content, rather than its generation, the chief industry driver.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "OASIS Patent Policy Sparks Boycott"
    CNet (02/22/05); Festa, Paul

    In response to the Organization for the Advancement of Structured Information Standards' (OASIS) revised patent policy, which the organization has hyped as a compromise to make its system more accommodating of open-source software developers, leading open-source and free-software proponents have signed an email calling for a boycott on those specifications. "We want organizations like OASIS to develop policies so any group that wants to use an industry standard can know in advance whether or not someone's going to come along and reach into their pocketbook," reports Rosenlaw & Einschlag attorney Lawrence Rosen. OASIS' patent policy revisions establish three modes for standards work--reasonable and nondiscriminatory (RAND) licensing, royalty-free (RF) on RAND terms, and RF on limited terms. In an interview, OASIS CEO Patrick Gannon says he doubts that the email signatories read the policy, insisting that royalty-bearing OASIS standards are relatively rare. He explains, "Our policy states that standards may incorporate work that is patented, but that they have to disclose it. And in almost all cases, that results in a royalty-free license for that work." Gannon also says many OASIS members are participants in open-source development who gave the revised patent policy a passing grade, adding that his organization has gone out of its way to have the policy reviewed by intellectual property lawyers. Patents in standards bodies are labeled unfair by advocates such as Bruce Perens, citing open source's substantial industry contributions. Conversely, supporters of RAND-amenable policies argue that the exclusion of royalty-bearing technologies is hurtful to standards.
    Click Here to View Full Article

  • "Added Reliability for Safety-Critical Software"
    IST Results (02/22/05)

    The IST-funded ATASDAS project has developed a toolkit for enhancing the reliability of safety-critical software. Project coordinator David Escorial of Spacebel says the tools identify the key software elements of the system architecture and facilitate analysis of software-hardware interaction. In addition to effecting automated dependability analysis, the ATASDAS tools deploy heuristics and algorithms of graph theory to describe the software system as it relates to such variables as software architecture, data dependency graphs for critical inputs and outputs of parallel processes, a call graph for parallel processes, and proper system characterization metrics. C/C++, Ada, Modula-2, and other languages employed in such applications are supported by the ATASDAS toolkit, which has been commercialized as imPROVE-C for distribution by TNI Software. Reliability standards such as IEC 61508 are also recognized by the toolkit. Escorial notes that the toolkit "helps us to carry out the analysis in a rigorous automated way, using formal methods, and it also helps us to properly identify reliability sources." He says the toolkit will be marketed toward sectors where safety is a key concern, such as land and air transportation.
    Click Here to View Full Article

  • "Software Gives Descriptive Directions"
    Technology Research News (03/02/05); Smalley, Eric

    MIT researchers have developed software that automatically generates directions by modeling the geographical relationships between spaces and their functions, and incorporates landmarks into the directions as well. The Location Awareness Information Representation (Lair) software taps a database containing information about places, paths, and place functions. Places are defined by six properties: Name, on (the paths the place lies along), star (the geometry of path intersections), view (other places visible from the location in question), contained (bigger places incorporating the location in question), and function (the place's uses and what activities can be performed there). The indoor walking directions Lair generates identify turns and confirm travel directions through the use of landmarks such as doors, and describe spaces traversed or passed by routes as well as hallway intersections. Walking directions are extracted from a map by arranging waypoints into sets representing segments of a path along the route, assigning turns where paths intersect. Lair uses landmark visibility to produce "you will see" phrases and keep directions short, and taps geographic similarities to ascertain where to segment instructions. Another tool, the Interactive Simulator for Lair Exploration, lets users make queries about places and routes, and could be used with handheld devices that monitor a user's whereabouts with an indoor version of the Global Positioning System. The researchers are trying to better understand how people use their surroundings to make decisions in order to improve Lair, and MIT researcher Gary Look says they are planning to analyze the quality of Lair's directions and enable the software to automatically define places and paths based on architectural drawings. The research, funded by MIT, was presented at the Intelligent User Interfaces conference (IUI'05) in January.
    Click Here to View Full Article

  • "Cornell Scientists Tackle 'Hard' Problems by Teaching Computers to Solve Tough Tasks the Human Way"
    Cornell News (02/21/05); Steele, Bill

    Cornell University Intelligent Information Systems Institute director Carla Gomes and associate computer science professor Bart Selman have developed new methods for solving hard "combinatorial" computer problems, which were detailed on Feb. 21 at the annual meeting of the American Association for the Advancement of Science in Washington, D.C. The combinatorial nature of the problems the tools are designed to address means that the computer must extract from a large set of variables the most effective combination of values to assign to the variables to fulfill specific parameters. The computing strategy in such cases is to try out every possible combination and choose the one with the optimal outcome; the computer begins selecting different blends of value settings, building a continuously growing possibility tree, re-doing the process until all possibilities have been compared or a satisfactory answer is uncovered. However, sometimes the computer can select a tree whose completion takes too long, which often happens when the computer is attempting to calculate "heavy-tailed" phenomena such as chess or economic trends. Among the techniques Gomes and Selman have come up with is one that involves finding a small number of core variables whose values can be set ahead of time. For instance, solving an airline scheduling problem with thousands of variables could be easier if just a dozen of those variables are fixed in advance. Gomes notes that "Humans are very good at seeing the big picture and seeing what's critical." Real-world problems that Gomes thinks could benefit from such strategies include power outage prediction and management.
    Click Here to View Full Article

  • "Computer Vulnerabilities Given Unified Rating System"
    New Scientist (02/21/05); Biever, Celeste

    A group of software and security companies that includes Microsoft, Qualys, Cisco Systems, and Symantec has developed the Common Vulnerability Scoring System (CVSS) as part of their responsibilities to the Homeland Security Department's U.S. National Infrastructure Advisory Council. CVSS, which was unveiled at the RSA Security Conference on Feb. 17, offers a harmonized system for rating the severity and urgency of computer vulnerabilities. The existence of unique vulnerability scoring systems for individual companies causes bewilderment among systems administrators, a problem that CVSS seeks to address. Counterpane Security's Bruce Schneier says, "We need a way to prioritize [vulnerabilities] and to know in real time which ones are important." Flaws are evaluated in the baseline CVSS metric according to seven traits, including how much access to sensitive hard-drive data a vulnerability gives a hacker, how much it allows a hacker to alter or destroy data, and whether it enables a hacker to crash the system. Another CVSS metric assesses vulnerabilities based on their age, as hacker exploits are more likely to have been developed for older flaws. Qualys intends to include CVSS scores along with the vulnerabilities it regularly publicizes in its Sans Top 20 newsletter. Though Ivan Arce with Core Security Technologies approves of the CVSS concept, he cautions that a bias could be introduced by software vendors that exaggerate a CVSS rating for vulnerability alerts concerning their own products.
    Click Here to View Full Article

  • "Health Industry Under Pressure to Computerize"
    New York Times (02/19/05) P. B1; Lohr, Steve

    Speaking at the Healthcare Information and Management Systems Society conference, national health information technology coordinator David Brailer told health industry officials and technology vendors that they faced a government mandate if they could not voluntarily agree on electronic patient record standards. The Bush administration, Congress, and the industry itself see electronic patient records as necessary because they would deliver cost savings of as much as 10 percent for total health care spending, reduce medical errors, and improve medical diagnosis and treatment effectiveness. The Certification Commission for Healthcare Information Technology was formed last year from vendors, consultants, and health care institutions to work out standards for such electronic records, but commission member Wes Rishel said the industry needed the government to act as a decision-maker in case no compromise can be reached on standardization. Much of the health care industry is high-tech, especially in revenue-generating areas such as diagnosis, surgery, and treatment, but the IT side is woefully under-invested. For example, whereas the banking industry spends almost $15,000 on IT per worker, per year, the health care industry spends just about $3,000 annually per worker. The large percentage of small physician practices in the health care industry could also scuttle any attempt to establish nationwide electronic patient records, since a recent study pegged the price for a small provider to set up electronic records at $30,000. The government could provide incentives in the form of federally backed loans or extra Medicare reimbursement, said Brailer. Right now, only about 10 percent of the health care industry uses electronic patient records, though not according to a standardized framework. When roughly half the industry adopts standardized electronic records, the realized benefits will help drive adoption in the rest of the industry, Brailer said.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Two UO Professors Study Multi-Tasking, Computer Efficiency"
    Oregon Daily Emerald (02/18/05); Sylwester, Eva

    University of Oregon professor Ulrich Mayr notes that multi-tasking leads to diminished productivity, and he and computer and information science assistant professor Anthony Hornof have researched the problem in detail. "As soon as the computer starts giving you tasks and you start accepting tasks from the computer, people start thinking they can handle more distractions than they really can," observes Hornof, who adds that awareness of computer distractions is growing continuously. Mayr carries out studies focusing on how long people take to switch between tasks through the performance of simple computer activities in a specific sequence; he concludes from these studies that switching back and forth between a small number of tasks can be difficult because the necessary suppression of one task increases the time it takes a person to return to it. Still, Hornof reports that students are particularly good in certain instances of multi-tasking, such as listening to a lecture while simultaneously taking notes. Mayr concludes that the similarity between two tasks can impact the efficiency of concurrent performance, using the difficulty of driving a car to one destination while telling someone how to get to a different location via phone as an example. Computers involve constant decision-making that can dramatically affect productivity, and Mayr suggests that users could be more productive by deactivating email alerts and setting aside time to check email. "Ultimately the responsibility is on the user to develop their ability to focus on one task at a time if they really mean to be productive in each one of those tasks," stresses Hornof, who recommends yoga, meditation, and similar activities as techniques to better one's concentration.
    Click Here to View Full Article

  • "Virtual Reality in the Round"
    Fraunhofer-Gesellschaft (02/24/2005)

    The Fraunhofer Institute for Computer Architecture and Software Technology (FIRST) will exhibit a cylindrical display system at the upcoming CeBIT 2005 event, showcasing a breakthrough in virtual reality technology. "When modeling virtual objects, designers, architects and engineers will no longer be surrounded by three-dimensional images as before--they can now install a true-to-life simulation in the display column, walk around it and work on it," boasts FIRST scientist Ivo Haulsen. "This gives the impression that the object they are working on is a hologram." The pillar, which is 1.6 meters in diameter and two meters tall, builds on X-Rooms technology and is chiefly geared toward advertising and presentation applications. Haulsen remarks that a key breakthrough in the virtual column's development is the successful rear projection of imagery across the cylinder's entire circumference, with special software to correct distortions via automatic calibration. The system also eliminates the problems of uneven coloring, brightness, undesirable overlaps, and time-consuming fine adjustments. In the lower section of the column are eight commercial projectors and four mirrors that deliver the rear-projected image, while the upper section is enclosed by a semitransparent viewing surface. Five standard computers equipped with the calibration software control the column, and 3D effects are facilitated using stereoscopic glasses that employ dual projection.
    Click Here to View Full Article

  • "He Paved the Way for the PC Revolution"
    Mercury News (02/21/05); Ha, K. Oanh

    Douglas Engelbart, who is credited with the creation of PC workstations that directly birthed the computer mouse, video conferencing, email, and hyperlinks, has earned a place in the Silicon Valley Engineers Hall of Fame, into which he will be formally inducted on Feb. 23. In a recent interview, Engelbart says the impetus for his pioneering work was "to increase [mankind's] collective IQ" through computing-facilitated collaboration as a way to solve the world's problems, which are too complex for any one person to address. The researcher estimates that we are in a very early stage of this revolution, noting that the creation of the Web and increased collaboration are steps in the right direction. However, Engelbart says the computing paradigm remains unchanged for the most part, illustrating his point by explaining that people still view digital content as entire pages rather than finer-grained options. To bring his vision of tapping the collective IQ closer to fruition, Engelbart proposes the construction of a "dynamic knowledge repository" in which the various thoughts representing the best human understanding of a situation are stored, giving users a comprehensive, integrated view of an argument, including the perspectives of both sides and the proof they provide. Engelbart says people must adopt a strategy similar to the boot-up mechanism of a computer, in which a small amount of memory is retained after deactivation so that the machine can restart and tap additional memory to be more effective. "If you can reach into the future and get some new stuff, you can use that to your advantage and make the future even more effective," he argues. The goal of the Bootstrap Institute, which Engelbart founded, is to help people tackle global problems via collaboration.
    Click Here to View Full Article

  • "Fight Over 'Forms' Clouds Future of Net Applications"
    CNet (02/17/05); Festa, Paul

    A breakaway working group of the World Wide Web Consortium (W3C) is nearly finished with the Web Forms 2.0 specification and will soon submit the draft to the W3C membership, forcing discussion on the difficult issue of electronic forms. The Web Hypertext Application Technology Working Group (WHAT-WG) includes browser makers Apple Computer, Opera Software, and Mozilla Foundation--all of whom want to avoid a situation where the competing XML-based XForms technology requires an entirely new generation of browsers. Participants on both sides of the debate are wary of proprietary application platforms. Opera chief technology officer and WHAT-WG founder Hakon Lie says XForms, which was finalized as a W3C recommendation in 2003, is not worth having to replace all Web browsers. "The XForms group tried to do the right thing, but as a result they dropped backwards compatibility," he says. The WHAT-WG draft proposal will require W3C members to decide whether to back evolutionary standards development or push revolutionary new standards. W3C HTML and forms working groups chair Steven Pemberton compares XForms to HTML 4, which ran in old software but required new software to add functionality. Pemberton says Web Forms 2.0 is not suitable for serious applications and was created from the point of view of browser makers, and that forms experts such as Oracle, Sun Microsystems, IBM, and Novell have XForms implementations that provide a solid basis for future e-commerce. The forms format issue threatens to unravel the push to create an open standards foundation for Internet applications; standard Web forms technology lies at the heart of many key Internet functions, but updates are needed to keep pace with the latest Internet application platforms such as integrating with back-end databases.
    Click Here to View Full Article

  • "California Researchers Collaborate With Perlegen Sciences on Map of Human Genetic Variation Across Populations"
    UCSD News (02/17/05); Ramsey, Doug

    Detailed in the Feb. 18 issue of Science is a study carried out by researchers at Perlegen Sciences, the University of California San Diego's California Institute for Telecommunications and Information Technology (Calit2), and the UC Berkeley-affiliated International Computer Science Institute (ICSI) in which genetic variation was mapped out across three distinct human populations. Perlegen sequenced the single-letter variations in the DNA of 71 people of African American, European American, and Han Chinese American origin, and the more than 100 million genotypes produced by the sequencing were analyzed by Calit2 and ICSI. ICSI researcher David Halperin and Calit2 researcher Eleazar Eskin co-created the HAP software tool for converting genotypes into haplotypes, and partitioning the human genome into "blocks" of limited diversity; the outcome calculated a greater number of blocks in the African American map than in the Han Chinese or European American maps, implying that samples of African American descent displayed the greatest genetic diversity. Calit2's OptIPuter cluster was employed to process the vast amount of data in less than 12 hours. Perlegen researchers designed a mathematical algorithm to recognize "tag single-nucleotide polymorphisms (SNPs)" that serve as guides for finding common variations in the human genome. "We've effectively figured out how to reduce the genotyping burden by identifying a reduced set of tag SNPs, thus decreasing the difficulty and cost of association studies," reports Perlegen scientist David Hinds. Perlegen chief scientific officer David Cox says the project represents a significant step in the search for genetic markers affiliated with Alzheimer's, cancer, and similar diseases. "Genome-wide analysis may soon become a standard methodology in the search for more effective, individualized treatments," he says.
    Click Here to View Full Article

  • "Researchers Find Security Flaw in SHA-1 Algorithm"
    IDG News Service (02/16/05); Roberts, Paul

    Chinese university researchers have discovered a technique that significantly improves the chances of cracking the data encryption algorithm SHA-1. SHA-1 (secure hash algorithm) is used by a wide range of companies and generates unique strings of values to both encrypt and decrypt digital signatures, says Counterpane Internet Security founder Bruce Schneier. The new cracking technique creates quick collisions, or occurrences of duplicate unique strings, within the SHA-1 algorithm so that breaking SHA-1 is 2,000 times faster. Although SHA-1 is still considerably difficult to crack and could take up to 1,000 years for a personal computer to decrypt, the research is likely to lead to a rethinking about the safety of SHA-1 among cryptographers. Previously, similar research has helped cryptographers crack other algorithms, and Schneier ponders whether consumers would be satisfied that SHA-1 protects data 999 times out of 1,000. The researchers, Xiaoyun Wang, Yiqun Lisa Lin, and Hongbo Yu of Shandong University, are expected to publish their paper, "Collision Search Attacks on SHA-1," on the International Association for Cryptographic Research Web site.
    Click Here to View Full Article

  • "Bridging the Digital Divide"
    Guardian Unlimited (UK) (02/17/05); Witchalls, Clint

    MIT Media Lab director Nicholas Negroponte and colleagues want to create a $100 laptop that would provide computer and Internet access to schoolchildren in the developing world. Negroponte has extensive experience providing IT access to impoverished areas such as Costa Rica, India, and Senegal, and his family recently set up an English-language and IT skills school in Cambodia that relies on networked laptops. Besides relying on open-source software, the MIT project is investigating many novel technologies that will help drive down the cost of the laptops and enhance their utility: In order to avoid pricey LCD screens, the group is developing a flat rear-projection screen and is also considering electronic ink technology pioneered by MIT researcher Joseph Jacobson; the laptops will also be mesh-enabled for greater resource sharing--one machine would act as the print server, another as a mass storage device, and another as a DVD player, for example. Flash memory will replace hard drives in the laptops in order to increase their durability, and the researchers are even considering whether a collection of laptops could form a phased-array antenna that would allow satellite Internet access. Another novel technology is generating "parasitic power" from keystrokes. Negroponte says technology is essential to addressing basic needs because it enables cheaper and better education, which is the key to erasing poverty; and while desktop systems would be cheaper and easier to maintain, laptops are of much greater benefit to learning, both in school and at home. And while the MIT team is working on the technical details, Negroponte blames telecom monopolies in the developing world for hindering greater advances. He suggests making telecom deregulation a condition for World Bank loans.
    Click Here to View Full Article

  • "U.N. to Control Use of Internet?"
    WorldNetDaily (02/22/05)

    A panel created by the United Nations is addressing the issue of Internet governance ahead of the World Summit on the Information Society, which is slated to be held in Tunis in November. Cybercrime, spam, and other issues are on the panel's agenda. On Monday, members of the panel expressed hope that an international system could be created in which the United Nations would govern the Internet. Countries in the developing world are calling for the International Telecommunication Union (ITU) or some other UN body to govern cyberspace, including issues such as domain names. These countries believe the ITU is better suited to Internet governance than current Internet overseer ICANN, claiming that ICANN is under the control of the United States. "There is an issue that is out there and that needs to be resolved," said Nitin Desai, special adviser to U.N. Secretary-General Kofi Annan. The first World Summit on the Information Society was held in 2003, and French Prime Minister Jean-Pierre Raffarin used the forum to push for an international form of Internet governance that would establish international rules that citizens could rely on.
    Click Here to View Full Article

  • "New-Look Passports"
    Economist (02/19/05) Vol. 374, No. 8414, P. 75

    The United States seeks to improve homeland security by mandating the distribution of biometric passports equipped with digital photos, digitized fingerprints, and iris scans, but concerns about privacy infringement, reliability and interoperability issues, and a rushed implementation raise serious doubts about the technology's effectiveness. The computer chips the passports are to be outfitted with are intentionally designed for remote readability, and the data on the passports will be deliberately unencrypted. The first measure leaves the passport bearer unaware that he or she is under surveillance, while the second means that anyone with a suitable reader--including identity thieves and terrorists--can access passport information. Addressing the second problem, ironically enough, would cancel out the passport's remote readability. Possible solutions include enclosing the passport chip within a Faraday cage, or locking the chip electronically. The interoperability of biometric passport technology is also questionable, as each country will choose its own chip manufacturers, while the common standard the passports will be designed to is cloudy. Reliability is an additional concern, as the significant error rates of facial-, fingerprint-, and iris-recognition technology raise the risk of false positives at border crossings and other points of entry.
    Click Here to View Full Article

  • "New Way of Wireless"
    CIO (02/15/05) Vol. 18, No. 9, P. 87; Gruman, Galen

    The latest iterations of the global system for mobile communications (GSM) and code division multiple access (CDMA) cellular networks deliver significant speed increases, while other wireless broadband offerings promise even higher throughput; but sorting through the morass of high-speed wireless technologies to choose the best option can be a tricky business without careful consideration. GSM boasts less throughput than CDMA, but has the advantage of worldwide global support. The next version of GSM, Enhanced Data GSM Environment (EDGE), provides 50 Kbps to 200 Kbps of throughput, while UMTS offers speeds of 100 Kbps to 350 Kbps; CDMA's latest permutation, CDMA2000 1xEVDO, supports 100 Kbps to 300 Kbps of throughput, and Verizon plans to take EVDO national by year's end. Further out is GSM's high-speed downlink packet access and high-speed uplink packet access extensions, which should boost throughput over 500 Kbps. Flarion Technologies' Flash orthogonal frequency division multiplexing (OFDM) promises real-world peak speeds of 1 Mbps for mobile users and over 1 Mbps for stationary users, but the scheme's need for licensed spectrum and transmitters and receivers at cell sites may discourage carriers. Fixed WiMax reportedly delivers subnet roaming, quality of service, multiple access queues, and peak connection speeds of 10 Mbps over five miles, and can be deployed in unlicensed or licensed spectrum; it is also less vulnerable to disruption by environmental factors than free-space optical technologies. But fixed WiMax-enabled chipsets are not expected for some time, while the mobile WiMax standard is not due to be finalized until next year. Fixed WiMax is generally considered the most sensible wireless broadband solution for enterprises that wish to link remote users to backbones and regional connectivity.
    Click Here to View Full Article

  • "Postmodern Software Development"
    Internet Computing (02/05) Vol. 9, No. 1, P. 4; Filman, Robert E.

    Robert Filman with the NASA/RIACS Ames Center draws parallels between the evolution of art and the evolution of software development, with object-oriented programming representing the computer science equivalent of modernism. He says the programming equivalent of postmodernism could overcome a number of object orientation's shortcomings. Code encompasses all meaning in object-oriented software, which the author characterizes as "notoriously unreliable and nonautomatable;" type systems, however, allow meaning to be extracted outside of code production, and Filman expects annotation in programming to be used more richly and linked to program analysis, comprehension, and tuning in the future. He writes that current technologies offer developers little option but to disperse crosscutting concerns throughout a system, but postmodern programming environments could permit the construction of working systems while also supplying mechanisms for discretely relaying these concerns. To address poorly functioning software, Filman suggests that postmodern systems could be built using tools that can handle unpredictable failures by assuming such malfunctions to be the standard rather than the deviation. Postmodern programming would also provide ways to define and sustain complex collections of elements. Whereas current programming languages permit users to make an inquiry and receive a response, postmodern software might augment user/system interaction with alternative techniques, such as context-sensitive evaluations, event-based systems, and conversational communication. Finally, Filman thinks postmodern programming-language analysis could achieve a balance between universal, generic program communication techniques and domain- or programming style-specific notations that is dramatically dissimilar to current programming-language analysis.


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM