HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 727:  Monday, December 6, 2004

  • "Nano World: Software to Speed Nanotech"
    United Press International (12/03/04); Choi, Charles Q.

    Federal, corporate, and academic consortia are being launched to spur the development of software that can simulate the expected behavior of materials at the nanoscale in order to refine nanotechnology R&D. IDC says this approach could yield a return on investment of $3 to $9 on every dollar spent, while Gerhard Klimeck with the National Science Foundation-sponsored Network for Computational Nanotechnology reports that development time could be cut by 50%. Accelerys recently launched a consortium whose members include Corning, Fujitsu, Sweden's Uppsala University, and London's Imperial College, to develop software that will scale up already existing programs to simulate tens of thousands of atoms. Accelerys chief science officer Scott Kahn says current cutting-edge computer systems can accurately model smaller-than-nanoscale molecular systems, but says the consortium aims to provide "applications at the nanoscale for the engineer, who doesn't want to look at the molecular characteristics, but at those bits of information they can use, like their current requirements, electrical resistance [or] amount of heat given off." Klimeck's consortium is committed to developing open source nanotech software products. Klimeck says he hopes his consortium will serve as a center where researchers lacking proper experimental means can perform their trials out of the network using a wide array of software hosted by the group. He estimates the rollout of software that could "significantly" affect the design of next-generation devices is about two years off, while about five years may pass until experimentally verified, commercial molecular electronics emerges. Klimeck foresees the consolidation of nanotech software firms in the next decade.
    Click Here to View Full Article

  • "Japan's Tech Industry Banks on the 'Cool Factor'"
    CNet (12/06/04); Kanellos, Michael

    Japan's tech sector is working to jump start a rise in its economic fortunes with a renewed emphasis on consumer electronics, spurred by a fundamental rethinking of Japanese business ethics to accommodate dramatic changes in the global economy. This reassessment has led to the Japanese adoption of offshore outsourcing and the forging of alliances and mergers with outside firms, while long-standing rivalries in such fields as processor and memory chip production are giving way to new partnerships in an effort to reduce costs and help fund the construction of manufacturing facilities. Consumer technologies considered to be most critical to Japan's economic resurgence include digital cameras, high-definition televisions and DVD players, and their associated components. ISuppli notes that Japanese firms dominate the digital-camera market, as well as the markets for digital-camera parts, but Korean, American, and other outside competitors are emerging with increasingly popular products such as camera cell phones and silicon sensors. Competition in the DVD industry is driving Japanese manufacturers to develop and roll out more advanced DVD recorders, though sales have been meager thus far. Japan is also making significant investments in emergent technologies in the hopes of getting a jump on rivals if the economic recovery continues. The National Science Foundation reports that approximately $900 million in government funding will go toward nanotechnology research this year, while NEC is seeking licensing revenues from companies that want to fabricate or sell products based on its patented single-wall carbon nanotube technology. Other new tech areas Japan is pushing itself to dominate include alternative energy products such as hybrid automobiles.
    Click Here to View Full Article

  • "Who's Recycling Techno Trash?"
    Associated Press (12/05/04); Simon, Ellen

    A four-year-old National Safety Council report found that less than 10% of electronic waste is being recycled; reasons provided for this figure include consumers' ignorance of recycling programs and pickup and drop-off locations, as well as bans on dumping electronic products into landfills because of their hazardous materials content. Silicon Valley Toxics Coalition executive director Ted Smith says his organization calculates that 60% to 80% of the e-waste sent in for recycling is exported to China, while Inform President Joanna Underwood notes that most cell phone recyclers spruce up discarded products and sell them in developing markets that lack a safe disposal infrastructure. Smith thinks electronics manufacturers need an incentive to design more environmentally friendly products, and his organization wants to authorize producers to reclaim discarded products in order to spur them to eliminate hazardous materials from their wares. Among the solutions being developed or adopted is next-generation recyclable technology, such as compostable cell phone covers, and converting used products into new devices. Recyclers focused on increasing the volume of material they recycle include GreenDisk, which is collaborating with the U.S. Postal Service on a strategy to transport used electronics gear to postal processing centers in mail trucks after mail deliveries are concluded. Some manufacturers have embedded recycling into their business models, though these efforts sometimes keep a low profile. Hewlett-Packard, for example, builds scanners from new polymers and recycled soda bottles, while Motorola sponsors a program where consumers can download a prepaid postage label online for mailers that can contain used mobile phones from any manufacturer.
    Click Here to View Full Article

  • "The End of the Beginning"
    Federal Computer Week (12/06/04); Sarkar, Dibya

    America is entering a new stage in its response to the Sept. 11 terrorist attacks where officials are considering more architectural issues instead of stop-gap fixes in security. Technology upgrades are still needed at the perimeter, but government efforts need to prioritize core issues such as intelligence gathering, information sharing, basic communications interoperability, and improved management and processes. New York State Police staff inspector Tom Cowper says ambitious data-mining projects are needed, such as the Defense Advanced Research Projects Agency's Terrorism Information Awareness system; Cowper acknowledges the threats to civil liberties, and says civil libertarians and other groups need to be engaged so that an acceptable solution can be found. As it is, objections have stalled development completely and there is little effort to find ideological middle ground. SRA International CEO Ernst Volgenau says a decentralized electronic information analysis system might work better, and he proposes a framework that meshes news and other open information with intelligence from field operatives; the emphasis of such as system is on processing available data quickly. Battelle Memorial Institute is working on computer simulation technology that would forecast terrorist threats, based on what the government knows about such groups' organization and operations, and though such a system might not provide effective specific pointers, it would point government efforts in the right direction. Other experts say the government needs to adopt more streamlined and comprehensive procurement policies for homeland security, invest more in public and private technology research, and leverage new biometric identification technology.
    Click Here to View Full Article

  • "Former Cybersecurity Czar: Code-Checking Tools Needed"
    IDG News Service (12/02/04); Gross, Grant

    Former Department of Homeland Security (DHS) National Cybersecurity Division director Amit Yoran called for software assurance tools that look for flaws in software code at the E-Gov Institute Homeland Security and Information Assurance Conferences on Dec. 2. He estimated that around 95% of software bugs are caused by 19 "common, well-understood" programming errors, and automation tools that sift through software code for such errors was a long-term priority of Yoran's division. Yoran noted that government research into software assurance tools is at a very early stage, and said he expects 10 or more years to pass before such tools are mature and widely adopted. In regards to why he left his job at DHS, Yoran commented that he lacked a clear objective as to what strategies to implement in order to secure cyberspace, while another problem was DHS' lack of operational or regulatory control over the Internet. Nevertheless, Yoran praised President Bush's National Plan to Secure Cyberspace for bypassing federal regulation in favor of collaborating with the private sector on cybersecurity recommendations. He reported that Bush's plan, in combination with DHS initiatives, can facilitate a more effective strategy than the current approach of finding bugs, evaluating whether to fix them, and patching them when they become too problematic for companies. Yoran said he expects the cybersecurity arena to undergo a dramatic change in several years as Web services, radio frequency identification tags, remote Internet access, and other technologies are adopted by companies and federal agencies.
    Click Here to View Full Article

  • "Tom Hanrahan, Linux Engineering, OSDL"
    InternetNews.com (12/03/04); Wagner, Jim

    IBM Linux program manager Tom Hanrahan was appointed to head the Open Source Development Lab's (OSDL) Linux engineering effort in September, showing the extent to which big-business IT firms have embraced open source development. Hanrahan says open source development presents significantly different opportunities and challenges than proprietary software development. The availability of so many contributors accelerates the development process while ensuring it is thoroughly reviewed; at the same time, groups such as the OSDL can only influence, not dictate, which direction development heads. Hanrahan says IBM employees are trained upon entering the IBM Linux Technology Center about the open source model of software development so that they have a community-oriented mindset when participating in Linux projects. The OSDL provides important support and testing services to the Linux community and commercial developers who want to write Linux-compatible programs: The lab offers test information and equipment that people can use when they need to test on a specific architecture, for example. In addition, the OSDL provides independent software developers with guidance on where Linux development is headed so that they can better ensure compatibility when developing their own applications. Hanrahan wants Linux to continue its penetration of the enterprise markets, especially in financial services and telecommunications. In terms of desktop development, Linux needs to operate more fluidly in a heterogeneous environment in order to speed adoption; in addition, desktop Linux needs more support from software vendors and a larger pool of compatible applications.
    Click Here to View Full Article

  • "UNSW Inks Quantum Computing Pact"
    Computerworld Australia (12/01/04); Gedda, Rodney

    The Center for Quantum Computer Technology at the University of New South Wales (UNSW) has joined forces with IBM's Thomas J. Watson Research Center to collaborate on research and the development of commercial applications. UNSW's quantum computing center is a cooperative of eight universities in Australia, and will provide a wide range of views on how to proceed with a technology in which there are so many uncertainties. "By using this approach IBM is gaining access to not just one point of view but eight--plus the multiple points of view that are inside IBM," says Henry Chesbrough, executive director of the center for open innovation at the University of California, Berkeley, and author of "Open Innovation." John Harvey, corporate affairs executive at IBM Australia, notes: "We are doing an open innovation model where we are taking work from outside and combining it with our own for a better and lower-cost solution." Researchers at the quantum computing center plan to develop a quantum computing chip over the next three years. Center COO Dr. Richard Sharp expects quantum computing to have an impact on searches of large databases, engineering modeling, and sorting through genomics data.
    Click Here to View Full Article

  • "The Future of Innovation in the U.S."
    Always On (11/30/04); Russell, Chuck

    Chuck Russell with Collective Intelligence cites the observation that basic research spurs innovation, which in turn generates economic growth in the form of new high-tech jobs, new infrastructure technologies, and unique, tech-enabled business models. Indeed, several studies estimate that basic research is directly responsible for 45% to 75% of innovation, and the U.S. government was determined to be sponsoring 85% of all basic research last year. However, Russell warns that government funding of basic research has been insufficient since 1980, and this trend must be reversed if the country is to retain its innovation leadership. He refers to a recent BusinessWeek survey of U.S. executives, 46 percent of whom called reduced research and development spending the biggest threat to innovation in the U.S. economy, followed by public education (45%) and corporate bureaucracy (36%). Federal funding for basic research experienced moderate growth in the 1990s, but federal spending levels for engineering and physical and mathematical sciences has fallen 0.09% since 1980 to account for just 0.16% of the GDP. Ninety-three percent of basic research funding in fiscal year 2004 will go to the departments of Health and Human Services, Defense, Energy, Agriculture, NASA, and the National Science Foundation. Harsh restrictions on discretionary spending means that most federal basic research will experience budget gains that hardly surpass the projected 1.3% rate of inflation. Life sciences are expected to receive 54.3% of the research budget, followed by engineering (16.9%), physical sciences (10%), environmental sciences (7%), mathematics and computer sciences (5.2%), social sciences (2.2%), and psychology (1.9%); the remaining 2.5% will be apportioned to "other sciences."
    Click Here to View Full Article

  • "Robots Making Good on Futuristic Promise"
    Fort Wayne Journal Gazette (IN) (12/06/04); Voland, Gerard

    Gerard Voland, dean of Indiana University-Purdue University's School of Engineering, Technology, and Computer Science, writes that robots are becoming a pervasive presence throughout society, both domestically and in our working lives. Robots have been playing a key industry role since the 1960s, performing jobs deemed too hazardous or boring for human workers, resulting in increased safety and productivity levels; robots are also being employed as surgical assistants, allowing surgeons to perform smaller, more accurate incisions and procedures that translate into less pain and faster recovery times for patients. Inexpensive, self-navigating machines have emerged for household use, such as automated vacuum cleaners and lawn mowers, while other domestic bots are designed specifically for entertainment purposes. Unmanned robots are also serving as explorers in space, military scouts, security measures, and early-warning disaster systems. Bipedal machines such as Sony's Qrio and Honda's ASIMO have rekindled visions of even more humanoid robots in the future tirelessly performing functions that may try the patience of humans, such as caregivers for the elderly. Voland expects future machines to exhibit new configurations and behaviors, based on the progress research labs are making in designing modular, shape-shifting robots that can change their form to carry out specific chores. He also foresees robots capable of automatically broadening their performance scope by incrementally meshing basic skills into new and more valuable functionality.
    Click Here to View Full Article

  • "Berners-Lee Maps Vision of a Web Without Walls"
    eWeek (12/02/04); Vaas, Lisa

    At the World Wide Web Consortium's (W3C) 10th birthday celebration on Dec. 1, W3C director Tim Berners-Lee outlined the Semantic Web, his vision of a future Internet providing a common architecture that transcends applications, enterprises, and community walls so that data can be shared and reused regardless of where it originates, using XML-based application integration and Uniform Resource Identifiers as its foundation. The W3C took significant steps toward realizing Berners-Lee's dream with its February 2004 release of recommendations for the Resource Description Framework and the Web Ontology Language, and recently started holding educational workshops geared toward probable Semantic Web early adopters. Berners-Lee reported that attendance was huge at an October workshop for the life sciences sector, because attendees knew the Semantic Web would be very beneficial for researchers who must regularly deal with a massive amount of data. "People [were] explaining why, in their area, when they started to use Semantic Web ideas, they could do things much more powerfully than they could before, when they realize they're communicating with people and trading information across the barrier," he noted. Berners-Lee drew similarities between the Semantic Web and enterprise application integration (EAI) systems, but saw one key difference--EAI system users may not necessarily retain access to data, while the technology provider definitely does. He said he hopes that "in the future, adapters won't be necessary--most products will come Semantically compatible." Existing prototype Semantic Web applications include MIT's Haystack, which allows users to work with data no matter where it comes from by eliminating the partitions separating repositories such as email clients, address books, calendars, and the Web.
    Click Here to View Full Article

  • "At Museums, Computers Get Creative"
    New York Times (12/02/04) P. E1; Hafner, Katie

    Museums across the country are using computers to engage visitors rather than simply provide more static information. Computers have long had a place alongside museum displays, but were often little more than a repository for information that did not fit on an exhibit label, says Institute for Learning Innovation associate director Lynn Dierking. The San Francisco Exploratorium offers a popular "Energy from Death" exhibit that features beetles eating the carcass of a dead mouse, while officials at the science museum say a computer display next to the terrarium is just as popular; it provides pre-recorded video of the carcass over days and allows users to speed, slow, and pause the action. Other computer installations are similarly breaking new conceptual ground, such as interactive systems at the Powerhouse Museum in Sydney, Australia, that helps people calculate the impact of their lifestyle on the environment. At the Museum of Tolerance in Los Angeles, a "voting theater" polls the audience and provides anonymous displays of how they answered questions. The National Archives in Washington, D.C., recently opened a new section meant to let visitors experience what it is like to explore backroom stores. A horizontal track runs in front of a series of boxes, and visitors move a screen along the track and get a virtual view into each box's contents; a box labeled "Kent State" provides video of demonstrations before four students were shot by National Guard troops in 1970, for example. New wireless handheld devices at the Getty Museum in Los Angeles remove clutter and allow visitors to create their own tours of the art museum.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Buildings Become Information Systems"
    Computerworld (11/29/04) Vol. 32, No. 48, P. 38; Weitzner, Daniel J.

    Discrete building maintenance and security functions are being integrated to allow those systems to interface with one another and other IT systems, writes Daniel Weitzner, principal research scientist at the MIT Computer Science and Artificial Intelligence Laboratory. Just as other enterprise systems are leveraging shared data, building security, energy management, and other functions are beginning to plug into enterprise IT frameworks to provide lower operating costs, more flexible workspaces, and physical infrastructure aligned with corporate organization and policy. Secure access control is made to better reflect policies by restricting workers' physical access according to their authorization levels, and RFID-type security systems can identify workers and keep logs for audit; security itself is enhanced by digital video monitoring. Though these measures make buildings more transparent and controllable, they also pose new and difficult privacy challenges that need to be addressed through a thoughtful combination of policy and design. Energy efficiency can be increased as devices and systems feed back usage data to a central system that optimizes power utilization, while building signage will also likely improve the usefulness of physical infrastructure by providing timely information to occupants as needed. All these functions are made possible by a mix of existing open data standards such as XML and SOAP message formats, as well as new standards such as an International Standards Organization digital control technology for buildings. With such capabilities, building systems will become just another information system with its own interface to other enterprise applications, but administrators need to plan ahead how they will implement these technologies to best increase building functionality while preserving worker privacy. On the technology side, the control of physical infrastructure increases the need for robust and secure technology.
    Click Here to View Full Article

  • "The Phone That Knows You Better Than You Do"
    New Scientist (11/27/04) Vol. 184, No. 2475, P. 21; Biever, Celeste

    An assistive cell phone that can anticipate user behaviors is on the horizon, thanks to the efforts of MIT researchers Sandy Pentland and Nathan Eagle. The phone can map out its user's lifestyle by logging when they make voice or text calls or employ alarm clocks or other phone applications, and deduce their socialization patterns with embedded Bluetooth radio links that measure the proximity of associates' own Bluetooth phones. Eagle says the phone's mobile message logging software, dubbed Context, can remind users of previous activities, the last time they saw a friend or associate, or when they met someone new; the software can also analyze the frequency of meetings with associates to ascertain the strength of user relationships. "No one has ever been able to capture this kind of data before, because cell phones and Bluetooth were never as ubiquitous," Eagle observes. Context, which was engineered at the University of Helsinki and the Helsinki Institute for Information Technology, logs the ID code of every Bluetooth chip it passes, the locale of each new phone mast it contacts, the number of people who are phoned or texted, and each incidence in which an application is employed. Every datum is time-stamped and sent to the user's network servers to be stored, and Context learns by cueing users to enter their location and activity each time the phone comes into the range of a new cell mast. MIT students are testing the system using Nokia 6600 smartphones with embedded Context; data recorded by the software is being constantly sent to a central MIT server, where a pattern-recognition program is calculating the likelihood of individual users' behavior, location, and socialization habits.

  • "Examining the Proposed Internationalization of TLDs"
    CircleID (11/30/04); Seng, James

    In this rebuttal to John Klensin's previous suggestion that language translation be applied to top-level domains (TLDs) in order to solve the problem of Internationalized Domain Names (IDNs) without creating a new TLD for each language, James Seng argues that Klensin's proposal has five major flaws. First, Seng, the former co-chair of the IETF IDN Working Group, points out that the IETF IDN Working Group has already considered and thrown out a similar proposal. On technical grounds, Seng says Klensin's proposal would cause excess confusion and complexity because it would only call for translations of TLDs. Because IDNs are also used in second-level domains and various technical protocols, translating only TLDs would unnecessarily disrupt these protocols. Seng also takes exception to Klensin's assertion that maintenance of a 300-name translation table would be a relatively easy task for programmers, noting that the length of time it has taken browsers to accept IDNA is an indication of how difficult it would likely be to get the translation table up and running. On the issue of translation, Seng argues that languages are not simply determined by countries, but also vary according to localities, making it difficult for countries to agree on something as simple as the translation of the country name itself. Finally, Seng points out that Klensin's proposal, which calls for one company to control all names under the .com TLD in one language, will not be easily accepted by the global Internet community. Seng is currently the assistant director of the Infocomm Development Authority (IDA) of Singapore, responsible for the Next Generation Internet.
    Click Here to View Full Article

  • "Virus Names Could Be Standardized"
    Computer Business Review (11/25/04)

    Arbitrary naming of viruses results in confusion about whether an attack is a new virus or just a variation of a previous virus, and is an important issue currently facing the security industry. The recent mobile exploit for Internet Explorer 6 was named Bofra by some vendors, but designated a MyDoom variant by others because it shared code. US-CERT, the U.S. Computer Emergency Readiness Team, proposes establishment of a malware identifiers database similar to the Common Vulnerabilities and Exposures (CVE) list currently managed by Mitre and sponsored by US-CERT. The database will assign virus identifiers, such as CME-1234567, for a standardized approach to viruses. The proposal even includes room for more media-friendly virus names, such as Blaster and Slammer, according to US-CERT. Currently, malware is named by the companies that discover them, and are often derived from file names to the content of email or plain text code. In a letter to the SANS Institute, US-CERT representatives wrote that "By building upon the success of CVE and applying the lessons learned, US-CERT...hopes to address many of the challenges that the anti-malware community currently faces."
    Click Here to View Full Article

  • "Plotting a Better Wireless Landscape"
    InfoWorld (11/29/04) Vol. 26, No. 48, P. 48; Gruman, Galen

    The reliable and robust performance of wireless local area networks (WLANs) will be a formidable challenge as broader networks are deployed and issues of quality of service (QoS), WLAN management, roaming, and mobile interoperability with other wireless technologies come to the fore. The least urgent of these four issues is interoperability, because the hardware does not yet exist; Proxim's Jeff Orr reports that achieving compatibility with 802.16 wide-area wireless will be easiest, since it uses Ethernet and SNMP and thus supports the same security mechanisms and protocols as 802.11. Traffic management will become a pressing issue with broader WLAN rollouts, particularly for networks with VoIP deployments and those in high-traffic zones. The IEEE is working on 802.11e, a QoS standard that will allow network administrators to prioritize user classes and application types on 802.11b networks, as well as help access points (APs) keep radio range and bandwidth usage optimized based on traffic patterns via standardized power settings. WLAN management becomes a problem as more and more APs are established, and the current solution for enterprises is to standardize on wireless APs, gateways, switches, and routers from one vendor. The IEEE is working on the 802.11s standard for managing wireless back-haul links and producing mesh networks that eliminate the need for each AP to support a direct connection to the wired LAN, while the IETF's CANWAP taxonomy aims to build a common understanding of diverse WLAN management device protocols and interfaces so that vendors and managers can implement the most suitable options. The 802.11f standard allows roaming among APs on the same network segment, but 802.11 roaming is flimsy and streamed connections can be disrupted because of reauthentication during roaming. The 802.11r task group seeks to mitigate these problems with faster algorithms and preauthentication.
    Click Here to View Full Article

  • "A Conversation With Bruce Lindsay"
    Queue (11/04) Vol. 2, No. 8, P. 22; Bourne, Steve

    IBM Fellow and guiding force in relational database management systems development Bruce Lindsay explains that information processing will not be reliable and dependable unless you "engineer for failure at all the different levels of the system, from the circuit level on up through subsystems like the database or the file system and on into the application programs themselves." Lindsay categorizes a distributed system as either a system that uses another system for a specific function--such as Web services--or a distribution of machines cooperating on a single service, which is where most people focus on error detection and recovery; he sees two key components of collaboration within distributed systems--determining the various "players" on a system, and verifying that players deemed out of commission are indeed out of commission. Lindsay says file systems have evolved from employing scanning or file system check to recover from emergency restarts toward a database-motivated/inspired logging system that marks the metadata changes and then writes the changes to disk leisurely. The advantages of this technique include a speedier emergency restart of file systems since only the most recent changes have to be repeated or confirmed. "The decision on how to respond to errors--whether to give no answer or even a partial answer--really depends on what the answer is going to be used for and what the reaction of your client or customer is going to be to a wrong answer or lack of answer," remarks Lindsay. System error messages can be represented by a notice that the system has predicted the problem and is attempting to tell the user what the problem is (indicating a relationship between user input and the error), or a message that a bug exists and the system needs more information to address it, resulting in a large amount of info being dumped into the error log data-sets. It is Lindsay's belief that, overall, both software and hardware developers have not invested enough money in building systems that can accurately state and describe errors.
    Click Here to View Full Article

  • "Black Hole Computers"
    Scientific American (11/04) Vol. 291, No. 5, P. 52; Lloyd, Seth; Ng, Y. Jack

    Physicists posit that all physical systems store information merely by existing, and can process that information through dynamic evolution over time; by that reasoning, the universe is capable of computation. Physicists are extending that theory to black holes, based on their suspicion that information can escape from these stellar phenomena. A black hole's computational ability stems from information's quantum-mechanical nature, and a black hole's total memory capacity is proportional to the square of its computation rate. This notion dovetails with the holographic principle, which states that the maximum data storage capacity of any region of space appears to be proportional to its surface area rather than its volume. The practical operation of a black hole computer entails the encoding of data as matter or energy, sending it down the black hole--which performs the computation programmed into the data--and then capturing and deciphering the hole's Hawking radiation output with a particle detector. The exactitude with which the geometry of spacetime can be measured is limited by the same physical laws that govern the power of computers. This precision turns out to be lower than physicists used to think, which implies that the size of discrete "atoms" of space and time is bigger than anticipated. With these principles, it can be postulated that the universe is performing a kind of self-computation, computing all matter--from the smallest quantum fields to the most expansive galaxies--and thus mapping out its own spacetime geometry to the highest precision permitted by physical laws.


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM