HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 662:  Monday, June 28, 2004

  • "One Small Step in Uphill Fight as Linux Adds a Media Player"
    New York Times (06/28/04) P. C1; Lohr, Steve

    Novell and Red Hat are the latest Linux companies to announce the inclusion of RealNetworks' media player in their desktop Linux distributions. The move bolsters Linux's position on the desktop against Microsoft, which has come under legal attack in Europe because it bundles its media player and desktop operating system together. The European Commission recently ruled Microsoft did not have to unbundle its product until the legal proceedings had been completed. Asian distributor TurboLinux and Sun Microsystems have already signed distribution agreements with RealNetworks, and RealNetworks' Dan Sheeran says the RealPlayer 10 software has become de facto media player standard for the Linux desktop; the Linux-specific version of RealPlayer builds on RealNetworks' open-source Helix development effort, which mainly focuses on media software for mobile phones, and also includes proprietary codecs used for encoding and decoding digital media files. Open-source proponents want to create a rich application environment like that which Microsoft has built up over two decades, and Linux already has a number of mainline applications for word processing, Web browsing, email, and data presentation. Open Source Development Labs CEO Stuart Cohen says the inclusion of RealPlayer shows mainstream software is coming online for the Linux desktop sooner than most people had expected. Experts predict Linux, formerly the domain of technicians, will see quick adoption among normal users: Gartner estimates current penetration at 1 percent of all desktop operating systems, compared to 2.8 percent for Apple's Macintosh and 96 percent for Windows.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Scientists Help Create Spacecraft That Think for Themselves"
    Newswise (06/24/04)

    Machine-learning and pattern-recognition software that can be applied to a wide array of spacecraft is an area of research for scientists at the Jet Propulsion Laboratory (JPL), Arizona State University (ASU), and the University of Arizona (UA). The Autonomous Sciencecraft Experiment (ASE) is currently focused on developing smart software for NASA's EO-1 satellite so that the satellite can relay the most timely data first while keeping less-timely data in reserve for later transmission. Ph.D. student Felipe Ip with UA's Hydrology and Water Resources Department is developing software that can detect floods, which works by comparing the satellite's camera images to images recorded in its computer memory; any significant deviations between the compared images causes the satellite to alert scientists and take more pictures. The software has already detected flooding on the Diamantina River in Australia, and in July the software will be operating almost completely autonomously for the second round of testing. The JPL and ASU researchers' software efforts are concentrating on the detection of volcanic activity and ice field changes, respectively. The ultimate goal of the ASE is the detection, mapping, and analysis of similar events throughout the solar system through smart spacecraft. "By using smart spacecraft, we won't miss short-term events such as floods, dust storms, and volcanic eruptions," notes Ip. "Finally, instead of sifting through thousands of images, I can actually get some sleep at night, knowing that a smart robot is on alert twenty-four-seven."
    Click Here to View Full Article

  • "H-1B Increase Faces Stiff Resistance"
    Computerworld (06/25/04); Thibodeau, Patrick

    Initiatives to raise the federal H-1B visa cap, whose 65,000 limit was reached in February, have made little progress, given record unemployment levels for engineers and a welter of state legislation to restrict offshore outsourcing. The IEEE-USA estimated last month that the jobless rate for U.S. systems analysts and computer scientists rose from 5.4 percent in the fourth quarter of last year to 6.7 percent in the first quarter of 2004; meanwhile, the average number of people employed in the field fell from 722,000 in 2003 to 672,000 in the first quarter of this year. Bills that could affect offshoring have been introduced in 37 U.S. states: In California, a bill from Carol Liu (D-South Pasadena) banning government agencies from contracting with IT services companies that employ foreign labor has been passed from the state assembly to the state senate. Connecticut CIO Rock Regan says such legislation could have a major influence on state governments as their IT workforces near retirement age. High-tech companies are supporting a proposal from Rep. Lamar Smith (R-Texas) that seeks to add an additional 20,000 visas to the H-1B cap through the exemption of U.S. university graduates with a master's degree or higher. The bill is advocated by the American Council on International Personnel, whose director of government relations, Lynn Shotwell, reports that many companies were able to satisfy their H-1B requirements for the current fiscal year by planning ahead, but notes that the situation may change in 2005.
    Click Here to View Full Article

  • "Wi-Fi Finds the Way When GPS Can't"
    New Scientist (06/28/04); Biever, Celeste

    Wi-Fi is emerging as a back-up technology for GPS systems, whose signals are often blocked by tall buildings and other structures in urban areas. The number of Wi-Fi base stations sprouting in public places is burgeoning, and the University of Washington's Anthony LaMarca and scientists at Intel's American and British research labs have created Place Lab, new software that continuously records the radio signal strengths from nearby base stations. Place Lab is programmed to recognize where a signal comes from using a database that stores the location of 26,000 U.S. and U.K. base stations, and can pinpoint a user's location by tapping the signal strength from a minimum of three base stations. Place Lab can provide accuracy to within 20 to 30 meters, compared to the GPS average of 8 to 10 meters, but LaMarca thinks the gap could be closed with the development of better algorithms that can account for building materials, the height of the base station above the ground, and other factors. The software is freely available for downloading at www.placelab.org, and once a user acquires Wi-Fi there is no need to purchase additional hardware to use Place Lab. The software has already been tested in a number of cities, including San Diego, Seattle, Berkeley, Manhattan, and Cambridge; another test is planned by University of Glasgow researchers, using volunteers with Wi-Fi-enabled personal digital assistants at the Edinburgh Festival in August. Other applications in development by Place Lab team members include a "location-aware" version of Microsoft's Instant Messenger software, which will enable users to send their Place Lab data so that "buddy lists" will show friends' physical location as well as their online presence.
    Click Here to View Full Article

  • "EBay Shows It's Sold on Software Makers"
    Los Angeles Times (06/28/04) P. C1; Gaither, Chris

    This year's EBay Live convention was preceded by the first-ever EBay Developers Conference, which the company set up for the growing number of software entrepreneurs creating EBay-specific applications. People began selling EBay add-on software as early as 1997, when David Eccles wrote his Cricket Jr. sniping program that enabled buyers to make last-second auction bids. Developing programs for EBay at that time was very difficult because the company did not make its APIs available and people had to rely on code gleaned from the Web site itself: Whenever the Web site changed, third-party applications would often stop working. The first outreach to EBay developers came in November 2000 with the release of APIs and this year the EBay Developers Program includes more than 7,500 members; the online auction currently counts 550 officially approved programs for the site, compared to less than 200 programs the same time last year. EBay is a unique opportunity for independent software developers because of its considerable and growing portion of e-commerce revenue. The market has even attracted a number of venture capitalists who were out scouting at the recent developers conference: Theirs is a question about how big any software company can get when it is completely dependent on a single platform such as EBay, but EBay CEO Meg Whitman said the company was careful not to stifle much-needed innovation by swiping ideas. EBay makes a percentage from every transaction, and third-party software that increases business on the site is encouraged. Some developers have even succeeded where EBay failed, such as Bonfire Media's mobile phone application that lets people follow auctions and place bids on the go.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Java's Place in the Universe of More Than Just Passing Interest"
    SiliconValley.com (06/27/04); Gillmor, Dan

    The chief value of Sun Microsystems' Java technology is its cultivation of a wide-ranging community or ecosystem that has helped Sun sustain its status as a feasible choice for purchasers of large-scale computer systems, according to Sun CEO Scott McNealy. Sun estimates that some 1.75 billion devices worldwide are running the programming language and software environment. A roadblock to Sun fulfilling the promise of Java was Sun's failure to inject compatibility into its "virtual machines." Microsoft, meanwhile, was intentionally modifying its version of Java so that what functioned on its Windows operating system would be problematic on other systems; however, Microsoft recently settled a huge lawsuit against such practices, and both Microsoft and Sun have been ensuring interoperability between their products since then. McNealy claims that Java has played a major role in preventing Microsoft from monopolizing Web-development technology. Java's remarkable success on non-PC hardware may be partly attributable to Sun's relaxing its control, and J.P. Rangaswami of Dresdner Kleinwort Wasserstein investment bank believes Sun's willingness to have a community was Java's salvation. However, some community members would like Java to be fully open-source, a topic that will be discussed at the JavaOne conference which begins on Monday in San Francisco. Sun VP and lead Java advocate James Gosling argues that making Java into a completely open-source product would complicate interoperability, compatibility, and reliability, which are major requirements for Java developers.
    Click Here to View Full Article

  • "The Robot That Does a Simple Job Very Well May Be Wave of Future"
    Wall Street Journal (06/28/04) P. B1; Gomes, Lee

    University of Notre Dame professor Steven B. Skaar believes many academicians are in a state of denial about their lack of progress in the field of robotics. The slow advancement of robot technology is attributable to the unexpected complexity of performing simple physical activities, a problem that has also inhibited progress in the field of artificial intelligence. Researchers are focusing on squeezing intelligence into machines, but Skaar believes the "teach and repeat" approach to robotics, in which the machines carry out actions they have previously been "taught," may offer the best chance of fulfilling robots' original promise. The problem is that such an approach is not very intellectually stimulating to researchers. Teach and repeat is the operating principle behind a robotic wheelchair Skaar created, and which the professor describes as unique, despite the fact that the device takes several minutes to cover a few dozen feet. Another of Skaar's breakthroughs is a robot arm that employs a "camera-space manipulation" system to stack pallets of paper bags at a Chicago facility. Yet Skaar has only received $100,000 in research and development grants, and the lack of investment could be due to a number of factors, including a less than enthusiastic industry response to news of robot breakthroughs, whose hype often belies their actual performance. Skaar is concerned that his graduate students may have trouble finding work in their field as a result of his relative isolation.

  • "VP Tool Re-Creates Hallucinations"
    Technology Research News (06/23/04); Patch, Kimberly

    Researchers at Australia's University of Queensland have created software that can simulate hallucinatory experiences in a 3D virtual reality environment modeled after a psychiatric ward. The hallucinations--unsettling images and malevolent, disembodied voices--are based on descriptions of actual delusions taken from patient interviews, while the ward model was constructed from photos of a local psychiatric unit. The software can serve as a tool for behavioral therapy as well as a technique for doctors to better understand the hallucinatory experience from the patient's perspective. The program operates on a university-based VR system comprised of a trio of projectors and a curved screen with a 150-degree field of view. The patient wanders through the 3D environment using a mouse and keyboard, while hallucinations such as a yawning abyss or a distorted mirror image are triggered manually by the user or automatically when the user approaches certain objects. Multiple voices that utter abusive comments simultaneously are triggered randomly when the patient is near items such as TVs and stereos. University of Queensland research fellow Geoffrey Ericsson notes that the realization of the project would not have been possible without the increased affordability of quality graphics hardware and memory, as well as a scenegraph whose frame rates were high enough to realistically depict the model. The scientists' next goal is to interview more patients in order to compile a sizable archive of hallucinations that can be stitched together into imagery appropriate for individual sufferers.
    Click Here to View Full Article

  • "DARPA Funds New Photonic Research Center at Illinois"
    Innovations Report (06/22/04)

    The Defense Advanced Research Projects Agency (DARPA) has earmarked $6.2 million to be allocated to the University of Illinois at Urbana-Champaign over four years for the establishment of a photonic research facility committed to the development of ultra-fast light sources for high-speed signal processing and optical communications systems. Professor Norman Cheng of the university's micro and nanoelectronics laboratory serves as the director of the DARPA-funded Hyper-Uniform Nanophotonic Technology (HUNT) Center. He explains that the center's chief area of research will be "the high-performance optical switching and data routing technologies needed for flexible connections-on-demand and efficient bandwidth delivery in next-generation communications systems." A major area of concentration for the HUNT Center is the augmentation of laser technology made possible by the ultra-fast light-emitting transistor discovered by Nick Holonyak Jr. and Milton Feng. The device can concurrently modulate electrical and optical signals, which could raise a semiconductor light source's modulation bandwidth to over 100 GHz, up from 20 GHz. Cheng says that large-capacity seamless communications would be enabled by long-wavelength quantum-dot microcavity laser technologies, and researchers will focus on techniques to enhance quantum dots' size, distribution, and optical quality and embed them into the active region of long-wavelength quantum-dot lasers and light-emitting-transistor-based lasers. Nanopatterning, nanoscale semiconductor growth and characterization, and nanostructure device design and fabrication are some of the suggested methods.
    Click Here to View Full Article

  • "Better Videoconferencing Requires Less Computer Network Jitter, New Software Tools"
    ScienceDaily (06/24/04)

    Ohio State University engineers have discovered that users' attitude toward Internet videoconferencing greatly depends on the degree of "network jitter"--the disruption of a video signal that creates a jumpy, freeze-frame effect--they experience. Ohio Supercomputer Center (OSC) systems developer and OSU doctoral student Prasad Calyam says this finding goes against conventional wisdom, which argues that the biggest headache for users is long delays; but delays were determined to actually be the least aggravating factor for users in the study, which involved the participation of 25 sites in 14 nations. The researchers collected their data through H.323 Beacon, an open-source, free-to-download software program designed by Calyam that serves as a transmission protocol for real-time audio, video, and other data across networks. "We can deploy [Beacon] on the network and get meaningful results that can accelerate problem resolution," says Paul Schopis, director of network engineering and operations at OARnet, OSC's networking division. Variability in the delay of data packets over a network is the cause of network jitter, and the root cause of this variability is network congestion, packets arriving out of order, and irregular packet sizes, giving rise to fragmented signals and out-of-sync audio and video. Schopis says the Windows-interoperable Beacon software can be used by less experienced end users, be configured to automatically notify system administrators when a problem crops up, and is able to fix problems on the user's system. Experts advocate the idea that high-quality videoconferencing can only be effectively supported across dedicated private networks or alternate networks such as Internet2. Calyam says the results of the study, combined with software such as Beacon, could improve not just commercial videoconferencing services, but computer networks.
    Click Here to View Full Article

  • "OASIS Passes Flaw-Reporting Standard"
    InternetNews.com (06/23/04); Boulton, Clint

    The Organization for the Advancement of Structured Information Standards (OASIS) has approved Application Vulnerability Description Language (AVDL) version 1.0 as a standard for enabling products to share information about security bugs in Web services and applications. Prior to AVDL's development, network managers had little option but to arduously sift through vulnerability reports, and then take the proper corrective action and draft firewall rules to protect the applications, notes OASIS AVDL Technical Committee co-chair Kevin Heineman. AVDL can shave considerable time off this process by importing flaw assessment data from AVDL-enabled application scanners, and providing appropriate rule configuration ability for firewalls, automatic remediation for patch-management software, and the inclusion of vulnerability data in event correlation products. "By employing solutions based on the AVDL OASIS standard, companies can reduce the threat they face from the moment a vulnerability is discovered to the time it takes them to first shield, then patch, their systems," explains Gartner analyst John Pescatore. He estimates that up to 80 applications bugs are reported weekly, which makes AVDL essential for firms with networks and data centers that extensively use commercial software from Oracle, Microsoft, and other vendors. AVDL is designed to complement OASIS' Web Application Security standard, a specification for facilitating communication between intrusion detection products and firewalls during cyberattacks. AVDL is already being used by businesses and federal agencies such as the National Nuclear Security Administration and the U.S. Department of Energy's central security incident response organization.
    Click Here to View Full Article

  • "Student Team's Robot Safely Detects Land Mines"
    EE Times (06/21/04) No. 1326, P. 4; Johnson, R. Colin

    A group of Johns Hopkins University engineering students have developed a prototype remote-controlled robot that can traverse uneven terrain to locate and mark land mines safely and inexpensively. Edoardo Biancheri, Dan Hake, Dat Truong, and Landon Unninayar spent nearly a year designing the machine as part of their Engineering Design Project course. The gauntlet was thrown down by Carl Nelson of Johns Hopkins' Applied Physics Laboratory, who was developing sophisticated sensor designs for military use. He envisioned the robot as a vehicle that could get the mine-sniffing devices into off-road areas that were too dangerous or rough for effective manual mine scanning. The robot travels on treads driven by two electric motors powered by an acid battery, while its movements are remotely directed from up to 500 feet away through a radio-controlled transceiver. A color video camera captures images of any objects the machine encounters, which an operator can view on a display built into a joystick controller; the robot's weight is distributed to reduce the chances that it will trigger a mine, while the bulk of the machine is composed of plastic or composite materials to ensure that any metal the sensors pick up is coming from mines rather than the vehicle itself. The joystick controller beeps whenever the machine encounters a mine, and the operator can command the robot to mark the spot with spray-paint. The students could not spend more than $8,000 to design and build the prototype, but their effort only cost $5,000. The team reckons that the cost of production versions could run as low as $1,000.
    Click Here to View Full Article

  • "Jeffrey Naughton Speaks Out"
    SIGMOD Record (06/04) Vol. 33, No. 2; Winslett, Marianne

    University of Wisconsin at Madison computer science professor, ACM Fellow, and winner of the National Science Foundation's Presidential Young Investigator Award Jeffrey Naughton clarifies a statement he made that database systems are control freaks by explaining that many people refuse to use such systems because they are forced to comply with the systems' rules, which can be very strict and limiting. His suggestion to database researchers is to "look at the problems facing these people, and try to figure out whether there is any way to provide some of the benefits of a database system without falling into the same kind of old megalomaniacal control over everything that goes with the data." Naughton cannot provide a clear answer on what direction database systems research should be going, but he does find criticism within the database community to be a positive development; defining a database researcher's role is also important, and Naughton does not think that the entire database community should be viewed as an extension of development organizations. His observation is that the real contribution of academics is their students. Naughton believes database researchers have a responsibility to investigate data models that may not be successful, as opposed to models that will definitely be unsuccessful. The professor's advice is to accept the fact that XML has set up permanent shop in the database sector, but he does not agree that XML constitutes "the next big data model." Naughton believes that future trends are leaning toward a relational or even object-relational model, while one development he anticipates within the next decade is closer integration with text. Naughton would prefer that database theoreticians focus on areas that interest them rather than uninteresting areas that may have some applications.
    Click Here to View Full Article

  • "Winning the War on Spam"
    Discover (06/04) Vol. 25, No. 6, P. 24; Johnson, Steven

    The current model for fighting spam is treating it as a disease, with spam-blocking software, blacklists, and other techniques being disease-fighting antibodies. Some technology experts say this thinking is flawed because it does not try to address the root cause of spam, which is its profitability: If millions of identical messages are sent out, the cost is still basically the same as if the spammer sent only one message. Ferris Research estimated businesses spent $10 billion fighting spam last year, not to mention the inconvenience caused to home users and the millions of hours consumed emptying junk mail. Over the past several decades, environmentalists figured out that industrial pollution, like spam, actually costs more than it appears: People buying gas at the pump pay for the oil extraction, refining, and transportation, but do not pay for the associated damage to the environment; in this sense, email is simply too cheap to reflect the exorbitant costs of spam on users and the Internet infrastructure. Although some experts have advocated a small monetary charge for email, this system would not only be difficult to implement, but would unfairly punish those who could possibly benefit from email most. Microsoft researcher Cynthia Dwork has another solution that involves payment for email, except in computation time, not money: She suggests making sending computers figure out a puzzle so that each email message would cost about 10 seconds in computational time. Dwork's scheme is dependent on a variable element in the puzzle, which can increase the complexity of the puzzle in relation to Moore's law; though this 10-second tax would not likely affect regular users much since they could do other tasks on their PC in the meantime, it would mean a single computer could only send out roughly 8,000 emails per day instead of the millions they currently can churn out. Spammers would have to buy more machines, which would put many of them out of business.
    Click Here to View Full Article

  • "Internet Takedown"
    Government Technology (06/04) Vol. 17, No. 6, P. 24; McKay, Jim

    The United States is depending more and more on the Internet to conduct business and government functions, but this poses a risk given the vulnerability of the Internet. Experts say that the chances of a major disruption--whether from deliberate attack or from an accident like the 2003 blackout--are growing. "The problem with the Internet is we developed it so fast and furiously, and didn't take a step back and build it foundationally with security in mind," says Phyllis Schneck, chairwoman of the FBI's InfraGard board of directors. There is no real short-term solution besides reducing the severity and number of interruptions, including viruses. Georgia Technology Authority Walter Tong says hackers are the most worrisome threat presently. A company with excellent security could still be at risk if it is connected to one with poor security, and Carnegie Mellon University Software Engineering Institute fellow Watts Humphrey says today's software is so defective that hackers easily find flaws in it. There is no real agreement as to how much damage an accident or a hacker can cause, though studies at Ohio State University suggest that the storage of key Internet routing information in only a few nodes is not a good idea, since damaging one could affect many areas. Critical infrastructure such as emergency services and transportation use the Internet, which puts those systems at risk until some technological solution is developed, such as a parallel network with secure routers. John McCarthy, executive director of the George Mason School of Law's Critical Infrastructure Protection Project, is involved with a partnership between the District of Columbia, Maryland, Virginia, and the Homeland Security Department to find out what infrastructures are essential, how they are interdependent, and what to do to protect them. He believes that every sector should understand its role in protecting, and that state governments must determine which infrastructures are most important.
    Click Here to View Full Article

  • "A Road Map to Agile MDA"
    Software Development (06/04) Vol. 12, No. 6, P. 51; Ambler, Scott W.

    Despite being a vocal critic of the Object Management Group's Model Driven Architecture (MDA), Ronin International senior consultant and author Scott W. Ambler has organized an agile MDA scheme that involves active stakeholder collaboration, regular development of working software, the integration of testing into the modeling initiative, and the adoption of tools that support these processes. Ambler notes that MDA establishes a boundary between the specification of system functionality and the specification of its deployment by defining Platform Independent Models (PIMs) and Platform Specific Models (PSMs); the basic goal is for developers to define their business' needs and analysis of those needs within a PIM, choose the target platform for their system, and then automatically convert the PIM into one or several PSMs. A solid MDA implementation can significantly boost developer productivity, if developers follow Ambler's suggested roadmap. The first step is to bring stakeholders into modeling efforts using inclusive modeling methods such as essential user interface models and simple tools such as whiteboards. Working software must also be crafted, and Ambler recommends that the modeling process be incremental: "Agile modelers will focus on a few requirements, generate the PSM from those requirements, and then create the software that fulfills those requirements," he writes. The third step is to adopt several models to facilitate the job, while the fourth step is to test continuously. The fifth and final step is to accept that writing source code will still be necessary. Ambler explains that MDA will not be viable for firms unless they have modelers who are highly skilled or have the potential to become highly skilled; the ability to hold onto these modelers; stakeholders who can either actively participate in the development of inclusive models, or comprehend the PIMs developers build; a kit of modeling tools that fulfill actual business requirements; and a willingness to intertwine the firm and the tool supplier's prosperity.
    Click Here to View Full Article

  • "Strategic Supercomputing Comes of Age"
    Science & Technology Review (06/04) P. 12; Parker, Ann

    The National Nuclear Security Administration's Advanced Simulation and Computing (ASC) Program combines the resources and expertise of Sandia, Los Alamos, and Lawrence Livermore national laboratories with those of IBM, Intel, and other computing industry leaders to provide the processing power to study the reliability, safety, and performance of America's nuclear stockpiles through simulation. Program co-founder Randy Christensen reports that the project has started to move out of the proof-of-principle stage thanks to the development of a computing architecture and scalable, high-fidelity, 3D simulation codes. The next step will concentrate on the enhancement of physics models within those codes so that nuclear weapons' behavior can be more clearly understood as part of the Stockpile Stewardship Program. The "entry point" for carrying out high-fidelity, fully-fledged weapons systems simulations will be Livermore's 100-teraops ASC Purple supercomputer, with rollout slated for next year. Its processing power will be complemented with that of IBM's Blue Gene/L, and collectively these machines should increase the predictive capability for processes such as mixing in gases and dislocation dynamics in metals by more closely linking simulations and experiments and gauging the ambiguity of simulated results. Livermore researchers first relied on huge vector-architecture supercomputers developed by Seymour Cray, but they eventually made the transition to scalar architecture and massively parallel ASC machines with off-the-shelf processors. The terascale systems are enabling physicists to anticipate material behavior from first principles and move equation-of-state models out of the descriptive domain and into the predictive. "It's a new world in which simulation results are trusted enough to take the place of physical experiments or, in some cases, lead to new experiments," explains Elaine Chandler, manager of the ASC Materials and Physics Models Program.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM