HomeFeedbackJoinShopSearch
Home
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM.

To send comments, please write to [email protected].

Volume 5, Issue 559: Friday, October 17, 2003

  • "Octopus or Eagle Eyes? Outfitting a Robot for Its Mission"
    New York Times (10/16/03) P. E7; Austen, Ian

    A robot does not necessarily need a sophisticated imaging system in order to navigate, and some researchers are turning to unusual real-world examples to develop vision systems that emphasize practicality and simplicity rather than superior image quality. An octopus' relatively poor vision system, which can differentiate between vertical and horizontal lines, is the model for the o-retina, a silicon retina chip developed by Dr. Albert H. Titus of the State University of New York at Buffalo. Retina chips, like complementary metal oxide semiconductor chips and charge-coupled devices, employ analog signals to convert different intensities of light into different electrical voltages; but the o-retina does not convert those voltages into digital code. "We're trying to mimic animals, and animals aren't digital," notes Titus. Furthermore, the o-retina's pixels replicate the cells in a octopus retina by linking vertically and horizontally, enabling the chip's software to find patterns between cells without processing a complete image of the object. Conventional sensors, in contrast, map light elements to every pixel to produce pictures. Titus plans for a later version of the o-retina to be able to distinguish light with variable polarization. He says the overarching goal of such research is to create retina chips that can imitate the vision systems of many animals, allowing, say, the depth perception of an eagle and a lion to be combined for robots that operate in deserts.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Older IT Workers Becoming Hot Commodity"
    EarthWeb (10/16/03); Gaudin, Sharon

    Older IT professionals with business skills have become a highly valued asset in the post-dotcom-bubble era, especially since many of the now-defunct businesses of that era failed because they lacked experienced personnel. A new report from Challenger, Gray & Christmas indicates a revived interest in managers and executives 50 years and older: The average job search time for over-50 job seekers has dipped 19 percent from approximately five months in the fourth quarter of 2002 to four months at the end of the third quarter this year, compared to a 1.8 percent drop in search times for seekers under 50 in the same period. Younger seekers' median job search time currently stands at 3.8 months, placing the over-50 crowd and the under-50 crowd virtually neck-and-neck. Illuminata analyst Gordon Haff notes that older job seekers employed by bigger, more deeply entrenched companies may have a tougher time getting another job. "If the company is perceived as unimaginative and slow and boring, those characteristics get carried over to workers," he explains. Challenger, Gray & Christmas CEO John Challenger reports that the hiring prejudice against older workers that characterized dotcoms at the height of the dotcom bubble was also present in big companies, which preferred younger IT staffers because they may have had less family commitments and were willing to work for less. However, Challenger says that attitude has also undergone a reversal as major companies realize that "With an older worker, you may get more loyalty, more commitment and sometimes more of a work ethic."
    Click Here to View Full Article

  • "Digging for Nuggets of Wisdom"
    New York Times (10/16/03) P. E1; Guernsey, Lisa

    Text-mining software is becoming more powerful and helping researchers, analysts, and companies find obscure conceptual links in large collections of material. University of Pennsylvania cancer researcher Michael N. Liebman, for example, uses text-mining tools to help identify the affect of late pregnancy on post-menopausal cancer by putting the software to work on thousands of medical research publications; he says the collection of many weak observations from many different sources strengthens that finding, even if the original authors may not have noticed its significance. Information specialists say cheaper text-mining programs are entering the mainstream and can help people make sense of voluminous public information, such as the 858-page congressional inquiry report regarding pre-9/11 intelligence failures. Text-mining is unlike Web searches because it identifies important people or ideas and analyzes the document context using algorithms. Text-mining is also a derivative of data-mining, as used in statistical analysis of structured databases, but focuses on unstructured data found in email messages, journal publications, or transcripts of phone conversations. Industry analysts say text-mining can help companies make use of the previously unusable documents they say hold up to 80 percent of corporate knowledge. Fireman's Fund Insurance used text-mining to dramatically improve its automated fraud-detection systems: Fireman's Fund operations research director Marty Ellingsworth watched the company's human investigators at work and found they often gleaned the most important clues from claims adjusters' notes, so Ellingsworth used text-mining to enable that ability in the system. Still, effective text-mining requires expertise, says Liebman, who spent months preparing a conceptual framework for the software that adds meaning to its results.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Anti-Spam List Wouldn't Fly, Experts Warn"
    Investor's Business Daily (10/17/03) P. A5

    Experts argue that fundamental distinctions between phone and email systems and the marketers who use them will be insurmountable barriers to the usability of a do-not-spam list, while even antispam advocates admit that such a measure would not cure the spam problem. The FTC is skeptical that a do-not-spam list run by the government would be effective: The phone network is strictly regulated and features central control and strong anti-spoofing measures, but email systems are decentralized and information about spammers is not difficult to falsify. A no-spam list can quickly become outdated because people switch email addresses more often than phone numbers. In addition, Direct Marketing Association CEO Bob Wientzen doubts that spammers, who already regularly violate consumer-protection statutes, will adhere to any lists. Critics note that spammers based overseas would be difficult to track down, while the existence of a no-spam list raises questions of what would happen if the security of that list is compromised. Still, Sen. Charles Schumer (D-N.Y.) has introduced legislation calling for the creation of a national no-spam list, and similar bills have been passed by State Senates in Michigan and Louisiana. Furthermore, the Direct Marketing Association and at least three private firms have instituted do-not-spam lists, but critics charge such measures as toothless.

  • "U. Researchers Revamp Credit Card"
    DailyPennsylvanian.com (10/16/03); Eckstut, Mer

    Penn State University is in the process of obtaining a full patent on its new "smart card" technology. Unveiled in July at a European conference on object-oriented programming, the smart card technology allows users to program spending limits into credit cards. Carl Gunter, a professor of Computer and Information Science, came up with the idea of creating an open application programming interface (API) for credit cards after his nanny exceeded the number of minutes he allowed her to use on his cell phone. Gunter wanted to give users of appliances with tiny computers the opportunity to customize the computers. The smart card project was part of a larger, two-year project on Open Embedded Systems. The smart card technology works with a modified credit card reader connected to a home PC, as well as special software people can use to program spending limits into cards. Businesses could use smart cards to limit where and how much employees spend, and parents could use the technology in a similar manner to rein in the spending of their children. Penn State wants to license the smart card technology to private companies.
    Click Here to View Full Article

  • "State E-Business Gets a Boost"
    Madison Capital Times (WI) (10/16/03); Richgels, Jeff

    Armed with a two-year, $600,000 grant from the National Science Foundation, the newly created Wisconsin E-Business Institute will partner with state plastic companies to develop e-business strategies. The institute was established to complement the Consortium for Global E-commerce, formed five years ago to bring together technology experts and state industries in an effort to develop understanding of how the Internet can improve business.
    Click Here to View Full Article

  • "U.S. Admits Convicted Man Is No Hacker"
    Los Angeles Times (10/16/03) P. C1; Menn, Joseph

    Federal prosecutors this week overturned the conviction of Bret McDanel, a former employee of Tornado Development who was tried and convicted as a criminal hacker for warning customers about a software bug that Tornado had failed to correct. The now-defunct Tornado provided a service that allowed clients to retrieve email, voicemail, and faxes via a Web site, but the bug McDanel complained of could have allowed all Tornado users to see each other's mail. McDanel notified Tornado customers that their information was compromised, and published details of the flaw on a Web site. This action prompted the FBI to bring him into court for allegedly breaking the Computer Fraud and Abuse Act, which bans the commitment of acts that harm the "integrity" of a computer system. McDanel's appellate attorney Jennifer Granick filed to appeal her client's conviction with the U.S. Ninth Circuit Court of Appeals, and Assistant U.S. Attorney Ronald Cheng declared that "[McDanel's] release of vulnerability information did not by itself cause an 'impairment to the integrity of a computer system,'" and repealed the conviction, citing government error. McDanel spent 16 months in jail, and his reputation may have suffered irreparable damage despite the reversal. "While Bret unfortunately served his time in prison, by pursuing this appeal he's demonstrated that people have the right to tell the truth about when a computer is insecure," said Granick. Larry Lessig of Stanford University is concerned that the Digital Millennium Copyright Act could also be used to force such a conviction.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Bad Grades for a Voting Machine Exam"
    Salon.com (10/15/03); Manjoo, Farhad

    Computer programmer Jeremiah Akin, a Peace and Freedom Party representative who witnessed a Sept. 9 logic-and-accuracy test of Sequoia Voting Systems' touchscreen voting machines in Riverside County, Calif., says the results did not make him breathe easier--in fact, they cast the security problems of electronic voting machines in an even bleaker light. He set down his observations in a 22-page document, where he indicated that Registrar of Voters Mischelle Townsend and her office made "misleading or inaccurate" statements, and implied that the Registrar did not understand certain computer programming and operation fundamentals, nor was she qualified to evaluate the security and reliability of e-voting systems. In an interview, Akin remarks that Townsend rigorously defended e-voting during the test, which he characterized as more of a sales pitch than an actual test; moreover, she claimed that computer scientists are ignorant of the electoral process, which Akin finds unconvincing. Worse still, he notes that most of the other test observers were not familiar with the technology, and adds that many of them signed off on the test before the test was completed. Akin recounts that no one was really allowed to press the touch screen during the test: People only saw the output, not the input; he says that he took a lunch break, then returned to the test area to take the test cartridges out of the machines, only to find that they had already been removed. A cartridge was eventually secured, and Akin discovered that, contrary to statements from Townsend and Riverside IT manager Brian Foss, the Sequoia machines use Microsoft software, which makes the computers on which the voting software runs vulnerable to modification. Akin was told by election officials that the machines produced a printed audit trail, when what critics actually asked for was a voter-verified paper trail to address concerns that the machines might record the wrong votes.
    Click Here to View Full Article

    To read about ACM's activities and concerns regarding e-voting, visit http://www.acm.org/usacm/Issues/EVoting.htm.

  • "Cold War Encryption Laws Stand, But Not as Firmly"
    CNet (10/15/03); McCullagh, Declan

    U.S. District Judge Marilyn Hall Patel dismissed a lawsuit filed by University of Illinois at Chicago math professor Daniel Bernstein against the federal government for allegedly trying to stifle his publication of a simple encryption program on the grounds that it violated Cold War-era encryption laws, which Bernstein's lawyers deemed unconstitutional. Participating sources report that the dismissal was predicated on the Bush administration's promise not to enforce certain parts of the regulations. Bernstein's case is one of three that have prompted the government to reduce the scope of attempts to control privacy-shielding encryption technology found in many email readers and all Web browsers. Such encryption once held the same munition status as tanks and fighter jets and its regulation was handled by the State Department, but the Clinton administration loosened the regulations and passed responsibility to the Department of Commerce. Justice Department attorney Tony Coppolino suspended the latest version of the encryption rules at a 2002 hearing, promising that cryptographers pursuing valid research would not face government prosecution. Bernstein's attorneys, which includes lawyers from the Electronic Frontier Foundation, argued that the encryption laws were changed in order to hinder their client's case; they also submitted a letter to the government in January 2000 claiming the laws forced cryptographers to disclose their research to the government, in violation of the Constitution. Meanwhile, the 6th Circuit Court of Appeals ruled in another case that computer source code is protected under the First Amendment. Bernstein declared in an email statement, "I hope the government sticks to its promises and leaves me alone--but if they change their mind and start harassing Internet-security researchers, I'll be back."
    Click Here to View Full Article

  • "NSF Awards $5.46 Million to UC Berkeley and USC to Build Testbed for Cyber War Games"
    UC Berkeley News (10/15/03); Yang, Sarah

    Researchers seeking to bolster computer networks' cyber-defenses will soon start conducting assaults on a cybersecurity testbed that the University of California, Berkeley, and the University of Southern California's Information Sciences Institute will develop with a three-year, $5.46 million grant from the National Science Foundation (NSF), which is co-funding the project with the Homeland Security Department. "Through this project we will develop traffic models and architectures that are scaled down from the actual Internet, but still representative enough that people can have confidence in it," remarks UC Berkeley professor and principal project investigator Shankar Sastry. The Cyber Defense Technology Experimental Research (DETER) network will support a simulation of the entire Internet that government, industry, and academic researchers can use to test their cybersecurity solutions. The network will be deliberately subjected to worms, viruses, denial-of-service attacks, and other malevolent codes. DETER will exist separately from the actual Internet, and when complete will be comprised of about 1,000 computers with multiple network interface cards and at least three permanent hardware clusters. The UC Berkeley researchers will concentrate on the design and construction of the DETER infrastructure, while a related NSF-backed initiative led by UC Davis and Pennsylvania State University will focus on a testing and assessment framework. "Projects such as these demonstrate how NSF contributes both to cutting-edge research and the nation's security," declares Mari Maeda, acting division director for Advanced Networking Infrastructure and Research at NSF.
    Click Here to View Full Article

  • "The Most Popular Operating System in the World"
    Linux Insider (10/15/03); Krikke, Jan

    The world's most popular operating system is ITRON, a Japanese real-time OS kernel that can be customized for any small-scale embedded systems. ITRON, currently found in a score of electronic gadgets that includes CD players, mobile phones, and digital cameras, is the first open-source specification for The Real-Time Operating system Nucleus (TRON), which was originally conceived by Professor Ken Sakamura as a social architecture as important as the water supply system or electric grid. Japanese embedded systems use ITRON as a de facto standard, and the specification is currently employed in some 3 billion microprocessors. Successive TRON-based specs include Business TRON (BTRON), a multilingual, pervasive computing environment that features a programmable graphical user interface; and Communications and Central TRON (CTRON), which serves as a real-time, multitasking operating system similar to Unix and has become the de facto standard for the Japanese telecom industry. The TRON initiative may have made a deeper impact much earlier, if not for U.S. interference: Matsushita's BTRON PC sparked a furor when the Japanese government announced plans to install the system in Japanese schools in 1989, and the U.S. government called the maneuver "actual and potential market intervention" and threatened to impose sanctions, a gesture that scared the Japanese into jettisoning the initiative. A recent alliance between Japan's T-Engine Forum--a TRON project offshoot--and Linux developer MontaVista seeks to standardize CPU-level embedded systems by bundling together MontaVista Linux and TRON's real-time OS, security infrastructure, and middleware modules. This development could add to proprietary embedded software vendors' worries about increasingly popular open-source solutions.
    Click Here to View Full Article

  • "WiMax Promises Breakthrough in Broadband Access"
    IDG News Service (10/15/03); Lemon, Sumner

    WiMax, or 802.16a, is a wireless networking standard that reportedly eases the provision of broadband access and can help reduce installation costs for broadband service providers. WiMax offers higher bandwidth and greater transmission range than Wi-Fi, enabling service providers to supply broadband connectivity directly to homes without worrying about so-called last-mile problems, explains Anand Chandrasekher of Intel's Mobile Platforms Group. The popular 802.11b Wi-Fi standard has a maximum data transfer rate of 11 Mbps and a maximum range of 1,000 feet in open areas, while WiMax boasts a 70 Mbps data transfer rate and a 30-mile transmission radius. Though installing a single broadband access point takes an average of 20 minutes, Chandrasekher remarks that the installation time can take as long as two hours in some situations, which can raise installation costs for the service provider significantly. "WiMax would eliminate that, because with WiMax, you'd be able to broadcast the broadband capabilities, and in the home environment, you could have an access point," he says. Chandrasekher anticipates a time when WiMax is employed to wirelessly link homes to broadband networks via Wi-Fi access points, and believes users will one day be able to tap into WiMax networks from their laptops or cellular handsets. Intel revealed at the Intel Developer Forum in Taipei that it did not expect WiMax to be commercialized for about two years; Intel plans to start producing WiMax-enabled chips in the second half of 2004. Chandrasekher is hopeful that WiMax will play an important role in overcoming the last-mile barrier that has impeded the U.S. adoption of broadband, particularly in rural regions. He adds that WiMax will help intensify the competition in Asian countries where broadband has already deeply penetrated.
    Click Here to View Full Article

  • "IT Jobs Contracted From Far and Wide"
    Globe and Mail (CAN) (10/14/03); Saunders, John

    There are two forms of IT offshoring--nearshoring and farshoring--that present advantages and disadvantages to North America: The United States appears to be hurting more because U.S. corporate IT operations are increasingly being farshored--transferred to cheaper workers in far-flung countries such as India and Russia; Canada, however, while suffering an IT job drain to farshoring as well, is also bringing in IT work as a nearshore outsourcer, with the United States its primary customer. It remains to be determined whether Canada's farshore losses are offset by its nearshore gains. Everest Group Canada President Frank Koelsch thinks that Canada's industry, on the whole, is encountering less difficulty than the United States. He contends that farshoring's grip on the country is mild, possibly because Canadian companies are smaller and more conservative than U.S. companies. Yet the major Canadian banks, among others, see opportunities in farshoring, and both American and Canadian IT service firms are establishing overseas branches to satisfy customer demand, restore former profit margins, and shore themselves up against the threat of India's three biggest outsourcers--Tata Consultancy Services, Wipro Technologies, and Infosys Technologies. All three firms are eyeing General Motors, a car company that contracts out all of its IT operations, which until recently were primarily handled by Electronic Data Systems (EDS). IT workers at EDS Canada are concerned that their jobs could be threatened as Wipro sets up shop near GM in Windsor, Ontario. Toronto lawyer Alison Youngman believes Canada should boost its standing as a nearshore IT outsourcer for U.S. customers through a national campaign.
    Click Here to View Full Article

  • "Leading Humanity Forward"
    Star (Malaysia) (10/14/03); Asohan, A.

    Reading University cybernetics professor Kevin Warwick--who gained notoriety by exploring "cyborg" technology through surgical implants in his own body--says the purpose of cybernetic research is twofold: To help disabled people better control their bodies or manipulate objects, and to enhance the performance and capabilities of humans in general. He says his goal is "to upgrade [people], to take humans as we are now and to give ourselves extra abilities." Warwick's first experiment, Project Cyborg 1.0, involved the implant of a transponder chip in his arm, one that could interact with an intelligent building system to enable Warwick to turn on lights and open doors without any manual action. The follow-up experiment, Project Cyborg 2.0, was based on the implant of an electrode microarray connected to his nervous system, which was then attached to a radio transceiver to establish a two-way link between his nerves and a computer; the implant allowed Warwick to remotely control devices such as a robotic hand, but much more ambitious plans for Project Cyborg 2.0 were abandoned or scaled back because of bureaucratic red tape, ethical concerns, or unexpected results. The professor had wanted to see if thought communications between two people with similar implants was possible, and he enlisted his wife to be a guinea pig, but he did not receive ethical approval for the experiment unless his wife's nerve implant was external rather than internal. The attempt to communicate physical emotions to one another was also limited by the complexity of the emotions themselves. A much more successful Project Cyborg 2.0 experiment proved that Warwick, when blindfolded, could navigate by ultrasonic signals fed to his implant through a sensor-studded baseball cap. The ultimate goal of these experiments is a brain implant, Project Cyborg 4.0, which Warwick hopes will dramatically expand human communications, perhaps even support telepathy. Warwick also subscribes to the belief that machine intelligence will inevitably overtake human intelligence, if one defines intelligence as "the mental ability to sustain successful life."
    Click Here to View Full Article

  • "W3C XForms 1.0 Hailed as Standard"
    InternetNews.com (10/14/03); Boulton, Clint

    XForms is touted as the successor to Hypertext Markup Language (HTML), adding more functionality and flexibility to the Web. The World Wide Web Consortium's (W3C's) official recommendation of XForms 1.0 as a standard is seen as a big step forward for the technology as it races against Microsoft's InfoPath (formerly called XDocs) technology, which will be bundled with new iterations of Office software. Not surprisingly, Microsoft is not on the W3C XForms working group, but large vendors IBM, Adobe, Novell, Oracle, and Sun are. The W3C views XForms as an open alternative to both Microsoft's InfoPath and proprietary technology offered by Adobe, which aims to make its portable document format (PDF) the foundation for creating online forms either in PDF or in XML Data Package. XForms will allow cell phone, PDA, and PC users to each access Web forms without compromising functionality. The technology delineates between function and presentation more clearly than does HTML and lets developers build in richer features, such as the ability to designate particular fields' email addresses or to negate credit card requirements if a purchase is made by cash. Government agencies are also eager to use XForms since it will help them in putting more services online, says W3C's Janet Daly. Meta Group analyst Thomas Murphy, however, remains wary about XForms' ability to compete against Microsoft since InfoPath already has a clear market path and XForms has just achieved recommendation status; he says the critical issue is how XForms is packaged and brought to market.
    Click Here to View Full Article

  • "Computer Evolution"
    EE Times (10/13/03); Brown, Chappell

    Computing's next evolutionary step involves direct machine-to-machine communication and a melded network infrastructure that anticipates users' needs. Intel research director David Tennenhouse spoke at a recent MIT conference about a future he is helping to shape through a number of seminal technology projects. After working with the most rudimentary batch-processors, computer scientists pioneered interactive computing in the 1960s, Tennenhouse said. In the future, users will need to relinquish some of their direct control over the system and allow computers more freedom to get results from other computers. Tennenhouse said users really should only need to work with one computer in order to receive a highly developed answer and be spared the constant computer-to-computer interaction that is going on behind the scenes. With revolutionary new types of hardware, network architecture, and computer-learning software, the distinction between individual computers and the network would begin to blur. Tennenhouse, a former director of DARPA's Information Technology Office, says the Internet would become a more powerful asset, anticipating user needs instead of just reacting to commands.
    Click Here to View Full Article

  • "New Scheduling Method Raises Efficiency of Electronics Recycling"
    Purdue University News (10/13/03); Venere, Emil

    Purdue University assistant professor of industrial engineering Julie Ann Stuart has developed a software solution for boosting the efficiency of electronic recycling operations by improving the management of incoming products from storage to breakdown. "In recycling you have a different objective when you schedule jobs than you do in manufacturing, and you need different key measurements to achieve that objective," Stuart explains. "We created the key measurements, and we identified the new objective, which may open up an area of research for a whole new class of scheduling problems." Unlike manufacturers, recyclers are not beholden to due dates, and the speed at which raw materials are extracted from discarded machines is not a major factor; far more critical to recyclers is the need to continuously provide enough space to receive and store products immediately before recycling. A lack of adequate available staging space can force recyclers to turn away shipments or store them in trailers, which can add up to lost income in the first instance and storage rental costs in the second. Rather than moving the products that can be most quickly disassembled out of the staging area first, Stuart's technique selects the largest objects with fast disassembly times first, a strategy that can save a lot of money in overhead by allowing recyclers to scale down their staging area; this would also lead to an increase in capacity by enabling recyclers to process more products in the same space. The International Association of Electronic Recyclers estimates that over 1.5 billion pounds of obsolete electronics is processed every year, and if such figures help make e-waste recycling a state and federal priority, then the importance of an effective scheduling method is clear, Stuart attests.
    Click Here to View Full Article

  • "Recognizing Excellence: ACM Calls for Award Nominations"
    ACM, New York, NY (10/17/03)

    ACM is calling for nominations for its 2003 awards recognizing outstanding technical and professional achievements in computer science and information technology. These awards, most with cash prizes, offer a unique opportunity to bring broad recognition to peers, colleagues, mentors, professors, and proteges for their contributions to CS and IT. They also represent noted milestones for career development and promotional opportunities in the IT field. The competitive process, administered by individual award committees, evaluates the top contenders; winners are selected based on specific criteria established for each award.
    For more information on how to submit nominees, visit http://www.acm.org/awards/award_nominations.html.

  • "2003 R&D 100 Awards Celebrate High-Tech: Software"
    R&D Magazine (09/03) Vol. 45, No. 9, P. 30

    Recipients of the 41st Annual R&D 100 Awards in the software category include NASA Glenn Research Center and ZIN Technologies' Microgravity Analysis Software System (MASS), which is employed in the Principal Investigator Microgravity Service Project to gather and study microgravity acceleration data from the International Space Station; MASS could be used to draw new insights about space's microgravity environment, as well as boost the fuel economy of cars and aircraft, help lower vehicle exhaust emissions, and improve fire safety. Procter & Gamble and Los Alamos National Laboratory's PowerFactoRE software, which is used to optimize current and future manufacturing processes, was honored for its ability to lower operational costs and capital spending by allowing users to anticipate, forestall, and cut reliability degradation, repair process time, and the frequency of equipment malfunctions. NASGRO 4.0 Fracture Mechanics & Fatigue Crack Growth Analysis software from NASA Johnson Space Center, Lockheed Martin Space Operations, and the Southwest Research Institute was cited because of its ability to scan space hardware, heavy equipment, pressure vessels, gas turbine engines, and offshore structures in order to gauge their resistance to fracturing. Pacific Northwest Laboratory's Starlight Information Visualization System, whose applications include strategic planning, fraud detection, and law enforcement, enables users to mine vast amounts of data to unearth buried information or outline trends in three dimensions. The purpose of Lawrence Berkeley National Laboratory's Energy Plus software is to simulate a commercial or residential building's expected energy management so it can be designed for optimal energy efficiency. Energy Plus can also contribute to the development of energy standards, and be employed as a training tool and to ascertain new technologies' energy impact. Finally, R&D honored Idaho National Engineering and Environmental Laboratory's Change Detection System software, which is used to precisely compare similar images to uncover even the most subtle inconsistencies; the tool is mainly designed for retinal scans, fingerprint analysis, medical image comparison, and homeland security.
    Click Here to View Full Article

 
                                                                             
[ Archives ] [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM