HomeFeedbackJoinShopSearch
Home

       HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to [email protected].

Volume 4, Issue 425: Wednesday, November 20, 2002

  • "Comdex: Panel Says to Accept the Net is Vulnerable"
    InfoWorld.com (11/19/02); Weil, Nancy

    A panel of security experts at the Comdex trade show said that the Internet's vulnerability to exploitation is a fact of life, and everyone--businesses as well as home users--must pitch in to make it more secure. Counterpane Internet Security CTO and panelist Bruce Schneier attributed much insecure software to the fact that software vendors cannot be held liable for poor-quality products, and explained that they would only make security a priority if enough consumers demanded it. Meanwhile, the security software industry is adding to the problem because it is in the habit of designing safeguards in response to threats, rather than in anticipation of them. Several panelists said that the proliferation of wireless networks will require companies to deploy virtual private networks (VPNs), firewalls, and other tools, and implement policies prohibiting workers from operating their own wireless equipment on corporate LANs. Some, however, believed that employees would ignore such policies, and urged that companies work to ensure network security regardless. Schneier described wireless security standards as "robustly insecure," and therefore unreliable. He added that corporate and federal officials should devote more attention to "actual criminals and not hackers," categorizing hackers as mostly kids who commit minor offenses that do little damage and cybercriminals as malicious individuals who steal information and cause severe damage. Some panelists hailed President Bush's security initiative, which issued recommendations for more public-private collaboration and education, as a step in the right direction.
    Click Here to View Full Article

  • "Study Details Technology's Role in Boosting Productivity"
    SiliconValley.com (11/19/02); Sylvester, David A.

    U.S. businesses increased productivity by about 2 percent each year from 1995 to 2000, and one-third of that increase was attributable to technology, according to consulting firm McKinsey & Co. The new report helps clarify what made technology more useful in some sectors than in others. Retailers and financial services companies, for instance, were able to increase productivity through the use of technology more than hotels and hospitals were able to. McKinsey analysts say businesses that re-engineered their operations to make use of technology benefited the most. One example is Wal-Mart, which used advanced inventory-tracking systems to find out what products sold well and where. The report also says changing government regulations played a key role in some cases. After the Securities and Exchange Commission updated rules allowing for more incremental pricing on stocks, trading volumes increased, facilitated by the Internet and more computerization. About this time last year, McKinsey released a report suggesting that much of the increase in productivity was isolated in technology-heavy fields, such as computer manufacturing and semiconductors. A rise in living standards is directly attributable to productivity increases, since companies pay higher salaries to more productive employees. Still, McKinsey Global Institute director Diana Farrell says that a precise determination of tech's effect on the economy is difficult, because some companies have not measured the benefits of tech equipment on their business.
    http://www.siliconvalley.com/mld/siliconvalley/4559923.htm

  • "Copy Control Complaint Desk Opens"
    Medill News Service (11/19/02); Madigan, Michelle

    Critics of the 1998 Digital Millennium Copyright Act (DMCA) and its provisions will be able to submit their opinions to the Copyright Office either by mail or online until Dec. 18. The last time the department accepted DMCA-related comments in October 2000, 235 submissions were registered in the first round, says Copyright Office senior attorney Rob Kasunic. The first round of the latest comment review will revolve around proposed exemptions to the law, and participants will be able to avail themselves of an online comment form. Kasunic notes his office will use the comments to propose revisions to the librarian of Congress, with decisions expected next October. The Electronic Frontier Foundation (EFF), which has long opposed the DMCA, will participate in the review, and EFF attorney Fred von Lohmann says that the organization is "encouraging others to make a clear record that the DMCA is being used in ways Congress never intended." The law has been criticized as anathema to fair-use rights and scientific research, and has been the subject of legal episodes, such as the arrest of ElcomSoft programmer Dmitry Sklyarov for distributing a program that bypassed e-book copy protection, and a threatened lawsuit against Professor Edward Felten for publishing research on how to circumvent a company's digital watermark technology. However, von Lohmann notes that the Copyright Office was "unsympathetic" to many such concerns during the last review two years ago. Copyright industry attorney Steve Metalitz is worried that amending the DMCA would only reduce the legal rights of content industry groups, and give hackers more room to pirate material.
    http://www.pcworld.com/news/article/0,aid,107129,00.asp

    To learn more about ACM's concerns regarding DMCA, visit http://www.acm.org/usacm.

  • "H-1B Program Gets More Heat"
    Investor's Business Daily (11/20/02) P. A4; Angell, Mike

    The H-1B visa program that American employers use to bring in foreign labor to fill mostly IT-related jobs has attracted criticism from both domestic and foreign employees. Norman Matloff of the University of California, Davis, argues that industry really hires overseas personnel because they work cheap, using an alleged shortage of American tech talent as an excuse. Meanwhile, Immigrants Support Network director Murali Devorakonda says that employers often take advantage of H-1B visa holders, either underpaying them or using the temporary nature of the visa as an excuse to dismiss them once they have completed a project. Furthermore, many employers use intermediary companies known as "body shops" that could claim part of a foreign worker's salary as a commission. The annual H-1B worker cap currently stands at 195,000, and roughly 890,000 visa holders currently reside in the United States; the congressional vote on whether to maintain the visa cap will take place in 2003. The Information Technology Association of America (ITAA) announced in May that 1.1 million American tech jobs would be created in the next 12 months, 50 percent of which will be vacant because of the dearth of qualified domestic workers. However, the association has since downgraded the anticipated number of created jobs to 800,000 to account for the tech downturn. ITAA President Harris Miller insists that the H-1B visa program is doing its job--the fact that only 80,000 visas have been approved this year is evidence that the supply of foreign workers is adjusting to the decline in tech jobs.

  • "Secret U.S. Court OKs Electronic Spying"
    CNet (11/18/02); McCullagh, Declan

    An earlier ruling from the Foreign Intelligence Surveillance Court declaring that domestic police agencies and spy agencies must be separated in order to protect Americans' privacy was overturned by the Foreign Intelligence Surveillance Court of Review, thus widening law enforcement's authority to conduct electronic surveillance, wiretapping, and secret searches against people suspected of espionage and terrorism. When the lower court made its ruling in May, Justice Department lawyers argued that the enactment of the USA Patriot Act nullified the need for a wall between local and federal law enforcement, and also established that the types of monitoring authorized by the Foreign Intelligence Surveillance Act (FISA) could now be carried out if terrorist or espionage acts represented a "significant purpose" of investigations rather than the primary purpose. U.S. Attorney General John Ashcroft, who requested the extension of powers, said the reversal will help usher in a new period of collaboration between police and federal agencies, and called it a "victory for liberty, safety, and the security of the American people." The ACLU and the National Association of Criminal Defense Lawyers responded to the decision by filing friend-of-the-court briefs recommending that the appeals court support the lower court's ruling. "Because the FISA now applies to ordinary criminal matters if they are dressed up as national security inquiries, the new rules could open the door to circumvention of the Fourth Amendment's warrant requirements," warned Robert Levy of the Cato Institute. "The result: rubber-stamp judicial consent to phone and Internet surveillance, even in regular criminal cases, and FBI access to medical, educational and other business records that conceivably relate to foreign intelligence probes."
    http://news.com.com/2100-1023-966311.html

  • "Nearly 1 Million IT Jobs Moving Offshore"
    EarthWeb (11/19/02); Gaudin, Sharon

    Almost 1 million IT jobs will be farmed out overseas over the next 15 years, predicts a new report from Forrester Research, which warns that such a development could threaten the positions of base- to mid-level American programmers unless they acquire more business-centric skills. Forrester's John McCarthy says the IT industry migration will be the first step toward the offshore transition of 3.3 million American jobs and $136 billion in wages to nations such as India, Russia, the Philippines, and China. He predicts that the next 16 months will be a heavy period of offshore migration, followed by a two-year lull, with even more overseas transference expected between 2005 and 2015. In addition to cheaper labor and less restrictive labor laws overseas, the quality of work by foreign IT professionals is better, according to McCarthy. "[Overseas workers have] done the most to turn IT development away from a mystical black art to a real business process... 'Just wing it' is not part of the culture there," he says. Illuminata analyst Gordon Haff insists that high-end, strategic IT jobs will stay in the United States, but people with low-end skills such as basic programming and maintenance programming are likely to have their job security threatened when the offshore transition occurs.
    http://itmanagement.earthweb.com/career/article.php/1503461

  • "Faster Than Speed of Byte"
    San Jose Mercury News Online (11/19/02); Stober, Dan

    The NEC Earth Simulator in Japan is currently the world's fastest computer, but IBM aims to recapture that title with a $267 million contract to build a pair of supercomputers for nuclear weapons modeling and other research at Lawrence Livermore National Laboratory. One machine, dubbed ASCI Purple, will be comprised of 12,600 IBM Power5 microprocessors and compute 2.5 times faster than the Earth Simulator, while its companion, Blue Gene/L, will feature 130,000 chips and compute 400,000 times faster than the average home PC. Physicist Bruce Goodwin of the Livermore lab says that ASCI Purple will finally give researchers the processing power needed to build a realistic 3D model of a detonating hydrogen bomb. The machine will also possess 50 trillion bytes of temporary memory and 2 petabytes of storage capacity, according to IBM. Michael Nelson, IBM's director of Internet technology and strategy, is particularly enthusiastic that the Power5 chips used by the Livermore supercomputers will start showing up in commercial servers next year, and later become available in desktops and perhaps video games. He adds that the lab computers and their commercial counterparts will be capable of self-healing. Meanwhile, Goodwin notes that Blue Gene, which will be primarily dedicated to DNA research and weather forecasting, has the potential to become the first true petaflop machine. The NEC Earth Simulator holds the No. 1 spot in the most recent list of the Top 500 Supercomputer Sites, but machines in U.S. weapons labs have claimed second through fifth place.
    http://www.bayarea.com/mld/bayarea/news/4553593.htm

  • "For W3C, It's a Question of Semantics"
    CNet (11/18/02); Festa, Paul

    The World Wide Web Consortium (W3C) last week revised several documents detailing the Resource Description Framework (RDF) and the Web Ontology Language (OWL), which relate to the Semantic Web, a future version of the Internet envisioned to better understand the context of information. Such a system would, for example, be able to help computers establish the similarity between the terms "book" and "manual," or be able to relate "purchase order" with "PO." However, some technologists and industry analysts argue that the project is a retooled artificial intelligence proposition that postpones more important Web services development. ZapThink senior analyst Ron Schmelzer estimates that the development of a Semantic Web could take as long as 10 years to achieve. Furthermore, he adds that the effort will require industry leadership and a concentration on the development of commercial products in order to succeed. W3C members bristle at the idea that the Semantic Web is AI. "The conceptual models behind RDF are predicated on work in the digital library community," insists Eric Miller of the W3C. "You can think of this as a common framework that supports thesaurus, taxonomies and classification schemes."
    http://news.com.com/2100-1001-966208.html

  • "Say 'Cheese' for the Robot Photographer"
    Associated Press (11/18/02); Fredrix, Emily

    Computer science professor Bill Smart of Washington University in St. Louis has created a robot that is programmed to move throughout environments, taking candid photographs of people as a way to integrate and demonstrate the applicability of undergraduate student projects in navigation, interaction, and face composition determination. The machine, dubbed Lewis, scans for faces with a video camera and uses a digital camera to capture single frames. Lasers and sonar gauge the robot's distance from objects, while programs study the captured frames' composition to find skin and faces. Within 1.4-meters tall Lewis are a pair of computers powered by four car batteries. Although the machine is programmed to avoid collisions with people, a handler is always present to shut it off in case such a situation becomes unavoidable. The pictures the 300-pound robot takes can be viewed and printed out via a wireless computer-robot connection. Smart says, "With a lot of robot stuff, it's difficult to get people enthusiastic. We're using photography as a framework. You can't show one piece of these applications on their own." The quiet robot runs on programs written by students.
    http://www.msnbc.com/news/835084.asp

  • "A Vote for Less Tech at the Polls"
    Wired News (11/19/02); Glasner, Joanna

    Critics of computerized touch-screen voting machines, which saw use in the recent congressional election, maintain that a paper record is especially necessary to ensure that the votes cast are accurate. Bryn Mawr College computer science professor and voting technology expert Rebecca Mercuri explains that a paper trail is critical, because computers are prone to malfunctioning; she adds that voting machines are especially vulnerable, since they are only used a few times a year. Stanford Research Institute Computer Science Laboratory principal scientist Peter Neumann also supports paper ballots because he says current touch-screen systems are not reliable. He says "if there's no assurance your vote goes through, it's irrelevant." There is also concern about that fact that the software used by such machines is not open source and therefore not publicly available for inspection, while further speculation posits that the software running on individual machines could be different from the code supplied by the machine vendors. Todd Urosevich of Election Systems & Software argues that exposing the software could threaten security--for example, it could help certain people rig election results by allowing them to manipulate code. Stephen Ansolabahere of the Caltech-MIT Voting Technology Project notes that voters reported incidents during this month's election in which the candidates they voted for with touch-screen systems were not the ones that the machines displayed. Still, he adds that preliminary reviews find that voters had less trouble with computerized systems this time than they did back in 1998. Despite all the glitches and criticisms, Sequoia Voting Systems estimates that 22.3 million registered voters used touch-screen machines in the recent election, and expects even more voters to use them by 2004.
    http://www.wired.com/news/business/0,1367,56370,00.html

    To read more about ACM's activities involving e-voting, visit http://www.acm.org/usacm.

  • "Internet, Grid to Forge Brave New Computing World"
    Grid Computing Planet (11/14/02); Shread, Paul

    A recent report from PricewaterhouseCoopers forecasts that the Internet will evolve into a "global networked computing utility" stemming from the intersection of grid computing, ubiquitous computing, IP dial tone, and computing as a utility; these trends are advancing as the result of growing Internet usage and increasing demands of enterprise computing. PricewaterhouseCoopers' Eric Berg says his firm anticipates that research into grid computing schemes, protocols, and middleware will likely resolve problems with reliability, security, and business models that are holding up the implementation of Web services. The report also predicts that the global telephone network will transition from separate sections of voice-over-Internet Protocol (IP) enclosed by legacy telephone system protocols to a model that supports end-to-end IP use, thus ending telecom carriers' domination of the development of new telephone services. Meanwhile, enterprises are expected to move away from proprietary computing infrastructure and increasingly rely on service providers for their computing needs. As computing becomes ubiquitous, the report predicts that IT will become so deeply embedded in everyday life that it will become invisible, but only if user interfaces, system software, and pervasive networks develop accordingly. The report also finds that current developments in computing and communications are being driven by several trends, particularly the growing emphasis on the flexibility, reliability, and manageability of scalable computing and communications infrastructures. Bolstering this trend are the movement of resource management tools to distributed computing environments, system vendors fortifying their systems with additional redundancy, new architectures for data centers and Web applications, and new system and network management tools.
    Click Here to View Full Article

  • "Webs Within Web Boost Searches"
    Technology Research News (11/20/02); Patch, Kimberly

    Web researcher Filippo Menczer of the University of Iowa is working to build a mathematical model that will provide more comprehensive search engine results based on text similarity and links between Web pages. While current search engine techniques already use text and links as the basis of rankings and results, Menczer is studying a sample of 150,000 related pages to find out exactly how useful those approaches are. Among the discoveries so far is that the likelihood of pages being linked together is correlative to their similarity in content. Menczer's model also encompasses factors previously unaccounted for, such as the popularity of the page, which makes it even more likely to be linked to. He says his mathematical model will eventually yield more relevant pages and encourage natural collaborative activities between groups of similar interests on the Web. Menczer is also looking into creating visual maps showing relationships based on textual similarity, meaning, and links. Physics professor Shlomo Havlin at Israel's Bar-Ilan University says Menczer's work is important to better understanding the relationship between ideas and networks on the Internet. That knowledge can then be applied to other areas of research, such as network stability and immunization against computer viruses.
    Click Here to View Full Article

  • "Intel's 'Hyperthreading' Not Enough to Sew Up PC Sales"
    Boston Globe (11/18/02) P. C1; Bray, Hiawatha

    Hyperthreading technology from Intel promises--and appears to deliver--greater performance for processors. The company picked up the reins from the defunct Digital Equipment and developed the ability to build CPUs with two sets of "registers" that would function in compatible operating systems as two separate chips. This would allow two separate programs to run simultaneously. Dual-CPU machines may offer similar capabilities, but they are expensive. Normally it may seem as if a computer with a conventional CPU is running two or more programs at the same time, when in fact the CPU is really running them one at a time, very fast. Hyperthreading, which Intel first embedded in its Xeon chips and recently debuted in the Pentium 4 chip, is supposed to solve the problem of performance slowdowns caused when one of the applications it is running demands excessive CPU power. However, Mark de Frere of Advanced Micro Devices and Insight 64 analyst Nathan Brookwood agree that people rarely perform two demanding applications at once. Although Intel cited research that finds that 75 percent of PC users run multiple programs simultaneously, de Frere counters that most are "soft" applications such as email and Web browsers--applications that require so little data that the performance gains from hyperthreading would be negligible. Intel argues that hyperthreading offers enough added performance to change the way people use PCs.
    http://digitalmass.boston.com/news/globe_tech/upgrade/2002/1118.html

  • "MEMS Really Is the Word at Munich Electronics Trade Show"
    Small Times Online (11/19/02); James, Kyle

    MEMS developers and manufacturers spoke excitedly about the prospects of their sector at the recent Electronica 2002 trade show in Munich. MEMS sensors for automobiles have started hitting mass production, comprising one-eighth of the market for automobile sensors and growing fast. Wicht Technologie Consulting's (WTC) Henning Wicht also says medical devices will comprise a major market for MEMS technology in the future as well, as it is useful for applications in stimulators, and measurement and diagnostic tools. The attendees were generally optimistic, noting that the MEMS industry has grown at a steady pace despite the overall tech sector slump. Wicht's group conducted a market study on radio-frequency MEMS (RF MEMS) and found that those components would make up a more than $1 billion market by 2007. RF MEMS consolidate different wireless components onto a single chip and consume far less power than traditional wireless components. Missile systems, military radar, the Global Positioning System, and next-generation mobile phones are just a few applications of RF MEMS. Attendees of the Electronica conference also worried about Asia's fast-growing MEMS capability, and encouraged one another to press hard in order to keep the region's lead. European Commission representatives that spoke at the conference say a new EU program was in development that would help turn European research success into products for the commercial market.
    http://www.smalltimes.com/document_display.cfm?document_id=5065

  • "New Way to Dramatically Increase Data Storage Capacity"
    Newswise (11/19/02)

    Researchers led by chemistry professor John Fourkas of Boston College's Merkert Chemistry Center report in the December issue of Nature Materials that they have discovered stable and cheap fluorescent materials capable of storing 3D data at high densities. Using laser light, the team "writes" fluorescent spots at specific areas in the media, while data is written in via multiphoton absorption. This makes the readout of data at relatively low laser intensities possible. The materials offer the advantage of repeatable scanning and little degradation after prolonged use, unlike previous materials used for multiphoton data storage. The amount of data that can be stored on a disc using the new materials is almost 20 times greater than the storage capacity of a standard DVD. In addition, the materials are moldable, their properties can be chemically tweaked, and they can be fabricated in a variety of forms, including crosslinked polymers or molecular glasses. Thus, later generations of hard disk drives may no longer have to subscribe to the disk-shape schematic.
    http://www.newswise.com/articles/2002/11/DATASTOR.BOC.html

  • "Loosening Up the Airwaves"
    National Journal (11/16/02) Vol. 34, No. 46, P. 3410; Munro, Neil

    The federal government is working to create new opportunities for the wireless industry in America, hoping to spur investment and make the market more competitive on the global scene. The FCC and the Commerce Department's National Telecommunications and Information Administration (NTIA) are working together to free up more of the electromagnetic spectrum. Earlier this year, the government took a portion of spectrum dedicated to military use and gave it to the industry. In addition, restrictions on spectrum license ownership for mobile carriers will soon be lifted, according to the FCC, which will allow operators to buy out competitors in tough local markets. The FCC hopes this deregulation will spur investment so that carriers can roll out advanced technology, such as location-based services. Both the FCC and the NTIA support the new unlicensed 802.11 technology used for short-range wireless data connections, as well as ultrawideband, which can be used for both data transfer and other applications. FCC Chairman Michael Powell declared last month that he supports recommendations from his Spectrum Policy Task Force. That group suggested license holders be allowed more flexibility in regards to how they use their portion of spectrum; using it for different daytime and nighttime applications, for example.

  • "Think the 'Digital Divide' Is Closing? Think Again"
    Maryland Daily Record--TechLink (11/02) P. 11; Armstrong, Mario

    Educators, politicians, parents, community activists, and businesses must take a creative approach to bridging the "digital divide" as the Bush administration seeks to cut key technology programs and put more money toward anti-terrorism efforts, writes technology advocate Mario Armstrong. The White House wants to terminate community technology centers and programs that helped teachers improve their computer skills, as well as other initiatives such as the Commerce Department's Technology Opportunities program, because it believes many of those programs are duplicative. According to a new Commerce Department report, the Bush administration believes the digital divide is closing, now that more than half the nation is online and with 2 million new Internet users going online each month. However, only 25 percent of the nation's poor were online in 2001, and only 39.8 percent of African Americans and 31.9 percent of Hispanic Americans are online. What is more, simply having access to the Internet does not mean Americans will use the technology as anything more than consumers. Some technology programs, for example, were a resource for receiving training for IT-related jobs. There remains more than 350,000 unfilled tech jobs today, which indicates the workforce still needs technical training; minorities and women are two so far largely untapped sources of new talent. A recent Information Technology Association of America study found that "women and some minorities are underrepresented in most technical fields," including information technology. Armstrong says boosting entrepreneurship, a better utilization of information and technology, and a strong education commitment are necessary to overcoming the loss of key government programs and finally ending the digital divide.
    http://www.mddailyrecord.com/specialpubs/techlink/1102j-digspin.html

  • "Digital Entertainment Post-Napster: Music"
    Technology Review (11/02) Vol. 105, No. 9, P. 56; Kushner, David

    The music industry's triumph over Napster, which record labels shut down in an effort to protect their content from piracy and boost flagging CD sales, was short-lived, since more open peer-to-peer networks have proven to be harder to dislodge. In response, the industry is trying to standardize copy-protected CDs, a move that could curtail fair-use rights, according to consumer proponents such as DigitalConsumer.org co-founder Joe Kraus. Copy-protected CDs that can play on stereo audio machines but cannot be copied onto computer hard drives are already available: For instance, over 30 million CDs worldwide use Midbar Tech's Cactus Data Shield, including an album released by Universal Music Group in the United States. Also debuting this year will be CDs that feature two sets of the same songs, one of which can be played on a computer under certain constraints, and another that plays on the stereo with no restrictions. Because the private sector is having difficulty developing a copyright-protection standard, Sen. Fritz Hollings (D-S.C.) has proposed legislation that would require digital-media device manufacturers to install federally-approved copy protection measures within their products. Complicating the situation are hackers, who are constantly striving to crack new protection schemes and share their strategies with others. Also critical to the adoption of copyright-protection technologies is the creation of a labeling system that lets consumers know whether CDs have been modified not to play on certain machines. Europe is being used as a testbed for copyright-protection systems developed by record companies, and these systems should show up in the United States by year's end.

  • "A Many-Handed God"
    Electronic Business (11/02) Vol. 28, No. 11, P. 40; Poe, Robert

    Nanotechnology can be applied to many industries, but it is the electronics industry that is attracting the most interest. Although nanotech is already being used in disk drive heads and magnetic media, for instance, research is proceeding apace into such materials as carbon nanotubes, which could be used to improve displays and sensors, while longer-term efforts into molecular electronics and molecular computing could yield even greater benefits. However, it is difficult to ascertain the potential market for nanotech-based products, given the field's youth and its focus on enabling other technologies; Canadian Nanobusiness Alliance President Neil Gordon estimates that current nanotech sales are somewhere between $30 million and over $100 million, while the National Science Foundation (NSF) forecast in March 2001 that a worldwide nanotech market surpassing $1 trillion annually could emerge within 10 to 15 years. Mihail Roco of the NSF calculates that private business is spending about $500 million on nanotech this year, with major companies such as IBM and Hewlett-Packard standing out with projects and products such as "pixie dust" used by disk drive magnetic media. Meanwhile, the proposed $710 million fiscal 2003 National Nanotechnology Initiative (NNI) spending budget will be distributed among 10 government agencies, all of which will have to align their own agendas with NNI's research strategy of conducting fundamental research, meeting "grand challenges," setting up centers of excellence, building a research infrastructure, and studying nanotech's ethical, legal, and societal implications. Startups such as Nantero and Zettacore are also focusing on nanotech with projects such as nanotube-based random-access memory (NRAM) and multi-porphyrin nanostructures. Other countries are pursuing their own nanotech development efforts in apparent response to the NNI mandate.
    Click Here to View Full Article

 
                                                                             
[ Archives ] [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM