Timely Topics for IT Professionals
About ACM TechNews
ACM TechNews is published every week on Monday, Wednesday, and Friday.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either IBM or ACM.
To send comments, please write to firstname.lastname@example.org.
Volume 5, Issue 499: Friday, May 23, 2003
- "Data Collection Is Up Sharply Following 9/11"
Wall Street Journal (05/22/03) P. B1; Davis, Ann
Data collection efforts have expanded in the wake of the Sept. 11 attacks, but the same commercial and government databases that would ostensibly be used to thwart future terrorist incidents could also be used to gather information on innocent American citizens. Claiming most of the attention--and criticism--from privacy advocates are the Capps airline passenger profile system and Total Information Awareness, which was recently renamed Terrorist Information Awareness; but lesser-known systems such as the Violent Gang and Terrorist Organization File (VGTOF) and the FBI's Terrorism and Intelligence Data Information Sharing Data Mart are also being developed to connect previously uncommunicative databases and combine public records with intelligence based on investigative conjecture. VGTOF, originally devised to gather information on gangs, has been extended to include anyone being probed by the FBI in domestic or overseas terrorist investigations, including people with no records of criminal activity. More than 7,000 people are listed in the database, along with gang members numbering in the tens of thousands. The Data Mart imports data from federal agencies and connects to local law enforcement databases, and employs text-mining software to scan for possible signs of terrorist activity throughout more than 1 billion documents collated from FBI field offices. Furthermore, the scope of police intelligence files is being widened through projects such as Rissnet. However, how often such systems produce false positives is unknown, because the databases are not open to the public. Meanwhile, the General Accounting Office believes the meshing of all these databases will be hindered by non-interoperable operating systems and computing languages.
- "Computing's Lost Allure"
New York Times (05/22/03) P. E1; Hafner, Katie
Fewer new university students are entering computer science courses as job prospects in that field continue to diminish. At the University of California, computer science professor Brian Harvey says just 350 students enrolled in his introductory course this spring, compared to 700 students who signed up in the fall of 2000. At that time, many computer science graduates had several job offers lined up and some were even offered signing bonuses to start before graduation. At Carnegie Mellon University, the effect on the computer science department has been a 36 percent enrollment drop-off from a peak in 2001. MIT reports 20 percent less freshman signing up for electrical engineering and computer science, compared to recent years. University of Texas computer sciences chairman J. Strother Moore worries that the decrease may even hurt the department, as well as the strength of the future economy. Even as freshman enrollments are down, graduate programs for computer science are growing because of students who want to avoid the current job market. Other effects of the downturn in programming job opportunities is that more students are studying for second degrees or in more applied fields of computing, such as business information technology or bioinformatics. Carnegie Mellon computer science department head Randal Bryant says one benefit of lower enrollments means students are sincerely interested in the field rather than just their career prospects. Meanwhile, corporations, concerned about a lack of future computer scientists, are intensifying their efforts to promote the field to children, even those in kindergarten. IBM's Dr. Gabby Silberman says he is concerned about the enrollment decline because so much of what the world depends on today is based on computing technology.
(Access to this site is free; however, first-time visitors must register.)
- "The Computer World Could Use More IT Girls"
Los Angeles Times (05/21/03); Margolis, Jane
UCLA education researcher Jane Margolis writes that future social, economic, and political trends in the United States depend on the type of people attracted to computer technology and the values they carry. She observes that girls are sorely lacking in the technology sector, though as many females as males surf the Web and use instant messaging. Only one quarter of the 2002 IT workforce consisted of women, even though almost 50 percent of the U.S. workforce was female. Furthermore, girls account for approximately 20 percent of all computer science majors and just 17 percent of all high school students taking the Advanced Placement Computer Science exam. Margolis argues that the lack of a female contribution to computer technology design will reverberate throughout the nation's economic and social architecture, lock out women from educational and economic opportunities, and result in products that do not fulfill women's needs, an example being voice-recognition systems chiefly calibrated to males' distinct vocal nuances. She cites research proving that the prevailing view of computer science culture is being dictated by "a small substrata of [game-oriented] male students," while women are struggling to connect computing to wide-ranging areas, such as medicine and social issues. Computer science education in the schools comes up short in this regard. Margolis also believes that the gaming industry needs to renovate the products it markets, the majority of which offer ultraviolent male-oriented fare that present women as sex objects.
Click Here to View Full Article
(Access to this site is free; however, first-time visitors must register.)
To learn more about ACM's Committee on Women in Computing, visit http://www.acm.org/women
- "W3C Makes Patent Ban Final"
CNet (05/21/03); Festa, Paul; Bowman, Lisa M.
World Wide Web Consortium (W3C) director Tim Berners-Lee announced on Tuesday that his organization had finalized the Royalty-Free Patent Policy, which excludes royalty-bearing patents from Web services standards development except in cases where W3C working groups see no way to circumvent technologies based on such patents. Berners-Lee wrote in the decision paper that "The decision to base the Web on royalty-free standards from the beginning has been vital to its success until now...By adopting this Patent Policy, with its commitment to royalty-free standards for the future, we are laying the foundation for another decade of technical innovation, economic growth and social advancement." The finalized draft represents a concession between the open-source community, which thinks that patents hinder standards development and adoption, and proprietary software companies that do not like the idea of freely releasing intellectual property. Open Source Initiative co-founder Bruce Perens praised the W3C's decision, but warned that the standards-setting process is unprotected from submarine patents that are filed, but not granted, when a W3C technical recommendation is still under development. The W3C's patent policy has prompted major industry players such as Microsoft and IBM to go over the heads of the consortium, although IBM's Scott Cook wrote in an email interview that his company approves of the completed policy. In the meantime, IBM, Microsoft, and SAP have submitted certain Web services standards proposals to the Organization for the Advancement of Structured Information Standards.
- "Take Tech Threats Seriously, Feds Say"
Medill News Service (05/22/03); Wenzel, Elsa
Charles McQueary of the Homeland Security Department told the House Select Committee on Homeland Security's Subcommittee on Cybersecurity, Science, and Research and Development on Wednesday that the department's budget for biological and nuclear attack defense research exceeds the budget for critical infrastructure protection by more than a hundredfold. He also expressed worry that the tech budget, which currently stands at $5 million, is not sufficient to overhaul emergency crew communications, disseminate information to the public, and deploy other types of homeland security technology. McQueary and several subcommittee members--Reps. Zoe Lofgren (D-Calif.) and Christopher Cox (R-Calif.) among them--want technological safeguards and response systems rapidly developed, but do not want short-term solutions to devour all their funding. "These issues are difficult to grasp, and not as easy to comprehend as the threats to our borders and infrastructure," Lofgren declared. McQueary wants $30 million more earmarked for a private-industry technology clearinghouse, $10 million more for academic programs, and $90 million more for threat evaluation, including cybersecurity. He noted that the Homeland Security Department is debating which 10 universities will serve as "centers of excellence" where terrorist countermeasures can be developed. Subcommittee Chairman Rep. Mac Thornberry (R-Texas) said that sifting through the numerous proposed solutions will be problematic, because time is not on the United States' side. Rep. Gregory Meeks (D-N.Y.) recommended that first-response systems be low-tech in order to alert the largest number of people in the event of terror attacks.
- "Bill in Congress Would Curb L-1 Visa Use for Foreign Workers"
Computerworld (05/21/03); Thibodeau, Patrick
U.S. Rep. John Mica (R-Fla.) has targeted foreign outsourcing firms with new legislation that is designed to protect U.S. high-tech workers from losing jobs to lower-paid workers. Mica has unveiled new legislation that attempts to address criticism of the L-1 visa program and how third-party outsourcers use it to bring cheap labor into the country. Under the L-1 visa program, companies with subsidiaries overseas are allowed to transfer executives and workers to the United States. However, the program has come under heavy criticism because foreign outsourcing firms are using it to place skilled tech workers in the United States. Mica's bill would prevent third-party outsourcers from serving as "body shops," but it faces some serious challenges. Displaced U.S. workers have expressed concern about the bill because it leaves a loophole for companies to hire workers overseas and then send them to the United States, while high-tech companies say they have legitimate reasons for bringing in foreign workers. For example, high-tech companies would need foreign employees who have helped develop software to assist clients in using the applications. More than 325,000 foreign workers currently hold L-1 visas, which enable them to work in the United States for seven years.
Click Here to View Full Article
- "ACM Honors Peter Franaszek for Contributions to Data Encoding"
AScribe Newswire (05/21/03)
ACM will give the Paris 2002 Kanellakis Theory and Practice Award to Dr. Peter Franaszek for his contributions to digital data encoding, which helped usher in revolutionary breakthroughs in digital media's recording density and digital communication systems' transmission bandwidth. Constrained channel coding is a key ingredient of communications and digital recording systems, one that facilitates clock synchronization, rapid data synchronization, and configuration of the signal spectrum. IBM researcher Franaszek's theoretical research laid the groundwork for several codes currently embedded in pervasive industry standards. Franaszek earned a Brown University Sc.B. in electrical engineering, and an M.A. and Ph.D. from Princeton in the same field. He co-invented the 8B/10B code, which is used in Gigabit Ethernet and Infiniband standards, and created the [2,7,] RLL code. Franaszek is the author of over 40 technical papers and the owner of 44 patents; he counts basic computer organization, data management, and representation among his current enthusiasms. Franaszek will receive the Kanellakis Award on June 7, at the ACM Annual Awards Banquet in San Diego and add it to past honors, including a 1989 Emmanuel R. Priori award from the IEEE and several IBM Corporate and Outstanding Innovation awards.
Click Here to View Full Article
- "Will Taxing E-Mail Stop Spam?"
IDG News Service (05/22/03); Gross, Grant
A number of proposals that aim to halt the relentless growth of unsolicited commercial email, or spam, are being debated in Congress. A Wednesday hearing of the Senate Commerce, Science, and Transportation Committee brought several proposals into focus, including Sen. Mark Dayton's (D-Minn.) suggestion to impose a small tax on each piece of commercial email sent, as well as organize a federal anti-spam SWAT team. Sens. Ron Wyden (D-Ore.) and Conrad Burns (R-Mont.) used the hearing as a platform to promote their Controlling the Assault of Non-Solicited Pornography and Marketing (CAN-SPAM) Act, which would ban deceptive email headers and force mass emailers to include opt-out policies for users. The measure was criticized by Orson Swindle of the FTC, who said that companies are averse to dealing with spam complaints because they could negatively impact their marketing opportunities, while users should not have to opt out of every piece of spam they get. He supports a solution that would allow users to only receive email from people in their address books, and thinks that subjecting spammers to criminal penalties could be an effective deterrent. Meanwhile, Sen. Charles Schumer (D-N.Y.) called for the passage of an international spam treaty that would prevent spammers from relocating their operations overseas if they are outlawed in the United States. Scelson Online Marketing's Ronald Scelson said that current ISP policies, which involve cutting off email marketers' network access in response to consumer complaints, is forcing legitimate markets to mask sender information; he introduced his own spam solution in which customers can bounce bulk messages back to the sender by checking a "no bulk email" box, which would be included in all email applications.
- "Linux and the Law"
NewsFactor Network (05/22/03); Wrolstad, Jay
Supporters of the open-source movement are gaining powerful allies as Linux and other open-source products find favor in both U.S. and international governments in response to the increasing use of open-source software. Venezuela, Britain, Germany, and France have started to accept open-source software, while legislation is pending in California, Texas, and Oregon that could make open-source offerings more advantageous--in fact, Oregon legislators are debating a bill that would make it a requirement to justify spending on proprietary software. Rep. Phil Barnhart (D-Ore.), who sponsored the Oregon bill, believes open-source software will enable the state to save a tremendous amount in licensing fees, assert more control over its computer systems, and achieve database interoperability. Aberdeen Group's Bill Claybrook adds that open-source software could be a less expensive option for schools and government entities that are unlikely to use the innumerable features of proprietary offerings. Stacey Quandt of Giga Information Group partly attributes the introduction of the Oregon bill to political pressures. Meanwhile, Mike Wendy of the Initiative for Software Choice is convinced the Oregon, Texas, and California open-source bills are nonessential, because "The market is working. Linux and other open-source products are proliferating." Quandt says that government interest in open source is making more people aware of the technology and its viability, while Claybrook observes that Microsoft is under pressure to lower its licensing fees as a result.
- "At Bell Labs, Hard Times Take Toll on Pure Science"
Wall Street Journal (05/23/03) P. A1; Berman, Dennis K.
Lucent Technologies' Bell Labs operation has become just like any other corporate research laboratory, heralding the end of free-reign "pure" research that is not tied to any specific company product or service. Lucent trimmed Bell Labs' research budget to around $115 million, about one-third of the amount it received in the mid 1990s. Research lines have been jettisoned where possible or shut down depending on their relevance to Lucent's recovering telecommunications business. Bell Labs' decline is a "national tragedy," according to MIT President Charles Vest, and represents a general shift away from basic research in America. The National Academy of Sciences says the government today funds about one-third of all basic research in the United States, compared to about two-thirds of U.S. research during the Cold War era. Recent market pressures have caused commercial sponsors to back away as well, and last year Xerox spun off its Palo Alto Research Center as an individually accountable entity. Bell Labs President William O'Shea explains that corporate research today needs to be more responsive to shareholders instead of wide-ranging national interests, as when the group was funded by the AT&T telecommunications monopoly; during that time, Bell Labs scientists won six Nobel Prizes and assisted NASA in the Apollo space program. Today, research groups collaborate with business colleagues to help solve customers' immediate problems and work on projects that have clear product-oriented goals. Optical networking researcher Rod Alferness says the new focus is in some ways satisfying: "We get joy out of doing research, and double joy out of seeing it in real use," he says.
- "The Pixel as Shutterbug, Embedded in Your Screen"
New York Times (05/22/03) P. E6; Austen, Ian
Low-temperature polysilicon thin film transistor liquid crystal displays (LCDs) are being developed by Toshiba and other manufacturers to capture and copy images, and could later be tweaked for more sophisticated functions, such as recharging batteries. Toshiba engineers are working on a prototype LCD in which the glass is first heated by a laser, and then cooled to produce crystals that electrons can pass through more easily, allowing thin film transistors to be built directly on the display. Conventional glass, in contrast, can only support simple transistors for activating and deactivating pixels, while driver transistors that control display operation must be positioned around the screen; the need to link pixels and drivers can cause reliability problems, according to Toshiba's Masato Kemmochi. The pixels of Toshiba's prototype image-capturing LCD serve as shutters, while each pixel is split into a trio of subpixels--one red, one green, and one blue--to provide color. Embedded in each subpixel are image-capturing semiconductors, allowing the screen to function as a scanner. The subject to be copied is placed on the scanner, and the screen illuminates the object with its backlight, taking a picture by opening all the subpixels. With such a display integrated in a laptop or tablet PC, students could capture images of library book pages, while handheld organizers with the same technology could read bar codes, notes Steve Vrablik of Toshiba America Electronic Components. Last fall, Sharp unveiled a polysilicon display that supported a replica of a late 1970s processor, furthering the idea that the technology could be used to build a complete system on a chip, which would save power, space, and money.
(Access to this site is free; however, first-time visitors must register.)
- "Canadian Security Conference Features Weird and Woeful Predictions"
IT World Canada (05/15/03); Careless, James
The 15th annual Canadian IT Security Symposium was characterized by predictions of both positive and negative developments. In the former category, John Heidenreich of IBM Research forecasted that machines built from nanotechnology will start appearing by 2010, supercomputers will equal the human brain in terms of processing power, and technology will generally become "faster, better, and cheaper." However, his projections came with several caveats: For one thing, new computers will have to be developed to help people analyze data that is growing at an exponential rate; future IT failures are likely to have catastrophic effects; and the point of developing autonomic technology is to aid people in managing IT systems rather than give computers the ability to think. Symantec CTO Robert A. Clyde made dire predictions of serious security problems as a result of the growth of wireless technologies such as Wi-Fi and Bluetooth. International Data estimates that mobile Internet users will total 589 million in 2005, and Clyde said that many of these people will be ill-equipped to defend themselves from hackers and viruses. He is especially concerned with the proliferation of 802.11b wireless access points, many of which are "rogue" points that are highly vulnerable to hackers. Clyde's solution is to devise fundamental wireless safeguards, such as setting corporate standards for wireless products and operating systems, determining what kinds of data and/or applications can be stored on devices protected by firewalls and encryption, and homogenizing and directing wireless purchases via a single corporate organization. He also recommended that all Wi-Fi access points be required to go through a firewall before being allowed on a wired corporate network, updating virus filtering, and encrypting all remote connections.
- "Peer-to-Peer Peering Pondered"
IDG News Service (05/23/03); Gross, Grant
The House Committee on Government Reform listened to the privacy dangers surrounding peer-to-peer (P2P) technology in the second of three hearings regarding file-sharing networks; the first hearing focused on pornography trading over networks such as Kazaa, and the third will look at government file-sharing networks. FBI Cyber Division deputy assistant director James Farnan said his unit had received no complaints about identity theft via P2P, though he said the potential for crime on those systems is great. The University of California at Berkeley and the University of Minnesota conducted a one-week survey of Kazaa users and found roughly 1,000 open email in-boxes, though the number is small considering the 70 million people who use the Kazaa client. University of California at Berkeley graduate student Nathaniel Good said he also found spreadsheets with credit card data and salary agreements on sharable portions of Kazaa users' computers. The universities' study recommended a easier-to-use interface and more user education for P2P networks, measures that would help people better protect themselves against inadvertent sharing. Kazaa lawyer Philip Corwin said the newest Kazaa client limited downloads to a special folder in its default setting, and promised to consider the recommended changes. Corwin also warned lawmakers of the music industry's effort to break into file-sharers' computers though new invasive software. Others at the hearing spoke about the spyware and adware that is included in some P2P software; MIT security architect Jeffrey Schiller said limiting file-trading to only music files would help solve some of the privacy problems, but noted that it would remove some of the legal defenses P2P entities have deployed against the music industry.
- "Ethernet Inventor Looks Ahead"
Boston Globe (05/19/03) P. C2; Bray, Hiawatha
Bob Metcalfe says May 22, 1973 was the birth date of Ethernet, the computer interconnect technology that allows PCs to connect in networks. That is the date Metcalfe sent a memo describing his idea to supervisors at Xerox's Palo Alto Research Center. Metcalfe admits to building on a fundamental project done at the University of Hawaii, but it was Metcalfe's mathematics expertise that allowed him to push that networking technology to unprecedented speeds. He says 3Com, the firm he started to sell Ethernet gear, received an unintended boost from IBM, which steered the nascent PC market toward its own Token Ring technology. By forcing PC manufacturers to exclude Ethernet, Metcalfe says 3Com was able to sell more expensive add-on Ethernet equipment to users who realized its intrinsic value. Metcalfe, now a venture capitalist with Polaris Venture Partners, says the next technology innovations will likely face the same questions as Ethernet did. At that time, most users shrugged off the idea of connecting computers and insisted on carrying data on disk from machine to machine; today, Ethernet is standard issue in computers, even those that connect via wireless networking.
Click Here to View Full Article
- "Spinning Around"
Sydney Morning Herald (05/20/03); Adams, David
Today's knowledge workers are inundated with information--too much to be useful, says Web site content management expert Gerry McGovern. He says society is working under the principals of the physical economy where "more is better," but should in fact adjust to the new digital economy, where hardly anything is scarce. Cheaper devices, faster processors, and more abundant storage all result in too much information to be useful, says McGovern. Author David Shenk wrote in his 1997 book, "Data Smog: Surviving in the Information Glut," that people historically consumed as much as possible in order to survive, and have carried that habit over into the information age. This results in "information obesity," writes Shenk. McGovern, who spoke recently at the Australian Computer Society's annual conference, says part of the responsibility lies with content authors. He says about 70 percent of all Web sites remain unread and that many corporate intranets are self-defeating. In addition, immersing oneself in data does not result in better decisions because people are too close to the situation to gain proper perspective. Silicon Valley is emblematic of an attitude where the more a person works, reads, and converses, the smarter and more productive they are. A 2000 Pitney Bowes study on information overload found that top performers used self-messaging, previewing, and knowledge indexing techniques to manage information.
- "The Future of 3D Graphics"
Extreme Tech (05/16/03); Stam, Nick
The WinHEC conference featured a session in which Nvidia graphics architect David Kirk detailed expected changes in 3D graphics over the next decade. Kirk, who received Siggraph's 2002 Computer Graphics Achievement Award, mused that graphics performance will continue to more than double every year for the ascertainable future. He forecast that graphical processing units (GPUs) will soon be capable of real-time shading and image generation, which in the past has eluded engineers. A challenge still waiting to be met is the incorporation of global illumination into real-time 3D graphics rendering. Although both CPU and GPU architectures boast programmability, high-level language support, and similar computation capability, Kirk postulated that there will still be a fundamental difference between the two frameworks, given that CPUs and GPUs have distinct purposes: GPUs are mainly used to efficiently process huge amounts of stream data, while CPUs' primary goal is to process data using diverse methods. Creative GPU applications that Kirk foresees in the near term include real-time sub-surface scattering, while current GPU applications of note include special visual effects, dynamic simulation, and scientific computations. Notable academic research projects involving graphics include Stanford University's rendering of caustics, real-time radiosity simulations and realistically rendered shadows for complex objects at the University of North Carolina, and Caltech's harnessing of GPUs to operate brute-force filtering algorithms to clean up noisy data. Kirk argued that the use of 3D graphics to enhance people's sensory perceptions remains a distant vision, because no one yet has successfully developed a noninvasive, lightweight headset display that supports high-quality graphics.
- "Web Access for All"
eWeek (05/19/03) Vol. 20, No. 20, P. 54; Donston, Debra
Section 508 of the Rehabilitation Act mandated that all federal agencies and government contractors make their Web sites accessible to the disabled by June 2001, but many organizations have yet to comply because the requirements are not always easy to interpret. Furthermore, Meta Group estimates that it can cost an organization as much as $200,000 to build adherence to Section 508 and/or the World Wide Web Consortium's Web Accessibility Initiative guidelines into a Web site, while incorporating accessibility into the site design process can cost up to about $50,000. In addition, Meta analyst Jennifer Vollmer reports that companies are stretched thin in order to comply with the Patriot Act, the Gramm-Leach-Bliley Act, and the Health Insurance Portability and Accountability Act, to say nothing of Section 508. "Companies are being forced to prioritize compliance, and my gut instinct is that accessibility isn't at the top of the list," she explains. Vollmer believes that even companies that are not required to satisfy Section 508 requirements will be forced to do so in order to avoid becoming defendants in a massive lawsuit. It is the companies' responsibility to compare how much time it will take them to make existing content accessible and build accessible new content against the potential rewards they can reap in terms of boosted sales revenues and compatibility with both internal and external customers. "Someone in the organization should take charge of accessibility and what the company's doing about it and then work with IT, HR and legal to form an accessibility board to come up with a strategy to move the company forward," Vollmer recommends. She also suggests that building new accessible content should be a priority.
- "Proof of Concept"
Technology Review (05/03) Vol. 106, No. 4, P. 28; Garfinkel, Simson
Net Effect columnist Simson Garfinkel suggests that the some of the computer virus outbreaks that have plagued the Internet in recent years are merely proof-of-concept tests for a truly devastating online assault organized by a secret information-warfare lab. He notes that major worms such as Code Red, Klez, and Slammer share a similar morphology: They breach systems, usually through a known vulnerability, using an "exploit"; they target victim computers with a "propagation engine"; and they inflict the actual damage with a "payload." Garfinkel observes that the worms boasted unique propagation engines, but only exploited known security holes and delivered ineffective payloads, which appears to support his proof-of-concept theory, at least in some cases. Computer security researcher Fred Cohen does not think that these worms were created by government-run outfits, but acknowledges that smaller labs or individuals, either rogue or merely looking for customers, could be behind them. "Law enforcement agencies know that hackers from the 1990s are now selling their services to organized crime and terrorists; why not the virus writers?" writes Garfinkel. An alleged presidential order for government-drafted cyberattack guidelines last summer could be the first step toward the authorization of information warfare, and Garfinkel is concerned that such a maneuver could encourage America's enemies to release viruses with payloads that could cause perhaps $100 billion in damage.
- "Electronics: A Voyage of Discovery"
Industry Week (05/03) Vol. 252, No. 5, P. 24; Tereshko, John
Breakthroughs in nanotechnology will greatly influence next-generation electronics and data storage, and open up new computing methodologies, according to Clayton Teague of the National Institute of Standards and Technology's Manufacturing Engineering Laboratory. Results of the nanotech revolution will likely include reduced data-storage costs, a rethinking of equipment maintenance, and a shift in the basic challenges of new-product development. IBM's Millipede project, in which data is written and read on a surface by the tip of a scanning tunneling microscope, could lead to cheaper and denser memories to replace solid state flash storage technology, says Tom Albrecht of IBM's Zurich lab; Millipede might find its way into many portable electronics. Meanwhile, other firms are grappling with the issue of adapting product and process research so that it can weather the disruption that will accompany nanotech's penetration. "The challenge is to study the potential of nanoelectronics to affect customer needs, the product solutions we offer, and how we, ourselves, manufacture them," explains Rockwell Automation CTO Sujeet Chand. Jay Lee of the University of Wisconsin, Milwaukee's Center for Intelligent Maintenance Systems expects the future of both industrial components and consumer products to be influenced by the intersection of nanoelectronics and intelligent maintenance systems; the Association of Manufacturing Technology's Paul Warndorf predicts the coming of low-cost, nano-based multifunctional sensors. Uzi Landman of Georgia Tech's Center for Computational Materials Science believes that organizations that wish to leverage nanotech will need engineers familiar with quantum mechanics. He forecasts that "there will be a new continuum from the abstract physicist to the engineering people who are actually going to build devices."
Click Here to View Full Article