Volume 4, Issue 392: Wednesday, August 28, 2002
- "Tech's Major Decline"
Washington Post (08/27/02) P. E1; McCarthy, Ellen
The implosion in the U.S. technology industry is impacting the number of computer science majors that U.S. universities are churning out. The Computing Research Association reports that computer science undergraduates declined by 1 percent last year, while educators say this drop is worsening. For example, Virginia Tech enrollment is expected to fall 25 percent this year, while George Washington University's computer science enrollment has already declined by more than 50 percent. In fact, enrollment growth in undergraduate computer science departments has stopped completely. Colleges such as George Mason University are trying to stave off the enrollment decline by offering more general IT majors. Judy Hingle of the American College Counseling Association explains that students also avoid computer science courses because they are perceived as difficult or "uncool." Many in the industry are trying to counter the stereotypical view of a computer science career as a profession characterized by isolation and geekiness. Information Technology Association of America President Harris N. Miller says, "Our concern as an industry is that if they begin to again see major declines in enrollment, down the road four years, as the economy picks up, once again companies are going to find themselves in a shortage situation."
- "EU Copyright Directive 'All Bad News'"
VNUNet (08/27/02); Middleton, James
The Britain-based Campaign for Digital Rights sharply criticizes the proposed European Union Copyright Directive (EUCD) in a detailed study released last week. The EUCD is similar to America's Digital Millennium Copyright Act (DMCA), in that it would make the circumvention of digital copy protection controls illegal. "As it stands, the U.K. implementation of the EUCD will hinder research into cryptography; make criminal current common practices of the music industry; give software companies unwarranted control over the creation of software products interoperable with their own; and provide an inadequate and entirely impractical mechanism for beneficiaries of the Directive's exceptions to obtain access to copyrighted works protected by technological measures," declares Julian Midgley of the Campaign for Digital Rights. The critique reports that the legislation would only grant exceptions to government authorities, educators, and disabled persons. However, these parties would need to secure written permission from the Secretary of State for Trade and Industry in order to bypass the safeguards. The U.K. Patent Office issued a consultation paper on the EUCD in early August, and the deadline for response submissions is Oct. 31.
- "What Are the Real Risks of Cyberterrorism?"
ZDNet (08/26/02); Lemos, Robert
Cyberterrorism warnings tend to exaggerate: They imply massive losses to human life and property, when in fact government and security experts note that bombing a target is still far more damaging--and easier--than hacking into a computer. An atmosphere of panic in the wake of the Sept. 11 attacks has pervaded the media and the government, with legislators such as Rep. Lamar Smith (R-Texas) claiming that "a few keystrokes and an Internet connection is all one needs to disable the economy and endanger lives," but such rhetoric has made "cyberterrorism" a confusing term. There is no denying the vulnerability of water utilities and power companies, which are increasingly reliant on computer-controlled devices, or are privately owned and have inadequate information security. But it is unlikely that disrupting their operations via cyberattack would result in deaths. Last month, Gartner and the U.S. Naval War College simulated an online attack on the nation's critical infrastructure, and concluded that such an assault would severely disrupt communications but would not lead to disaster; furthermore, to accomplish such a "digital Pearl Harbor," the terrorists would need years of preparation time, sophisticated skills, specific knowledge, and hundreds of millions of dollars. Also, in the event of a cyberattack, hospitals, prisons, and other critical metropolitan systems would continue to function because they run on independent generators, while utilities and infrastructure operators would be able to protect the public with backup systems. Security experts agree that the Internet itself is most vulnerable to hacking, but the biggest loss incurred by hacker intrusions is one of data, not life.
- "Cells Light Way for Flat-Panel Displays"
NewsFactor Network (08/27/02); Lyman, Jay
Flat-panel, color displays of the future could be based on light-emitting electrochemical cells (LEC), according to Penn State University researchers, who disclosed their findings this week at the 224th annual meeting of the American Chemical Society in Boston. Electrical engineering graduate student Cheng Huang reports that polymer LECs are more energy efficient than light-emitting diodes (LEDs), and could boast greater stability, speed, and longevity as well. The researchers devised an LEC composed of bilayer luminescent polymers positioned between a pair of electrodes. Adjusting the voltage allowed the cell to emit either a red/orange light or a yellow/green light at a response time faster than that of the human eye. Huang observes that organic LECs can only generate a single color. The Penn State team is now aiming for full color representation, which relies on blending red, green, and blue light together. Huang notes that the researchers have yet to produce an appropriate red and green, and still lack a blue LEC.
- "Buggy Software Still Takes a Toll"
CNet (08/27/02); Gilbert, Alorie
Although software quality is showing improvement, business customers continue to blame faulty software for lost productivity and finances. A federal report estimates an annual loss of almost $60 billion due to buggy software. Sometimes the problem may not strictly lie within the software: Inflated customer expectations, disorganized enterprise data systems, and poor deployment and project management have also been cited, but clients and companies agree that they have made progress in the installment and implementation of business applications. Vendors have started to admit that they released faulty software in the rush to get their products onto the market in order to maintain their competitive edge. Others claim that they are working to produce better software through more thorough testing and quality assurance, because such a strategy yields long-term benefits that are more valuable than short-term sales increases. For example, Manugistics' flawless execution (FLEX) program, launched last year, dedicates 30 percent more worker power to quality assurance, according to the company. Software companies that take more responsibility in product support and maintenance can also boost customer confidence. Despite these efforts, analysts note that Wall Street is still pressuring companies to raise their revenues. And despite their best efforts, software firms admit that due to their complex nature, completely bug-free programs are nearly impossible to achieve.
- "Technical Degrees Still Command Highest Salaries"
Internet.com (08/23/02); Pace, Chris
Students that possess undergraduate and graduate technical degrees usually earn the highest starting salaries, compared to those with degrees in other fields, according to the most recent New York Times Job Market (NYTJM) survey. The poll finds that undergraduate degree holders can receive starting salaries between $38,000 and $52,000 per year, while graduate degree holders can earn between $55,000 and $78,000 per year. For students who are already enrolled or who intend to enroll at a college/university in the next year, technical degrees are second only to business degrees, which reflects job seekers' belief that the latter are essential to getting a good position. However, respondents said that simply owning a certain degree does not ensure a job: To truly qualify, undergraduates must also demonstrate strong ethics, multitasking and teamwork skills, strategic thinking, and a tolerance for doing grunt work. Meanwhile, successful graduates are expected to possess functional work experience or specific industry work experience. Hiring managers suggested that students will increase their chances of getting hired if they are involved in on-campus courses, internships, extracurricular activities, work/study, and research that involve a demonstration of leadership.
Click Here to View Full Article
- "Hoping for Very Big, Yet Extremely Small, Discoveries"
Scripps Howard News Service (08/20/02); Vorenberg, Sue
Five new U.S. nanoscience centers are being built under the aegis of the Department of Energy. One facility, the Center for Integrated Nanotechnologies, will be run by Los Alamos and Sandia national laboratories in New Mexico; the other centers will be located at the Brookhaven, Lawrence Berkeley, Argonne, and Oak Ridge national labs. Associate director for the New Mexico facility Don Parkin says that nanoscience research can yield stronger metals by building materials from the ground up at the atomic level. Center director Terry Michalske adds that researchers are blending chemistry and physics in an effort to develop substances capable of self-repair. "[Nanoscience] reaches into new medicines and health care, engineering--some say every aspect of our lives will be affected," he contends. Sandia researchers recently discovered that atoms can be assembled into light-transmitting silicon crystals that could lead to speedier, smaller, and less expensive computers and communications systems. Michalske predicts that the federal government will probably earmark a nanoscience research budget of $700 million for 2003, up from $620 million this year. He says that for the past 20 years the labs' nanoscience work has been limited, but the new centers will accelerate research so that products can be rolled out faster.
Click Here to View Full Article
- "Battling the Bugs"
Financial Times (08/27/02) P. 12; London, Simon
The National Institute of Standards and Technology (NIST) recently reported that software bugs cost American companies approximately $60 billion in 2001, while Bill Guttman of Carnegie-Mellon University's Sustainable Computing Consortium pegs the real cost closer to three or four times that. Software is lagging behind advances in computer hardware, while NIST senior economist Greg Tassey concludes that the pervasiveness of software means that "the economy's ability to develop and use new computer-based products will determine a major share of realized growth for the foreseeable future." However, although software buyers have grudgingly accepted faulty software as a fact of life in the past, there are indications that their patience is eroding. Major technology companies such as Oracle and Microsoft have launched initiatives to make their products more secure and reliable. Meanwhile, Sun Microsystems co-founder Bill Joy recommends that software programmers should drop "preindustrial" programming languages such as C and Cobol in favor of more modular languages such as Java. Effecting this change will take time, because it also implies prioritizing integration, testing, or debugging over writing new code, the latter being a cultural practice that programmers are taught when they are students. NIST's Mark Skall says that programmers are receiving inadequate training because of this outdated emphasis. Furthermore, individual companies lack an incentive to produce software of better quality, and perhaps the only solution would be a collaborative effort between, firms, customers, and academia that is prodded by a standards body such as NIST.
- "Blue-Laser Storage Moves Closer to Reality"
IDG News Service (08/27/02); Williams, Martyn
A new blue-laser optical disc format developed by NEC and Toshiba is expected to be announced either this week or next week, and the companies plan to present the product to the DVD Forum as a next-generation successor to DVD. The technology could end up in a standards battle with Blu-ray Disc, another next-generation format proposed by nine companies earlier this year. Such a battle could delay the technology's consumer rollout. CD and DVD systems currently use red lasers to record data, but the shorter wavelength generated by blue lasers can increase the data storage capacity of 4.7-inch optical discs. Each side of current DVDs can store about 5 GB, but the rewritable Blu-ray discs can store about 27 GB; the storage capacity of Toshiba and NEC's format has not been disclosed. A blue laser costs about $1,000, but several companies have announced plans to mass-produce the technology, which could lower the price to under $100 per laser. Regardless, the technology is still a ways off from reaching consumers. Meanwhile, a standards battle over recordable DVD standards remains unresolved.
- "Nano Research Challenges Storage Unit"
United Press International (08/26/02); Burnell, Scott R.
New research shows that magnetic storage technology may be viable for several more years than previously thought thanks to advances in nanotechnology. Scientists had worried that, as magnetic storage bits are squeezed into tighter spaces, they would become volatile and lose information. Larry Bennett of George Washington University and Ed Della Torre of the National Institute of Standards and Technology challenge the current equations that predict the limit of how densely magnetic storage sites can be built. Their experiments with cobalt/platinum structures shows that other factors, such as the chemistry of the materials, affects the potential size of the storage sites. University of Illinois at Chicago professor Vitali Metlushko says the new view may be complicated by the difficulty of managing materials as small as in Bennett's experiment, but that storage nanotechnology benefits greatly from multidisciplinary contributions. Among the challenges is controlling the thickness of the layers and making the material flawless and almost totally smooth. Notre Dame University researcher Gyorgy Csaba notes that other kinds of research may lead to the replacement of electronic transistors with domain arrays. He cites "nanopillars" of magnetic materials being used in lieu of transistors.
- "Bluetooth Rolls Toward Deployment in Cars"
EE Times Online (08/23/02); Murray, Charles J.
Chipmakers are working on Bluetooth technology that can withstand the extreme conditions often found inside an automobile as well as provide the processing power needed to run a variety of applications. Bluetooth in cars and trucks would enable a number of functions, such as hands-free cell phone operation, advanced stereo systems, and downloadable music and videos from drive-up kiosks. Cambridge Silicon Radio on August 21st announced that it had begun mass production of its Bluecore-2 External chip, while Infineon Technologies and Broadcom also are working on Bluetooth chips sturdy enough to survive the extreme temperatures and shocks suffered in a car. Analysts say that consumers will have little patience for faulty technology in their cars, unlike on their desktop computers, where technical glitches are a matter of course. Still, Cambridge Silicon Radio predicts that one in five new cars will have Bluetooth installed by 2007. Hurdles to be overcome include eliminating interference from other radio devices, such as satellite radio systems, and preventing data from leaking from one car's network to a nearby Bluetooth-enabled car. Still, automakers worldwide are ready to push the technology, and Chrysler says that this fall it will unveil an aftermarket version of its UConnect Bluetooth communications system. Future car-based communications systems will allow drivers to listen to phone calls through the car's stereo system, while directing their speech at the windshield. General Motors engineer Pom Malhorta says Bluetooth has the edge in car-based systems because it's "fairly low cost compared to other solutions and fairly low complexity, too."
- "Selling the Connected Home"
Network World Fusion (08/19/02); Kistner, Toni
The Internet Home Alliance (IHA) and the Zanthus research firm recently issued a report detailing what kinds of households are most amenable to the concept of the connected home, why many consumers resist the notion, and what consumer requirements are currently unfulfilled. According to extensive surveys, the type of household most likely to welcome a connected home consists of two teenagers, and two college-educated parents aged 35 to 44 that earn a combined annual income of $75,000 or more. Such families are characterized as technology early adopters with a preference for tools that can slash time off household chores and increase quality time; 10.1 million households, or 17 percent of American homeowners, account for this market. The majority of resistance or neutrality to the concept of the connected home (58 percent) comes from older, more conservative respondents that view the technology as an invasion of privacy. This view is also shared by many early adopters. Many critics think automated meal preparation is too impersonal, and worry that the connected home will further erode the boundary between home and work life. Respondents' unmet needs were categorized into household chores, home systems, personal communications, and scheduling. Forthcoming technologies designed to serve these areas include remote-controlled interconnected kitchen appliances, a self-adjusting, energy-efficient thermostat, and an IHA-piloted suite comprised of cell phones, pagers, and a kitchen-based centralized computer monitor.
- "The NSF Looks Toward the Future"
PC Magazine Online (08/21/02); Rupley, Sebastian
National Science Foundation (NSF) assistant director Peter Freeman says that his agency has organized an initiative to bolster the country's high-end computer and supercomputing efforts. He adds that by investing in improved supercomputing frameworks, the NSF will enable the U.S. supercomputing effort to catch up with those in other countries. In addition to providing services to science and engineering researchers, the NSF is concentrating on research that revolves around supercomputers, high-bandwidth networks, middleware, sophisticated mass storage, and applications software. "In all of these areas we're trying to invest in both operational activities that use currently available technologies to support research at sites such as the San Diego Supercomputer Center, and we're also investing in research that can further develop advanced technologies," Freeman declares. He expects that such efforts will advance data-mining applications to the point where events like Sept. 11 can be more accurately predicted and medical diagnoses can be made with more precision. Freeman notes that the foundation is also giving more emphasis to technologies with more long-term prospects, such as nanotechnology and biotechnology. The NSF and the Commerce Department recently issued a report that recommends research and development investments in wearable computers, and anti-stress and anti-aging technologies, among others.
- "Purdue Retools Nanotech Research to Benefit University and Region"
AScribe Newswire (08/23/02)
Purdue University's Schools of Engineering plan to spruce up and increase their engineering space by 60 percent via a $400 million renovation that will include a $56.4 million core facility for nanotechnology research. The initiative's guiding force, engineering dean Linda P.B. Katehi, believes nanotech will profoundly affect engineering education and Indiana's economy. She says the Birck Nanotechnology Center will boost the schools of engineering's competitive edge across academia, science, and the economy, and foster research that will yield products generated by companies spun off from public and private partnerships. Purdue's nanotech efforts will be consolidated at the center, which will be located in the Discovery Park research complex. Its doors are expected to open in the first quarter of 2005. Katehi, the school's first woman engineering dean, describes the five-year investment in the schools' transformation as "unprecedented." Educators say the initiative is a reflection of a paradigm change in the discipline of engineering that will resonate throughout the next century. Under the plan, $250 million will be spent for new construction, $100 million will be spent for new equipment, and $60 million will be spent in renovations designed to provide needs for the next 15 to 20 years.
- "The Seven Deadly Security Sins"
NewsFactor Network (08/22/02); Lyman, Jay
Most network security holes are attributed to basic failings, such as poor password management, misconfigured servers, and inadequate or nonexistent patching. But security experts also point to lesser-known dangers such as network sharing between business partners and bad IT hiring standards. Gartner research director John Pescatore explains that misconfigured servers as well as "sheer incompetence" are often the result of recruiting people who have exaggerated their credentials. He also blames the lack of patching on companies that increase the network load without expanding IT staffs in order to save money. Furthermore, Pescatore comments that giving another company total network access is folly, although security specialist Ryan Russell acknowledges that sharing networks without considering security first is sometimes inherent in the pursuit of business goals. Yankee Group analyst Matthew Kovar observes that a major mistake is changing a system or network without reevaluating its security. He adds that that the sheer number of security alerts issued by software vendors is prompting companies to overlook many of them. Other factors contributing to security vulnerabilities include a lack of product testing prior to commercial release and default settings of newly deployed software.
Potomac Tech Journal (08/19/02) Vol. 3, No. 33, P. 11; Goodman, Joshua
The TIC summer camp in Washington, D.C., has been teaching schoolchildren computer skills for 20 years now, including programming, animation, Web design, and digital music creation. Camp director Karen Rosenbaum says the TIC camp offers a unique opportunity for children to learn computer skills they would not normally learn at a public school. For one, the 4 to 1 student-teacher ratio found at TIC cannot be replicated at public schools, and TIC adheres to a non-traditional idea of instruction, where students largely direct their own learning. While Rosenbaum admits that the $700 fee for the two-week course can exacerbate a "digital divide," she says that need-based scholarships help with that situation. A more pressing concern is increasing the number of girls who attend the camp, according to her. Rosenbaum says that girls who do not start learning computer skills before their college years are inhibiting their opportunities in the future. Currently, just about one-third of the students are girls, which she says does not bode well for diversity in the future IT workforce. Rosenbaum says that although practical programming skills are stressed, the camp also focuses on teaching students to work collaboratively. All projects are worked on in pairs, for example. Rosenbaum says the goal is to get kids "turned on to learning." She notes that although their projects are often geared toward fun or juvenile results, such as an explosion, the process of getting there usually requires a sophisticated grasp of computer skills.
- "P2P Getting Down to Some Serious Work"
Network World (08/19/02) Vol. 19, No. 33, P. 30; Fontana, John
Peer-to-peer (P2P) technology is gaining ground in the corporate IT environment, helping make networks more robust and assisting applications. In addition, P2P is joining with other network technologies--most notably grid computing--that are also making waves in the business world and bolstering distributed computing. In essence, P2P technologies allow end-user devices to link directly to each other without an intermediary server, although some hybrid P2P setups do mix with a client/server schema. Together with grid computing, Burton Group CEO Jamie Lewis says that P2P technology helps make more computing resources available over the network and notes that P2P will be especially important in the establishment of Web services. She says, "Peer-to-peer is an architecture and one that will play a substantial role in distributed computing as endpoints in the network gain more power." This April, the P2P Working Group merged with the Global Grid Forum, signaling how intertwined the two technologies have become for progressive IT thinkers. MeshNetworks is also using P2P to extend the range of wireless LANs by allowing end-user devices to become nodes linking other users to the network. The technology was derived from work done by the Defense Advanced Research Projects Agency (DARPA), which developed peer technology to enable soldiers to instantly connect with others on the battlefield.
- "The Programmable Building"
Technology Review (08/02) Vol. 105, No. 6, P. 74; Zacks, Rebecca
Director of the MIT Media Lab's Center for Bits and Atoms Neil Gershenfeld believes that buildings will be much more efficient and flexible if their myriad systems are connected to the Internet, and his team is devising small, inexpensive networking devices for that very purpose. His laboratory has developed a new way to turn on the lights using a device built from off-the-shelf components that Gershenfeld says acts as both a switch and an Internet site. The switch can be attached to a track near an AC outlet plugged into a light, while touching metal pads on the switch and the outlet allows them to exchange data. Afterwards, the switch, which draws power from the track, can be used to turn the light connected to the outlet on and off. Furthermore, Gershenfeld says that the switch can be removed and plugged in at another location, yet will still control the same light. Another device on the same track enables Web-based control by converting incoming signals into a lower-bandwidth format, thus allowing wireless devices such as PDAs to control the lighting. Gershenfeld says that the construction and management of buildings could be streamlined with such a system, while office reconfiguration could be achieved at less cost. His systems will be tested when the Media Lab's new building opens in 2004.
- "Homeland Insecurity"
Atlantic Monthly (09/02) Vol. 290, No. 2, P. 81; Mann, Charles C.
Bruce Schneier, author of "Applied Cryptography," says that most of the computer security measures being planned and developed in the wake of Sept. 11 will be ineffective, and in some cases could make Americans even more vulnerable. Schneier says people frequently assume that deploying security tools will cure all ills, when in fact they will only reduce risks to acceptable levels at best. Studying how well security measures fail is of critical importance. Attacks can be prevented by scaling down and compartmentalizing security systems, and incorporating redundancy; failure should be embedded into their design so their basic operation is not hobbled by any single error; and people rather than computers should be the ones who make the decisions. Schneier says that many of the security measures the government is considering--new security infrastructure, biometric scanners, smart cards, database networks, etc.--are not "ductile" and fail badly. Almost every homeland security proposal calls for the establishment of national databases, but this would decompartmentalize much government data and provide malicious hackers with an attractive target, similar to the credit card numbers kept in centralized private-sector repositories. Furthermore, many people are tempted to circumvent secure networks because of the difficulty they have working with them; solutions Schneier suggests include changing software licenses and implementing narrowly-focused, often physical, safeguards. But his biggest argument is the inclusion of people with the authority to make critical decisions, for electronic decision-making cannot match human judgment.