Volume 5, Issue 459: Wednesday, February 19, 2003
- "Bill Would Ban Spam E-Mail in California"
Los Angeles Times (02/18/03) P. B1; Vogel, Nancy
California Sen. Debra Bowen (D-Marina del Ray) has authored a bill that would make it illegal to send spam email from California or to a California-based email address, a crime punishable by a maximum fine of $500. However, experts such as Jupiter Research's Jared Blank doubt that legislation will curb the growth of unsolicited commercial email. "Somebody sitting in China sending you emails about Viagra is not going to care what California's rules are," he points out. Bowen wrote a 1998 law that required spammers to label their spam as such by including ADV or ADV:ADLT in the subject heading, as well as include a toll-free number or return email that viewers could use to opt out of receiving spam, but Louis Mastria of the Direct Marketing Association reports that most spammers are continuing to send junk email regardless of the mandate. Jupiter estimates that the number of spam messages received in the United States grew from 140 billion to 261 billion between 2001 and 2002, an 86 percent increase. Ferris Research analyst Marten Nelson is confident that the spam industry will ultimately be undone by technology rather than legislation: He predicts that spam will be under control within five years once enough email blocks are in place. Mastria's group prefers that the federal government enact legislation that would require spammers to institute an "opt-out" policy, but consumer groups think an "opt-in" approach, in which spammers need to get consumers' permission prior to sending them unsolicited email, is more practical. Coalition Against Unsolicited Commercial Email co-founder Ray Everett-Church lauds Bowen's new bill, and thinks there is merit in implementing it nationally.
(Access to this site is free; however, first-time visitors must register.)
- "Diversity in the High-Tech Workplace"
SiliconValley.com (02/14/03); Fortt, John; Davis, Jack
The workforces of the 10 highest-grossing high-tech companies in Silicon Valley have grown in diversity, but the emphasis is on Asians rather than other minorities, while executive levels remain predominantly white. About one in three jobs created between 1996 and 2000 were filled by Asian employees, but most Asians were concentrated in the technical and engineering fields rather than retail and management. This growth is attributed to the fact that Asian countries churn out almost three times as many science and engineering graduates as the United States annually, according to the National Science Foundation; and nearly 8 percent of science and engineering degrees from U.S. universities are awarded to Asians. Meanwhile, about 10 percent of the workforce in 2000 was either black or Latino, while women made up less than a third of the workforce by the time the most recent tech boom ended. Workers at major Silicon Valley companies often use their experience to launch startups, but even the CEOs of these startups are mainly white, notes Intel's Sriram Viswanathan. Tech leaders say the low percentage of Asian executives is due to a lack of communication and language skills, as well as personal contacts. Clarinet Systems President Wen Chang adds that Asian children will be more inclined to seek careers as sales and marketing associates by exposing them to successful role models. Kathleen Allen, who teaches entrepreneurship at the University of Southern California, notes that about one-third of her current class is made up of Asian engineering graduates. Among Asian countries, China and India specialize in producing engineers, but Vietnam, the Philippines, and other Southeast Asian nations tend to produce professionals that migrate to medicine or other fields.
- "Drive Resumes for Standard Software License"
IDG News Service (02/13/03); Gross, Grant
The National Conference of Commissioners on Uniform State Laws (NCCUSL) is set to renew its drive to get standard software licenses established in all states. The group of lawyers, professors, and judges, backed by the software industry, has labored since 1999 to install the Uniform Computer Information Transactions Act (UCITA) in all 50 states. UCITA was dealt a fresh blow this month as the measure did not receive approval from the American Bar Association (ABA). UCITA backers pulled the bill before it reached the ABA's House of Delegates, but only after it failed six sections of the association. While proponents of the bill say it helps to drive down the cost of selling software by establishing standard licensing practices, opponents such as the American Library Association and the ACMachinery argue UCITA takes away consumer rights. The law passed only in Maryland and Virginia, and NCCUSL legal counsel John McCabe admitted it would have little impact unless adopted by a large number of states. He said the ABA rejection of UCITA would not affect the current round of lobbying in state legislatures, given the level of debate surrounding the bill. He pointed out that submission of UCITA before the ABA was a courtesy and not a prerequisite for its adoption by states.
To read more about ACM's arguments against UCITA, visit http://www.acm.org/usacm.
- "Cyber-Security Strategy Depends on Power of Suggestion"
Washington Post (02/15/03) P. E1; Krim, Jonathan
The Department of Homeland Security's national cybersecurity plan, released on Friday, lacks substantive action on the part of the federal government, according to many experts. Many technology firms opposed active interference from the government, such as many of the measures promoted by former cybersecurity advisor to the President, Richard A. Clarke, who resigned recently. He advocated, for example, suspension of wireless Internet operations until better security solutions were developed and having large federal agencies use their buying power to raise software security benchmarks. Instead, the plan from the Department of Homeland Security suggests that individual Internet users install firewall software, businesses revise IT security plans, and the establishment of a consolidated federal center for monitoring and responding to computer attacks. Purdue University professor and security guru Eugene H. Spafford said, however, that the recommendations relied on a private industry that was unlikely to respond as needed without government involvement. SANS Institute director Allan Paller commented that the plan was a good threat analysis, but lacked some of the more actionable arguments in circulation. Michael Wendy, policy counsel for CompTIA, an IT trade group, praised the plan because it focused on important behavioral aspects of computer security, such as certification and training, instead of relying on "silver bullet" technological solutions.
Eugene Spafford is co-chair of ACM's U.S. Public Policy Committee; http://www.acm.org/usacm.
- "Digital Vaccine May Make Computer Networks Tolerant to a Fault"
NewsFactor Network (02/18/03); Martin, Mike
Computer science graduate students at the University of California at Irvine have developed techniques that allow software engineers to inoculate computer systems against faulty data. Systems such as those used in international intelligence, space exploration, and the military need to be protected against faults. By purposefully injecting faulty data into the network, computer science researcher Martin Mathis says software engineers can easily monitor how the system responds and gauge the fault tolerance of the system. Protections can also be built in so that the software is not vulnerable to the same type of threat again. For every data input, there are three possible output results: Correct, incorrect but acceptable, and incorrect and unacceptable. In the space shuttle, two pairs of computers running the same program simultaneously control the spacecraft. If the first pair's operations disagree in any aspect, the second pair will take over. In case both pairs are lost, the shuttle crew can turn on a fifth reserve control system that runs different software, but with the same functionality. The technique developed at the University of California allows engineers to test software systems and find the exact ranges of incorrect but acceptable data outputs. Mathis says the method would be specifically applicable to middleware such as CORBA, where object information is exchanged between different computer systems, such as in e-commerce environments.
- "'Selfish Routing' Slows the Internet"
By choosing the fastest route for data packets passing through the Internet, individual systems hamper the overall flow of information, according to Cornell University researchers speaking at the annual American Association for the Advancement of Science meeting. They said current routing systems investigate network paths and chose the fastest routes for the data packets assigned them, without regard to overall Internet performance. When all Internet-connected systems act in such a manner, the result is an equilibrium called Nash flow. Cornell University researchers Eva Tardos and Tim Roughgarden found that, on the Internet, the average speed as a result of Nash flow was one and one-third times less efficient than if a centralized traffic management system was imposed. Roughgarden said the benefits of changing the current system depended on the actual effects on the real Internet, noting that his study was done using mathematical models. However, he suggested that routers be programmed to not only take into account the potential speed of a specific path, but also the effect a data packet would have on that route. By adding this altruistic factor, average Internet speeds could be expected to increase, he said. Roughgarden and Tardos' research will be published in the Journal of the Association for Computing Machinery in March.
- "Word 'Bursts' May Reveal Online Trends"
New Scientist (02/18/03); Knight, Will
Cornell University computer scientist Jon Kleinberg believes that new online trends can be identified faster through computer algorithms that search for surges or "bursts" in the usage of specific words. "The key is to find unexpected changes in the frequency of the appearance of words," explains Christos Papadimitriou of the University of California at Berkeley. Kleinberg's algorithms could be used, for instance, to spot emergent fads by finding word bursts in the thousands of Weblogs peppering the Web, which would be a significant advantage for advertisers. The technique could also be used to comb through other kinds of data--a company's maintenance staff could detect new problems by identifying word bursts within emails sent to customer support, for example. Kleinberg demonstrated the feasibility of his method by studying all State of the Union speeches delivered by American presidents dating back to 1790, making connections between word bursts and historical events of the day. Usage increases of the word "depression" earmarked speeches given between 1930 and 1937, while "atomic" had greater usage than any other word between 1949 and 1959. Kleinberg will present his findings in Denver at the American Association for the Advancement of Science's annual conference on Tuesday.
- "Commerce Proposes IT Policy Restructuring"
InternetNews.com (02/14/03); Mark, Roy
The U.S. Commerce Department is proposing merging various agencies that oversee information technology and telecommunications policy into one agency in order to streamline policy management. Secretary of Commerce Don Evans wants to combine the National Telecommunications and Information Administration (NTIA), the e-commerce functions of the International Trade Administration (ITA), and the Technology Administration (TA). Evans says, "Convergence is the business model in the digital economy--it should be the business model in the federal government." He says "telecom and technology operate together...we need to adjust our structure to keep pace with the world." The new agency, headed by Under Secretary for Technology Phil Bond, will tackle issues related to spectrum management, telecom and e-commerce policy, and technical standards.
- "Faster Video for Wireless Devices?"
CNet (02/14/03); Junnarkar, Sandeep
Truong Nguyen, an engineering professor at the University of California, San Diego, has received more than $200,000 in grants over three years to develop technology intended to improve the quality of video delivered via wireless devices such as cell phones. The grants were provided by Skyworks Solutions and the state government's Industry-University Cooperative Research Program. Nguyen says his research targets video decoders at the appliance level. For example, he says he has been able to stream video at a rate of 20 frames per second at 65 Kbps, which he says is twice the number of frames typically streamed. This results in a less shaky video sequence despite low data rates. Nguyen's research may eventually help increase consumer's adoption of video applications via cell phones and other devices, which are slowly being developed by American wireless carriers. Meanwhile, Lucent Technologies announced recently it was gearing up to license new technology that creates faster wireless networks, while Intel introduced a cell phone processor that would facilitate digital photography, Web surfing, and color screens. Gartner Dataquest estimates that such wireless applications could generate $20 billion in revenues by 2006, despite such setbacks as Verizon Wireless' decision last year to postpone a next-generation data-only cellular network.
- "Tracking the Killer Worm"
NewsFactor Network (02/18/03); Brockmeier, Joe
The recent outbreak of the SQL Slammer worm could have been far more damaging than it was, according to Giga Information Group security analyst Michael Rasmussen, who believes the attack was launched to demonstrate proof of concept. Slammer could be a harbinger of far more catastrophic worms that could be unleashed upon the Internet in the future, and serves as yet another wake-up call to enterprises, which are notoriously lax when it comes to effectively securing their networks. Vincent Weafer of Symantec's Security Response notes that Slammer's proliferation had "no dependency on human interaction," and spread six times faster than typical worms. Contributing to the worm's fast propagation was its reliance on transmission via UDP, a protocol that is less secure than TCP. More and more network vulnerabilities are being disclosed every year, which means that companies would be wise to prepare themselves for the next major attack, says Aberdeen Group security analyst Eric Hemmendinger. He advises companies to keep abreast of patches and vulnerability updates, while the considerable cost this approach carries with it could be mitigated by deploying automation and intrusion prevention solutions. Further security could be implemented by shutting down redundant services, Hemmendinger adds. Switching to a different platform does not ensure security either, as Rasmussen points out that commercial Unix systems and even open-source systems are exploitable.
- "Disk-Drive Capacity Continues to Grow"
SiliconValley.com (02/16/03); Gillmor, Dan
Dan Gillmor predicts that the if the current pace of innovation keeps up, disk-drive users will soon have more inexpensive storage capacity than they know what to do with. Keeping pace with storage advancements is the shrinking size of disk drives. A gigabyte of disk-drive capacity currently costs less than a dollar, and Gillmor expects to see small media players capable of storing between 40 GB and 60 GB in 2004. More everyday appliances being equipped with recording and data storage devices is likely to generate controversy concerning privacy issues. "For example, what are the privacy implications of our automobiles keeping track of where we drive and at what speed, as the car makers, insurance industry and government snoops will surely wish?" Gillmor inquires. He writes that the telecommunications industry is unlikely to grant users enough bandwidth for centralized multimedia delivery; this will make information storage using disks at the network edge a necessity. Gillmor also observes that increasing storage capacity has spurred the commotion over digital copying, while the addition of portable drives into the mix will fuel further attempts by the entertainment industry to clamp down on consumers' rights to copy digital content.
- "Robots Are Getting More Sociable"
MSNBC (02/18/03); Boyle, Alan
Scientists are trying to build sociable robots that could be used not only to better understand human social interaction, but to assist people both physically and psychologically. "Robots have always been an intriguing mirror to our own conception of what it means to be a human," observes MIT professor Cynthia Breazeal, who developed the Kismet robot and is now working on a more advanced machine called Leonardo. In the physical arena, researchers such as Yoseph Bar-Cohen at NASA's Jet Propulsion Laboratory are developing electrically-driven plastic muscles that could be utilized in the next generation of prosthetic and robotic appendages. Meanwhile, University of Texas graduate student David Hanson has designed K-Bot, a robot with a realistic, human-like face that is adept at mimicking human expressions it picks up through its camera-equipped eyes. Although K-Bot resembles Disney's animatronic puppets, Hanson notes that it is intelligent, and is able to identify facial expressions and respond with its own expressions in real time. He plans to equip K-Bot with a speaker so that the machine can generate speech, and make its software smarter. More sophisticated robot faces could be used for cognitive research or even to train human beings in socialization skills. "The ultimate goal is to create a compassionate, sociable robot that begins to approach on various aspect of human intelligence, and someday become our peer," Hanson explains.
- "Conversation With Marc Andreessen"
Wired News (02/14/03); Glasner, Joanna
- "U.S. Backs Merging Net, Phone Numbers"
CNet (02/13/03); McCullagh, Declan
The Commerce Department has recommended that the United States join a new electronic numbering system that will let people use one identifier for various purposes, such as faxes, mobile phones, instant messaging, and email. The ENUM system is intended to help converge the Internet and the phone network, and users will be identified by their phone number, including the country code. Commerce Department assistant secretary Nancy Victory says the United States should also push for ENUM's implementation, with principles supporting competition, interoperability, and privacy, and minimizing regulation. "Domestic implementation of ENUM must be done in a manner that maximizes the privacy and security of user data entered in the ENUM DNS domain," she notes. Thirteen nations, members of the International Telecommunication Union, have agreed to the proposal and are planning trials.
- "High Impact"
InformationWeek (02/10/03) No. 926; Hayes, Mary
The question as to whether IT is making a fundamental difference to society at large is a matter of debate, but there are individuals striving to put their technology skills, expertise, and corporate contacts to altruistic use. Stanford University professor Eric Roberts, whose work includes leading the development of worldwide academic computing curricula guidelines sponsored by the IEEE CS and the ACM, believes the IT industry needs something similar to medicine's Hippocratic Oath. Both inside and outside the classroom, Roberts is churning out a new crop of socially responsible computer scientists. Former Sybase executive VP Bob Epstein aims to eliminate the disconnect he perceives between corporate goals and the conservation of the environment through the nonprofit he co-founded, Environmental Entrepreneurs (E2). The organization, whose 200 members include Apple Computer founder Steve Jobs and Google CEO Eric Schmidt, was instrumental in the passage of the Global Warming Bill by the California Legislature and Gov. Gray Davis this past summer. AT&T Labs technical staffer Lorrie Cranor, a fervent proponent of online privacy, has channeled her interest into initiatives such as the Platforms for Privacy Preferences Project (P3P), a standard way for Web sites to clarify their privacy policies in a format computers can understand; she also developed the AT&T Privacy Bird application, which notifies users about sites' P3P-compliant privacy policies. Dot-com entrepreneur and Overstock founder Patrick Byrne established Worldstock Socially Responsible Goods as a steady online retail outlet for products made by poor or disenfranchised people around the world. Some 2,160 craftspeople in over 40 countries have had their goods sold on Worldstock, which investigates vendor backgrounds to ensure that the goods are produced under humane conditions.
- "Survival Guide: Perspectives From the Field"
Washington Technology (02/10/03) Vol. 17, No. 21, P. 34; Wakeman, Nick
James Lewis of the Center for Strategic and International Studies argues that doomsday scenarios of cyberattacks on America's critical infrastructure are overblown. He points out that most cyberattacks are one-time assaults that result in minimal damage to the infrastructure. "The whole experience with other kinds of attacks is that you need to do a lot of them over a long period of time to bring a country to its knees," he observes. This is extremely difficult, given the proliferation of redundant systems and the rapidity with which countermeasures are deployed after a cyberattack is launched--factors that are often overlooked in cyberattack scenarios, according to Lewis. He postulates that the only kind of cyberattack capable of doing real damage is one that is launched in tandem with a physical attack; in such a scenario, a physical attack's effects could be multiplied if a hacker disrupts communications by blocking channels and releasing bogus information. In Lewis' opinion, the real threats are posed by the wide availability of infrastructure-related information on the Internet, which terrorists can use to put together a coordinated attack strategy. Hackers can also break into networks and scoop up information they can exploit. Lewis also says the government has been lax in keeping sensitive information offline.
Click Here to View Full Article
U.S. Black Engineer & Information Technology Magazine (02/05/03); Witherspoon, Roger
The coming wave of technological development will present great opportunities for minorities and the underprivileged, while at the same time threaten to widen the "digital divide" between the technological haves and the have-nots. Wake Forest University's Nat Irvin, executive professor of future studies, says that black communities need to thoroughly debate future tech trends and their societal ramifications. "We have not made the transition from focusing on the past and starting to look 20 and 40 years out to determine what is happening and how we will fit in," he explains. Leroy Jones of Dell Computer argues that the "uncool" image many young people have about technology must be reversed. Meanwhile, IBM's Mark Dean, recipient of the 2000 Black Engineer of the Year award, anticipates that computer use will shift from computing-only mode to data management and data-centric functions, which will in turn lead to new products or business or financial schema. Nanotechnology is considered by many to be "the ultimate science," according to Gary Harris of Howard University; there are, however, many unanswered questions about nanotech's potential impact on biological systems, as well as ethical issues. Dr. Irvin contemplates that one of nanotech's negative consequences could be "genetic discrimination," in which the technology is used by the wealthy to incorporate genetic advantages into their offspring. Dr. Harris envisions another scenario in which nanobots are designed to do physical harm to people of a certain race or ethnicity. Dr. Irvin says more black people need to get involved in the ethical and legal debate of such issues.
For information regarding ACM's joint participation (with CRA and IEEE) on the Coalition to Diversity Computing, visit http://www.npaci.edu/Outreach/CDC.
- "Supercomputing Resurrected"
Technology Review (02/03) Vol. 106, No. 1, P. 52; Tristram, Claire
Japan stole the supercomputing speed record from the United States with the launch of NEC's Earth Simulator last March, an event that signals a deep gap in the U.S. supercomputing development effort, according to high-performance computing expert Gordon Bell. Although the United States has distinguished itself with advances such as Hewlett-Packard's ASCI Q supercomputers, it has fallen behind because of a drop-off in government funding, which in turn has eroded private investment; another hindrance has been the American effort's reliance on massively parallel configurations, whereas Japan's strategy is to build government-funded, specialized supercomputers. American computer vendors may downplay the Earth Simulator's triumph, but the fact remains that specialized supercomputing supports the simulation of complex systems, without which basic science and national security research cannot move forward. Designing supercomputers from the bottom up, as the Earth Simulator was, is simply more effective than connecting clusters of commoditized, general-purpose processors, Bell explains. The latter approach, while admittedly cheaper, requires "parallel programs" that are very difficult and arduous to write--as a result, the relative thriftiness of building supercomputers out of off-the-shelf parts may be cancelled out by the cost of parallel software development; a far simpler tactic is vector computing, a key element of the Earth Simulator. Furthermore, the Japanese approach enables computers to solve real-world problems. The chief motivator for U.S. spending on supercomputer research is the need for additional computing power to model nuclear weapons performance, while the Earth Simulator has spurred the Defense Advanced Research Projects Agency (DARPA) to invest in supercomputing research. Also receiving DARPA funding is research into new supercomputing architectures, but this will not bear any significant fruit without a radical transformation of the design and engineering cycles, and a technique to flawlessly mass-produce the technology.