HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 659:  Monday, June 21, 2004

  • "Fractals Show Machine Intentions"
    Technology Research News (06/23/04); Smalley, Eric

    Swiss and South African researchers have combined fractals and clustering algorithms into a visual interface that can be used to understand a machine's internal states in much the same way that people use body language to interpret behavior. The algorithm clusters snapshots of a machine's sensory input, computational processing, and output, while a fractal generator creates a fractal pattern in the center of the display; these fractals radiate in concentric rings so that observers can see how the machine's state changes over time. In one possible scenario, people could perceive that the machine is sensing a change in its surroundings with the display of a fractal representation of the machine culled from snapshots that correlate to a high level of sensory stimulation. Jan-Jan van der Vyver of the University of Zurich and the Swiss Federal Institute of Technology notes that the researchers made sure not to use any human-like representations that people might link to specific machine behaviors or objectives, which are unlikely to dovetail with the machine's actual state. Jeffrey Nickerson of the Stevens Institute of Technology is unsure that the fractal display is essential, as autonomous machines could be programmed to explicitly express their intentions. The feasibility of the fractal interface was demonstrated two years ago at Switzerland's Expo.02, where the display was connected to a neural net that controlled the cameras, microphones, pressure sensors, speakers, and light projectors of a smart room. Van der Vyver says that self-developing technologies such as genetic algorithms will contribute to the emergence of truly autonomous systems, but predicts that about five years will pass before practical applications of the fractal interface debut.
    Click Here to View Full Article

  • "Supercomputer Ranking Method Facing Revision"
    CNet (06/18/04); Shankland, Stephen

    The Linpack test currently used to rank supercomputers on the biannual Top500 list is limited since it chiefly measures processor performance while failing to account for other supercomputing aspects, such as data transfer speed or integer operations. To address these failings, University of Tennessee professor Jack Dongarra has developed the HPC Challenge Benchmark Suite with government funding. The suite consists of seven tests that are run concurrently, although Dongarra says there is no meaning in encapsulating their various results into a composite score. Such tests include Stream, a measurement of the memory-to-processor data transfer speed; Ptrans, an evaluation of processor-to-processor communication rates; DGEMM, which involves the multiplication of one array of numbers with another; and b_eff, a yardstick for network response time and data capacity. Dongarra, one of the Linpack test's authors, says the HPC Challenge Benchmark should not be considered a replacement for Linpack, which is essential for enabling historical comparisons in high-performance computing (HPC). Cray in particular has welcomed the new test suite, which is not so surprising since its Top500 rankings have been less than impressive, but Dave Turek with IBM's "Deep Computing" team views the benchmark with more than a little skepticism. He argues that a lack of clear understanding of what the benchmark represents can sow additional confusion, and adds that the tests are biased towards the needs of intelligence organizations such as the FBI or CIA. The HPC Challenge Benchmark is the product of a program the Defense Advanced Research Projects Agency established after a Japanese supercomputer wrested the supercomputing crown from the United States in 2002.
    Click Here to View Full Article

  • "Galileo: Challenge to U.S. Might?"
    Wired News (06/18/04); Shachtman, Noah

    The European Union's Galileo satellite navigation system will work with the U.S.-controlled global positioning system (GPS), according to an agreement between the two partners. Although Galileo was originally planned as a GPS competitor, European and U.S. officials have decided to conjoin the two systems, which will roughly double the number of available satellites and should improve service quality as well as the range of services offered. Military applications were a major sticking point during the negotiations, since satellite location is a major component for modern battlefield systems, including smart bombs and autonomous flying drones; U.S. military uses a designated portion of GPS signal called M-Code that theoretically would be protected during major world conflict while public GPS services would shut down. European space and military officials wanted to place their own government-use signal, "public regulated service" (PRS), alongside M-Code so that it would be difficult to shut down by the Americans. A compromise solution places PRS in an entirely different portion of the frequency range from GPS. With the agreement finished, the European Union still faces the challenge of launching all 30 satellites over the next few years, and European officials plan on securing a private partner to help finance the venture, but a company has yet to be announced. Many observers also wonder what incentive a company would have to sell special services over a system that is already basically free. PRS will add some capabilities, such as special broadcasts meant for airline pilots that would correct upper-atmosphere distortions and a "search and rescue" frequency that would especially listen for distress calls. The first Galileo satellite will launch in October 2005 for testing purposes, and should be followed by others until the full system goes online in 2008.
    Click Here to View Full Article

  • "Superhighway Code"
    The Engineer (06/10/04)

    The State of Minnesota is testing a new traffic data system that depends on information coming from individual vehicles instead of from cameras or observation helicopters; Ford intelligent vehicle technologies project manager Ron Miller says the goal is to make each car a traffic and weather sensor capable of communicating with outside systems and other cars, which will enable authorities to monitor and respond to traffic conditions in real time. The Vehicles and Infrastructure Integration project is a national effort, according to Miller. More than 50 state police cars, ambulances, and other state-owned vehicles in the Minneapolis and St. Paul region will be equipped with a bevy of sensors this year, while data collected will include vehicle speed, location, and bearing, as well as environmental indicators such as whether wipers or headlights are turned on. Data will be fed into the Minnesota Condition Acquisition Reporting System, which subsequently puts announcements out on message signs, traffic information phone lines, and on the Web. Miller's team is also working on concept cars, including a modified Ford Explorer that is full of sensors and innovative driving aids. An active night vision feature shoots a laser into the area in front of a car, wider than the headlights can illuminate, and then measures the image with an infrared laser; the result is shown on a night vision display on the dashboard. Another feature measures magneto-resistance of nearby cars to warn drivers of blind-spot dangers when changing lanes, and drivers are notified with light indicators on their side mirrors and by audible alerts. The team is not just investigating technology, but how to integrate it into the car naturally "so it looks like it belongs there," Miller says. PowerLine Communication is another important area because it cuts down on the number of dedicated wires needed to send data and power to different embedded systems.
    Click Here to View Full Article

  • "DS-UWB Enables Convergence"
    Network World (06/14/04) Vol. 21, No. 24, P. 27; Gifford, Ian

    The Institute of Electrical and Electronics Engineers (IEEE) is highly interested in turning Ultrawideband (UWB) into a wireless standard for enabling mobile communications devices to support multimedia-heavy applications. Direct Sequence UWB (DS-UWB), developed by the UWB Forum, integrates single-carrier spread-spectrum design with wide coherent bandwidth. Its mode of data transmission is pulses of energy that exceed 1 billion per second, which supports 28 Mbps, 55 Mbps, 110 Mbps, 220 Mbps, 500 Mbps, 660 Mbps, and 1,320 Mbps data rates. DS-UWB has the edge over legacy wireless technologies in four areas--quality of service, low cost, high data rates that scale to 1 Gbps or higher, and battery life. The technology generates the shortest possible pulses on the widest possible bandwidth, allowing strong, high-data-rate connections in a high multipath environment (such as households) and precise geolocation of UWB devices. This boosts the technology's appropriateness for such applications as low-power handhelds or data transfers at high data rates. DS-UWB will also keep interference to a minimum by inducing continuous white noise at lower levels than competing technologies. Combining a DS-UWB physical layer and the 802.15.3 network can facilitate the delivery of high-rate wireless video, audio, and data transfers by convergence devices, a critical ability for next-generation devices.
    Click Here to View Full Article

  • "Survival of the Fastest: Scientists 'Selectively Breed' Winning Formula One Cars"
    Innovations Report (06/17/04); Moore, Judith H.

    University College London researchers have employed genetic algorithms to build a computer model designed to optimize the performance of Formula One race cars so that they can win competitions. The fastest cars are "evolved" through the selective combination of the vehicles' optimal settings, and results demonstrate the possibility of trimming the best lap time by 0.88 of a second per lap. The simulation vehicle involved 68 configured parameters that influenced suspension, tire pressure, the engine, steering control, and fuel consumption. "Each best performance solution was treated as though it had its own genes that define those parameters," explains Peter Bentley of the Digital Body Group at UCL's Department of Computer Science. "These winning solutions were then bred to produce the next generation, which combined the best settings of both parent cars until eventually we evolved the ultimate Formula One vehicle setup." A series of experiments--five in all--were carried out using simulations of England's Silverstone track and Germany's Nurburgring track devised by Electronic Arts: Krzysztof Wloch with UCL's Department of Computer Science says the British course permitted the testing of vehicles configured for low-down force because of its slow corners and fast sweeping turns, while the convoluted German course was better suited for tests of vehicles tuned for high-down force. Bentley reports that Formula One cars currently use performance-monitoring software, but the UCL system allows the car's configuration to be optimized in the middle of a race; this would permit settings to be tweaked to minimize any damage the car sustains, for instance. A paper detailing the UCL team's breakthrough will be presented at a Seattle conference in June and published in this week's edition of New Scientist.
    Click Here to View Full Article

  • "Secretary of State Finally Sees the E-Voting Light"
    SiliconValley.com (06/17/04); Gillmor, Dan

    Dan Gillmor asserts that California Secretary of State Kevin Shelley is on the right track in requiring paper trails for electronic voting machines. The secretary wants to mitigate the risks of electronic voting by issuing standards that, among other things, require counties with touch-screen systems to implement a voter-verifiable paper trail to ease recounts by July 1, 2006. Shelley's office also announced an agreement with Santa Clara and other California counties to provide paper ballots to voters who prefer them. Shelley cited the wrongness of vendor-funded "independent testing" systems when requiring that e-voting machines' programming code be disclosed to the state for evaluation. He says his actions were spurred by shaky voter confidence in e-voting technology, reports of performance problems and potential software bugs and security holes, and equipment malfunctions during California's March primary. When it was concluded that e-voting machine supplier Diebold deceived the state in its claims of product reliability and security, Shelley decertified certain Diebold systems and instituted the new direct recording electronic (DRE) standards for counties. Meanwhile, the League of Women Voters recently amended its position on DRE voting when influential members such as former ACM President Barbara Simons aggressively protested against the use of paperless systems. Now the organization's leadership is endorsing a proposal that requires "voting systems and procedures that are secure, accurate, recountable and accessible." Gillmor writes that Shelley has emerged as a key figure in the e-voting controversy whom he hopes will serve as a catalyst to the rest of the United States in making DRE voting more secure and reliable.
    Click Here to View Full Article

    For more on ACM's activities involving e-voting, visit http://www.acm.org/usacm.

  • "Monitoring Complex Systems"
    EarthWeb (06/17/04); Spafford, George

    Unless monitoring systems are kept up-to-date and given sufficient priority, they will make infrastructure less secure, not more secure. As networks become more complex, attention must be paid to the monitoring systems that watch systems and send IT administrators alerts when something goes wrong. Too often, monitoring systems are merely an afterthought to network engineering and tacked on haphazardly; such deployments affect the reliability of monitoring systems and cause those systems to either produce false alerts or alerts that are not aligned with actual needs. IT administrators that perceive their monitoring system is ineffectual will ignore alerts or even falsify reporting requirements because they see the practice as a waste of time. Monitoring systems that are perceived as unreliable, therefore, pose a significant security risk to the organization because hackers are likely to choose points of entry that already produce a high number of false alerts. In order to solve this problem, monitoring systems must be given careful thought, first in regard to what parameters must be monitored and second in regard to how it is deployed. Monitoring systems, like any other system, are vulnerable to single-point-of-failure incidents, but doubling up on different type of sensors can increase the quality of alerts and, in consequence, administrators' confidence in those systems. Besides incorporating monitoring into network design and deploying appropriate technology, organizations should also plan on continuous improvement to their monitoring systems, while change advisory boards to review, approve, and schedule adjustments and lines of communication between functional groups help keep monitoring systems at the forefront of network issues.
    Click Here to View Full Article

  • "Panel Urges U.S. Strategy to Counter China's Tech Rise"
    EE Times (06/17/04); Leopold, George

    Congress has received a new report that expresses concerns about the impact of Chinese technology developments on U.S. economic and national security. The annual report of the U.S.-China Economic and Security Review Commission addresses the issue of the challenge that Chinese technology advancements pose to U.S. competitiveness. The panel that monitors developments in China noted that the nation is pursuing domestic standards for wireless encryption software, "exclusive technology formats" for cell phones and DVD players, or Enhanced Versatile Disc, as well as draft standards for RF identification technology. The report also says Taiwan has not stopped moving semiconductor chips for manufacture to the mainland of China, where a value-added tax on chip imports can be side-stepped. Meanwhile, the Chinese military is working with local research institutions such as the Chinese Academy of Sciences in areas such as remote sensing, semiconductors, and laser technology. "The extent to which China uses its enhanced technology capabilities to accelerate its military modernization programs is of direct national security concern to the United States," Commission Chairman Roger Robinson said during a House Armed Services Committee hearing June 16, 2004.
    Click Here to View Full Article

  • "A Golden Vein"
    Economist Technology Quarterly (06/04) Vol. 371, No. 8379, P. 22

    Mining databases for useful information is regarded with some suspicion by companies who tend to use it alongside traditional analytics, but vendors of business intelligence (BI) systems say the technology is improving rapidly and offering unprecedented opportunity. Harnessing customer data to provide more personalized service could allow Wal-Mart, for example, to mimic the intimacy of a smaller local store. And grocery chains use analytic systems to find nuance in their sales data, understanding that an under-selling product might actually be key to retaining the store's most valued customers. Specifically, data mining is becoming more useful in three areas: Real-time monitoring, predictive analysis, and analysis of unstructured data. Real-time analysis means banks and credit card companies can watch customers' transactions as they are processed and flag possible cases of fraud; Visa's European operations implemented a real-time monitoring system called VISOR in January and reduced instances of fraud from 1,576 per month to just 458 per month, as well as achieved a far lower false positive rate. Real-time data mining is made possible by advances in databases themselves, since data is better structured for quick analysis, and by gains in computer processing, especially the use of parallelism. Predictive analysis has provided less solid results because it still requires trained human analysts to get the best results. However, analyzing unstructured data is opening up entirely new territories for companies: IBM's Web Fountain research project is a good example as it harnesses the entire Web for analytic purposes; using natural-language processing techniques, Web Fountain is used by Semagix to better identify money-laundering schemes.
    Click Here to View Full Article

  • "Net Visionary Urges E-Mail ID Standard"
    CNet (06/17/04); Olsen, Stefanie

    In his opening address to the first Email Technology Conference on June 17, TCP/IP co-author and chief corporate strategist for MCI Vint Cerf said an email authentication standard is a critical step toward effectively combating the growing problem of junk email. Brightmail estimates that spam accounts for up to 64 percent of all email, and its proliferation is partly due to the use of the Simple Mail Transfer Protocol, which makes verification of senders' identities impossible for recipients. One sender verification scheme Cerf advocated is digital signatures, which are affixed to email and block delivery of the message until they are authenticated. However, he pointed out that such a scheme could be hindered in public forums that lack a central technology governance authority. Cerf warned that active HTML and XML have elevated the risk of spam delivered to the PC, and he advised consumers to bolster their defenses against spam, spyware, and malware by getting into the habit of running filters and anti-spyware programs. The FTC recommended in its report on the proposed federal No-Spam list that the industry create a standard email sender authentication system. Many leading corporations are pursuing such a system: Microsoft, for instance, has entered into an agreement to integrate its Caller ID for Email with Sender Policy Framework, while Yahoo! and others are evaluating encryption protocols to confirm senders. Recently announced anti-spam products include a customizable spam-filtering system from Cloudmark, and IronPort Systems' Virus Outbreak Filters, which are designed to recognize and corral suspicious email or malware before it can contaminate the entire network.
    Click Here to View Full Article

  • "Overview: 100 Best Places to Work in IT"
    Computerworld (06/14/04); Brandel, Mary

    The top 100 companies in this year's Computerworld Best Places to Work in IT survey are characterized by their ability to deliver the benefits IT employees value the most after standard perks such as paid vacation and health care: Technology, training, and schedule flexibility. The economic turbulence of the last few years has dramatically affected the provision of such benefits by companies. "Leading Geeks" author and C2 Consulting principal Paul Glen explains, "When you're dealing with a down economy, you have to focus on the basics--cool work, great relationships, fair pay and a reasonable belief that the future holds more of the same." Cutter Consortium fellow Tom DeMarco observed that in the past year some companies attempted to boost their efficiencies in the wake of layoffs by dumping more work on their remaining personnel, which hurt employee morale and hindered innovation. Many companies that made the Best Places list are distinguished by strong IT governance mixed with a central data approach that ties into clearly defined business strategies. Leading companies also empower employees by allowing workers to voice their views and suggest creative ideas to open-minded, higher-ranking staffers. Access to cutting-edge technology and the feeling that employee contributions are benefiting society is also critical, while another element workers respond well to is the opportunity to become competent in multiple disciplines; complementing this is positive reinforcement from the company through additional responsibilities, promotions, and citations. But all of these benefits are worthless if employees feel less than secure about the stability of their jobs.
    Click Here to View Full Article

  • "Shortage of Computer Security Experts Hampers Agencies"
    National Journal's Technology Daily (06/10/04); New, William

    Homeland Security Department chief security officer Jack Johnson warns there is a severe lack of IT security professionals in government, and that the government needs to train the "next generation" of cyber experts. Johnson says his agency lacks the IT workforce it needs to build required security systems, and would contract that job out to private-sector workers, except that there are only so many cleared contractors. At the Homeland Security Department, Johnson and CIO Steve Cooper have split data security tasks, with Johnson handling unclassified data and Cooper dealing with more sensitive material. Cooper is currently working on a Homeland Security Information Network he says will be on par with Defense Department security by the end of this year, and is also redesigning personnel security in order to lessen internal cybersecurity threats. Federal Aviation Administration (FAA) deputy director Thomas O'Keefe says that more research and development is needed for cybersecurity, along with more collaboration among industry and researchers. He argues that information-sharing among government security professionals needs to be more efficient and effective than information-sharing among Internet criminals. O'Keefe notes that the nation's air-traffic control system is completely separate from the Internet, protecting it from viral outbreaks. The FAA is moving to an IP-based system, but will still keep its network separate from the general Internet.
    Click Here to View Full Article

  • "R&D Envy"
    InformationWeek (06/14/04) No. 998, P. 38; Babcock, Charles

    Meta Group analyst Howard Rubin reports that IT research currently accounts for less then 4 percent of most enterprise IT budgets, compared to 8 percent to 10 percent a few years back. But despite increased budgetary thriftiness and scrutiny engendered by the economic downturn, some companies are still openly pursuing experimental business technology research with no clear return on investment or time frame. Securing funding for IT research and development requires gaining the trust of business partners by convincing them that the money spent on R&D efforts is important to the company. One option is to look for funding outside the organization: Realizing that he lacked the financial means to provide a proof of concept and draft an implementation strategy for a project to connect far-flung law-enforcement agencies in North Dakota through a centralized state system, independent consultant (and later state CIO) Curtis Wolfe asked for help from Search, a project-specification consulting consortium that used Justice Department funding to supply what Wolfe needed. Thanks to Search's help, North Dakota was awarded $2 million from the Justice and Homeland Security departments to deploy the integrated Law Enforcement Records Management System. Hilton Hotels dedicates less than 1 percent of its IT budget to research, but stretches the value of its investment by contributing to the FedEx Institute of Technology at the University of Memphis, and retaining a room at a Los Angeles-based hotel where interaction between guests and new technologies is tested, among other things. How a company handles IT R&D is determined by how important the organization's leadership considers R&D to be.
    Click Here to View Full Article

  • "Vigilantes on the Net"
    New Scientist (06/12/04) Vol. 182, No. 2451, P. 26; Moran, Barbara

    Counterstrike software is viewed as a panacea by companies frustrated by ineffective laws and enforcement against hackers and other online miscreants, but critics claim that such a tact is unethical, possibly unlawful, and could provoke an all-out war in cyberspace. Most organizations' response to cyberattacks is to bolster their defenses with firewalls, honeypots, and other measures, but network managers are locked into an unending game of one-upmanship with hackers; furthermore, small companies may not have the financial resources to upgrade their protection. It was this conundrum that prompted Tim Mullen of AnchorIS to develop software that strikes back at malware such as the Nimda worm by sending its own mutual exclusion (mutex) program back to the machine the worm came from and causing it to reboot (thus canceling the worm's mutex), while the user of the worm-sending machine is informed of his culpability via a pop-up window. Symbiont's iSIMS software is more sophisticated, and offers more aggressive counterstriking options: The product analyzes attacks to determine their point of origin, the damage they could cause if not stopped, and possible response strategies, leaving the final decision to the individual client. Offensive measures iSIMS is capable of include altering routing data on a malware-laden packet so that it is directed back to its source, and a last-resort option of sending code to the attacking computer that stops the attack. A key concern of critics is that counterstrike software can target innocent users such as owners of "zombie" computers who are unaware that their machines have been hijacked, or people whose addresses have been deliberately spoofed by hackers. In one scenario, malicious parties could exploit counterstrike software and goad two organizations to attack each other. Lawrence Berkeley National Labs engineer Eugene Schultz contends that the mentality behind counterstrike software is typical of "a small number of...hotheads...who want to get back at people."

  • "In Dust We Trust"
    Economist (06/10/04)

    Wireless sensor networks, more commonly known as "smart dust," have been aggressively hyped as a technology with wide-ranging military, commercial, and domestic applications. Each sensor packages a microprocessor, a power source, and a two-way radio into a small form factor; their wireless connections are capable of self-configuration, and coupling this capability with an internal power supply makes the sensors easy to install. A standard communications protocol for smart dust, 802.15.4 (also known as ZigBee) has been drafted by the Institute for Electrical and Electronics Engineers, and the initial ZigBee specifications, which will center around residential, industrial, and building controls, should be finalized by year's end. BP is sponsoring the most ambitious smart dust experiment to date by equipping an oil tanker with 160 "mote" sensors designed to track vibrations in the vessel's various mechanical systems to see if equipment malfunctions can be predicted. The company has also installed sensors in trucks to trace routes and keep tabs on drivers' behavior. Dust Networks founder Kris Pister says he was asked to assess whether a new building's wired sensor system could be replaced with 50,000 motes, but has concluded that such a deployment is not feasible with current sensor technology. Meanwhile, Ohio University researchers are engaged in a U.S. military project to enclose 10 square kilometers of corn fields in a "fence" comprised of 10,000 sensors. The concept behind smart dust--the incorporation of tiny sensors into an environment--has raised fears of privacy infringement similar to those generated by the proposed deployment of radio-frequency identification tags, but for now mote technology is being employed for mundane applications such as monitoring seismic activity in a California building, controlling the lighting system of a Chicago edifice, and taking temperature and pressure readings in an underground station in London.

  • "Decoding Application Security"
    CSO Magazine (05/04); Violino, Bob

    The World Wide Web has made business easier, but it has made information security more expensive and difficult. Application security is a major issue for chief information security officers (CISOs). Security product vendors are introducing new products intended to provide application-level security that firewalls cannot, but CSOs and CISOs say that enterprises should proceed cautiously as the processes and products mature. Web application attacks use application flaws to get into systems or computers, and defensive measures include code inspection, outside scanning for flaws, and application-security gateways that scan incoming network traffic more deeply than conventional firewalls. Web-application security monitors applications to make sure they behave the way they are supposed to, explains Gartner's Richard Stiennon, which is more effective than trying to learn every attack signature. Yankee Group predicts that the market for application security products and services will go from 2002's $140 million to $1.74 billion by 2007. The technologies currently available are working well, say early adopters. New York State Office of Cyber Security & Critical Infrastructure Coordination director Will Pelgrin says the state is looking into application-security products, and has included application-security best practices in its state agencies' security policy. The Department of Energy is evaluating a NetContinuum gateway, and senior security analyst John Dias says the agency's vulnerability to application-level attacks has dropped. However, the technologies are hindered somewhat by their impact on application performance, complex implementation, untested record, and funding and training issues.
    Click Here to View Full Article

  • "Women, Minorities, Persons With Disabilities in Science, Engineering"
    Newswise (06/16/04)

    U.S.-based Asian/Pacific Islanders with bachelor's degrees in science and engineering (S&E) are attracting wages that surpass their white peers who are of college age, according to the new report, "Women, Minorities, and Persons With Disabilities in Science and Engineering 2004." The online report also shows that the number of African Americans, Hispanics, and American Indian/Alaska Natives earning S&E degrees continues to grow steadily, although at a slow pace. There has been a tremendous increase in the number of associate and bachelor's degrees awarded since 1997, but women have not followed the trend. Women earned 37 percent of bachelor's degrees in computer science in 1985, but just 28 percent of computer science bachelor's degrees in 2001. Women represent 41 percent of all S&E graduate students, but only 20 percent of women are pursuing engineering degrees. In comparison, nearly 70 percent of Asian/Pacific Islanders who are S&E graduate students have chosen engineering, computer sciences, and biological sciences. Such fields were pursued by 42 percent of whites, and one-third of blacks, Hispanics, and American Indian/Alaska natives. Also a similar percentage of graduate students with disabilities has chosen engineering, computer sciences, mathematics, and life and physical sciences.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM