HomeFeedbackJoinShopSearch
Home
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM.

To send comments, please write to [email protected].

Volume 5, Issue 571: Friday, November 14, 2003

  • "Study: Tech Has Glass Ceiling"
    SiliconValley.com (11/12/03); Guido, Michelle

    Despite the United States' vanguard position in the area of technological advancement, there is still a dearth of female IT leaders, according to a new study from the nonprofit Catalyst research and advisory group. The study is the result of five roundtable discussions with 75 senior executives, male and female, who attributed women's low penetration into high-ranking IT echelons to various factors, including: A lack of mentors, role models, and networks for women; companies' failure to strategically and objectively recognize and develop skills; an exclusively male corporate culture that frowns upon the advancement of women; and difficulties in balancing family and home life with career goals. "What is surprising is that in an industry that thinks of itself as a meritocracy, women and men both perceive a lack of acceptance of women," observes Catalyst President Ilene H. Lang. Catalyst's Kara Helander also notes that a running theme within the discussion groups was the postulation that women are less prepared than men to assume leadership responsibilities. "People assumed that women are too emotional to be effective leaders, that a woman who has a family won't be willing to travel--which can automatically exclude her from a more high-profile job," she explains, adding that such attitudes allow managers to relegate women to support work, which significantly lowers their opportunities for advancement. To reverse this trend, the Catalyst report advises companies to give women access to career development programs, offer them networking and mentoring opportunities with other women, and nurture more flexibility. The report also cited several companies that have taken positive steps in improving women's chances for attaining leadership positions.
    Click Here to View Full Article

    For information on ACM's Committee for Women in Computing, visit http://www.acm.org/women.

  • "Net 'Outcry' Spurs Review of Patent"
    Washington Post (11/13/03) P. E1; Krim, Jonathan

    The U.S. Patent and Trademark Office decided Nov. 12 to re-assess a Web browsing patent credited to the University of California at San Francisco and Eolas Technologies CEO Michael D. Doyle in response to what deputy commissioner for patent examination policy Stephen G. Kunin called a "substantial outcry" from the Internet community. The outcry erupted when Microsoft was fined $520.6 million in August because a jury ruled that its popular Internet Explorer software infringed on the Eolas patent. Internet standards bodies, technologists, and companies are defending Microsoft because the widely used Internet Explorer has become the industry standard across a significant portion of the World Wide Web. Until the case is resolved, Microsoft will proceed with plans to avoid infringing the Eolas patent by requiring Web site providers to add code to their Web pages to keep Internet Explorer running smoothly before the browser is modified. Failing to do so will cause a pop-up box to appear every time users click on an interactive feature, inquiring if they wish to continue. In an appeal for patent review made on behalf of the World Wide Web Consortium, Web pioneer Tim Berners-Lee claimed that the patent office had missed or disregarded instances of similar technology that was in existence before UC San Francisco's patent application in 1994. The timing of the patent reevaluation is fortuitous, as the patent office is being criticized for approving an excessive number of poor-quality patents, while companies, patent attorneys, and top agency officials are urging Capitol Hill to use patent application fees to fund an upgrade of the patent office. Up to a year could pass before the patent review is complete.
    Click Here to View Full Article

  • "Europe's IT Skills Shortage Evaporates"
    Web Host Industry Review (11/13/03); Eisner, Adam

    Bleak projections of IT staff shortages in the European Union forecast in the late 1990s by International Data (IDC) and other research firms have generally not come to pass. More and more European tech employees are being forced to seek local employment in response to the United States' recent decision to dramatically reduce the number of work visas for foreign IT professionals. An impending staff shortage was indicated before the bursting of the tech bubble and the telecom market crash effectively canceled it, and made jobs harder to come by while raising the availability of qualified personnel. The outsourcing of IT jobs to India and other countries outside the EU is also creating headaches for European economies. Pierre Audoin Consultants recently reported that Romania could become a center of outsourcing thanks to its abundance of highly skilled, English-speaking IT employees and tech graduates, as well as the return of IT specialists from the United States and Western Europe. Other eastern European countries are also gaining ground as outsourcers. The European IT job market is expected to continue its sluggish growth: The European Information Technology Observatory predicts that Europe's IT services sector will only experience a 2.8 percent gain this year and a 4.7 percent gain next year. A more optimistic forecast anticipates significant growth in broadband technology and services as high-speed Internet proliferates throughout the continent, though not enough to lead to a huge IT staff shortage.
    Click Here to View Full Article

  • "Whatever Happened to Bluetooth?"
    IDG News Service (11/14/03); Lemon, Sumner; Lawson, Stephen; Krazit, Tom

    Bluetooth is gaining momentum as its growing volume of shipped products pushes prices down, but the technology is still too complex for many users, say experts. Bluetooth Special Interest Group (SIG) executive director Mike McCamon says most of the 1 million Bluetooth-enabled devices going out right now are mobile phones, with 65 percent of devices going to Europe and just 10 percent to the Americas. The new Bluetooth Specification Version 1.2 announced this month by the Bluetooth SIG adds new features, faster connection speeds, and adaptive frequency hopping designed to mitigate interference. Devices using the new standard are expected to reach the market in coming months. Gartner analyst Ken Dulaney warns that many users will be disappointed by the inoperability of some Bluetooth devices, since Bluetooth is only a base technology and requires device profiles, not all of which every vendor or product supports. Devices from the same vendor should work better, and there are some nifty applications such as a Bluetooth PC hub from Logitech, which connects the user's Bluetooth-enabled PC, keyboard, mouse, and mobile phone. Still, Logitech's Alexis Richard says Bluetooth is not for mainstream use yet. Besides current misunderstandings between vendors, which Dulaney expects will be resolved over time, Bluetooth is inherently limited by its maximum bandwidth of about 500 Kbps; by comparison, new Ultra Wideband (UWB) technology reaches 500 Mbps. Intel CTO Pat Gelsinger expects UWB devices to also use Bluetooth's upper layer protocol so that the two technologies can co-exist. Dulaney says vendors should not focus on replacing Bluetooth, but building out its intended strong points such as usability and pervasiveness.
    Click Here to View Full Article

  • "Is That You, Son? Voice Authentication Trips Up the Experts"
    New York Times (11/13/03) P. E8; Eisenberg, Anne

    Though computer-based speaker recognition has made significant progress in the last few years, boosting its accuracy and consistency is a tough challenge. Speaker recognition expert George Doddington observes that such factors as age, emotions, and metabolism can subtly change vocal characteristics, while the variable ways voices are recorded can also affect the technology's accuracy. Speech recognition technology has improved in its ability to pick up traits not directly associated with the speaker's vocal mechanisms; Douglas Reynolds of MIT's Lincoln Labs has helped develop programs with a greater analytical range, taking into account characteristics such as pauses, pitch, and pronunciation style. IBM researchers, meanwhile, are working on conversational biometrics technology that analyzes multiple sources of conversational data, and which could be used as an authentication tool for transactions such as accessing credit card account information over the phone. Ganesh Ramaswamy at IBM's Thomas J. Watson Laboratory says, "We look not just at the voice, but at what you say and how you say it." IBM enrolls people in the program by asking them to read from a magazine for 30 seconds. "Any magazine is fine," Dr. Ramaswamy says. "When people speak this long, you get enough of an idea of the frequency content of the various sounds in their voices." The program also creates a model of other details like pronunciation. In cases where voice characteristics alone may not be enough to verify a speaker's identity, the system is programmed to ask the person questions and determine whether his or her answers are sufficient. The National Institute of Standards and Technology (NIST) holds an annual contest in which speech recognition programs compete to see which can verify speakers most accurately. The programs are tasked with ascertaining if two fragments of telephone conversations come from the same speaker, but NIST mathematician Alvin F. Martin says the programs have no advance knowledge of the words they will have to contend with. "Instead, they are matching speech on different things at different times," he says.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Tech Peddles Its Influence on Campus"
    CNet (11/11/03); Borland, John

    The relationship between the tech industry and educational universities is becoming so intricate that schools and enterprises are openly collaborating on projects and even curricula that further major companies' agendas, which critics argue is endangering academic independence. Industry and corporate program advocates counter that this assumption is an exaggeration: For one thing, corporate donations are small compared to companies' internal research investments, making it the responsibility of federal budgets to fund the lion's share of computer science education initiatives; in addition, computer companies usually only ask for nonexclusive, sometimes royalty-free licenses to any intellectual property yielded from research seed grants. Industry's biggest argument against corporate influence on education is the simple fact that profitable near-term results from academic research are a rarity. Nevertheless, there are clear indications that deep corporate relationships are changing the nature of university research--Alan Shaw of the University of Washington, for one, observes that most academic researchers' emphasis appears to have shifted from long-term results to more short-term results. Corporate relationships and donations are also finding favor at universities for practical reasons, as levels of public and private funding have declined significantly in the last few years. But glowing testimonials from corporate donation proponents do not sway skeptics, whose suspicion is aroused by such trends as schools' continued use of Windows when widely available Linux software offers comparable power at no charge. Supporters of corporate/academic partnerships such as the University of Washington's Ed Lazowska note that the use of popular technologies prepares students for entry into the job market, while corporate executives say such ventures give their companies access to outside ideas and potential employees. U.S. Education Secretary Rod Paige and other educators maintain that corporate influence on academic institutions can be checked through awareness of corporate/academic relationships and meticulous oversight by teachers, administrators, and departments.
    Click Here to View Full Article

  • "Plastic Memory Could Replace Silicon"
    Associated Press (11/12/03); Fordahl, Matthew

    Princeton University and Hewlett-Packard Labs researchers have developed a new polymer-based memory technology that promises greater storage capacity at lower cost than silicon chips. The memory, which consists of a certain amount of silicon, a flexible foil substrate, and a plastic film, can be fabricated without the need for vacuum chambers, high temperatures, or lithography. It is non-rewriteable, but a highly power-consumptive laser or motor is not needed to read and write data, which is retained even in the absence of a power supply. Information does not have to be stored through the use of transistors: Data-writing is accomplished by channeling a strong current through a polymer fuse so that it blows and changes its conductivity; specific junctions open or close to retrieve data through the application of smaller currents. Researchers say the memory's simpler production requirements translates to lower costs for consumers on a per-megabyte basis, and the memory can be stacked in three-dimensional layers. MIT's Vladimir Bulovic reports that Advanced Micro Devices and Intel are pursuing projects to develop rewriteable polymer-based memory. The new memory technology could be configured in a small memory card format and used in cell phones and digital cameras. HP Labs scientist Warren Jackson says the fact that data cannot be rewritten is an advantage for some applications such as storing music files, photographs, and even accounting data.
    Click Here to View Full Article

  • "Wireless Networks Gain Spectrum"
    Washington Post (11/14/03) P. E1; Stern, Christopher

    In an attempt to spur the penetration of high-speed Internet access in rural communities and other underserved regions, the FCC yesterday reserved a section of airwaves for wireless networks in the 5 GHz band of the radio spectrum, which has traditionally been employed by satellite companies and the military. Any company is free to use the new swath of spectrum without a license; the only caveat is that frequencies will have to be shared by companies using the airwaves. "[M]aking more spectrum available for this important application will foster facilities based competition and significantly advance the public interest," declared FCC Chairman Michael K. Powell in a public statement. The FCC has limited the amount of power any device can use for signal transmission in order to reduce interference and congestion as a result of frequency sharing. Certain public interest organizations have found fault with the FCC's decision, claiming that the 255 new frequencies the commission allocated cannot support a data stream that can pass through walls or even trees, which would significantly restrict the transmission's range. Most consumer wireless devices--cell phones and TVs being two--operate in the 3 GHz band of the radio spectrum and can transmit signals over long distances with considerably lower power than 5 GHz devices, but FCC officials said several startups have proven that information can be delivered over relatively long distances with the new frequencies. Consultant Gregory L. Rohde says the FCC's decision will give a much-needed jolt to the rural wireless industry, while consultant Coleman D. Bazelon argued that companies could design more durable networks if they could control the airwaves through licensing.
    Click Here to View Full Article

  • "20-Year Science Map From Dept. of Energy"
    InformationWeek (11/11/03); Chabrow, Eric

    The U.S. government plans to team up with U.S. computer vendors to develop supercomputing capability specifically for science and industrial applications. In a new 48-page report, "Facilities for the Future of Science: A 20-Year Outlook," the Department of Energy details a plan to set up computing facilities for making "ultrascale scientific computing" available for public science research. Businesses would be able to use the technology to create virtual prototypes of new designs and products, which is expected to have a drastic impact on the time it would take to bring new products to market as well as lower investment costs. "There is no real way to measure the competitive advantage this kind of computing power can give us," Energy Secretary Spencer Abraham said Monday at the National Press Club. For example, the report notes that General Motors spent several years and millions of dollars working on its current prototype and simulation of jet engines, a process that could be completed in less than a day with ultrascale scientific computing. The proposal takes Japan's Earth Simulator supercomputer into consideration, and seeks to boost public scientific research computing capability by a factor of 100. Abraham believes ultrascale scientific computing will be a boon for the U.S. economy.
    Click Here to View Full Article

  • "Standard for Securing Domain Name System Nears Finalization"
    InternetWeek (11/11/03); Gonsalves, Antone

    The IETF reportedly will soon implement DNS-Sec, a security function that will authenticate information that passes through the Internet for more easy identification of people who send spam, worms, and viruses. DNS-Sec uses a digital signature that is attached to each domain name and associated IP address in a DNS server, so that a browser looking for the IP address of a domain name has to include technology in the function to ensure it is getting the actual address for the domain name. In this way, spammers and people attempting to release viruses have a harder time utilizing fake address headings. Researchers have been working on the mechanism for roughly 10 years, but it will only work if the groups that control TLDs, companies with individual DNS servers, ISPs, and telecommunications firms all put it in place. The mechanism also puts more information into DNS storage, which means server managers could have to undergo upgrades and implement another layer of administration to oversee the security mechanism, which would cost groups that store domain names in the servers. John Pescatore of Gartner says of DNS-Sec, "It's got to be something that's forced from the top." He emphasizes that the mechanism will work only if everyone employs it. ICANN is in favor of DNS-Sec, and implementation of the mechanism could coincide with deployment of IPv6. Cisco engineer Patrik Faltstrom, a member of the IETF committee developing DNS-Sec, says, "It's the sum of all of these mechanisms, or security tools, that make the Internet secure...DNS-Sec is very important, and will move security on the Internet a big step forward."
    Click Here to View Full Article

  • "Advances in Car Technology Bring High-Class Headaches"
    USA Today (11/12/03) P. 1B; O'Donnell, Jayne

    Auto technology is able to give drivers on-the-road directions, monitor multiple aspects of vehicle performance, and avoid serious injury in accidents, but the growing complexity of value-added systems is also causing serious problems; ironically, most of these problems are in high-end cars reputed for their quality. AutoSpies.com founder Donald Buffamanti says he has received hundreds of reports from car owners about computer-related malfunctions, such as drivers whose cars stalled while cruising on the freeway and seats adjusting without warning while driving. The amount of electronics in cars is increasingly rapidly: a new Audi A8L has 36 computers and 1,600 trouble codes; AAA's John Nielsen says, "These things make the space shuttle look antiquated." BMW's Gordon Keil readily admits new technology is by definition immature, while Mercedes director of vehicle electronics Stephen Wolfsried notes that advanced electronics makes possible significant improvements in emissions reduction, safety, performance, and comfort. J.D. Power research shows new car complaints remaining steady even though basic vehicle operations have improved vastly. Still, new electronic systems introduce new problems, making up for improvements in other areas. Some features are admittedly unnecessary, says Honda executive engineer Erik Berkman, such as the driver-specific programmable door lock preferences on the Acura TL. But Cadillac engineer David Leone says his company is getting a handle on increasing complexity, reducing the number of dedicated wires and increasing the number of redundant chips. He says new remote diagnostic technology can save drivers time and trouble by letting their cars be inspected on the road via satellite communications. The new Cadillac XLR sports car, like some Mercedes models, also includes cruise control that adjusts according to the distance between cars instead of only according to speed.
    Click Here to View Full Article

  • "Academics Can Be Fun and Games"
    Wired News (11/13/03); Dean, Katie

    The University of Southern California plans to launch a minor degree program in gaming in late November 2004, in what is reportedly a first for a major research university. The minor, which is nearing finalization, is an interdisciplinary course of study developed by Anthony Borquez of the USC School of Engineering in conjunction with the USC School of Fine Arts, School of Cinema-Television, and Computer Science department. The degree will require students to complete between six and eight courses, with a concentration on art and animation, design and management, or programming; topics of study will include 3D modeling, programming, animation, mobile game development, and game production. Electronic Arts animator and USC instructor Scott Easley says the course will allow students to gain the skills needed to qualify for a job in the gaming industry. USC students will be able to tap into a video game library, video cards, and hardware spread out between two laboratories, and employ 50 PC workstations with attached GameCube, PlayStation 2, and Xbox consoles. Gaming programs were previously restricted to specialized institutions such as DigiPen and art schools, while New York University and the University of Washington offer certificate programs in video games. Borquez originally proposed a gaming program for USC about five years ago, but the idea failed to generate enthusiasm among the administration and students. University of California, Irvine, studio art professor Robert Nideffer also tried and failed to set up a gaming minor at his school three years ago, but since then UC Irvine has received approval for a game culture and technology facility.
    Click Here to View Full Article

  • "Image Processing Means You Can See Both the Wood and the Trees"
    EurekAlert (11/11/03)

    Mathematician Gemma Piella has reworked current wavelet techniques to bring more detailed pictures to image processing. Developed when she was a doctoral student in the Netherlands, the new technique for processing images is able to produce a complete picture of an object, from close up and from a distance, all at the same time. For example, Piella's "multiresolution" techniques would allow a user to have a single detailed view of the green surface of a forest, as well as trees, leaves, and bark, all at the same time. Piella used a mathematical operation to modify the upward and downward movements of wavelets so that the small waves are able to recognize the trajectory of start and end points. Wavelets are able to act on geometrical information in the signal to be processed, which allows small details to be visible in images with low resolution. Piella's image processing advance could be useful in areas such as medicine, enabling a CT scan and an MRI scan of the brain to produce images of brain tissue as well as the bones.
    Click Here to View Full Article

  • "Behavior-Monitoring Machines"
    Technology Review (11/12/03); Sherman, Erik

    Behaviometric software is designed to determine the behavioral patterns of people and things and fine-tune the kinds of behavior--normal and abnormal--to look for through observation. The effectiveness of such software lies in developers' ability to measure and classify behavior so that the software is not overtaxed by too much detail. A breakthrough application developed at the Technion-Israel Institute of Technology can identify people typing on a keyboard with almost total accuracy without having to concentrate on what is being typed; the software focuses on the number of milliseconds it takes people to type a character, and accounts for slight variations for each typist by grouping individual times. The finite amount of key combinations allows researchers to build a probabilistic model for an individual user by employing existing algorithms. Filtering must be used to prevent behaviometric software from being overwhelmed by details: VistaScape Security Systems, for instance, uses software to examine video surveillance tapes at high-risk locations by employing generic object image models to match activity against rules established by the client. The software is programmed to ignore details that are ruled irrelevant--an object's color, for example--and target other details that might make the difference between normal and suspicious behavior, such as object size, speed, and direction of movement. The human factor is also a vital component--Living Independently enlists gerontologists to help refine its software, which is designed to monitor elderly people living at home and alert family members, hospital services, and others when their behavior diverges from the norm. Machine learning must be embedded in behaviometric software because "normal" behavior can change over time.
    Click Here to View Full Article

  • "Student Wins Award for Novel Web Idea"
    Daily Trojan (11/14/03) Vol. 150, No. 57, P. 1; Patke, Meghan

    Shou-De Lin, a Taiwanese graduate student at the University of Southern California, earned the award for best paper at the 2003 Institute of Electrical and Electronics Engineers (IEEE) Web Intelligence Consortium International Conference last month with a document detailing his "geocoder" project. Lin's computer science professor Craig Knoblock says the geocoder reverses Web sites such as MapQuest by enabling people to enter latitude and longitude coordinates in a Web search engine, and retrieve a corresponding address. "[Lin] had a clever idea about essentially combining different services on the Web together, and the thing that was interesting was that he was able to take a service that was not reversible and add components to make it so," explains Knoblock. Lin says the goal of the project was to mesh Web agents and search engines in order to accommodate the agents' limitations. Lin was the first student to earn the IEEE prize, and received another award at the ACM SIG KDD International Conference on Knowledge Discovery and Data Mining in August for a paper he authored with Information Sciences Institute project leader Hans Chalupsky entitled "Using Unsupervised Link Discovery Methods to Find Interesting Facts and Connections in a Bibliography Dataset." Of his IEEE honor, Lin says, "I personally feel that it's a motivation for me to keep on going in this direction, to study artificial intelligence, as my main research focus is that."
    Click Here to View Full Article

  • "TIA Is Dead--Long Live TIA"
    IEEE Spectrum (11/03) Vol. 40, No. 11, P. 22; Cherry, Steven M.

    The Defense Advanced Research Projects Agency's (DARPA) Information Awareness Office and its controversial Terrorism Information Awareness (TIA) program were defunded by congressional decree in late September, but eight Information Awareness Office projects will be transferred to other parts of DARPA rather than disbanded. Meanwhile, the National Foreign Intelligence Program (NFIP) will conduct related research. TIA-funded surveillance projects that were savaged by the media and are now defunct include a Georgia Institute of Technology initiative to build "gait signatures" for people by analyzing how they walk; and LifeLog, which aimed to record everything a person says and does through a massive multimedia diary. Electronic Privacy Information Center general counsel David Sobel reports, "Killing the Information Awareness Office is a positive first step, but it doesn't eliminate the government's data-mining initiatives. It drives them underground." The eight surviving Information Awareness Office programs, which add up to $51.2 million in funding, are Bio-Event Advanced Leading Indicator Recognition Technology, Rapid Analytical Wargaming, Wargaming the Asymmetric Environment, and five programs to translate and analyze spoken and written natural language. According to a report from the joint House-Senate appropriations conference committee that voted to defund TIA through 2004, NFIP will develop "processing, analysis, and collaboration tools for counterterrorism foreign intelligence" which are prohibited by law from being used in a domestic capacity. The CIA, the FBI, and the National Security Agency are among the intelligence branches that jointly manage NFIP. TIA-like projects could also migrate to the Department of Homeland Security Advanced Research Projects Agency, which received a first-year budget of $800 million.
    Click Here to View Full Article

  • "The Mind of a Hacker"
    InformationWeek (11/10/03) No. 963, P. 42; Hulme, George V.

    The word "hacker" has acquired a bad reputation in light of the harm inflicted on people and computer networks by worms and viruses, identity theft, and digital extortion, while laws such as the Digital Millennium Copyright Act (DMCA) are driving what many people see as legitimate and important security research even further underground. An InformationWeek survey of hackers uncovered some disturbing trends: One respondent, who uses the alias "unnamed," says motivation can differ with each person; reasons given include hacking being a source of intellectual stimulation, a hunger to learn how things work and how they can be manipulated, an opportunity to strike a blow against authority, the government, or capitalism, and the desire to make one's reputation. Most hackers insist they do not engage in unlawful activity, and argue that the disclosure of software flaws can actually help improve security--a claim that is met with scorn by security and business-technology professionals. With the DMCA, the USA Patriot Act, and other legislation that criminalizes hacking gaining ground, it is becoming increasingly difficult to draw the line between hackers and legitimate security researchers. Hacking is becoming a more subterranean activity as a result, and analyst Mark Loveless notes that the underground culture fosters a collaborative attitude to develop tools without considering their ultimate use. Loveless characterizes this attitude as "refreshing," but such a view is distressingly naive at a time when fears of cyberterrorism, identity theft, and infrastructure attacks facilitated by such tools are prevalent. University of Pittsburgh researcher Bernhardt Lieberman thinks schools should offer, and businesses should pay for, courses to teach students about hacking's ethical ramifications. EEye Digital Security co-founder and former hacker Marc Maiffret confides that "A lot of people doing this stuff like doing it because they're doing something illegal or edgy. It's about the thrill of it."
    Click Here to View Full Article

  • "The Code Warriors"
    Time (11/10/03) Vol. 162, No. 19, P. IB3; Roston, Eric

    Boosting homeland security--perhaps the overriding concern of the 21st Century--follows a two-pronged strategy of shoring up the safety of both physical and information assets. Security incidents reported to the CERT Command Center totaled 114,855 between January and September 2003, compared to 9,859 in 1999; Gartner estimates that security spending has increased 28 percent annually since 2001, while overall tech budgets have experienced a mere 6 percent rise. Columbia University computer science professor Sal Stolfo maintains that cyberattacks on the nation's IT networks will trigger a cascade effect that impacts other vital infrastructures, while the National Academy of Sciences reported in 2002 that people's willingness and competency to contend with threats relative to their magnitude has worsened since 1991. Professionals for Cyber Defense and former director of the National Infrastructure Protection Center Michael Vatis suggest that a Manhattan Project for security be set up to take charge of shielding these essential networks. Meanwhile, consultant and entrepreneur Dan Geer says the federal government must institute laws making tech companies responsible for security flaws that put critical infrastructures at risk. At the core of network vulnerability is the open nature of the Internet and the failure of its developers to anticipate its potential for abuse; but the human factor is also culpable, as demonstrated by organized crime and other malevolent parties who target databases for extortion, and consumers and tech firms that fail to install or activate simple security measures. Companies are struggling to automate security and reduce software complexity, which IBM Security & Privacy Research director Charles Palmer calls "the enemy of security." The effectiveness of customer behavior surveillance by credit card companies is finding favor in the software industry, while Geer and others say most desktops are especially vulnerable because they use a standard operating system supported by the monocultural business model of companies such as Microsoft.
    Click Here to View Full Article

  • "AI Loves Lucy"
    Computerworld (11/10/03) Vol. 31, No. 51, P. 36; Rosencrance, Linda

    Cyberlife Research founder Steve Grand plans to create an artificially intelligent machine that can think and learn through experimentation with Lucy, an android version of an infant orangutan he has developed. Such a breakthrough, Grand hopes, will yield insights on how the cerebral cortex evolves in response to experience, and enable people to design and construct biologically-inspired computational architectures and related applications that boast greater adaptability, intelligence, and robustness. The researcher plans to mimic cerebral evolution in Lucy using computer-simulated neural networks whose interactions will form the basis of the robot's intelligence. Grand boasts that this method has so far enabled Lucy to recognize a banana regardless of color, size, position, or distance. Lucy will undergo an upgrade with 15 new computers Grand purchased using a $68,000 grant from the National Endowment for Science, Technology, and the Arts; the researcher plans to extend Lucy's sensory range and mobility by giving her a more substantial body with improved eyesight and hearing, as well as appendages. One of his goals is to make Lucy capable of learning to crawl and walk, and repeat simple sounds. "With Lucy, [Grand] appears to be taking no shortcuts with sensory inputs or motor outputs, as he is striving to integrate real vision and audition, as well as voice, arms and legs," notes Apple Computer scientist Larry Yaeger, who feels that machine intelligence should be attempted from an evolutionary, rather than engineering, approach. But he adds, "if anyone on the face of the earth can engineer intelligence from scratch, I believe it would be Steve Grand."
    Click Here to View Full Article

 
                                                                             
[ Archives ] [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM