HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 735:  Monday, December 27, 2004

  • "Researchers Get EU Funding for Linux Project"
    IDG News Service (12/22/04); Niccolai, James

    An organization of European research institutions and open-source software firms has secured $4.5 million in funding to create new software development and management tools for simplifying complex IT projects that use Linux and other open-source software. The European Union committed $2.9 million to the Environment for the Development and Distribution of Free Software (EDOS) project in an effort to boost European competitiveness in the IT sector, notes Nuxeo CEO Stefane Fermigier. He says most contemporary Linux deployments consist of thousands of individual software "packages," and the tools EDOS aims to develop could significantly ease their integration and management. EDOS is focusing on the development of two specific tools: A distributed peer-to-peer application that helps system builders insert and mesh together software packages running across scores of PCs and servers, and an automated quality testing suite. Fermigier says the tools will be particularly beneficial for consultants who build customized versions of Linux for projects, and could also abbreviate development cycles for vendors of Linux operating systems. Participants in the EDOS group include project leader and Linux software vendor Mandrakesoft, Nuxeo, the University of Zurich, Tel-Aviv University, the French National Institute for Research in Computer Science and Control, Italy's CSP Torino, the University of Geneva, SOT in Finland, Nexedi SARL, and the University of Paris 7. EDOS is envisioned as a 30-month project with deliverables expected every six months, and Fermigier says the first priority is to study the problems and potentially devise specifications and prototypes.
    Click Here to View Full Article

  • "Just How Old Can He Go?"
    New York Times (12/27/04) P. C1; Lohr, Steve

    Inventor and computer scientist Ray Kurzweil is attempting to extend his life via a rigorously controlled diet that includes 250 daily nutritional supplements, which he interprets as a "reprogramming" of his biochemistry. It is Kurzweil's contention that death could one day be postponed indefinitely thanks to continuing advancements in knowledge and technology. In their book, "Fantastic Voyage: Live Long Enough to Live Forever," Kurzweil and Frontier Medical Institute founder Terry Grossman outline a three-stage, 20- to 25-year process of biotechnology evolution. Kurzweil thinks artificial intelligence and nanotechnology will allow people to reconstruct their bodies in whatever way they desire by the late 2020s, while advanced gene process knowledge will make the reversal of disease and aging through biotechnology therapies possible within 15 to 20 years. The authors say people have a better chance of being alive and healthy by the time nanotech-based life extension technologies arrive if they reprogram their bodies through diet, exercise, and nutritional supplements, as Kurzweil has done. Among the authors' dietary recommendations is the consumption of less food than a person needs, with an emphasis on strict carbohydrate, dairy, and fat limitations, along with a high vegetable intake. University of Illinois at Chicago professor S. Jay Olshansky cautions that the mega-vitamin dosages Kurzweil and Grossman advocate are foolhardy, given that such supplements' benefits have not been properly assessed. Inventions Kurzweil is credited with include an early optical-character-recognition program, the first commercial multiple-word speech-recognition system, a text-to-speech voice synthesizer for the visually impaired, and other pattern recognition technologies that employ AI.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Sprucing Up Open Source's GPL Foundation"
    CNet (12/23/04); Shankland, Stephen

    The General Public License (GPL) that is core to many important open-source software projects is being updated, though the release of a version 3 is not yet set, according to GPL author Richard Stallman. There are a number of concerns to be addressed in the next update, perhaps most importantly how patented technology is addressed: Currently, GPL is ambiguous about companies' patented technology involved in GPL-licensed software; what happens to patented technology when its involved with GPL projects needs to be clearly defined, and Sughrue Mion attorney Frank Bernstein suggests emulating the Apple Public Source License or Common Public License used by IBM, which both cover included patents and punish companies that sue over patented technology by terminating their rights to use and distribute the software. Other open-source experts see GPL version 3 as a way to battle software patents in general. "We need to find some way to monkey-wrench the awful, broken software-patent oligopoly before it does more serious damage," says Open Source Initiative President Eric Raymond. Open-source advocate Bruce Perens suggests a mutual-defense clause for different open-source licenses that would punish companies that sue over patent infringement by terminating their licenses to all products that fall under the clause. Another issue to be addressed is the use of GPL software with proprietary hardware, such as the TiVo digital video recorder that only allows use of one version of Linux, or developing trusted computing schemes that would require all executable software to be cryptographically signed before being used on a PC. Web services presents another challenge because it redefines the meaning of distribution, reports Hewlett-Packard's Martin Fink. He wonders whether GPL distribution extends to Web services, where portions of a program are spread over many systems.
    Click Here to View Full Article

  • "Dumber PCs That Use the Net for Processing, Storage Get Hot--Again"
    Wall Street Journal (12/27/04) P. B1; Delaney, Kevin J.

    The thin-client computing model, which was a major bust in the 1990s, may be enjoying a comeback with the increasing pervasiveness of broadband Internet connections, open-source software, and moves to set up a Third-World computing infrastructure via the deployment of low-cost, rugged computing devices. Remote management of advanced applications and file storage via powerful servers and hard drives offered by Yahoo! and other ISPs is helping make the thin-client PC model even more commercially viable, along with Internet companies' promotion of the "Web as platform" technology concept. In addition, thin-client computers offer fewer hackable vulnerabilities thanks to their simplicity. These developments push a computing model in which the user's PC terminal need only support a Web browser and a display. Certain executives in Silicon Valley contend that Google is advancing a strategy that employs PCs with open-source operating systems and browsers to access Web-based applications hosted on Google servers, while Ask Jeeves and other online search companies are undertaking similar tactics. There are drawbacks to the thin-client/Web-based computing model: Its support for video editing or video games with heavy graphics is poor, and Internet companies cannot dangle privacy concerns in front of consumers in order to gain their trust with their personal information. However, the abundance of inexpensive PCs and the profusion of increasingly intelligent Web-connected handhelds have made thin-client devices less of a concern among tech thinkers.

  • "Rage Against the Machines"
    Technology Review (12/24/04); Delio, Michelle

    Kent Norman, cognitive psychologist and director of the University of Maryland's Laboratory for Automation Psychology and Decision Processes, studies how technology users vent their frustration at the systems they work with, often by hurling verbal abuse at the machinery, and sometimes by physically attacking it in creative ways. A three-year online survey conducted by Norman reveals that most respondents are fairly competent technically, leading the psychologist to conclude that "Geeks have real problems with technology designed by other geeks." Previous survey statistics have influenced Norman's assessment that approximately 10 percent of all new computers and tech equipment given as gifts over the winter holidays will suffer serious damage over the next few weeks as a result of user rage, with most of the damage inflicted by non-technical owners. Norman believes geeks will demonstrate even more venom in their technological torture. He observes that "Geeks have as many or more frustrations than the rest of the population because we attempt to push the technology harder and have higher expectations than most," and cites one report of an owner who poured two gallons of gasoline over his PC and set it ablaze. Other unique forms of tech torture survey respondents have admitted to include crashing cars into equipment, tossing keyboards into swimming pools, and shooting computers. Norman sees psychological value in destroying machinery creatively, a hobby he practices in earnest at his lab, which is stocked with over 20 years' worth of surplus and obsolete gear. Norman documents the equipment's destruction on video and distributes the films online in the hopes that frustrated computer owners will get a vicarious thrill rather than actually damaging their own equipment--or at least be inspired by the videos to practice safe and effective tech torture.
    Click Here to Vuiew Full Article

  • "Technologies for the Blind"
    Design Engineering (12/16/04)

    A collaborative venture between University of California, Santa Cruz, researchers and the nonprofit Smith-Kettlewell Eye Research Institute has yielded prototype assistive devices for the sight-impaired that use computer vision technologies stemming from robotics research. A team of students led by UCSC professor Roberto Manduchi has developed a flashlight-sized "virtual white cane" that takes range measurements via laser, in combination with a digital camera and a computer processor that scans and meshes spatial data as the user waves the device over a scene. The device emits audio signals to alert the user about distances as well as obstacles, and future prototypes will be equipped with a tactile interface. Manduchi and Smith-Kettlewell researcher James Coughlan are collaborating on a system that employs a camera-outfitted device to sense and collect data from small, cheap, maintenance-free colored tags or labels that can help blind users navigate and reach specific destinations. A third project, described by Manduchi as "MapQuest for the blind," aims to enable users to explore electronic maps using vibrations generated by a "force-feedback mouse." The project focuses on the software component of the force-feedback mouse, as the interface itself is readily available. Manduchi remarks that "The people at Smith-Kettlewell are helping us to understand the real needs of the blind, and they have blind engineers who test the systems we develop."
    Click Here to View Full Article

  • "Next-Generation Computer Chip to Hold 2 Engines"
    Associated Press (12/27/04); Fordahl, Matthew

    The world's top semiconductor companies are planning to roll out next-generation dual-core PC chips beginning next year. Both Intel and AMD plan to start shipping such microprocessors for desktops, laptops, and high-end servers in 2005, while IBM will collaborate with Sony and Toshiba on multicore Cell chips for video game consoles, high-definition TVs, and home servers to commercially debut in 2006. Dual-core chips will boast slower speeds than single-engine chips, but they will consume less power and be able to multitask, thus giving the chip industry a breather as it attempts to continue advancing Moore's Law. As chip transistors become smaller and smaller, power consumption and heat output are of increasing concern. Intel executives claim that the company's decision to terminate a next-generation Pentium 4 project as well as plans for a 4 GHz Pentium under its current Prescott core in favor of multicore chips was a matter of common sense rather than an issue of heat and power consumption. Taking full advantage of multicore chips involves redesigning programs so that they know the processor's capabilities, and Intel reports that it made progress in this area with its introduction of Hyper-Threading in 2002. Hyper-Threading, which is embedded in Windows XP and supported by most Linux deployments, fools the operating system into thinking that the computer boasts multiple processing engines, thus harnessing idle computing power to accommodate additional processing. Current software applications operate on a multicore chip on top of Windows, but those that have not already been optimized for Hyper-Threading will need to be reconstructed to fully exploit dual-core processors.
    Click Here to View Full Article

  • "Who Really Wielded the Paintbrush?"
    New York Times (12/23/04) P. B1; Eisenberg, Anne

    New image-processing software from Dartmouth College may help art historians gauge the authenticity of artwork as well as ascertain how many artists may have contributed to a piece. The software uses wavelet-based statistical analysis to determine the unique characteristics of lines, curves, and brush stroke textures taken from scanned images of the artwork. The technique is detailed in the December edition of Proceedings of the National Academy of Sciences by Dartmouth professors Daniel Rockmore and Hany Farid, who drew upon his work with computational algorithms designed to analyze digital images for signs of tampering. The method was used to distinguish drawings attributed to the Flemish master Pieter Bruegel from those credited as counterfeit, and the results were in keeping with art experts' opinions. Another test case examined the brush strokes of the Italian Renaissance painting "Madonna With Child" by Perugino, and the results suggested that more than one set of hands worked on the piece. Farid says the painting was converted to gray scale so that simple color differences between faces would not register, but Metropolitan Museum of Art curator Laurence Kanter says the technique should account for such elements in order to attain more credibility among art historians. Another Met curator, Nadine Orenstein, notes that the computer would need to tap a massive database for every artist, leading her to conclude that "It's going to be decades before they have a connoisseur meter and go around to auction houses." Still, conservators such as Ellen Handy with the City College of New York's art department think the software could complement expert evaluations.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Flexible Scanner Works on Curved Surfaces"
    New Scientist (12/23/04); Biever, Celeste

    University of Tokyo electrical engineer Takao Someya and colleagues have developed a flexible, credit card-sized scanner that uses a polymer matrix composed of thousands of light-sensitive plastic photodiodes placed underneath a grid of plastic transistors. The device can be plugged into a mobile phone, which will feed the scanner power as well as store and display the images it captures. Because the scanner is flexible, it can copy material off of curved surfaces. Each transistor stores the electrical charge generated by its individual photodiode in response to light; the plastic's transparency allows ambient light to pass through the membrane to reach the scanned object, and the diodes are shielded from the ambient light by the transistors, so that the diodes only receive the light reflecting off bright areas. The developers claim the scanner can scan text with adequate resolution "to image all the letters on a wine bottle." The prototype boasts 36 dots per inch (dpi) resolution, but the scalability of the polymer electronics reportedly makes 250 dpi resolution possible. Someya has had to use complex bench-top electronics to scan images, but the engineer plans to build a custom chip that will read out the data and pipe it to a mobile phone or a USB storage device for later viewing; a color version of the scanner that employs red, green, and blue-sensitive cells is also on the horizon. Someya believes the scanner will be able to read images better than cell phone cameras because the device can obtain a higher resolution image from unusually shaped surfaces.
    Click Here to View Full Article

  • "Poland Pushes EU Software Patents Off the Agenda"
    Computer Business Review (12/22/04)

    The formal adoption of the European Union's software patents directive has been impeded again with undersecretary of state for Poland's Ministry of Science and Information Technology Wlodzimierz Marcinski's last-minute request that the directive be removed from the agenda of an Agriculture and Fisheries Commission conference. The Foundation for a Free Information Infrastructure says Marcinski made this request on behalf of his country, which believes the issue merits further discussion. Last month, the Polish government rejected the proposed directive after meeting with Microsoft, Sun Microsystems, and Novell representatives, who corroborated that the measure makes all software potentially patentable; the government asserted in its rejection that the directive's wording is incongruous and inexplicit. Belgium's minister for economic affairs Marc Verwilghen told the Belgian parliament earlier this month that the directive has lost its supporting majority due to changes to the weighting accredited to member country votes on Nov. 1. The European Parliament proposed amendments to the directive that would supposedly patch a loophole critics claimed permitted widespread software patents, but the revisions were stripped from the directive before a political consensus was reached in May. Since the Agriculture and the Fisheries Commission meeting was the last of the year, the directive cannot now be ratified before the end of 2004. Should the directive become a common position of the Council of Ministers early next year, there will be a three- to four-month window for the European Parliament to agree, reject, or revise the directive before returning it to the Council for another reading. If revisions are to be made, the directive would then be sent back to the Council, which would have three to four months to agree to or refuse the amendments.
    Click Here to View Full Article

  • "A Look Ahead to Grid in 2005"
    Computerworld (12/23/04); Foster, Ian

    Grid computing will continue to mature in 2005 with the acceptance of open standards, increases in complementary Web services and service-oriented architectures (SOAs), implementations at Fortune 1000 companies, and the continuation of "big science" experiments, writes Globus Alliance co-founder Ian Foster. Standards are crucial to the further adoption of grid computing because they reduce complexity and future risk, and enable organizations to achieve return-on-investment faster. Grid computing implementations are still struggling to meet the real needs of applications, but large vendors are working toward that goal from the top down with their broad strategies, while smaller vendors are developing innovative solutions for application subsets, says IDC analyst Dan Kusnetzky. SOA and grid computing feed each other's growth because SOA makes software applications modular and therefore easier to deploy on the grid, while grid computing provides optimized infrastructure resources. The combination allows companies to increase business responsiveness and reduce costs, but there is a significant barrier in getting applications recoded to run on this new architecture, says Nemertes Research analyst Andreas Antonopoulos. Grid computing has plenty of resources to draw upon today, but the problem is one of controlling the available resources. Virtualization and orchestration becomes more difficult as new types of devices and resources come online, including sensors, networks, and other devices, says Nortel Labs director Franco Travostino. The key to grid computing success in 2005 is a focus on value, meaning limited deployments of high quality and fewer tools that are thoroughly tested, according to SAP enterprise grid computing development manager Alexander Gebhart.
    Click Here to View Full Article

  • "Maintaining Cryptographic Security in the Information Age"
    IST Results (12/23/04)

    Traditional approaches to data encryption and decryption are threatened by the emergence of more powerful computer systems, so researchers are looking to quantum computing in the hope of maintaining the security of information. IST has focused on this area, first through the STORK project, which built a cryptography brainstorming agenda, and then through the ECRYPT Network of Excellence project in cryptology; the next step is to make quantum cryptographic methods practical, available, and affordable for industry and commerce in general, which is the mission of the IST-funded PROSECCO project. "We are trying to find new quantum cryptographic applications that offer higher levels of security than classical cryptography, and also if classical cryptography can be made more secure against quantum attacks," explains PROSECCO project coordinator Joern Mueller-Quade. Quantum information exchange uses photon transmission, and intercepting such a transmission without disrupting the photon sequence--and thus signaling an attempt to eavesdrop--is impossible. The most obvious quantum cryptographic application lies in the distribution of the secret encryption/decryption keys, which is significantly less vulnerable than classical cryptographic techniques. Mueller-Quade reports that PROSECCO is more interested in the development of more secure digital signatures than key exchange methods. Meanwhile, IST's SECOQC initiative is developing a quantum technology-based tool that will let organizations securely share critical data by enabling cryptographic key generation and distribution management. Project coordinator Christian Monyk notes that unlike existing high-security cryptography, quantum cryptographic keys do not have to be distributed beforehand; developing economic and commercially practical optical devices for generating, sensing, and guiding single photons is a key challenge for SECOQC.
    Click Here to View Full Article

  • "Go Ahead, Just Try to Disappear"
    Los Angeles Times (12/27/04) P. A1; Colker, David

    Global Positioning System (GPS)-equipped mobile phones and other devices are rapidly emerging as tools allowing people to track the whereabouts of truant children, spouses, workers, pets, and others, but some perceive such technologies as a growing threat to personal privacy. "When a worker far away knows that every move they make is monitored by someone--without information about just what they are doing--it takes on a punitive sense," notes Lancaster University management professor Lucas Introna. SpyGear Store operator Greg Shields reports that women who suspect their husbands of philandering account for 60 percent of his business's geolocation gear sales. Meanwhile, CMS Worldwide expects the number of new cars equipped with GPS navigation systems to increase from 3.9 million now to 6.5 million in 2008. Cell phones with GPS debuted in 2001, when the FCC mandated that mobile phone carriers equip their handsets with geolocation technology in order to make 911 emergency calls easier to track; companies that opted for GPS are expected to have at least 95 percent of their subscribers converted to GPS phones by the end of next year. James Dempsey of the Center for Democracy and Technology believes that the commercial value of location services is so great that such services would spread even without a federal mandate. Mark Frankel of the American Association for the Advancement of Science is troubled about the practice of tracking people without their awareness, even if it is for their own good; adolescents, for instance, could regard such a measure as a betrayal. Frankel also argues that privacy plays an important role in the development of people's personality in their teens.
    Click Here to View Full Article

  • "Researchers Seek Truth in the Secret Life of the Battery"
    New York Times (12/23/04) P. E8; Eisenberg, Anne

    Texas Instruments researchers have developed a monitor that director of battery management Scott Eisenhart claims can calculate the remaining power in a laptop battery within 1 percent accuracy. Each battery pack in the system features two chips that continuously calculate the remaining energy; algorithms in the chipsets constantly take impedance measurements and adjust their predictions of the battery's remaining charge accordingly. Impedance--opposition to the flow of electrical current--changes in response to numerous factors (temperature, age, use, etc.), and Gold Peak Industries' Jiang Fan says a battery's rate of chemical reaction is affected by the buildup in impedance as electrolytes dry up. Battery Design President Robert Spotnitz says the gas or fuel gauges currently used to measure remaining run time on laptops are essentially no good, because the estimates they provide are often way off the mark. TI's Dave Heacock notes that newer laptops may boast gauges that monitor charge in and out of the battery, along with microprocessors that measure remaining battery energy, but large errors can still crop up because they are not programmed for constant, real-time monitoring. Fan says TI may have circumvented a major problem with its impedance-monitoring gauge: "In older systems, all the modeling factors for prediction have been determined, and they can't be changed once the battery is installed," he explains. TI does not expect to roll out the new gauges in laptops until the middle of 2005 or later, so that customers will have time to engineer the device into their products. The technology's success could lead to its inclusion in other devices such as digital cameras, where the gauges could measure the number of pictures that can be taken before the battery's charge is depleted.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Eyes, Wallets Wide Open to XML Traffic Woes in 2005"
    SearchWebServices.com (12/20/04); Mimoso, Michael S.

    Even as XML Web services are quickly being adopted by enterprises, those users are beginning to realize the consequences of more XML traffic on server processing and network bandwidth. Meanwhile, the World Wide Web Consortium (W3C) is working on a binary XML standard that some vendors fear poses a threat to ASCI text-based XML implementations and products. Those fears are unfounded since current text-based applications can continue alongside binary XML, says Elite IT Services senior consultant Rick Hogenmiller. Text application messages would need a encoder/compiler and decoder/decompiler, but the added processing requirements would be insignificant compared to the overall efficiencies gained from binary XML. The development of a single binary XML standard is going well and should be accepted by vendors once ratified by the W3C in about two years, predicts Burton Group senior analyst James Kobielus. Meanwhile, a market for XML acceleration products has cropped up to help companies deal with increased XML processing and bandwidth issues. Other experts, however, say XML is already overused. U.S. government technologist Sam Chance says binary XML would be no better than native binary protocols such as Remote Method Invocation. In addition, XML's use of port 80 on servers creates security problems, he says.
    Click Here to View Full Article

  • "Tech Trends That Should End"
    eWeek (12/27/04) Vol. 21, No. 51, P. 54; Chen, Anne; Sturdevant, Cameron; Rapoza, Jim

    EWeek editors reflect on the past year by identifying bad trends and poor decisions that should be rehabilitated or avoided in 2005. Software patching is a chaotic, expensive process made all the more so by the accelerating frequency and severity of exploits, and the solution is to significantly lower the need for patches, or eliminate it altogether. Email auto-response features in anti-virus gateways only add to the spam problem, and shutting them down will offer considerable relief; more adaptive anti-spam, anti-virus, and anti-spyware measures and strategies are also needed. Technology legislation such as the CAN-SPAM Act and the Induce Act has hindered rather than promoted tech innovation, while the world's major content holders are refusing to distribute content online out of dissatisfaction with the current digital rights management system, even though the absolutely piracy-proof DRM scheme they want is a pipe dream. The recommended rapid standardization of information lifecycle management solutions will allow vendors to develop compatible systems. Criticism is also leveled against "self-indulgent multimedia content" that makes delivering content to alternative devices or sight-impaired users more difficult and complicates Web site malware screening. Such splashy Web pages are of no use to people seeking decision support data or online transactions. The Sarbanes-Oxley Act is not a tech regulation measure, yet tech vendors have aggressively advertised compliant products as if it were, thus exploiting the fears of client companies in a way that could be construed as improper.
    Click Here to View Full Article

  • "The Death of Domestic Programming"
    DM Review (12/04); Ruggiero, Russell; Brooks, Rex

    The economic recession cannot be blamed on American business alone, but myopia among American business and government is eroding a major asset in the form of domestic programmers and the future IT talent pool as well, write IT analyst Russell Ruggiero and Starbourne Communications Design President Rex Brooks. Depressed levels of IT spending as a result of the Y2K flameout have spurred companies to offshore domestic programming to lower-wage countries in order to maximize their return on investment. A recent report from the Bureau of Economic Analysis indicates that overseas professionals employed by American companies outnumber American workers employed by foreign companies, while U.S. businesses and organizations are expected to have offshored between 30 percent and 40 percent of their programming work by 2009. Offshore outsourcing and the bleak domestic economic outlook have created a dispiriting job market for this year's crop of U.S. computer science graduates, and these factors will probably discourage many high-school students from pursuing degrees in computer science. Domestic IT job layoffs and increased offshore outsourcing will reduce America's capacity for innovation and economic competition. Such trends inhibit many U.S. companies' ability to properly implement enterprise-level applications into production as well as cultivate in-house IT expertise, while giving foreign nations an opportunity to nurture domestic talent and become economically predominant. The authors recommend two sweeping philosophical changes to reverse the shrinkage of America's IT workforce: Permitting businesses to lower their tax liability for employee development training in order to better promote the problem; and giving employees and their dependents equal medical insurance coverage--a strategy that is critical to solidifying worker loyalty.
    Click Here to View Full Article

  • "Is Cyberterrorism Being Thwarted?"
    Optimize (12/04) No. 38, P. 19; Sachs, Marcus; Crews, C. Wayne

    There is a difference of opinion between Competitive Enterprise Institute VP C. Wayne Crews and SANS Institute Internet Storm Center director Marcus Sachs over whether the federal government's attempts to protect the Internet from cyberterrorists are effective. Crews maintains that the government's solution to cyberterrorism--more cybersecurity legislation--is ineffective, and should be put aside in favor of greater federal-industry collaboration. He argues that the government should make a better effort to catching computer criminals rather than imposing new regulations, while the White House should avoid proposed cybersecurity measures that could whittle away individual privacy. Besides collaboration with the private sector, Crews recommends that the government "get its own house in order" by deploying network safeguards and establishing internal government-security product standards. Sachs, former director of the Department of Homeland Security's National Cybersecurity Division, says the government was focusing on cybersecurity even before 9/11 with the creation of organizations that include the FBI's National Infrastructure Protection Center and the Defense Department's Joint Task Force for Computer Network Defense; meanwhile, the private sector was similarly focused on cybersecurity with the formation of sector-specific facilities for information-sharing. Sachs says the al Qaeda attack forced industry leaders to rethink their business-continuity tactics and spurred governments to incorporate safeguards into both physical and Web infrastructures. He also extols the DHS Cybersecurity Division's cybersecurity-enhancing collaborations with academia, industry, and the international community, and notes that it has become standard practice among individuals to use antivirus software and approach unfamiliar Web sites with caution, while businesses are embedding security in all information systems, following best practices, and organizing internal cyberpolicies.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM