ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM.

To send comments, please write to [email protected].

Volume 5, Issue 565: Friday, October 31, 2003

  • "E-Mail Providers Devising Ways to Stop Spam"
    Washington Post (10/30/03) P. E1; Krim, Jonathan

    Instead of trying to detect and block unwanted email, technologists at ISPs and email providers have spent the last nine months developing ways to modify the Internet architecture to recognize good email. Several projects focused on combating the practice of identity "spoofing"--the counterfeiting of legitimate email addresses by spammers to throw trackers off--have started to bear fruit. "We have to allow legitimate senders of emails to distinguish themselves from spammers," notes Harry Katz of Microsoft, which teamed up with Yahoo!, America Online, and EarthLink to devise a "trusted sender" system that reportedly is close to being announced. Project Lumos from the Network Advertising Initiative aims to certify email as well as rank the reputations of bulk emailers; so that Project Lumos can work effectively, bulk emailers must willingly comply with technical standards for adding data to a message's header, while ISPs would adjust their incoming mail servers to recognize the new data and impede bulk email that lacks such information. Certification would require mailers to follow certain consumer-friendly protocols, and repeated failures to comply would result in a low rating by the system, and the automatic blocking of any mail from low-scoring mailers. The senders permitted from (SPF) initiative conceived by Pobox.com CTO Meng Wong is an anti-spoofing project in which companies that run outgoing mail servers must electronically "publish" the numeric Internet addresses of all verified machines that transmit messages from its domain; the incoming mail server would then check to confirm that messages claiming to be from a certain numeric address indeed originate from that address. Hans Peter Brondmo of bulk mailer Digital Impact believes an Internet address check scheme such as Wong's will be instituted by year's end. Katz points out that these various antispam initiatives will not work unless enough participants join in to threaten business for non-participating firms.
    Click Here to View Full Article

  • "Boucher Calls Copyright Office 'Misguided'"
    InternetNews.com (10/30/03); Mark, Roy

    The U.S. Copyright Office's recent decision to deny consumers a fair-use exemption to the Digital Millennium Copyright Act (DMCA) that would allow them to make back-up copies of legally purchased, digitally recorded material was decried by Rep. Rick Boucher (D-Va.) on Oct. 30 as "misguided." He explained that consumers will be forbidden from backing up their DVDs, and also banned from engaging in specific kinds of encryption research and security testing, and from bypassing access control measures even when they are outdated, damaged, or not working properly. The DMCA exemptions the Copyright Office did grant on Oct. 28 covered decrypting lists of Web pages or directories impeded by Internet filtering software, bypassing outmoded digital rights management devices that block access because of damage or malfunction, accessing video games and programs in outdated formats, and accessing electronic books for which the publisher has deactivated the read-aloud function or the use of screen readers to convert the text into a specialized format. In January, Boucher introduced a bill, H.R. 107, that would permit the fair-use circumvention of digital copy controls as well as allow the production and distribution of circumvention hardware and software as long as its uses, for the most part, do not constitute copyright infringement. "Now that it is clear that the Copyright Office is not going to interpret the DMCA in ways that will permit ordinary fair use activities, the need for the enactment of H.R. 107 is more apparent than ever," Boucher declared. His legislation is supported by the ACM, Sun Microsystems, Gateway, the Consumer Electronics Association, and Verizon, among others.
    Click Here to View Full Article

    For information on ACM's activities involving copyright issues, visit http://www.acm.org/usacm.

  • "E-Vote Software Leaked Online"
    Wired News (10/29/03); Zetter, Kim

    The software code used in Sequoia Voting Systems' AVC Edge touch-screen kiosk has been made available on the Internet, according to an anonymous tipster who found the program on an IT contractor's public FTP server. Jaguar Computer Systems' own Web site pointed users to the public portion of its FTP server, saying many commonly used files were there; Jaguar provides election support services to a California county where Sequoia AVC Edge machines were used in 2000 and last month's recall vote. The Sequoia code was already in binary form and is not as easy to decipher as the Diebold source code discovered earlier this year on an unprotected FTP server. Johns Hopkins researcher Avi Rubin, who wrote a controversial report on the Diebold system's insecurity, says the Sequoia code would reveal perhaps 60 percent of what was in the actual source code. Sequoia touts its proprietary operating system, which it says lends more security than Diebold's Windows-based system; however, inspection of the software shows Sequoia uses WinEDS (Election Database System for Windows) running on top of Windows systems, and that it also uses an unpatched version of MDAC 2.1 (Microsoft Data Access Components), which has been subsequently updated to version 2.8 because of vulnerabilities with the previous iterations. Stanford Research Institute lead computer scientist Peter Neumann says MDAC is off-the-shelf software and not included in certification and testing, and could provide sanctuary for a Trojan horse planted by an election official or company employee. Electronic Frontier Foundation attorney Cindy Cohn blasts both Diebold and Sequoia for their insistence on keeping source code secret. Instead, she says the software should be made open so that the outside community can point out vulnerabilities and help make it more secure. Neumann says, "You need to use a machine with accountability and an audit trail."
    Click Here to View Full Article

    For information on ACM's activities involving e-voting, visit http://www.acm.org/usacm.

  • "Torvalds and Morton Release Linux Kernel 2.6"
    Linux Insider (10/28/03); Kroeker, Kirk L.

    A new Linux 2.6 test kernel from Linux inventor Linus Torvalds and kernel manager Andrew Morton is available and ready for testing in the enterprise space, according to an announcement from the Open Source Development Labs (OSDL). "Now is when we want big companies and software vendors to step in and hammer on the kernel so we can get their ideas into the final production release of 2.6 Linux," OSDL Fellow Torvalds explains. The 2.6 kernel boasts notable improvements over the three-year-old 2.4 kernel, including increased scalability, an augmented driver layer, and expanded support for embedded device applications such as cheap, low-power CPUs with no memory-management units. Torvalds says the kernel is in position to be employed on 32-way systems; the kernel development team has inserted a new CPU scheduler, memory management, and file system code, and the kernel can support as much as 8 GB of memory on IA-32 systems. The enhanced driver layer was developed to boost the performance and manageability of disks and other I/O devices, and includes a new device mapper and logical volume management. The Linux 2.6 kernel also allows users to hot-plug Firewire, USB, and other devices, as well as access smoother and better-performing mouse, video, and sound functions. The platform is able to support professional music-studio quality sound through the Advanced Linux Sound Architecture. OSDL encourages Linux 2.6 kernel users to use OSDL test workloads, which can be accessed at the OSDL and Sourceforge Web sites, and discuss the results with the lab. OSDL's Scalable Test Platform and Patch Lifecycle systems, as well as a compile regression test platform, comprise the lab's Linux test infrastructure.
    Click Here to View Full Article

  • "Digital Copyright Challenges Get Chilly Response"
    NewsFactor Network (10/29/03); Morphy, Erika Morphy, Erika

    The Library of Congress granted only a handful of exemptions to the Digital Millennium Copyright Act (DMCA) out of many proposals submitted during the first review period, and critics such as Electronic Frontier Foundation (EFF) legal director Cindy Cohn lament that the refused proposals could have helped strike a much-needed balance between the rights of consumers and copyright owners. "We are obviously disappointed that the Librarian chose to ignore the views submitted by consumers," declared EFF staff attorney Gwen Hinze. Approved exemptions include an allowance of access to electronic books that have had their read-aloud function disabled by publishers, a provision for the visually handicapped; three exceptions related to the bypassing of outdated digital-rights management devices and computer programs and video games in antiquated formats; and permission to decode Web pages or directories blocked by software filters. Exemptions that were refused included one from 321 Studios calling for the copying or backing up of purchased DVDs; the company announced plans to appeal the decision. The Librarian's argument in refusing the exemption was that there are plenty of reasonably priced, commercially available DVDs for consumers to buy as replacements for broken or defective products--a move that 321 Studios says mainly serves to put more consumer dollars into the entertainment industry's coffers. "The intent of the [DMCA] was to stamp out piracy--it was never intended to stamp out fair use rights of average Americans," asserts 321 Studios' Julia Bishop-Cross. Other denied exemptions include allowances for software enabling consumers to play DVDs on alternative systems and foreign DVDs on American players, as well as file-sharing software that lets consumers exchange CDs and DVDs with friends and family.
    Click Here to View Full Article

  • "Are We Ready for E-Voting?"
    E-Commerce Times (10/30/03); Millard, Elizabeth

    Developments are underway in the United States and Europe to make electronic voting a reality, but e-voting seems unlikely to fully emerge for a number of years. "I think the big problem is that everyone's rushing to make use of electronic voting when there aren't good standards, particularly security," notes the Stevens Institute of Technology's Arnold Urken, who established one of the first election testing labs for e-voting, but had to shut it down because election officials were unwilling to do thorough testing. He also points out that security standards must be complemented with improved software tools and transaction methods that instill reliability. Urken further observes that the U.S. government is in the habit of purchasing second-rate e-voting technology, a practice that discourages the country from developing sturdy verification protocols. Gartner VP Christopher Baum explains that e-voting faces many political hurdles, such as a lack of consensus on how ballots should be handled. "That doesn't mean that everyone has to use the same technology, but the ballots will have to be treated in the same way by everyone," he says. Another factor that could stall e-voting is a lack of federal investment in the technology, despite promises of funding from the government. Despite these challenges, e-voting proponents are optimistic that an e-voting system can be established by the time the 2012 presidential election rolls around: Baum predicts that as much as one-quarter of all votes in the 2012 election could be cast electronically, and a major leap forward will be the setup of e-voting centers that spare voters the inconvenience of traveling, which could foster a dramatic increase in voter turnout. Urken says e-voting could facilitate a shift in the way the government operates by solidifying its role as a vendor of information.
    Click Here to View Full Article

  • "Ideas Unlimited, Built to Order"
    New York Times (10/30/03) P. E1; Schiesel, Seth

    Technological inventions have not always met the needs of users, and more often conform to their wants. In a survey of prominent people and technologists, the New York Times found that many would-be technological inventions are not about technology itself, but about meeting the sometimes extravagant wants of users. Cartoonist Scott Adams would like to have a device that could locate his house cats, while Electronic Frontier Foundation co-founder John Perry Barlow wishes for an instantly updateable brain implant that could act as a universal remote. Conversely, real estate tycoon Donald J. Trump would like brain implants for his contractors so that they could understand what he wanted without him having to shout. FCC Chairman Michael K. Powell says he would like a device that delivers up-to-date personal information--the type needed to fill out forms at the doctor's office, the Department of Motor Vehicles, and for mortgage refinancing. Novelist William Gibson suggests a hosted Web service that filters information read online, and highlights it in different colors depending on whether it is misperceptions, political spin, or outright lies. Sun Microsystems co-founder Bill Joy asks for a hat that could filter out unwanted sounds, such as a nearby obnoxious conversationalist at a restaurant, glasses that could block unwanted sights, and another device that could mute rude or angry speech. Such inventions would give the users the character of the proverbial three monkeys that neither hear, see, or speak evil, he says. Tennis professional Martina Navratilova wants a technological tennis linesperson that's always right, while comedian Margaret Cho would like a laptop computer that doesn't get hotter than your body temperature, yet is light enough to hold in one hand, is powered by solar energy and other light sources, and can read your mood, changing colors and playing a song that snaps you out of a bad mood, for example.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "On the Road to a Great I.T. Career"
    NewsFactor Network (10/30/03); Park, Michael Y.

    The IT industry promises good career opportunities in certain areas, such as defense, research and development, utility computing, and wireless technology. New entrants to the job market may also want to consider getting more staid jobs outside of the big city, since companies across the board are ramping up their IT operations, even small firms in outlying areas. Traditional positions such as application development, server management, and network and server operations provide a good technology base for future CEOs, according to Challenger, Gray & Christmas CEO John Challenger; he also advises U.S. workers to stay ahead of the globalization trend by getting into defense and security fields, since those jobs are unlikely to be outsourced for security reasons. Morningstar analyst Nicolas Owens also says the military and other government agencies requiring security clearances are good places for new IT workers to go for job security, since they are markets protected from offshore outsourcing. He notes that the consolidation trend among defense contractors means faster growth of standards and more rapid integration efforts at agencies such as Homeland Security. Boston Consulting Group partner Mike Busch says companies right now want technology-savvy business people, and adds that the combination of an MBA and technology degree is very powerful. More technically oriented workers, however, will want to focus on emerging areas such as utility computing and research, which Busch estimates will grow quickly in the next five years. Challenger explains that in the end, predicting the future in the mercurial IT world may not be the best option for college students and other new workers, since the most important factor should be a passion for whatever job a person is doing.
    Click Here to View Full Article

  • "UA Researchers May Lead Way to Faster Computers"
    Arizona Daily Wildcat (10/31/03); Nowe, Ashley

    The National Science Foundation awarded a $1.23 million research grant to eight University of Arizona scientists to develop a protein-based wire technology that could dramatically reduce the size of computer chip components and boost computer speeds. Associate professor of bioengineering James B. Hoying explains that these ultra-thin protein strands or microtubules are so small that 25,000 could be lined up shoulder to shoulder within a single millimeter. The protein replicates in response to intestinal E-coli bacteria, forming a straw-like microtubule that the research team turns into a wire by sheathing it in a metallic coat. In order to build a working electronic circuit, the protein wires must be coaxed into connecting to the proper points, an ongoing challenge the UA research team faces. "Among all the hundreds of strands that grow from the starting point, only one or two will connect where we want it to," notes material science and engineering professor Pierre Deymier. The failure of even one of the wires to connect will render the chip inoperative. In an ideal scheme, the researchers would position the protein at a starting point and add the E-coli bacteria to trigger replication, and then insert another short protein that links to a properly formed wire at the end-point. The protein wire technology could be applied to sensors and miniature electronic circuit structures as well as computers, while Hoying envisions water-based computers as a possible future development. "This will be a whole new paradigm of technology, leading to the hybrid between biology and technology," he boasts.
    Click Here to View Full Article

  • "Israeli Processor Computes at Speed of Light"
    Reuters (10/29/03); Cohen, Tova

    Israeli firm Lenslet claims to have created the first commercially available optical DSP, capable of performing 8 trillion operations per second. The DSP with attached optical accelerator advances digital technology 20 years, boasts Lenslet founder and CEO Aviram Sariel, who says licensing the technology is not out of the picture. The rather large Enlight processor uses 256 lasers to perform computations, and is targeted at high-end applications such as high resolution radar, electronic warfare, airport luggage screening, weather forecasting, and cellular transmissions. Gartner analyst Jim Tully says he is unfamiliar with the Enlight processor, but says other chip firms are already researching on-chip optical connections. He says traditional semiconductor chips benefit from existing manufacturing investment. The final Enlight product will measure 15 cm square and cost about the same as a multi DSP board, according to the company. Sariel says, "Optics is the future of every information device," while Tully says "it's conceivable this technology could become mainstream inside chips in 10 years time."
    Click Here to View Full Article

  • "Grateful for Robot Contest, Entrants Give Tanks"
    CNet (10/30/03); Kanellos, Michael

    Unmanned vehicles that the U.S. military can use is the goal of the Defense Advanced Research Projects Agency's (DARPA) Grand Challenge, a 250-mile off-road race from Los Angeles to Las Vegas scheduled to take place in March 2004. DARPA announced Oct. 30 that 106 teams--far more than were originally anticipated--have applied as Grand Challenge participants, while 86 teams have sent in technical papers describing their proposed robots. It was hoped that the final roster of contestants would be complete near the end of October, but DARPA has extended certain deadlines because of the huge number of applicants. The development team whose autonomous vehicle successfully completes the race within the allotted 10-hour timeframe will be awarded $1 million; the course, which will include rough terrain and at least one overpass that thwarts Global Positioning System (GPS) navigation schemes, will not be disclosed until two hours before the race, and the vehicles must guide themselves without human assistance, except for emergency stops and restarts. Grand Challenge representative Don Shipley announced earlier in the year that the Defense Department wants one third of all combat vehicles to be autonomous by 2015, and this mandate prompted the development of the competition. Notable Grand Challenge participants include Carnegie Mellon University's Red Team, which is reportedly converting a Humvee into a robot vehicle at a cost of over $1 million. A California Institute of Technology group is refining a Chevy Tahoe, with corporate sponsorship from Northrop Grumman and IBM, among others. A much smaller-scale effort is Team Loghiq, which primarily consists of two brothers sponsored by Via Technologies of Taiwan.
    Click Here to View Full Article

  • "Spotlight: People Are Robots, Too. Almost"
    Jet Propulsion Laboratory (10/28/03)

    Robots that can imitate human mental processes are under development in the Telerobotics Research and Applications Group at NASA's Jet Propulsion Laboratory (JPL). JPL robotics engineer Barry Werger reports that "by mimicking human techniques, [robots] could become easier to communicate with, more independent, and ultimately more efficient." Robot control methodologies fall into two general categories: Deliberative control, in which robots plan sequences of action using painstakingly detailed maps and models; and reactive control, whereby the machines base their actions on environmental observation. JPL's Telerobotics Research and Applications Group is developing machines with behavior-based control, which involves robots following a plan that can be adjusted in response to unexpected changes or impediments. Researchers at JPL follow either a fuzzy logic or neural network approach to implement behavior-based control. Whereas robots that employ fuzzy logic follow an unchanging set of knowledge, machines using neural networks--large networks of simple elements configured in much the same way neurons are arranged in the human brain--learn from example and experience. Embedding fuzzy logic into robots' engineering technology enables them to operate in a human-like way and react to visual or audible cues. Meanwhile, neural network robots can be trained to recognize and navigate environments through example, using experience to adjust to unforeseen circumstances. Technology that relies on behavior-based control is already employed in digital cameras, computer programs, washing machines, car engines, and by the postal service. As the technology continues to improve, the time may be coming when robots can perform space missions with a minimum of human aid.
    Click Here to View Full Article

  • "Is There an Echo in Here? Software Lets Architects Predict"
    New York Times (10/30/03) P. E4; Eisenberg, Anne

    Architects are using auralization software to model the acoustical qualities of planned structures. Such technology can allow architects to catch sound-related problems before construction and redesign accordingly; auralization can also demonstrate predicted acoustics to clients so that they will be more inclined to agree with architects that changes are necessary. Predictive sound programs used to be employed mainly for the architecting of concert halls and other high-profile performance areas, but the growing power and speed of computers has allowed architects to use auralization to design more common structures, such as airport terminals. There are drawbacks to the technology: Robert C. Coffeen of the University of Kansas reports that auralization programs are less adept at handling the scattering of sound than they are at simulating sound reflections. "Auralization isn't there to tell us exactly 100 percent how a room will sound, but to show changes in what it will be like as we change materials and surface shapes within the hall," he explains. Lily M. Wang of the University of Nebraska notes that there is a short supply of certain recordings for auralization tests--namely, sound samples recorded in an echo-free anechoic chamber. One sample notably missing was an anechoic choral recording (an important element for the aural design of churches, for instance), so Wenger's Ronald Freiheit arranged a choral recording session in collaboration with 3M. Rendell R. Torres of Rensselaer Polytechnic Institute points out that the lack of an acoustical simulation is "like trying to describe a painting without actually showing it to anyone."
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "ICANN Heads for 'Substantive' Meeting"
    CNet (10/28/03); McCullagh, Declan

    ICANN Chairman Vinton Cerf says he looks forward to dealing with "really substantive issues" during ICANN's meeting this week in Tunisia, when the organization is expected to address VeriSign's SiteFinder service, intellectual property rights in domain names, and IPv6. Cerf says this represents a significant step for ICANN and for himself. ICANN's wrangling with VeriSign over SiteFinder, the rise of denial-of-service attacks, and other issues have drawn greater public attention to Internet governance of late. Cerf comments on another key issue, claiming that by 2006 to 2007 "there will be a need to have a fully functional IPv6." Recently, 3Com, Cisco Systems, AT&T, BellSouth, and other companies declared their support of IPv6, which the U.S. Department of Defense intends to utilize exclusively within the next four years. Additionally, ICANN President Paul Twomey has commented on concerns about Internet stability as well as plans to review intellectual property rights related to use of country names in domain names.

  • "Chasing the Bugs Away"
    InfoWorld (10/27/03) Vol. 25, No. 42, P. 49; Krill, Paul

    A 2002 report from the National Institute of Standards and Technology estimates that software bugs cost the American economy $59.5 billion annually, a third of which could be saved by more thorough testing for defects. Many enterprises are setting up best practices in the software development stage, and turning to third parties to test software for errors as well as check code afterwards. Cigital President Jeff Payne attributes software glitches to three factors: The complexity of software, the futility of creating flawless rules to completely eliminate bugs, and the generally poor testing and validation efforts of software developers and builders. "When you get to business management, there's often a disconnect with the software development side and [the fact] that services and tools exist out there to make it easy to remove these defects," observes Reasoning product management director Jeff Klagenberg, who argues that management must accommodate software quality issues. Most analysts contend that developers should test software as they go in order to boost the quality of the end product, as well as entrench procedures designed to keep business-side requirements in focus. Cigital's approach is to particularize what is to be built, and then architect and design prior to coding and testing. Companies can also save money by shaving off reworking time and late life-cycle testing costs through the use of software quality reviews and artifact analyses, while Agitar CEO Alberto Savoia believes bug detection should be an area of concentration for software developers rather than a responsibility relegated wholly to quality-assurance staff. ZapThink senior analyst Jason Bloomberg favors extreme programming and agile software methodologies that "more tightly link developers to the users who will use the final product," with the latter especially valued as a process to make sure that software is attuned to fluid business needs.
    Click Here to View Full Article

  • "Salary Squeeze"
    Computerworld (10/27/03) Vol. 31, No. 49, P. 37; King, Julia

    An annual Computerworld salary survey of almost 15,000 IT professionals finds that for the second consecutive year pay raises were moderate, while some workers experienced salary reductions per company mandate. Respondents received an average raise of approximately 6.7 percent, and though fewer workers reported pay cuts this year, the average salary shrinkage was 13 percent of their base pay, the same as last year. Nearly all of the polled professionals over 40 consider themselves fortunate just to be working in the midst of an economic recession. CP Kelco CIO Joyce Young notes that middle managers in particular are earning less because of the economy and current levels of IT unemployment, while DFS Group CIO Ron Glickman points out that offshore outsourcing also plays a key role in flat or declining IT worker wages. DFS Group's outsourcing contract with New Jersey-based Cognizant Technology Solutions has helped the company be more productive on just 50 percent of the operating budget, Glickman notes. On the other hand, Gartner analyst Diane Berry warns that economy-driven IT layoffs can hurt companies once the economy recovers and they need to hire expert IT workers to stay competitive. Some companies have tried to avoid breaking the "psychological contract," as Berry calls it, in unique ways: Barton Malow CIO Phil Go says his company makes regular salary adjustments of between 3 percent and 6 percent, while Synygy executive Chetan Shah has cut the salaries of average and underperforming IT staff in order to fund raises and bonuses for more skilled and outstanding workers. Fifty-four percent of all IT workers surveyed by Computerworld expressed satisfaction with their overall compensation, while only 22 percent reported dissatisfaction, despite the longer hours and heavier workloads they must contend with.
    Click Here to View Full Article

  • "Emerging Technology: Wireless Security--Is Protected Access Enough?"
    Network Magazine (10/03) Vol. 18, No. 10, P. 33; Dornan, Andy

    The Institute of Electrical and Electronics Engineers' (IEEE) release of Wi-Fi Protected Access (WPA) was done to correct security flaws that have long plagued the 802.11 standard; such flaws include inadequate encryption for the standard's Wired Equivalent Privacy (WEP) feature, a major problem being key reuse, in which attackers can exploit patterns established by the repeated use of the same encryption key to build a crib sheet. Many WLAN vendors responded to this vulnerability by lengthening the WEP key, but this solution offers limited protection, given the severity of WEP flaws. The IEEE is designing a WEP replacement, which is awaiting finalization and may not proliferate widely for two years; furthermore, the most secure Wi-Fi iteration, 802.11i, will not work with the entire installed base of Wi-Fi hardware. The 802.11i standard ejects RC4 encryption and replaces it with Rijndael (AES), the official cipher of the U.S. government. Included in the draft is an alternative encryption scheme, Temporal Key Integrity Protocol (TKIP), that enables non-AES hardware users to join 802.11i networks without lowering security. Another component of WEP is a CRC-32 checksum that allows recipients to ensure that their information has not been changed in transit, the catch being that while accidental alterations are easily detected, deliberate attacks are not; safeguards against unintentional and intentional changes in 802.11i were added with the design of AES-CCMP. WPA, developed by the Wi-Fi Alliance, melds TKIP with several 802.11i components, including the 802.1x authentication protocol, which controls the mode of exchange for TKIP and AES keys, allowing deployers to employ any kind of authentication. Other 802.11i-derived technologies utilized by WPA include Key Hierarchy, which permits users to employ different keys for different purposes, and Cipher Negotiation, which is a necessity when not all clients linking to a network support the same security measures. Nevertheless, WPA and 802.11i cannot prevent the biggest WLAN security problem--that WEP is deactivated by default when shipped, and most users fail to activate it.
    Click Here to View Full Article

  • "The State of Information Security 2003"
    CSO Magazine (10/03); Berinato, Scott

    A CIO magazine/PricewaterhouseCoopers survey of over 7,500 respondents outlines the evolution of information security. Among the conclusions it draws is that information security has improved little in the last two years; respondents are just starting to realize that solving information security problems will be a daunting challenge, yet at the same time are looking for simpler solutions; and there is a clear correlation between confidence and companies that boast solid security. The survey causes security experts such as Bruce Schneier to attribute most information security problems to deployment rather than spending--there is more investment in tech implementations than there is in risk-management training, awareness, and education, and few companies are using technology to its full potential because they just let it run without interpreting the data it generates. More confident organizations usually possess stronger security infrastructure, devote more of their IT budgets to security, and are less likely to have security controlled by the IT department. Key strategies of effecting this model include hiring a CSO or forming an executive security committee, and carrying out independent audits of security practices. Fear of attack is often the chief justification for security deployments, and CSOs should focus more on objective security evaluations, and institute solid security requirements that vendors and business partners must meet. EBay VP Howard Schmidt thinks the predominance of minor security breaches signifies a dearth of IT security discipline, and the solution is to re-target security efforts to cover both small-scale and large-scale threats, hire a disciplinarian, and vigorously adhere to security regulations. The amount of money each individual employee spends on security measured by the survey illustrates the importance of applying the per capita security spending metric to the enterprise, and comparing this expenditure to the industry average, the general average of $964, and expenditures of very confident and not very confident companies.
    Click Here to View Full Article

  • "TR100: Internet"
    Technology Review (10/03) Vol. 106, No. 8, P. 88; Bruno, Lee

    The honorees in this year's TR100 awards in the Internet sector believe all networks will one day feature ubiquitous voice and data services via the adoption of 802.11 wireless data networks and the proliferation of Net-based voice transmission. These developments, along with the expected growth of the low-cost wireless sensor market, will add intelligence to networks in which authority for controlling data is spread out across appliances and sensors that study and share information on traffic patterns. John Apostolopoulos at Hewlett-Packard has developed a method for simultaneously transmitting video data along multiple routes to bolster the security of streaming video on the Internet. Other TR100 honorees in the field of Net security include CipherTrust CTO Paul Q. Judge, who wrote IronMail software that halts spam and computer viruses before they can penetrate a network. Lafayette Project co-founder Meg Hourihan and Google programmer Evan Williams designed Blogger, a wildly popular Web application that simplifies the creation of Web logs (blogs). Jason Hill of JLH Labs created the freely distributed TinyOS software that enables wireless sensor devices to send messages to their neighbors as needed, and boosts the resilience and accelerates the deployment of sensor networks; Ember co-founder Andrew Wheeler, meanwhile, has developed wireless sensor networks designed to raise industrial efficiency. Lih Y. Lin of the University of Washington and AT&T's Jennifer Yates both helped speed up optical networks--the former by building pivoting micromirrors to directly switch light-wave signals, and the latter by developing the General Multi-Protocol Label Switching standard widely used by the telecom industry. Other notable Internet innovators include CollabNet's Brian Behlendorf, who developed software tools to improve the productivity of groups engaged in collaboration over the Net; Jud Bowman of Pinpoint Networks, who created a software platform that establishes a bridge between wireless applications and handsets; Yahoo!'s Rasmus Lerdorf, who invented the PHP hypertext preprocessor server language to bring live data to the Web; Voxiva founder Paul Meyer, Digital Envoy's Sanjay Parekh, and Martin Wattenberg of IBM.

[ Archives ] [ Home ]