HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 599:  Wednesday, January 28, 2004

  • "Worm Slowing, But Still Dangerous"
    Wired News (01/28/04); Delio, Michelle

    The MyDoom email virus may be losing speed, but experts warn that its effects will linger. MyDoom, which has wrested the dubious honor of most virulent email virus away from Sobig.F, installs backdoors in victim computers that could allow hackers to hijack the infected machines and use them as platforms for all kinds of mischief, such as denial-of-service attacks and spamming. Keynote analysts note that general Internet performance was restored to close to normal speeds as of 3 p.m. EST on Jan. 27, but warn that there could be significant degradations when MyDoom launches a nearly two-week-long denial-of-service attack against the SCO Group's Web site on Feb. 1. Though the virus is programmed to stop proliferating on Feb. 12, the infected machines will be left vulnerable to spammers and hackers. Security experts attribute the virus' rapid spread to the way many users were fooled by the text accompanying the MyDoom-tainted email attachments; Trend Micro's David Perry explains that MyDoom represents the first time that a worm's social engineering has had the corporate sector in its crosshairs. Worse, MessageLabs' Natasha Staley thinks MyDoom is merely a proof of concept, and a harbinger of even more damaging malware on the horizon. Though Sophos and F-Secure have issued free tools designed to remove MyDoom from contaminated systems, many home users will likely ignore them because they are unaware that their computers have been compromised, since the worm appears to have no apparent effect on PCs. In an online letter to the open-source community, open-source proponent Bruce Perens states that MyDoom's attack on the SCO site is not to be commended, even though SCO is threatening litigation against Linux users on the grounds of copyright infringement.
    Click Here to View Full Article

  • "Virginia Tech Migrates G5 Supercomputer to Apple Xserves"
    E-Commerce Times (01/27/04); Weisman, Robyn

    Virginia Tech has announced that its G5 Mac-based supercomputing cluster, which ranked third in terms of speed last fall, will immediately begin its migration from Power Mac G5 desktops to Apple's Xserve G5 1U servers; this switch should be completed within four months. Virginia Tech's Lynn Nystrom says the transition will reduce the space the cluster takes up by two-thirds, and will better address heating and cooling issues. International Data analyst Mike Swenson says this migration is intriguing, at least superficially, given the similarities between the G5 Power Mac's and the Xserve's processors and configurations. He adds that the closeness of the 1U form factors makes controlling heat difficult, but Apple's Doug Brooks counters that the Xserve is less power-consumptive and therefore cooler than either the G5 tower or rival 1U rack-mount servers. He also posits that the switch will make the cluster's scalability more flexible, while system administrators will be able to manage the cluster remotely and check the health and condition of each individual Xserve. Tom Goguen, Apple's server software director, says the Xserves run on Mac OS Server 10.3, or "Panther," which "is fully optimized for the G5 platform...[offering] management tools and managed software solutions [that provide] Virginia Tech with much more flexibility on how they can choose to deploy the cluster." Nystrom says that NASA, the National Security Administration, and other agencies are courting Virginia Tech with proposals to use the G5 cluster for research projects, and the university is considering the fee structure for such services.
    Click Here to View Full Article

  • "Rover Engineers Hope They Found Problem"
    New York Times (01/27/04) P. A14; Chang, Kenneth

    The rover Spirit could return to its mission on Mars within a week if NASA engineers have isolated the cause of its recent malfunction, and they think that they have by successfully replicating the computer crashes the rover has been suffering from. Mission managers suspect that the crashes were caused by an overload of open files, and one manager, Dr. Mark Adler, is confident that the rover's full functionality could be restored within a week once the flaw is patched. The problem is related to the rover's flash memory, which retains data when the power is turned off and is used to store essential information such as scientific experiments; an unlimited number of files can be stored in flash memory. The rover also boasts random access memory (RAM), whose speed advantages over flash memory are offset by its inability to retain data in the absence of power. Spirit's programming is slow to read or write files in flash memory, so the machine copies frequently-used data to RAM, but the finite supply of RAM was exhausted because so many files were open. The vehicle's flight software became trapped in a crash/restart loop, and the rover was draining its batteries because it could not properly shut down at night. NASA flight controllers will attempt to download some of the data in Spirit's flash memory that should reveal what the rover was doing when the malfunction transpired; Spirit's counterpart, Opportunity, will also require patching for the same flaw. "It's not a theory that this is a problem on the vehicle, in the sense that we've verified it in the test bed," notes Dr. Adler. "So we know, even if this doesn't turn out to be what hit Spirit, it is something we have to work around on both vehicles."
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "H-1B Visas Going Fast"
    CNet (01/26/04); Frauenheim, Ed

    As of Oct. 1, 2003, the annual cap of H-1B visas fell from 195,000 to 65,000, and U.S. Citizenship and Immigration Services recently reported on its Web site that some 43,500 visas had either been approved or were in the pipeline for approval in the first quarter of fiscal year 2004. The agency's Chris Bentley characterizes the current speed of H-1B applications as "somewhat accelerated" from last year. Cognizant Technology Solutions CEO Lakshmi Narayanan contends that IT services firms with a significant portion of India-based operations may be among the biggest H-1B applicants. India-based companies with U.S. operations widely use both the H-1B and L-1 visa programs, and this is been a sore point for U.S. workers laid off in recent years, as well as critics who claim the programs support the offshore outsourcing of technology jobs. Visa advocates caution that the elimination of the visa program would accelerate, not decelerate, offshore outsourcing. In addition to the lowered visa quota, it is now cheaper for companies to secure H-1Bs: The application fee has fallen from $1,000 to a mere $130 as of October 2003. Ron Hira of the Rochester Institute of Technology expects the number of India-based companies applying for H-1Bs to increase now that rules pertaining to "H-1B-dependent employers" have expired; the rules required firms with major percentages of H-1B workers to assert that they looked for American workers before applying for another H-1B visa. A new temporary worker program proposed by President Bush could impact or even jettison the H-1B visa program.
    Click Here to View Full Article

  • "SGI Sheds More Light on Multi-Paradigm Computing"
    Computer Business Review (01/27/04); Aslett, Matthew

    Silicon Graphics (SGI) CTO Dr. Eng Lim Goh detailed Project Ultraviolet, an initiative first unveiled at the Supercomputing 2003 trade show in November as a plan to devise a new class of "multi-paradigm" supercomputers by meshing the best constituents of vector, scalar, and other computing architectures. The resulting single architecture will implement whatever computing approach is best suited to make the most of a specific application. SGI research also discovered that disparate operations within one application could yield more performance improvements through a combined processing approach than they would through additional processors. "The world has been stove-piped into scalar or vector and there is nothing in between," explained Goh. "The ideal architecture would be a mixture of both." SGI is attempting to create a memory controller that can mate with the Intel processor to make vector application management less of a headache for the scalar processor; the multi-paradigm architecture must also be standards-based and take the nascent Processor in Memory (PIM) and field programmable gate array (FPGA) models into consideration. Existing applications should be able to run on Project Ultraviolet technologies without the need for recoding; still, Goh pointed out that the code will require recompiling in order to maximize all the paradigms. The full Ultraviolet project's objective, which is some three years away, is to link FPGAs into memory via a port on the memory controller, although SGI will roll out FPGA support in 2005 with the delivery of an interface to FGPA capabilities in an independent chip that will be combined with the memory controller.

  • "The Machine That Invents"
    St. Louis Post-Dispatch (01/25/04); Hesman, Tina

    Imagination Engines CEO Stephen Thaler's experiments with neural networks revealed that disrupting their connections with noise caused the networks to generate new ideas. These trials were the beginnings of Thaler's Device for the Autonomous Generation of Useful Information, a.k.a. the Creativity Engine, which offers a far more flexible model for autonomous computing than logical, rule-based artificial intelligence systems, according to University of Memphis computer scientist Robert Kozma. Creativity Machines feature built-in critic networks that choose the best ideas and provide positive feedback, which gives the network the impetus to generate even better ideas. The Creativity Machine designed the Oral-B CrossAction toothbrush via brainstorming between two neural networks; in another experiment, the device composed 11,000 new songs from a sampling of Thaler's Top 10 hits spanning 30 years, and Thaler wagers that the machines' composing skills could be refined with a human-trained critic network. Other possible applications of the Creative Machine include collision avoidance and intrusion prevention technology in which the machines can alert motorists and security industries to impediments, pedestrians, or intruders, while spy agencies want to map out the Internet and identify anomalous behavior with Thaler's invention. Lloyd Reshard at Eglin Air Force Base says the technology's biggest hurdle is packaging it into an off-the-shelf product. There are also those who fear that such a machine could replace people in certain professions, or give rise to a "Terminator" scenario in which the machines decide to eliminate mankind. General Dynamics government contractor Rusty Miller dismisses such concerns, arguing that a much greater danger lies in terrorists and other malevolent parties using the Creative Machine for nefarious purposes.
    Click Here to View Full Article

  • "Biggest Web Problem Isn't About Privacy, It's Sloppy Security"
    Wall Street Journal (01/26/04) P. B1; Gomes, Lee

    Web security leaves a lot to be desired, as evidenced by embarrassing incidents at companies such as the online restaurant reservation service OpenTable.com; Web designers need constant reminding of the security issues they should be aware of as they create Web sites, a situation that MIT doctoral student and security consultant Kevin Fu calls "depressing." Upon signing up at OpenTable, new customers are given personal cookies that store specific customer numbers so that the site recognizes returning customers and sends their personal data back to their browsers. However, it turned out to be easy to alter the number in one's cookie to that of another user, and fool Open Table into sending that other person's data as well. Furthermore, a coder wrote a simple program that could cycle through all possible numbers and revisit OpenTable repeatedly to steal registered users' data. OpenTable reports that it patched the problem with encryption software when it was notified last week. Richard M. Smith of computerbytesman.com says a better way to close up such security holes is to employ only very large random numbers. Open Web Application Security Project director Mark Curphey points out that such security flaws are all too common, with broken authentication and invalidated input exemplified by OpenTable's troubles constituting the biggest problems. The Web security focus has shifted: Whereas it first concentrated on fortifying the connection between users and Web sites from hacker intrusions, it now is turning its attention to subtler exploits, such as how apparently harmless data given to trusted users can be abused.

  • "That Gibberish in Your In-Box May Be Good News"
    New York Times (01/25/04) P. 3-16; Johnson, George

    Researchers at the recent 2004 Spam Conference laid out highly technical solutions to eliminating spam and gave little attention to legal remedies such as lawsuits and enforcing the Can Spam Act. Filtering technology, in particular, is taking its toll on spammers, forcing them to send increasingly meaningless messages. The downward slide in the coherence of spam messages has several evocative analogies: An evolutionary dead-end species with too many detrimental mutations, or the HAL computer in "2001: A Space Odyssey," when it is slowly disconnected and dies a confused death. After computer scientist Paul Graham urged the adoption of Bayesian email filters in an August 2002 manifesto, the anti-spam community armed itself with this potent weapon. Using a statistical method invented by 18th century Englishman Thomas Bayes, Bayesian filters rank words according to how often they appear in spam messages; spam that makes it past the filter results in corrections to the filter so that spammers are forced to devolve their messages into meaninglessness. Recent examples of spammers' desperation are one word messages or emails with a single image accompanied by a link, or email subject lines such as "Seakrets of ((eks-eks-eks)) stars." To confuse filters, spammers insert "word salad," or strings of words reminiscent of Dadaist poetry that is sometimes written in white letters on white backgrounds so as to be invisible. Presentations at the 2004 Spam Conference promise more technical tools to lock down on spam, such as in spam researcher Terry Sullivan's "10-dimensional high-fidelity model of historical spam space" analysis of spam evolution; Sullivan found spam evolves in punctuated equilibrium similar to how biological species are supposed to evolve.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Most Flexible Electronic Paper Yet Revealed"
    New Scientist (01/26/04); Knight, Will

    Philips researchers have created the most flexible electronic display to date by printing organic electronics on a 25-micron-thick polyimide substrate, and MIT researcher Joe Jacobson calls this breakthrough "an important milestone and another step closer towards 'real' electronic paper." The 80,000-pixel display produced by the Philips technique measures 12 centimeters diagonally and can be rolled into a tube only 2 centimeters in diameter; the screen generates a greyscale image and takes about one second to refresh, which makes it impractical to display moving images. Deposited atop the polyimide substrate is a 200-micron-thick layer of E Ink's "electronic ink," which consists of thousands of capsules featuring negatively charged black particles and positively charged white particles. The application of an electrical current through the organic circuit to a specific part of the screen excites either the black or white particles, inducing a white or black color change in the specified area. Philips is keeping the manufacturing method's details under wraps, but plans to mass-produce such displays in a few years. Bas Van Rens, general manager of Polymer Vision, which will commercialize the Philips displays, claims the screens are far more sophisticated than similar products in that they offer better resolution, more complex electronics, and more manageable size. Products that could come from electronic paper once the technology is perfected include updateable roll-up newspapers and collapsible mobile phones.
    Click Here to View Full Article

  • "Time to Redial: VOIP (Voice Over Internet Protocol) Makes a Comeback"
    Knowledge@Wharton (01/27/04)

    Voice over Internet Protocol (VoIP) technology is making a comeback after going through a buzz phase a few years ago; at that time, many people tried out free phone services, but few thought to replace their traditional phone sets with the unreliable and poor quality of VoIP on 56K modems. With the spread of broadband and a new enthusiasm from phone and cable companies, VoIP is building momentum: All the major phone carriers and cable firms have announced at least limited rollouts of VoIP service as replacement for traditional phones, and analysts expect the market to reach some $4 billion by 2007. In-Stat/MDR senior analyst Daryl Schoolar says 4 million households will use VoIP telephony in three years' time, a number that is constrained by the spread of broadband Internet connections; less than 40 percent of American households are expected to have broadband by 2007. Regulatory issues also dog the technology: Though VoIP advocates mostly want government uninvolved in VoIP deployment--as it is the Internet--law enforcement is calling for similar wiretap capabilities available on traditional phone networks. There is also a fear that rapid VoIP adoption would cause traditional phone network investment to stagnate, thus threatening universal access and 911 emergency services. VoIP has the potential to become a hot political topic this election year with Democrats supporting regulation and Republicans advocating a hands-off approach, according to Wharton business and public policy Professor Gerald R. Faulhaber. Despite the threats, analysts say that VoIP will remain a long-term option, especially among businesses frequently communicating overseas. Cable firms will likely do better in the home market as they have with broadband Internet, but phone companies are likely to benefit from the increased cost-efficiencies of replacing expensive traditional infrastructure.
    Click Here to View Full Article

  • "For Brazil Voters, Machines Rule"
    Wired News (01/24/04); Mira, Leslie M.

    For the longest time, it was easy to rig Brazilian elections because they relied on a strictly paper-based voting process. Now, even as electronic voting systems draw fire in the United States for being insecure and unreliable, Brazil's populace is praising e-voting, which was introduced in 1996 and spread to all precincts in 2000. The e-voting technology used in Brazil is partly supplied by Diebold Election Systems, the same company that has been heavily criticized for its touch-screen voting systems in North America, but there are noticeable differences: The voting machines, known as "urnas," are portable and have no touch screens--instead, voters punch in numbers corresponding to a candidate, and confirm their vote with a green button once they have viewed the candidate's image and name to make sure their selection is correct. Daniel Wobeto of the electoral commission in Rio Grande do Sul reports that machine software is published on the Internet prior to elections, while the urnas' voting data is stored on a floppy disk and sealed within the machine with tamper-evident tape. The urnas are more affordable than the touch-screen systems in the United States, and they make voting easier for citizens with poor literacy. They are also voter-verifiable with the addition of a plugged-in printer, but the Brazilian government recently approved a law that will eliminate printed e-voting receipts to reportedly save money and accelerate the electoral process. Computer scientists and other activists call this strategy a bad move. Tribunal Regional Eleitoral computer programmer Claudio Luiz, who calls the urnas "faster and safer" than paper-based voting, says the technology will soon be employed by high-school student councils.
    Click Here to View Full Article
    To read about ACM's activities in the area of e-voting, visit http://www.acm.org/usacm/Issues/EVoting.htm.

  • "Data Storage Worlds Uniting"
    CNet (01/25/04); Frauenheim, Ed

    In an effort to reduce costs and simplify file-sharing, companies are attempting to merge the best elements of network-attached storage (NAS) and storage area networks (SAN) technologies. Perhaps one of the most evident trends in this wider convergence are NAS gateways, which connect SANs with Ethernet networks; usually, SANs use faster FibreChannel to speedily connect with local servers, though this traditional set-up means poor resource utilization. Making data storage available across the enterprise is a goal of most companies and is similar to the virtualization efforts going on in the server arena. Downsides to NAS gateways include increased "vendor lock-in" since NAS gateways, also called NAS heads, from Network Appliance, IBM, and EMC each work with their own brands of SAN systems. Other approaches to NAS-SAN convergence besides NAS gateways include block- and file-level data access in the same system; IBM's SAN File System is such a "unified storage product" that uses both SAN block-level and NAS file-level approaches to data access. The system uses Internet Protocol and software loaded on servers to keep track of metadata, the result being a single file system for SANs that can manage petabytes of data. BlueArc's Titan NAS gateway is not able to manage such large amounts of storage by itself, but when clustered together can do so; in addition, BlueArc CTO Geoff Barrall says the devices use field programmable gate arrays--programmable silicon--to quickly handle tasks usually assigned to software. Barrall says the Titan product is also able to increase its data-transfer speed with technological advances and support larger file systems. In the future, Ethernet upgrades and iSCSI will mean tighter convergence of NAS and SAN approaches.
    Click Here to View Full Article

  • "Iridescent Software Illuminates Research Data"
    NewsFactor Network (01/27/04); Martin, Mike

    The job of sifting through journals and scientific literature to unearth information relevant to studies could become less burdensome--and expensive--for researchers with the help of Iridescent, a software program developed by bioinformatics scientists at the University of Texas Southwest Medical Center. "This work is about teaching computers to 'read' the literature and make relevant associations so they can be summarized and scored for their potential relevance," explains Iridescent co-developer Dr. Jonathan Wren at the University of Oklahoma, who adds that humans would have to read tens of thousands of documents to accomplish the same task. UT professor Dr. Harold Garner, who also contributed to Iridescent's development, says the program mimics the scientific thought process. Iridescent builds a network of related objects by identifying shared statistical relationships between object sets in the National Library of Medicine's Medline bibliographic database, an exponentially expanding archive. "Having assimilated all of Medline, Iridescent can compile diverse facts to present a list of 'hypotheses' to the user for finding hidden knowledge in the data," Garner comments. The UT professor says Etexx Biopharmaceuticals employs Iridescent to devise new ways to apply existing FDA-approved drugs to cardiac diseases that otherwise have no therapeutics. Garner adds that the program dramatically lowers investments in time and money for high-throughput screening, toxicology testing, and manufacturing qualifications.
    Click Here to View Full Article

  • "The Mac Turns 20"
    PC World (01/23/04); Watt, Peggy

    Remembering Apple's launch of the Macintosh is like rehashing the early history of PC innovations: Many of the people who helped launch the Macintosh gathered at the Computer History Museum in Mountain View, Calif., and were asked what Apple and the rest of the PC industry have learned from each other. Former Microsoft application development manager Vern Raburn says Steve Jobs got many of his Macintosh ideas from Xerox PARC research developments, such as the graphical user interface, mouse, laser printing, and even the desktop trashcan icon. Even though Jobs did not come up with the original ideas himself, Apple was the first company to market those features in real products, which helped break the PC free from event- and menu-driven tasks. The Macintosh platform continues to be a source of multimedia innovations: Former Apple desktop publishing head John Scull says he worked on making the LaserWriter into a marketable product with only a summer intern on staff; PageMaker developer Aldus was a key ally that first launched that product on the Mac, then on Windows. Scull remembers Kodak dismissed desktop publishing as "a toy." Apple found other key software allies in Lotus, Software Publishing, and Microsoft, though the latter delivered the most goods in Excel, the first graphical version of Word, and other programs. Another important Apple contribution was the use of networking, along with file translation programs and emulators, as a way to achieve compatibility. Marketing, computer performance, and open architecture are three things Apple learned from the PC industry, but consultant Rob Enderle says Apple is still learning other lessons, such as licensing; Apple recently licensed its iPod design to Hewlett-Packard. Former Apple public relations manager Barbara Krause says that though Apple employees worked in something of "a reality distortion field" during the early days, the company's focus on "the computer for the rest of us" had a dramatic impact.
    Click Here to View Full Article

  • "Grids Moving Beyond Science"
    Network World (01/26/04) Vol. 21, No. 4, P. 1; Mears, Jennifer

    Mainstream organizations are starting to embrace grid computing--which has been for the most part confined to academic and scientific pursuits--to make their IT resources more efficient and to squeeze more performance out of business applications. This adoption is in an early phase: Only 4 percent of 180 companies surveyed last summer by Summit Strategies reported that they had deployed a grid, while 12 percent said they were assessing the technology; 18 percent did not expect major grid evaluations for at least a year, and 25 percent noted that grid computing was likely to play an extremely important or very important role in their IT infrastructure over the next three years. "If you've got particularly transactional applications, there's really no business case right now to put them into a grid environment," points out Summit's Mary Johnston Turner, but this could change with the integration of grid standards and Web services via the Global Grid Forum's Open Grid Services Architecture. At The Globus Alliance's recent GlobusWorld conference, IBM announced new Web service specifications designed to provide such integration. Analysts believe the adoption of such standards will spur wider acceptance of grid computing in the corporate arena, and note the importance of advanced workload management, resource usage monitoring, and security enhancements to the evolution of specifications. This integration is concurrent with the introduction of consulting and professional services that aim to help corporate clients comprehend and design grid environments. In a 2003 Platform Computing study of 50 companies, 89 percent of respondents reported that organizational politics constituted the biggest obstacle to grid implementation. Platform's chief business architect Ian Baird says that users will be able to manage workloads across shared resources through grid middleware offerings from the likes of Platform and DataSynapse.
    Click Here to View Full Article

  • "The Tyranny of Copyright"
    New York Times Magazine (01/25/04) P. 40; Boynton, Robert S.

    Copyright holders are asking for tougher copyright laws to ostensibly curb piracy of intellectual property encouraged by the spread of the Internet, but a growing protest movement--the so-called "Copy Left"--contends that such an approach is anathema to democratic freedoms and is strangling innovation, creativity, and cultural progress. Copy Left activists argue for a return to a more Jeffersonian copyright system in which individual creators are granted the exclusive right to profit from their intellectual property for a limited time, after which their works revert to the public domain. The advent of the Internet underscores two revisions to copyright law: The first, instituted almost 100 years ago, bans the "copying" of one's creation by others, while a 1976 amendment dictates that fixing anything in a tangible medium constitutes automatic copyrighting; together, these provisions make every work automatically shielded by copyright law, and their online publication tantamount to copying. According to the Copy Left's "Creative Commons" model, individual creators should be allowed to control the use and copying of their works as they see fit. Yale law professor Yochai Benkler believes following a cultural commons model is a better option for fostering creativity, and makes more economic sense: He makes the case that the telephone industry reaps far more profits than the recording industry because users prefer telecommunications as a medium for interaction, socializing, and work, whereas the recording industry merely disseminates content to people through tightly controlled channels and technology that permit little more than passive consumption. Copy Left opponents such as Columbia Law School professor Jane Ginsburg maintain that the Creative Commons model hamstrings authorship and the dues authors are rightfully owed, while Stanford Law School professor Paul Goldstein believes a micropayment licensing model is more efficient and democratic than the Creative Commons. William Fisher of Harvard Law School's Berkman Center offers a restructured business model in which all works that can be transmitted over the Internet are registered with a central office designed to watch how frequently a work is used and thus remunerate artists on that basis; the system would ensure that creators are compensated while everyone can access every creative work without restrictions.

  • "Know Thy Neighbor"
    New Scientist (01/17/04) Vol. 181, No. 2430, P. 32; Buchanan, Mark

    Virtually all networks appear to share the "six degrees of separation" architecture outlined by Harvard's Stanley Milgram, but the sometimes striking dissimilarities between laboratory models and real-world networks reveal insights critical to the development of practical network science applications, which could include a more searchable and navigable Internet and automated traffic flow management. An extension of the "six degrees" theory is the "small worlds" model mapped out by Cornell researchers Duncan Watts and Steve Strogatz, who determined that adding a few random long-distance links to a grid of closely connected neighbors makes the network more navigable: Follow-up research by Cornell computer scientist Jon Kleinberg found that a searchable network is generated when the grid has a greater number of shorter links than longer links. Watts believes the key to unraveling the mystery of a searchable society resides in the mechanics of social organization. In 2003, Watts and Peter Dodds of Columbia's Earth Institute and Mark Newman of the Santa Fe Institute applied Kleinberg's observations to the navigation of social networks, and discovered that social groupings fulfill two functions: They identify people with group labels and are the jumping-off point for establishing social connections. In the same way that people in Milgram's experiment were able to determine the most efficient route throughout a social network by applying similarities between themselves and their intended target, search engines, for instance, could use labels akin to social groupings to hit target Web pages faster. However, recent discoveries show that, in the real world, the small world model breaks down if each network link has an associated "cost" that generates inequality. Boston University's Gene Stanley, Bar-Ilan University's Shlomo Havin, and Lidia Braunstein of the National University of Mar del Plata learned that the lowest-cost path between two nodes remains close to the shortest path as long as the variations in the cost of the links are small, but the situation changes if these variations increase; variance in the cost of a network's links makes the network less searchable, which supports the case for maintaining link uniformity.

  • "The Code Warrior"
    Vanity Fair (01/04) No. 521, P. 94; Shnayerson, Michael

    F-Secure security specialist Mikko Hypponen characterizes 2003 as the worst year in virus history, and singles out the month of August as the nadir. August 2003 marked the emergence of Blaster, self-replicating malware that served as both a virus and a worm, and that only needed to be linked to the Internet to spread. Once that problem was seemingly resolved, the latest permutation of the SoBig email virus struck. SoBig is a program that has become more and more insidious with each new version, and its sophistication sends a clear message that its authors are highly organized and more than likely adults with far more sinister agendas than malcontented teenage hackers. Hypponen is convinced that the August SoBig variant was launched as a means for financial gain, and explains that the virus writers had the program install back doors in infected machines so they could send spam to people in the compromised systems' address books, in addition to the many addresses provided by the spammers. Most of SoBig's victims were home users, who were more apt to click on the virus-laden emails because they were being flooded with so many, according to Vincent Weafer of Symantec Security Response. Hypponen attributes the U.S. blackout of Aug. 14 to the Blaster worm, albeit indirectly: He postulates that though human error was the most probable direct cause, "the reason why they made those mistakes was that they weren't getting the right information from the sensors they were using to monitor the power grid," and Blaster was spreading through the same communications channels the sensors used. F-Secure founder Risto Siilasmaa says the blackout was clear evidence of how computers are becoming more and more deeply embedded in society's infrastructure, and the growing visibility of indirect damages caused by malware; he feels that virus writers will turn their attention to smart phones next.



    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM