HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 651:  Wednesday, June 2, 2004

  • "Gaps Seen in 'Virtual Border' Security System"
    New York Times (05/31/04) P. C1; Markoff, John; Lichtblau, Eric

    Technologists and engineers are highly skeptical that the U.S.-Visit program, a government plan to establish a system that combines biometrics and databases to monitor foreigners entering the United States in real time, is viable. Critics such as RAND Corporation scientist Willis Ware note that other government agencies--the IRS, the FAA, and NASA among them--have had a poor track record in computerization projects involving unproved technology. The Department of Homeland Security intends to announce on June 1 which of three contractors bidding on the U.S.-Visit project will receive the contract, but critics argue these contractors have been given too much leeway for defining the technologies and capabilities of the U.S.-Visit system, which could cost up to $15 billion over 10 years. Furthermore, border control systems will be of little use against malicious individuals the government does not know about. Computer security experts also claim that even an effective U.S.-Visit system may not justify its expense: An October 2002 report on seven biometric technologies from the General Accounting Office concluded that the deployment of such a system could run as high as $2.9 billion, while annual operational costs could reach $1.5 billion. VAS International has also been protesting the use of biometrics in certain efforts, warning that past initiatives resulted in the generation of false positives and negatives; VAS President Marc E. Nolan said relying on facial recognition or any other single technique will fail because of a lack of standards. "To have one of them be the panacea for all of them isn't going to work," he contended. Critics are also doubtful that U.S.-Visit would be able to accommodate the massive volume of data generated by a real-time tracking system.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Spam's Assault Going Beyond Annoying E-Mail"
    Los Angeles Times (05/31/04) P. A1; Gaither, Chris

    Spam is relentlessly moving beyond desktop computer email and invading Web logs, cell phones, instant messaging, and Internet bulletin boards. Privacy Inc. co-founder Doug Peckover observes, "[Spam is] like water flowing down a hill--you try to block it, and it just flows elsewhere." Forrester Research estimates that about 50 percent of all European mobile phone users have gotten text message ads, and critics warn that a similar trend will erupt in the United States when a directory of about three-quarters of the U.S. mobile phone population is amassed by the Cellular Telecommunications and Internet Association. Although it is against the law in California to send unwarranted commercial text messages to mobile phones, most spammers exhibit a flagrant disregard for such legislation. Radicati Group expects instant-messaging spam, also known as "spim," to swell 300 percent this year to 1.2 billion messages, although Yankee Group analyst Paul Ritter maintains that spim only accounts for 5 percent to 8 percent of all corporate instant messages. Even more alarming is the proliferation of comment spam that penetrates Web pages, which spammers employ to trick search engines into giving their sites high rankings. Another source of consternation is the intrusion of spam onto message boards, in which spambots search the Web for sites that accept visitor postings and insert advertisements; one of the more egregious examples of this practice was a memorial site that was assaulted by spam. Marketers are trying out new marketing tactics in the wake of federal mandates to curb telemarketing and email ads: "The direct-marketing industry is throwing the spaghetti up against the wall and seeing what sticks," comments American Teleservices Association executive director Tim Searcy.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Government Data-Mining Lives On"
    CNet (06/01/04); McCullagh, Declan

    A new report from the General Accounting Office (GAO) estimates that federal agencies are engaging in 199 data-mining programs, many of which bear an unsettling resemblance to the Total Information Awareness initiative killed by congressional mandate because of its privacy implications. Among the programs cited in the report are a quartet of Defense Intelligence Agency projects that involve sifting through data culled from Internet searches and the intelligence community to identify foreign terrorists or Americans linked to foreign terrorism; an Incident Data Mart from the Homeland Security Department for spotting potential terrorist activities by mining logs of any "event involving a law enforcement or government agency for which a log was created;" an FBI data mart assembled from Bureau records, those of other federal agencies, and public data sources to identify potential terrorists' illegal entry into the United States; and the Department of Education's Project Strikeback, which is set up to look for "anomalies" indicating "terrorist activities" by comparing names in the department's databases with FBI-furnished records. There is a positive side to government data mining: Among its potential or established benefits is identification of credit card abuse by federal employees, improved computer network security, and the detection of tax evasion. However, civil liberties groups have warned that data-mining is paving the way for a scenario where every American citizen's activities, communication, and movement is registered and archived in computer databases. Rep. Ron Paul (R-Texas) says, "Government is by nature inefficient and harmful when it gets involved in these programs."
    Click Here to View Full Article

  • "Science Middleware Initiative Goes Open-Source"
    IDG News Service (05/31/04); Weil, Nancy

    The National Science Foundation's Middleware Initiative (NMI) is quickly picking up U.S. academic support, as well as interest from industry middleware groups and international partners. NMI recently made the fifth version of its software available for free download, which incorporates a number of middleware projects not begun within the NMI but at other academic Internet organizations such as the Internet2 community. Pennsylvania State University lead systems programmer Renee Shuey says the increased interest concerning NMI is the program's involvement of key middleware researchers from the academic field, who have included technology such as the Shibboleth federated ID management environment; Shibboleth and other NMI components are used at Penn State to allow physics students to access online course material at partner North Carolina State University using their Penn State account and password. The 80,000-student university is also planning to use Shibboleth in its upcoming Napster university service that will be made available to students later this year. Developers contribute and use NMI components according to their different needs, and the project has served as a sort of clearinghouse for university middleware: Among the projects using NMI technology are the Network for Earthquake Engineering Simulation and the Biomedical Informatics Research Network. "What's fascinating is the emergence of distributed collaborations within and among scientific communities that, for many of them, are a huge change in their culture and the way they conduct their work," says NMI program officer Kevin Thompson. The program expands mostly by word of mouth, beginning with one researcher in an institution who serves as an evangelist to peers. Collaboration is ongoing between university middleware and grid programs and corporate-world counterparts, such as the Liberty Alliance and WS-Security, says Internet2 middleware and security director Ken Klingenstein.
    Click Here to View Full Article

  • "Europe Braces for Patent Rules"
    Wired News (05/28/04); Andrews, Robert

    The European Parliament's upcoming vote on a software-patenting law recently passed by regional ministers could become a flashpoint for computer programmers concerned that its enactment will trigger U.S.-style patent wars that could establish multinational monopolies and hamstring the industry. Though amendments that declared general software ideas unpatentable were introduced into the law by Parliament members last year, those revisions were overruled by the European Council's decision last week. A second reading of the new Directive on the Patentability of Computer-Implemented Inventions will take place in Strasbourg, France, in September, and coders are rallying to pressure Parliament candidates to vote against the law, or be voted out of office. Richard Stallman, spiritual leader of the free software movement, characterized British European Parliament member Arlene McCarthy as a stooge for the multinationals, and accused her of lying to voters about her patent plans' benefits. McCarthy countered that critics were feeding "misinformation" about the law's implications to the public. "Patent protection is vital if we are to challenge the U.S. dominance in the software-inventions market," she asserted. "Good patent law for computer-implemented inventions will protect software-development companies and give them a return on their investment through license fees."
    Click Here to View Full Article

  • "Using Tech to Fix Elections"
    Linux Insider (05/27/04); Murphy, Paul

    Paul Murphy, author of "The Unix Guide to Defenestration," describes the electronic voting process as a quagmire of inexperienced and poorly educated voters and election officials, practically nonexistent audit trails, systems dominated by insecure Microsoft technologies, and easily corruptible vote accumulation protocols. He argues that implementing e-voting should be regarded as "criminally stupid" when faced with such concerns as: The lack of vote verification mechanisms and public vote counting procedures; the impossibility of making meaningful recounts in close elections; untrustworthy machines; and no way to compare e-voting systems because vendors--with the exception of Diebold--have refused to make their systems available for analysis. Murphy contends that a successful e-voting system should consist of an eligibility verification process kept separate from the e-voting record, a voter-verifiable paper trail, and a fast, accurate, and manually verifiable vote-accumulation process that does not support illegal access. Murphy describes how such a system--whose cost would be about 60 percent less than that of a Diebold or similar system--should be designed: It would be a touch-screen model with a smart Sunray display and a printer to provide a paper ballot, while ballots would be prepared beforehand as text files for automatic processing into the Web pages that voters view; there would be at least one voting station at each polling place, as well as at least one training station with a wall projector showing a video introduction to e-voting on a continuous loop; each smart display would be logged in by election officials using an assigned ID, and paper ballots would be employed to conduct audits, recounts, and disputes. If the technology could be set up and used in schools on nonelection days, Murphy postulates that polls could be consolidated within or near schools.
    Click Here to View Full Article

    For information about ACM's activities regarding e-voting, visit http://www.acm.org/usacm.

  • "Giant Grid Discovers Largest Known Prime Number"
    IT Management (06/01/04); Gaudin, Sharon

    Researchers have discovered the largest known prime number, described as 2 to the 24,036,583th power minus 1, using a grid of approximately 240,000 networked computers. The number, which features 7,235,733 decimal digits, is the 41st Mersenne Prime to be discovered, and was found through the Great Internet Mersenne Prime Search (GIMPS) project created by retired computer programmer George Woltman. He explains that when the number of digits exceeds 20,000 or 30,000, the Mersenne algorithm becomes essential to finding large prime numbers; Woltman notes that the grid used in the GIMPS project, which encompasses virtually all time zones, has between 20,000 and 40,000 computers active at any one time. He calculates that the grid carries out 20 trillion operations per second, using the computing power of machines located at universities, businesses, and households. "[The project] serves as a benchmark for how fast computers are and how far distributed computers have come," Woltman declares. Each machine on the grid runs software freely available for downloading on the GIMPS Web site, and tests one or two numbers each month, depending on how much power is in each system. Entropia's Scott Kurowski manages the main server that the individual computers in the grid communicate with. The 2.4 GHz Pentium 4 Windows XP unit that found the latest prime number was run by GIMPS volunteer Josh Findley, a consultant with the National Oceanic and Atmospheric Administration, and the discovery was confirmed by half of a Bull NovaScale 5000 HPC running Linux on 16 Itanium II 1.3 GHz CPUs run by Tony Reix of Grenoble, France.
    Click Here to View Full Article

  • "PC In a Pocket Promises Big Performance"
    Financial Times (06/02/04) P. 9; Taylor, Paul

    The days of pocket-sized PCs are fast approaching as a handful of American startups ready new ultra-personal computers (UPCs) that can run the full version of Microsoft's Windows XP operating system and all of its accompanying applications, whereas the software used by personal digital assistants and other handhelds is much more limited. Antelope Technologies, OQO, and Vulcan have developed Transmeta chip-powered UPCs for business travelers and individual users who prefer not to be saddled with the weight of traditional notebook PCs. Antelope's modular design Mobile Computer Core has been available since late 2003, while OQO's Elizabeth Bastiaanse says her company's UPC should make its market debut later this year at a cost of less than $2,000. The OQO device, which will be geared toward vertical markets such as health care professionals, mobile sales forces, and field engineers, features a 20 GB hard drive, a touch-sensitive screen, and a miniature Qwerty keyboard that is easy to use. Vulcan's UPC, dubbed the Flipstart, resembles a pocket-sized "clamshell" notebook with a thumb-operated keyboard and a flip-up display, and is equipped with a 30 GB hard drive. The development of the UPC model has been helped along by the shrinkage of electronic components as well as the advent of high-performance, low-power microprocessors. UPCs should be especially appealing to enterprises with large mobile workforces, or executives who spend much of their time on the road or in conference. There are people who doubt that the UPC model will be widely accepted, given the conservative streak of companies in regards to adopting new technology.
    Click Here to View Full Article

  • "Engineers Visualize Electric Memory as It Fades"
    Newswise (06/02/04)

    Researchers at the University of Wisconsin-Madison and Argonne National Laboratory have examined the atomic arrangement of ferroelectric memory, which can store data without power, to learn why it loses reliability over time, a problem known as fatigue. A better understanding of the structure of electronic memory would help in the design of better devices. It is ferroelectrics' atomic configuration that determines the material's ability to retain data, and the application of an electrical pulse to the material changes the data and the atomic structure. Each pulse causes ferroelectrics' storage and retrieval capabilities to gradually diminish until they either lose the data or cease working. "We'd like to understand how [the material] switches so we could build something that switches faster and lasts longer before it wears out," explains UW-Madison professor Paul Evans. The project involved the use of Argonne's Advanced Photon Source to track changes in the atoms' position to see how well the ferroelectrics in an operating device switched or recalled data. X-ray analysis determined that progressively larger areas of the device stopped functioning as the mechanism was subjected to continuous electrical pulses. Evans observes that at higher voltages the material loses the ability to switch because an essential element of the material has changed. Evans says, "Wouldn't it be nice to have a computer that doesn't forget what it's doing when you turn it off?"
    Click Here to View Full Article

  • "Who Tests Voting Machines?"
    New York Times (05/30/04) P. WK8

    The so-called independent testing of electronic voting machines that election officials rely on is in desperate need of reform and transparency, according to this editorial. Testing companies are in the pocket of voting machine manufacturers, which means that they often rush through e-voting system evaluations and do not disclose flaws for the sake of expediency; the New York Times maintains that these testing outfits should be replaced by government-funded and -coordinated labs. Independent testing labs are notoriously tight-lipped about their testing procedures, but the requirements for ensuring that e-voting machine software is bug-free cannot be fulfilled by current technology, according to experts such as Stanford University professor David Dill. Voting software can contain over 1 million lines of code, and each line would have to be carefully examined to root out any bugs, which Dill calls "an impossible task." Adding to the untrustworthiness of independent testing labs is their reliance on 2002 standards regarded as insufficient by computer experts: Harvard-affiliated computer scientist Rebecca Mercuri says there are no requirements to analyze any commercially available software employed by voting machines, even though it can harbor flaws that could compromise the integrity of the entire system. The editorial recommends that the Election Assistance Commission step up its efforts to establish government-run testing labs, deploy transparency and rigorous standards in the testing process, severely penalize voting machine companies and election officials for passing off uncertified equipment as certified, and require election officials to have a nonelectronic backup system available on Election Day. The most effective measure for securing elections, however, is the inclusion of a voter-verified paper trail.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

    For more in e-voting, visit http://www.acm.org/usacm.

  • "Software Helps Rights Groups Protect Data"
    SiliconValley.com (05/31/04); Schoenberger, Karl

    Benetech CEO Jim Fruchterman's open-source Martus software was spawned from a need to aid human-rights activists in abusive Third World countries by offering a sophisticated but simple data encryption program that could be used by people with minimal technology skills. There is no need for users to manually enter a cryptography key with Martus--simply entering their name and password is enough. The Martus prototype was introduced two years ago: Since then, the program has spread to at least 50 countries, and Fruchterman estimates that Benetech has distributed about 500 Martus program CDs while an additional 500 copies have been downloaded for free off of www.martus.com. Human-rights workers gave Martus a passing grade after the program was tested in Sri Lanka and deployed in training sessions in the Philippines. The Martus system features automatic and redundant data backup on dedicated servers in Toronto, Seattle, Budapest, and Manila. Over the past three years, the Martus project has received $1.5 million in grants from the likes of the MacArthur, Soros, and Ford foundations, while links between the project and non-governmental human-rights groups were forged by the Asia Foundation. Apart from the U.S. National Security Agency and other major code-cracking organizations, no one can decrypt Martus-encrypted files without the original user's cryptography key and password; however, Pretty Good Privacy code developer Phil Zimmermann notes that Martus could be thwarted by keystroke-sniffing programs such as spyware.
    Click Here to View Full Article

  • "Slow Going for Linux in Iraq"
    Wired News (05/28/04); Asaravala, Amit

    Extremely few Iraqis have experience with Linux, but the Iraqi Linux User Group (LUG) out of Baghdad University is trying to change that: Ashraf Tariq and Hasanen Nawfal, both engineering students at the school, launched the Iraqi LUG with the aim of "putting Linux on every server in Iraq," according to Nawfal. The Iraqi LUG has a long way to go, but has begun with Baghdad University, where the group held two seminars and passed out 200 CDs with Mandrake Linux loaded on them. Basically, the Iraqi LUG believes that while pirated software abounds today, government, commercial, and academic institutions in Iraq will eventually have to start paying for their Windows and other proprietary software; the Iraqi LUG sees its job as advocating an open-source alternative before the country has to pay out millions in software licenses. As Iraqi interest in computing ramps up, the opportunity for Linux to make an early entrance is passing quickly. The U.S. Commerce Department, however, imposes restrictions on U.S.-developed Linux distributions that contain strong encryption because officials fear it could be used for secret terrorist communication--but that argument is flawed because proprietary software with strong encryption has already been approved for export to Iraq without license, such as Microsoft Windows and Sun Solaris, because those products are considered "mass market." Additionally, some Linux distributions are not from the United States, such as Brazil's Conectiva and Germany's Suse. Other Microsoft alternatives are also not well known, and Minnesota Public Radio correspondent Adam Davidson recently gave a Slashdot.org interview from Iraq in which he said he had met only one Iraqi who knew of Apple Computer, while only a few of his acquaintances knew of Linux.
    Click Here to View Full Article

  • "Outsourcing Ax Falls Hard on Tech Workers"
    Los Angeles Times (05/30/04) P. A1; Vieth, Warren

    Technology workers held enviable positions in the U.S. job market several years ago, but now face lay-offs due to their companies' offshore outsourcing programs. Offshoring is billed as a way to generate more valuable jobs in the United States, but the reality is that it is purely meant to increase profitability, says Washington Alliance of Technology Workers President Marcus Courtney. He notes that just five years earlier, observers estimated hundreds of thousands of more IT workers would be needed in the United States; today, salaries in the technology industry have stagnated and while many who are fired either cannot find work in the field or are forced to take a less desirable position. The situation has created new political and economic divisions, with many U.S. technology workers questioning the globalization policies that once only affected blue-collar jobs. Having laid-off IT workers train their foreign replacements has been an especially sensitive issue, with some critics calling the practice inhumane and members of Congress trying to limit such activities. At Agilent Technologies, the worldwide headcount has shrunk from 44,000 in 2001 to just 28,000 employees under what the company calls its Workforce Management Program; the firm recently hired India-based Satyam Computer Services to take over its IT infrastructure development and management, and requested some of the U.S. workers whose jobs were lost to train replacements. Hewlett-Packard and then Agilent veteran Cliff Cotterill was let go because of the Satyam contract, but holds Agilent's executive management more culpable than the Indian IT workers who he helped train before leaving. "I've occasionally thought they should reopen the House Un-American Activities Commission and bring all the CEOs up to Congress," he says.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Technology Applications for the Health Sector"
    ITWeb (South Africa) (05/28/04)

    Telkom has launched a new South African Center of Excellence (CoE) at the University of the North near Polokwane to develop speech technology applications for the medical sector. For instance, such applications will allow patients to receive the correct medication dosage over the telephone, said Telkom CTO Reuben September. The University of the North is the fourth academic institution to set up a Telkom CoE, following the University of the Western Cape, the University of Transkei, and the University of Zululand. Hewlett-Packard and Marpless are industry partners with the CoE, which is concentrating on automatic speech technology and speech synthesis. "HP's commitment is about a transfer of knowledge and skill to tutors and students alike, as well as access to the technologies that enable an environment that is driven by services and is rich in content and available universally," said HP's Trevor Belcher. The AST Group at Stellenbosch and the group at the University of Pretoria are developing dictionaries and concomitant tools for automatic recognition of continuous speech in indigenous South African dialects such as Sepedi, Setswana, Tshivenda, and Xitsonga. VoiceXML technology will allow researchers to build applications that convert text into speech without requiring an extensive understanding of the complexities involved. September was hopeful that the CoE Program will allow commercial services to be rolled out in three to four years: "This will enable people to transact their business and interact with computers using only speech," he proclaimed.
    Click Here to View Full Article

  • "The Scope of Network Distributed Computing"
    InformIT (05/28/04); Goff, Max

    In the first chapter of his book "Network Distributed Computing: Fitscapes and Fallacies," author Max Goff provides an overview of numerous NDR areas, noting that all aspects of NDR are interconnected to almost every other aspect. Goff stresses that ubiquitous computing is needed to achieve the complete "ephemeralization" of all systems, be they monetary, electrical, or related to physical infrastructure; the author argues that a line must be drawn between pervasive and ubiquitous computing, with the former term representing the ubiquitous distribution of information and the latter representing the ubiquitous need for information. Goff reasons that component reuse and a composable NDC architecture could become reality through the adoption of Web Services and the Semantic Web, the latter of which is an effort to embed deeper meaning into Web content so that software agents can carry out complex user commands. Peer-to-peer computing is characterized as an organic approach to NDC software architecture, which could become the only kind of scheme that can adequately sift through the intricate relationships stemming from the widening scope and data distribution of the World Wide Web. Spaces computing, a concept pioneered by Yale University's David Gelernter, puts forward the idea of a Mirror World--a software architecture encompassing all real-world aspects that can be measured, tracked, transported, or numbered, and that facilitates sophisticated computing transactions such as the solving of complex problems by tapping into myriad CPUs. Goff recommends that NDC developers concentrating on collaborative computing should gain as much knowledge about human beings as they can, if they wish to architect effective groupware. Dependable, fault-tolerant systems are increasing in importance as the implications of network and system failures grow along with the global reliance on NDC, and Goff lists Wilfredo Torres-Pomales' checkpoint-restart and recovery-blocks techniques as approaches that could arguably yield more dependable systems. Other NDC research and development areas of note include cluster concepts, distributed storage, security, languages, distributed agents, middleware, distributed algorithms, network protocols, distributed databases, mobile and wireless, distributed file systems, massively parallel systems, real time and embedded systems, operating systems, and distributed media.
    Click Here to View Full Article

  • "Gunning for Speed"
    Business Week (06/07/04) No. 3886, P. 132; Port, Otis; Tashiro, Hiroko

    The dislodging of U.S. supercomputers' ranking as the world's fastest machines by NEC's 41-teraflop Earth Simulator in Japan about two years ago served as a wake-up call to Washington, which realized that the United States was lagging behind other nations not just in the speed and efficiency of its supercomputers, but in their application to "grand challenge" problems in science. Before Earth Simulator's debut, American supercomputing efforts in computational biology were overtaken by Japan, whose Riken institute built a 78-teraflop machine for simulating protein function, while a 1-petaflop machine is planned for 2006. Smaller supercomputers have become a crucial ingredient in the improved design of products and their time-to-market--for example, such machines are employed by Japanese automobile manufacturers to run aerodynamic simulations to design quieter cars. Supercomputer design falls into two fundamental categories: Vector architecture consisting of "heavyweight" chips built to deal with complex scientific and mathematical problems, and scalar architecture in which commercial off-the-shelf (COTS) chips are clustered together. Cutting-edge research such as the kind carried out by Earth Simulator favors the vector approach, while most U.S. supercomputing efforts opt for the cluster scheme because COTS chips are cheaper, and offer incalculable "time-to-insight" advantages. Experts note that supercomputing clusters eliminate research teams' dependence on national supercomputing resources that may not be readily available, and can accelerate the time it takes to develop applications. A next-generation COTS/vector hybrid may emerge thanks to projects such as IBM's Blue Gene and Cray's Red Storm. Meanwhile, the Defense Advanced Research Projects Agency is backing a competition to develop a standard petaflop vector/scalar architecture to debut as a prototype by 2009.
    Click Here to View Full Article

  • "True Grid"
    CIO (05/15/04) Vol. 17, No. 15, P. 44; Lindquist, Christopher

    Enterprises are cutting information processing time and making faster business decisions through grid computing, which is a less expensive option than a complete hardware upgrade; the end result for business users is faster responses, less time to market for new products, and reduced prices per unit of processing power. Among grid computing's biggest advantages is affordable scalability, and the ability to more effectively leverage existing hardware resources. The utilization rate of CPUs in the average desktop PC at any one time is estimated to be less than 20 percent, while grid computing promises to raise that figure up to 80 percent, or even 98 percent according to some claims. One of the challenges to mainstream acceptance of grid computing is changing a widespread mindset afraid that control will be lost, departmental budgets will be cut, and sensitive data will become vulnerable. Licensing and privacy is another hurdle, as per-CPU pricing becomes unworkable with a grid architecture: In the end, per-use price models could become prevalent, once the tools for tracking such usage are fully developed. Meanwhile, grid vendors and customers both see the necessity for standardization, and a number of efforts are underway, including the Global Grid Forum, the Globus Alliance, and the Enterprise Grid Alliance. Standards currently under development include the Open Grid Services Architecture, the Open Grid Services Infrastructure, and the Web Services Resource Framework. For the moment, the most suitable grid applications are those that run identical or similar computations on massive numbers of pieces of data, with no single calculation reliant on previous calculations; cryptography is one such application. Applications that have a greater dependence on data handling than CPU power--enterprise resource planning, for example--as yet do not complement grid computing very well.
    Click Here to View Full Article

  • "Is It Time to Make the Network Smarter?"
    Business Communications Review (05/04) Vol. 34, No. 5, P. 8; Sevcik, Peter

    Internet infrastructure needs to become more sophisticated in order to provide better reliability, security, and performance, writes networking expert Peter Sevcik: Although the idea of a dumb network served the Internet well during its nascent days by allowing greater innovation and easy access, it is now time to reassess old assumptions. Core systems should not be burdened with tasks such as deep packet inspection, but there can be changes to the network infrastructure edge that would greatly alleviate current problems. By consolidating some of the end-system intelligence at the network edge, ISPs would be able to better deal with spam, viruses, and denial-of-service attacks, for example. In addition, a smarter control plane is needed to better serve real-time Internet applications such as VoIP telephony, which use a number of simultaneous, coordinated network connections to function. Another reason to increase network infrastructure sophistication is to better provide for global enterprise networking, which currently depends on numerous local ISPs and higher-level ISPs at great distance. Traditional carriers have already begun discussing a second enterprise-level Internet that would have stricter conduct requirements but provide premium service for mission-critical applications; this secondary business-class network, dubbed the Infranet by Juniper, would replace current private networks. As more companies add edge network devices to bolster their use of the common Internet, carriers' revenue streams from private networks are threatened. Although the enterprise-level Internet would likely mean lower revenues for carriers than private networks, at least it would prevent a complete loss. The author writes that more public discussion is needed concerning a more sophisticated public Internet, and a solution must come soon lest network players take individual action.

  • "Innovators 2004"
    InfoWorld (05/24/04) Vol. 26, No. 21, P. 46; Connolly, P.J.; Rash, Wayne; McAllister, Neal

    Seventy-five percent of technology developers singled out by InfoWorld for making contributions that have led to significant improvements in our daily lives focus on security, XML, or Web services solutions. Dr. Dan Boneh of Stanford University and Dr. Matt Franklin of the University of California, Davis, realized a simple way to encrypt email by developing an algorithm that converts a simple identity string into a public-private key pair, a scheme that allows users to encrypt email without pre-registering. Lumeta chief scientist Bill Cheswick is mentioned for his work with IP Sonar, a tool that checks for network leaks, and Bram Cohen is noted for creating the BitTorrent peer-to-peer file-transfer client that has drawn praise for shifting file hosting duties from the server to the client. Miguel de Icaza, a driving force behind the Gnome project, Ximian, and Mono, has worked tirelessly to bridge the usability chasm between proprietary workstation systems such as Windows and open-source systems such as Unix and Linux; Micah Dubinko of Verity is focusing on converting inflexible, presentation-centered Web-based forms into tools that are adaptable across an array of devices through XForms. Sana Security chief scientist Steven Hofmeyr bucks traditional IT security strategies by modeling Web security systems such as Primary Response on the human immune system, an approach that eliminates the need for frequent IT updates, maintenance, and patch implementations. IBM's Norman Rohrer made critical contributions to the PowerPC 970FX chip, a device that is more energy-efficient while simultaneously capable of faster processing speeds; StorageTek senior fellow James Hughes is lauded for developing P1619 technology, which uses Advanced Encryption Standard to keep storage data secure while allowing companies to share networks with people whom they do not necessarily trust. Sourcefire's Martin Roesch is cited for Snort, a pet project that became the industry standard for intrusion detection, and Real-time Network Awareness (RNA), a more advanced tool for "passive" network security in which the network tells administrators about itself and then incorporates that information into an intrusion detection system.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM