HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 689:  Wednesday, September 1, 2004

  • "Reining in Tech"
    San Francisco Chronicle (08/30/04) P. C1; Evangelista, Benny

    The entertainment industry learned its lesson from Napster and is now going after threatening technologies even before they are actually made into real products: The music recording industry is asking the FCC to disallow high-definition digital radio (HD radio) makers from installing automatic recording technology, while the movie industry and the National Football League (NFL) made a similar FCC request concerning TiVo's planned service that would have let subscribers view recorded shows over the Internet, although that request was denied. Electronic Frontier Foundation attorney Fred von Lohmann says this type of pre-emptive attack stifles technology development and could be used against any emerging technology. However, Motion Picture Association of America's Fritz Attaway describes the content industry's efforts as not stopping technology, but ensuring that "technology can provide a secure environment for our content." A recent federal appeals court ruling that cleared file-sharing software firms of aiding copyright infringement acknowledged that current technology advances have left the entertainment industry with far less time to react to new developments. Napster was shut down by court order in just a couple years, but by that time had spawned an entire industry and underground culture; recording industry executives fear new HD radio technology could further open up content piracy because users might be able to set devices to automatically record certain songs when broadcast. The music firms say they want HD radio devices that allow manual recording capabilities similar to what is available now with standard radio. Although HD radio is just starting to take off, new digital radios in the United Kingdom include flash memory cards that allow users to save the previous 10 to 30 minutes of broadcast. The Internet TiVo service was possibly saved by the inclusion of authorization devices that would ensure only subscribers could access content, though the NFL said that scheme would still not prevent users from watching local games that would otherwise be blacked out.
    Click Here to View Full Article

  • "Battle of the Ballot Heats Up"
    MSNBC (08/30/04); Boyle, Alan

    Despite careful monitoring of elections between now and Nov. 2 by voting watchdogs to avoid the pitfalls that dogged the 2000 presidential election, ElectionLine.org director Doug Chapin is convinced that the 2004 race will be just as quarrelsome, simply by the fact that so many more people are focused on the voting issue. Comparing the 2004 election season to a forest-fire season, Chapin states, "The woods really aren't any drier in 2004 than they were four years ago, but more people have matches." The Florida primary scheduled for this week will be scrutinized heavily, thanks to the state's notorious voting controversies and their role in the 2000 fiasco. Nevada's Sept. 7 primary, meanwhile, will represent the first actual test of electronic voting systems equipped with voter-verifiable paper trails, seen by many as a solution to e-voting problems such as outages and bugs that have led to switched or lost ballots. The Verified Voting Foundation has recruited over 1,300 computer experts to observe voting machine testing and ensure that bugs are spotted before ballots get lost. California Institute of Technology professor and co-director of the Caltech-MIT Voting Technology Project Michael Alvarez expects voting technology challenges to be offered by practically every battleground state by Nov. 2, and he and his Caltech-MIT associates will release a series of recommendations for voters and election officials in a few weeks. The researchers will be comparing the numerous voting technologies on Election Day as well. Experts dissuade voters from waiting until Election Day to ensure that their ballot is properly counted: Correct registration is a must, and Alvarez notes that absentee voting by mail is not infallible; he adds that many absentee ballots are disfranchised because of simple errors such as missing voter signatures, incorrect addresses, or late postage.
    Click Here to View Full Article

    For more information on ACM's proposed statement on e-voting and current results of the member opinion poll, visit https://campus.acm.org/polls/.

  • "Bit by Bit Computers Correct Us"
    Philadelphia Inquirer (08/30/04) P. D1; Flam, Faye

    Essays and tests are being graded by computer programs imbued with artificial intelligence that uses pattern recognition techniques, although experts caution that such programs can miss the finer points of writing, such as relevance and context. Designers and advocates counter that although judging creative writing and poetry may be beyond the programs' abilities, they are still valuable assessment tools as long as students make a good-faith effort; Harry Barfoot with Vantage Learning, a provider of computerized essay scoring, notes that scores on standardized writing tests improved significantly in schools that employed such programs. E-rater, an essay-grading program from Educational Testing Service (ETS), is "trained" to recognize patterns that indicate language structure and elements associated with both good and bad writing by being fed countless essays and newspaper articles. The program is used to score the essay portion of the Graduate Management Admission Test, which was originally graded by two human assessors while a third stood by to evaluate the essay if there was more than one point of difference between the first two graders' scores. ETS executive director Richard Schwartz explains that only two human judges are used now, with the second coming in to assess the essay if the scores of the first judge and E-rater differ significantly. However, Andy Jones of the University of California at Davis proved that such programs can be tricked into giving high scores to nonsensical material as long as the writers employ many of the kinds of words and phrases used in good essays. MIT research scientist Henry Lieberman points to grading programs' lack of common sense as the reason why they are such poor judges of essay content, although he and other researchers acknowledge that computer intelligence is improving. In addition, Barfoot remarks that students are favoring grading programs because they offer nearly instantaneous feedback.
    Click Here to View Full Article

  • "Computers: Scientific Friend or Foe?"
    USA Today (08/31/04) P. 6D; Vergano, Dan

    The growing presence of computers and technology is opening up new avenues of scientific research while simultaneously raising the risk of both intentional and unintentional scientific misconduct or fraud. The National Cancer Institute's John Weinstein notes that scientists have become highly dependent on programs such as the Excel spreadsheet, which was discovered in a June BMC Bioinformatics report to allow gene names to be changed into dates, thus making proper genomic data retrieval or analysis impossible. Other notable cases that reflect similar concerns include a January computer glitch that disrupted communications between the Mars rover Spirit and the NASA mission controllers until the probe's flash memory was updated; a July warning from the editors of the Journal of Cell Biology that popular photo-manipulation programs used to clean up digital images can falsify data; and a 2003 report on the recent space shuttle tragedy that rebuked NASA scientists' practice of glossing over critical analyses using PowerPoint presentations. "What we're seeing here are surface issues that point to deeper problems with the way we use existing software to present and handle data," explains Leo Herbette of software development firm Exploria. He also illustrates critics' argument that current computing technologies may be inhibiting the exploration of big problems by noting that programs that present data one slide or image at a time prevent researchers from establishing connections between data produced by different experiments. Meanwhile, John Krueger of the Public Health Services' Office of Research Integrity posits that a generation gap between less computer-savvy senior researchers and younger, more technologically competent researchers has led to inadequate mentoring in acceptable data management and presentation. However, he points out that growing computational capability has also made it easier to spot research fraud.
    Click Here to View Full Article

  • "Humanising the Internet With Virtual Companions"
    IST Results (08/30/04)

    France-based La Cantoche Production has earned a nomination for this year's IST Prize for its Living Actor technology, which eases the creation and implementation of 3D interactive characters to enhance the online experience. The software, based on high performance compression algorithms and a user-friendly scenario-based concept application, allows the characters to travel throughout a computer screen independently of other software applications while understanding users' keyboard, mouse, or vocal directives. Living Actor characters also display human-like behavior and are multilingual. In IST's AsAnAngel project, Cantoche helped deploy a conversational and customizable virtual assistant that supports Internet navigation by Web and cell phone users. Cantoche is also playing a role in the IST project Humaine, whose goal is to pave the way for "emotion-oriented" systems that can record, simulate, and shape human emotional states and processes: Cantoche is readying tools and virtual characters that will be employed to research the parts emotions play in persuasion and communication. "These two projects have helped us to develop strong relationships with key universities and laboratories in order to share knowledge and develop new technology," notes Cantoche co-founder Benoit Morel. "Involvement is also important for our clients as it provides recognition of our expertise."
    Click Here to View Full Article

  • "Fab Labs Bring "Personal Fabrication" to People Around the World"
    Newswise (08/31/04)

    MIT's Center for Bits and Atoms (CBA) is deploying "fab labs" so that people around the world can avail themselves of the technology to fabricate practically anything from cheap and widely available materials in order to tackle local challenges. "Instead of bringing information technology to the masses, the fab labs bring information technology development to the masses," explains CBA director Neil Gershenfeld. "For our education and outreach efforts, rather than telling people about what we're doing, we thought we'd help them do it themselves." The fab lab concept stems from CBA research on the "personal fabricator," which is defined as a machine that can manufacture any device, itself included. CBA is using a $13.75 million Information Technology Research award from the National Science Foundation to probe the convergence of computer science and physical science. CBA's latest fab lab was deployed on the campus of Ghana's Takoradi Technical Institute in July and August: Students have used the facility to design and fabricate fluorescent pink key chains with modern manufacturing tools, and the lab is also focusing on projects such as solar-powered cooking, cooling, and cutting machinery, as well as antennas and radios for wireless networks. There are several fab labs in India committed to the development of agricultural technology and 3D scanning and printing for rural craftsmen, while one in Norway above the Arctic Circle is working on wireless networks and animal radio collars as assistive technologies for nomadic herding. All fab labs are outfitted with open-source computer-aided design and manufacturing software, computer-controlled fabrication tools, laser cutters for 2D and 3D structures, a 3D precision milling machine, and a sign cutter for plotting interconnects and electromagnetics.
    Click Here to View Full Article

  • "University Challenge"
    Sydney Morning Herald (08/31/04); Head, Beverley

    Universities are constantly monitoring industry and technology trends, and struggling to keep their curriculum up to date, according to experts in Australia. Murdoch University professor Duane Varan travels to the United States three times each year to check up on the latest technologies pertinent to his interactive TV research institute, and finds a completely different landscape each time; in order to cope, universities need to find industry partners with whom they can collaborate, he says. The Australian government has created the IT Skills Hub to maintain closer connections between higher education and the IT industry, while the Australian Information Industry Association works with technical schools to keep curriculum relevant to industry needs. Monash University IT faculty dean Ron Weber, whose department is one of the largest IT university groups in the world, says balance is needed between practical aspects and fundamental technical theory. Changing curriculum wholesale can be expensive because it would require maintaining two parallel tracks while students currently in the pipeline finish their studies; a more practical approach is to incrementally change curriculum, which also allows constant change to occur. Other experts note that cross-discipline education is another challenge for universities, but one that has already spawned numerous derivative courses: At Swinburne University of Technology, for example, students can choose between computing, computer science, multimedia design, Internet and communications technologies, and information systems. Even as courses proliferate, enrollment in IT university courses has dropped in Australia, falling 2 percent in 2003. IT university degree applications have fallen more than 25 percent in the last year, according to IT Skills Hub chief executive Brian Donovan, who worries that universities will lower their entrance requirements.
    Click Here to View Full Article

  • "The Socialization of Collaboration"
    Computerworld (08/25/04); Greif, Irene

    Technologies such as email, instant messaging, and Web conferencing have played major roles in the acceleration of business speed, increased competitiveness, and the transformation of communications, writes IBM Fellow Irene Greif. She observes that individuals' reliance on social hierarchies has been reduced by email and instant messaging: Email, for instance, bolsters informal social connections among individuals that cultivate solid decision-making and innovation, while instant messaging has become an essential work tool as well. Portal and workflow technologies are more attuned to the top-down flow of corporate messages and processes, while properly employed team software can significantly increase productivity. Greif explains that an increase in online communication can have negative side effects for both IT departments and knowledge workers, but they could be mitigated by blending the individual, corporate, and team viewpoints as a criterion for strategic IT investment, while integration of email, portal, and team technologies-based services should be facilitated via attention- and activity-based management tools. The author cites IBM Research attention management prototypes, such as a technology that can visually identify email messages that are from the same sender or that are part of an email string so that users can more easily find the information they require and react accordingly. Accompanying the advent of new communications media is a dramatic increase in the volume of data, making the management of corporate knowledge documents a complex challenge; research has found that people more easily find attachments in email than in file systems. Using team spaces takes data out of private email and into a common area that increases the likelihood that it will be suitably managed by IT personnel, although the regular use of team spaces by employees requires that they be as simple to use as email or more beneficial to individuals or teams.
    Click Here to View Full Article

  • "Is Encryption Doomed?"
    Technology Review (09/01/04); Garfinkel, Simson

    Scientists at the Crypto conference last month announced flaws in the MD5 algorithm, the latest round in a cycle of developing and breaking codes that seems to go on perpetually. But researchers are also looking at a 30-year-old mathematical challenge that could possibly end this cycle by undermining the fundamental basis for encryption. "Polynomial," or P problems, are solvable by today's computer technology because they ensure a thorough search will yield an answer, while "Nondeterministic polynomial," or NP problems, are very difficult to solve, and include foundations for encryption such as factoring large numbers. Researchers are currently attempting to prove that solving all NP problems using traditional polynomial means is impossible by basically trying to answer a question set up by mathematicians Stephen Cook and Leonid Levin in 1971, on whether P is equal or not equal to NP. Both Cook and Levin acknowledged the existence of a large number of "complete" NP problems, or problems that contained the essence of a "nondeterministic polynomial." If a polynomial-derived solution to one NP-complete problem could be found, then it would be relatively simple to extrapolate the method and solve every NP problem, and this question was the basis of a one-gold-ounce wager between RSA co-developer Len Adelman and Michael Sipser, who is the new MIT Mathematics Department head. Though the bet was made when both researchers were graduate students at the University of California, Berkeley, Adelman says mathematicians may not be much closer to finding an answer to the P vs. NP question; any breakthrough would require completely new ideas, and would not come by incremental advances. "From my perspective, we are no nearer to solving the problem now than we were when bell-bottom pants were cool," Adelman says.
    Click Here to View Full Article

    Leonard Adelman, Ronald Rivest and Adi Shamir won ACM's acclaimed A.M. Turing Award in 2002 for their contributions to public key cryptography. While at MIT in 1977, this team developed the RSA code, which has become the foundation for an entire generation of technology security products. Click on http://www.acm.org/announcements/turing_2002.html for more details.

  • "Distributive Computing Spreads Out"
    Alameda Times-Star (08/30/04); Fischer, Douglas

    The Berkeley Open Infrastructure for Network Computing (BOINC) launched by UC Berkeley researchers is an Internet computer program that allows people to divvy up their PCs' idle computing power between as many distributive computing projects as they desire, which can range from the search for intelligent extraterrestrial life to new drug development to meteorological simulation to finding the largest prime number. Distributed computer projects combine the spare computing power of numerous PCs into a "virtual" processing center that can outrace even the hardiest supercomputer, but BOINC creator David Anderson says BOINC manages the problem of surplus power by re-routing distributed computing participants to other projects, and allowing users to control the amount of computing time dedicated to each project. "If we take this idea of using home PCs to do science out of the techie science-fiction corner and into the mainstream...it becomes essentially the norm--you buy a home PC and this is what most people do," he explains. BOINC's shortcomings include a less user-friendly Web site than other popular sites such as America Online, and a stable of only three distributed computing programs, although more are slated for development. Sun Microsystems engineer Kirk Pearson recommends that neophyte users start with a specific program before moving on to BOINC. Among the projects that could benefit from a program such as BOINC is Climateprediction.net, a climate change modeling initiative with 65,000 participants running meteorological simulations of 45-year periods. Project coordinator David Frame reports that it takes approximately a month to run a single simulation on a modern PC.
    Click Here to View Full Article

  • "Interview: Web Standards Need Support"
    IT Week (08/25/04); Neal, David

    Ilog head of products Jean-Francois Abramatic, who was recently reappointed to the World Wide Web Consortium's (W3C) advisory board, reports that the Internet has changed significantly since the W3C's formation in 1994: Whereas 10 years ago the Web was relatively small and only had a few major specifications such as HTML and HTTP, today its ubiquity has wide-ranging societal implications and elements such as XML have birthed whole families of specs. Abramatic remarks that a standards body must establish its pertinence, and notes that the consortium was given the job of evaluating the industry on the technologies to focus on almost from the beginning. "We may not get all the community [to back particular efforts] but enough to generate significant momentum," he says. Abramatic says the W3C furnished tutorials and best practice guidelines for Web accessibility so that the community could be well educated on the issue. "Because more content goes on the Web every day, our efforts with the WAI [Web Accessibility Initiative] must be permanent and amplified," he asserts, adding that accessibility is a priority in technological design. During his tenure as W3C Chairman between 1996 and 2001, Abramatic learned that organizations such as ICANN and the Internet Engineering Task Force should not be tasked with enforcing accessibility compliance standards. He observes that designing standards takes time because everyone involved must be allowed to contribute their own input in order to build a deployable standard, and concludes that a key element is trusting the members of the design team.
    Click Here to View Full Article

  • "Computers Add Sophistication, But Don't Resolve Climate Debate"
    New York Times (08/31/04) P. D3; Revkin, Andrew C.

    There are many competing computerized climate models that use different techniques, but Ronald J. Stouffer of the Geophysical Fluid Dynamics Laboratory says these models outline the same conclusion: That world temperatures are rising sharply, primarily to human influences such as the emission of greenhouse gases from fossil fuels. These findings were advocated by an update on federal climate research released by the White House last week, although both environmentalists and industry-backed organizations found fault with the report. The former group criticized the Bush administration for refusing to require cuts in emissions while simultaneously acknowledging the global warming problem and its causes, while the latter contended that the simulations used in the findings were erroneous. Some climate researchers charge that modelers are constructing complex models of even more complicated real-world phenomena containing many climate-shaping factors that are still not scientifically understood. However, Dr. Gerald A. Meehl of the National Center for Atmospheric Research reports that the computer-based climate simulations have become more powerful and sophisticated over the last several years, and are more closely mimicking actual meteorological processes. Still, there is little disagreement from those working in the field that the climate models need further enhancement. They explain that it is critical to gain more insight on aerosol emissions by incorporating their unique properties into simulations to more accurately model their ability to cause increases or decreases in temperature. Climate modeling critic and MIT meteorologist Dr. Richard Lindzen says modelers are plugging in values so that their simulations will detail dramatic future warming trends not because such results are probable, but because they are desirable.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "The Downloading of the President '04"
    Salon.com (08/24/04); Manjoo, Farhad

    Electronic voting machines are drawing fire because of their vulnerability to tampering, susceptibility to bugs, and their lack of an audit trail. But of more concern to people such as former ACLU Miami President Lida Rodriguez-Taseff are whether solid e-voting procedures and practices will be in place to prevent errors attributed to late polling place openings, poorly trained election officials, uninformed voters, and elections laws that prevent the rigorous examination of machines to track down the causes of malfunctions by the time the presidential election rolls around. She argues that the Republican and Democratic parties are to blame for this state of affairs through their failure to properly educate the voting public on what to expect from e-voting, although other voting experts blast activists' skepticism toward such systems over the last several years. There is a certain legitimacy to the claim that this criticism is causing voters to lose confidence in the security and reliability of elections, and MIT professor Ted Selker advocates parallel testing and other thorough procedures to enhance the security of paperless voting systems and reassure voters. Absentee voting, which the Democratic Party has suggested as an alternative to on-site poll voting, is disparaged by both Selker and Rodriguez-Taseff as even being more prone to abuse and error than e-voting. The GOP, meanwhile, wholeheartedly supports e-voting. Despite the criticism leveled at e-voting criticism itself, there have been some positive results: E-voting system manufacturers notorious for poor product security have been harshly audited, and some companies and election officials have implemented security refinements. Activism has also spurred politicians to push for reforms, such as Nevada Secretary of State Dean Heller's mandate that all e-voting systems in the state furnish voter-verifiable paper trails for the presidential election.
    Click Here to View Full Article

    For more information on ACM's proposed statement on e-voting and current results of the member opinion poll, visit https://campus.acm.org/polls/.

  • "Speech Applications Voice New Strengths"
    eWeek (08/30/04) Vol. 21, No. 35, P. 37; Caton, Michael

    The evaluation of interactive voice response (IVR) systems by organizations entails them choosing between Voice XML (VXML) products and Microsoft's Speech Application Language Tag (SALT) products as the basis for their speech application platforms, both of which bundle speech recognition, dual-tone multifrequency (DTMF) and text-to-speech technologies into a single application design-and-development model. Both platforms keep the speech interface separate from business logic and data, determine how an application will handle user-application interaction by guiding voice interaction through the utilization of grammars and telephone keypad input via DTMF, and define the application's flow and the validation of speech or touch-tone user input using Web application development methods for building pages. VXML's technique for this last capability involves XML, while SALT supports tags. Architecturally, SALT diverges from VXML in its ability to define a way to structure multimodal applications, while the close association between SALT and Microsoft developer tools such as Visual Studio .Net enables anybody knowledgeable of Visual Studio development languages to build a speech application. Meanwhile, the majority of traditional IVR and telephony application providers support VXML, which widens companies' options for platform and development languages. The World Wide Web Consortium is subjecting the latest iteration of VXML, Version 2.1, to final review. Version 2.1 is expected to include capabilities previously beyond the standard's purview so that developers can construct stronger applications with more effective management of exceptions. VXML is thought to be more effective for enterprises seeking to build richer applications more deeply rooted in underlying corporate applications and data, while SALT allows organizations to exploit commonplace programming skills and build IVR applications whose scope surpasses telephony.
    Click Here to View Full Article

  • "In Search of Better Video Search"
    InformationWeek (08/30/04) No. 1003, P. 44; Ricadela, Aaron

    TV news broadcasters, PC users, and intelligence analysts are just some of the people who could benefit from video search technologies being developed by IBM, Microsoft, and academic researchers. IBM recently demonstrated Marvel, a prototype computer system that employs statistical methods to establish relationships between video footage elements such as color, sound, shapes, and patterns, and label the footage appropriately so that users can find individual shots. Project coordinator John Smith's intelligent information management group has authored algorithms that can recognize 140 concepts culled from an archive of ABC News and CNN broadcasts held by the University of Pennsylvania, and the team is planning a November report on their work to boost Marvel's accuracy by combining concepts. Smith's group is also collaborating with Columbia University's digital video multimedia lab to integrate computer vision and image understanding with machine learning strategies so that news footage from American and foreign broadcasters can be mined for related topics. Columbia University electrical engineering professor Shih-Fu Chang observes that video searching is complicated by video's lack of textual or graphical structure. Meanwhile, Carnegie Mellon University's Informedia initiative is developing a technique to answer video search queries by evaluating elements such as shapes, text, and colors. And Microsoft Research has developed a system that enables users to view all relevant shots in a home movie by moving their mouse pointers over objects they wish to see.
    Click Here to View Full Article

  • "Tests Reveal E-Passport Security Flaw"
    EE Times (08/30/04) No. 1336, P. 1; Yoshida, Junko

    The first interoperability test between electronic-passport chips and readers was a terrible muddle, but vendors say future government testing will allow them to polish their products. Last month, the National Institute of Standards and Technology (NIST) performed the interoperability tests for the Department of Homeland Security over three days at facilities in Morgantown, W.Va.; a second round of testing began in late August in Sydney, Australia, and will feature chip and reader vendors trying to sort out varying interpretations of the International Civil Aviation Organization's (ICAO) e-passport specification. The tests show Type B contactless interface technology faced more difficulties than the Type A interface, and the ICAO requires readers support both standards while cards only have to use one. Though the interoperability issues could be likened to those that crop up during normal PC industry plugfests, experts were especially worried by NIST researchers' ability to scan e-passport data from as far as 30 feet away; ICAO specifications call for data to be transmitted over just a few inches in order to preserve personal data. Users could insert foil sheets to protect their e-passports from remote readers, but ICAO specifications do not require anything more than public-key-infrastructure-enabled digital signatures. Biometric data is recommended, however, and some European countries are planning on active, on-chip authentication schemes. Vendors are also proactively working on security components that government agencies can decide to implement at their discretion, such as chip modifications from Infineon that make those chips more difficult to hack. Gemplus' Jean-Paul Caruano said the ICAO specification described basic access control through encryption, while security expert Bruce Schneier warned that e-passport data readable from a distance could aid thieves or terrorists targeting certain nationalities.
    Click Here to View Full Article

  • "Organized Crime Invades Cyberspace"
    Computerworld (08/30/04) Vol. 32, No. 35, P. 19; Verton, Dan

    Antivirus researchers say a surprising increase in virus and worm activity is linked to an underground economy in identity theft and spam. F-Secure antivirus research director Mikko Hypponen says the connection is not very new, though until recently the writers were thought to be only a rogue subculture. He says MyDoom was the start of a concerted effort to make money from virus and worm infections. Although the MyDoom worm gained notoriety for its denial-of-service attacks against SCO and Microsoft, the more significant activity was going on behind the scenes, when someone scanned millions of IP addresses for backdoors left open by the virus. A network was set up, ready to service the underground spam market. F-Secure analysts decoding encrypted messages in a version of Bagle found warnings to the author of the Netsky.R virus. Bands of hackers, likely Russian immigrants living in different European countries, had been using Bagle and other malware to expand their spam proxy networks, but the Netsky.R author used the infection to clean out those spammers' viruses and was running denial-of-service attacks against their front Web sites. Symantec director Brian Dunphy says that a recent variant of MyDoom featured peer-to-peer networking capabilities that allowed the author to update infected machines and protect his network against rivals. Viruses and worms are also being used to install Web servers on vulnerable systems; Web sites often sell subscription services on compromised computers. Some support identity theft rings, harvesting credit card and other information to sell underground.
    Click Here to View Full Article

  • "Just a Note to Say..."
    GPS World (08/04) Vol. 15, No. 8, P. 33; Cameron, Alan

    New location-aware services are emerging thanks to the advent of new global positioning system (GPS)-enabled tools. Natalia Marmasse of the MIT Media Laboratory has developed comMotion, a system that can sift through data collected by the user and relay it in the appropriate location-based context. The prototype comMotion portable unit consists of a PC, a GPS receiver, a wireless modem, an earphone, and a microphone, while software employs a location-learning agent that notes sites frequently visited by the user and allows them to be tagged with associated to-do lists; once this is done, the user can encode items or tasks either by typing or recording speech, and when the GPS unit ascertains that the user is at or close to a defined location, the system raises a visual or auditory reminder. Jim Spohrer's WorldBoard concept proposes establishing an augmented reality system on a planetary scale: Such a system would supposedly allow users to retrieve information (be it handwritten, auditory, 3D images, or dynamic media-rich Web pages) on anything residing on a Web server that corresponds to its physical location, using the object's GPS coordinates as a guide. The GeoNotes Java application devised by Swedish Institute of Computer Sciences researchers was implemented three years ago on an open-access network based in Stockholm. GeoNotes, once downloaded to a PC or personal digital assistant linked to the Internet and to a Lucent base station, uses wireless local area network technology to determine a user's position so that he or she can write location-specific "tags" and graffiti as well as browse, read, search, and comment on the GeoNotes of other users. WaveMarket, meanwhile, delivers a location-based friend-finder service with WaveIQ, a software suite comprised of a hosted "superblog" that functions as a "multiple-channel informational clearinghouse," a map interface for mobile phone displays, and a framework that enables wireless operators to alert subscribers when "friends" enter or exit a defined location.
    Click Here to View Full Article

  • "MDA: A Motivated Manifesto"
    Software Development (08/04) Vol. 12, No. 8, P. 35; Booch, Grady

    IBM Fellow and Unified Modeling Language co-founder Grady Booch writes that the increasing need for systems that operate round the clock is boosting the value of Model Driven Architecture (MDA), which offers a way to design and build systems that can continue to function even when components are being replaced. He goes on to explain that MDA is not particularly hard to comprehend, and outlines seven key reasons why organizations should consider adopting the architecture. MDA clarifies the distinction of concerns between different project stakeholders in order to facilitate iterative teamwork, delivering an application that fulfills both functional and non-functional requirements. Booch notes that MDA promotes proficiency by using patterns to achieve best practices, and advises that developers should select a tool that allows them to add their own individual pattern models to the model library, as well as corresponding deployments to the code library, for incorporation into their applications. The author explains that companies embracing MDA will draw, hire, and retain MDA specialists because MDA methods will yield rapid returns on investment, while a combination of on-the-job training and formal guidance can turn a typical development team into MDA experts in as short a time as two months. Another advantage of MDA Booch mentions is its ability to allow business stakeholders to employ a Platform-Independent Model because it is easy to comprehend and overcomes deployment dependencies. He notes that standards-based MDA prevents vendor lock-in, while more advanced automated testing can be enabled by MDA. Specialized MDA toolset availability is pervasive and increasing, while other notable benefits MDA provides include interoperability and portability, future-proof applications, and a consistent metamodel.


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM