HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 632: Friday, April 16, 2004

  • "Can Software Kill You?"
    TechNewsWorld (04/13/04); Germain, Jack M.

    Faulty software is becoming a larger concern now that computers are involved in nearly every aspect of people's lives. Bad code can even lead to deaths in some cases, such as at the National Cancer Institute in Panama, where 21 patients died in 2000 due to radiotherapy overdose, caused by improper programming. Software glitches have been behind major vehicle recalls in North America, some affecting critical safety functions such as brake warning lights. Yet according to the National Institute of Standards and Technology, software developers spend about zero percent of their development budget on code-checking and correction. While no software can be made perfect, a more responsible attitude can improve software quality, according to Cigital Labs CEO Jeffrey Payne. In fact, the Institute of Electrical and Electronics Engineers reported that simple peer review would eliminate 60 percent of all software bugs. Many software developers are wary of the cost of building better software upfront and instead rely on patching, says Christopher Nolan of application testing and monitoring firm Empirix. He says software that is used for applications other than originally intended also leads to failure. Quality assurance processes offer a way to drastically improve software quality, says Nolan. Companies should first conduct risk analysis and apply resources more effectively. Software testers are also important, and need to be able to think up extraordinary cases in which software might fail. Agitar CTO Alberto Savoia says computers can be used to improve software quality using testing applications. With the increase of available computing power, computers today can scan individual software modules before the entire code is run to ferret out bad code, he says.
    Click Here to View Full Article

  • "Hackers Strike Advanced Computing Networks"
    TechNews.com (04/13/04); Krebs, Brian

    A number of hackers have compromised U.S. research computing laboratories and networks in the past weeks, doing little damage but raising fears that hugely disruptive attacks are possible. Much like a Canadian teenager used University of California, Santa Barbara supercomputers to knock out Amazon.com, eBay, and CNN.com in 2000, experts say that whoever took over research resources recently could have done much worse. Among the facilities compromised are the Department of Energy's Argonne National Laboratory, the National Center for Supercomputing Applications, and the San Diego Supercomputer Center--all part of the TeraGrid research network. That network was disabled for several days while investigators, including possibly the FBI, gathered evidence about the intrusions. As many as 20 universities and research laboratories could have been targeted, according to sources who asked to remain anonymous because of the ongoing investigation. Stanford University, which is not part of the TeraGrid, has quarantined at least 30 Linux and Solaris machines to reevaluate the maintenance and protection of those Unix-based systems. Stanford computer security officer Tina Bird said the school was alerted by the FBI about the rash of intrusions, and that the focus on Unix systems instead of Microsoft technology was a surprise. Argonne National Laboratory TeraGrid engineering director Pete Beckman said the attacks seemed to be exploratory rather than focused on stealing scientific data or causing damage to other Internet targets. TruSecure chief scientist Russ Cooper, however, said the large-scale intrusion was worrying, especially since those systems were supposed to be among the most secure national resources. In unrelated investigations, U.S. intelligence agencies have monitored al-Qaeda operatives probing the computer networks of critical infrastructure facilities such as dams and power plants.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "DRC Investigation Finds Public Websites 'Impossible' for Disabled People"
    PublicTechnology.net (04/16/04)

    The Disability Rights Commission (DRC) in the United Kingdom has condemned Web developers and online companies for throwing up the same barriers to access for disabled people as exist in the physical world. The results of the study and the DRC's recommendations show that the Web could be made much more accessible to disabled users at relatively modest expense compared to what is required for physical services. The DRC report was compiled with the help of City University's Center for Human Computer Interaction Design in London, and surveyed 1,000 public-facing Web sites. An automated test of the 1,000 Web sites showed 81 percent did not meet minimum accessibility requirements as defined by the World Wide Web Consortium and that the average home page presented 108 barriers to access for disabled persons, including complex page structures, disorienting navigation, undescribed images, and little contrast between background and content. Disabled users further evaluated 100 of the Web sites, finding that more than a quarter of the most basic tasks were difficult or impossible for some users. Blind users were the most disenfranchised, even when using screen reader technology. Of the 400 Web developers surveyed, only 9 percent said they had expertise in accessibility while another 9 percent said they used disabled users to test their site's accessibility. DRC Chairman Bert Massie said that while the Web promised equal access, it so far had failed disabled people by keeping them from participating in online discussion, from job opportunities found online, convenient consumer services, and cheaper goods and services. Legal requirements for equal access are already on the books in the United Kingdom, and Massie said it was only a matter of time before disabled people brought legal challenges to noncompliant companies.
    Click Here to View Full Article

  • "Making Software Systems Evolve"
    IST Results (04/14/04)

    The IST is pursuing a project that would make software evolvable, enabling an organization to change its support software without disrupting the operation of the business. Participants in the ARCHWARE project, which will be completed by year end, want to establish a formal architectural specification language that can be used for various domains, in an effort to facilitate the implementation of systems as they change throughout their lifecycle. ARCHWARE is focusing on the architectural description language (ADL) for software, which is an open source process for modeling and encoding software activities that would add flexibility to an organization's systems. Evolvable systems would lower development and maintenance costs, particularly with regard to compliance of systems as user requirements change, and is critical for software such as Enterprise Resource Planning (ERP) systems. Project coordinator Ferdinando Gallo of Consorzio Pisa Ricerche (CPR) in Pisa, touts open source as a good business model because it involves selling knowledge about a product, building a new software paradigm, and becoming the expert. "Others come and create further value by building on that foundation," says Gallo. "In the process, they help with the evolution of the software."
    Click Here to View Full Article

  • "Blogs: Here to Stay--With Changes"
    Christian Science Monitor (04/15/04) P. 17; Lamb, Gregory M.

    Estimates put the number of blogs on the Web at 2 million while recent research from the Pew Internet & American Life Project says 11 percent of Americans have read one. Experts say the cultural influence of blogs is significant, but has not reached its full potential. Blog pioneer Dave Winer derides some popular blogs from politicians as gimmicks run by advertising agencies. A real politically influential blog would enable substantive public discussion between a city council member and their constituents, and would turn political discourse from sound bites into something more substantial, he says. Harvard Law School's Berkman Center for Internet and Society, where Winer is a fellow, recently hosted BloggerCon II, which was conducted in blog fashion by encouraging audience comment facilitated by moderators. "The Weblog Handbook" author Rebecca Blood countered the idea that blogs were supplanting traditional journalism, noting that most of the time blogs merely commented on or added value to the actual reporting done by newspapers and magazines. Moreover, the scheme of blogs emphasizes updated news with the most recent entries on top, while newspapers and magazines use their format to weight stories according to value, not just timeliness. Winer commented that the so-called blogosphere was still not well defined, saying, "When you try to draw a circle around blogs and say what they are, you always come up with exceptions." Blogs gained notoriety partly from people writing from exotic locations or positions privy to influential stories, but Berkman Center executive director John Palfrey says that as blogs become more pervasive, companies and politicians will use them to gauge public sentiment. And while blogging will not likely provide anyone's staple income, it can serve as a "reputation builder" that establishes someone as an expert in their field.
    Click Here to View Full Article

  • "Researchers Awarded $2 Million to Create High-Tech Tools for Fighting Wildfires"
    AScribe Newswire (04/15/04)

    University and government researchers are collaborating to create a complex simulation and sensor system that will help predict the behavior of wildfires quickly. The planned system involves sensors placed around a wildfire that take in data such as temperature, wind direction and speed, and environmental moisture. That information will be sent back to a remote supercomputer where it will be inputted into an ongoing computer simulation then relayed back to firefighter team leaders. Computer simulations take time, notes University of Colorado at Denver mathematics professor Jan Mandel, who is leading the Data Dynamic Simulation for Disaster Management project. He says people expect computers to analyze information as it comes in and produce actionable information immediately; a scenario his project will help make reality. Funding is provided by the National Science Foundation and other collaborators include the University of Kentucky, Texas A&M University, Rochester Institute of Technology, and the National Center for Atmospheric Research (NCAR). The $2 million, four-year project involves supercomputer, high-speed networking, satellite and sensor monitoring, meteorology, and mathematical theory research. NCAR scientist Janice Coen says simulating complex, fast-changing phenomena such as wildfires is a tremendous technological challenge, especially when the computing resources are far away and data has to be secure. Information sent to firefighters on the ground also has to be presented in a meaningful way that does not require time-consuming analysis, such as in images. By getting more information about future wildfire behavior, lives and property can be saved and natural resources protected. Fires that would burn out normally can be allowed to do so when it benefits the local ecosystem, for example.
    Click Here to View Full Article

  • "NASA Gives Mars Rovers Software Upgrade"
    Associated Press (04/13/04)

    NASA completed a software upgrade for both Mars rovers on Tuesday that should make their computer systems more reliable and enable the robots to travel farther across the red planet. NASA began sending the Spirit rover's new flight code on Thursday and the Opportunity rover's new code on Friday; the software must travel across 188 million miles of space. The Spirit rover has already successfully rebooted and will use its new programming to independently traverse Mars' rocky surface, particularly the Gusev Crater, an area that stumped the rover's previous navigation software. Opportunity, working on the other side of the planet, doesn't need the navigation code, but will make use of new "deep sleep" software to partially power down at night and sever electricity to a malfunctioning heater on the rover that turns on unnecessarily. Both rovers will benefit from new software to fix the rovers' computer memory problems. Opportunity is expected to successfully reboot with the new software Tuesday night.
    Click Here to View Full Article

  • "Friend or Foe? A Digital Dog Tag Beams the Answer"
    New York Times (04/15/04) P. E4; Shachtman, Noah

    The U.S. military is working on new battlefield identification technology that promises to cut down on friendly-fire incidents. In the Persian Gulf war in 1991, 24 percent of American casualties were caused by mistaken attacks, but Army combat identification researcher Pete Glikerdas says the goal is to get that figure down to about 3 percent. In Iraq today, the Army's Blue Force Tracking system uses satellites and an encrypted network to create a composite map of the battlefield with allied positions, though the system takes several minutes to update. The speed of information needs to be increased so that shooters can find out quickly whether a particular target is friend or foe, says Joint Forces Command Lt. Col. Bill McKean. He is working on "cooperative" identification schemes where the shooter can query targets, and friendly targets can respond instantly. Different radio technologies are under development for aircraft, ground vehicles, and even soldiers' rifles. Col. McKean says more powerful computer processors allow radio frequency tags that are inexpensive and small enough to be issued to ground troops. The tags would receive and respond to signals sent by fighter planes, avoiding the type of tragedy that occurred in Afghanistan two years ago, when an American pilot dropped a bomb on a group of Canadian soldiers. Currently, pilots have to rely on grainy images from their plane's sensors to determine whether a target is friendly or not. Similar radio frequency tags are being tested for tanks and armored vehicles, but use millimeter waves to send encrypted identification requests up to four miles away. Smaller sensor tags worn by soldiers would alert a friendly shooter when their rifle's range finder hits them with a laser beam.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Computer Science Degree Still Leads to Employment"
    South End (04/12/04); Doppelt, Zach

    Stu Zweben, chair of the Ohio State University Department of Computer Science and Engineering, says the reason fewer undergraduates are majoring in computer science is broader than the outsourcing of jobs overseas. In addition to the poor job market, Zweben says many students are pursuing information technology related coursework such as media technology and Web services that are not in computer science programs. "Some of these are more attractive to students, and they were not available five years ago," says Zweben. Moreover, Zweben adds that the barriers and restrictions placed on computer science programs in the 1990s to make them more competitive has discouraged many students from pursuing a degree in the field. Zweben also serves as chair for the Computer Research Association, which has completed its annual Taulbee Survey of doctorate-granting university programs across North America. The latest Taulbee Survey reveals a 23 percent decline in newly enrolled computer science majors in 2003, and a 3 percent decline in Bachelor of Science recipients.
    Click Here to View Full Article

  • "Government, Firms Unveil Cybersecurity Framework"
    National Journal's Technology Daily (04/12/04); Leventhal, Ted

    A corporate security task force formed by the IT industry and the Homeland Security Department has published a report that provides a framework for security governance. The framework, which says that information security is a fiduciary responsibility for CEOs, uses existing federal and international information security guidelines and suggests linking security duties to corporate functions. Task force co-chair Bill Connor says many executives are aware of cybersecurity threats, but do not know what to do about them. FTC commissioner Orson Swindle says corporate managers will eventually accept the need for more information security, adding that companies should make a public commitment to securing their information. The report suggests that participants promote the framework in the business community, and that the Homeland Security Department endorse it. The guidelines are supported by the Business Software Alliance, TechNet, the Cyber Security Industry Alliance, and the Information Technology Association of America. House Homeland Security Committee Chairman Christopher Cox (R-Calif.) says private-public partnerships are necessary to make cyberspace secure. The task force that drafted the report, "Information Security Governance: A Call to Action," was formed at a cybersecurity summit in December.
    Click Here to View Full Article

  • "Major Network Project, Partnership"
    Newswise (04/13/04)

    The Department of Energy's Oak Ridge National Laboratory (ORNL) and the National LambdaRail (NLR) have agreed to share resources, creating one of the most advanced environments for computationally intensive science and network research. NLR focuses on high-speed research networks shared between research universities and private industry groups, while ORNL has critical high-speed switching and time-sharing technologies for supercomputer applications. By conjoining resources, the organizations will make more powerful infrastructure available to users on a time-sharing basis. General-purpose networks aggregate large numbers of users and amounts of data, but scientific networking needs are different, explains ORNL project director Bill Wing. "Scientists need large amounts of dedicated bandwidth, but for relatively short periods of time," he says. Researchers will now be able to schedule tremendously high-speed connections using an advanced switching infrastructure. The specifics of the five-year agreement involve no financial remunerations or legal obligations, but ORNL will provide four 10-Gbit lambdas (light wavelengths used to move data) between Chicago and Atlanta. NLR will provide Department of Energy scientists access to two 10-Gbit lambdas on its nationwide infrastructure as well as membership in the organization. NLR researchers will also have access to the ORNL Center for Computational Sciences computing facility and the Spallation Neutron Source when it comes online in 2006.
    Click Here to View Full Article

  • "The Changing Face of Open Source"
    InternetNews.com (04/09/04); Kuchinskas, Susan

    The open source community continues to conjure up the image of programmers working on a project in their own spare time, but since the mid 1990s much of the code writing has been carried out by developers who work for IT shops. In fact, many programmers involved in open source projects today have been hired purposely by companies to develop open source solutions that address a particular business need. Today, only a few developers write most of the code. "There might be hundreds of people who contribute one line of code," says Bob Bickel of Jboss, a developer of open source server software. Bickel further explains that not too many developers can afford to work for free, adding that the in-house model has become the way to fund software that a company has plans to utilize. Corporate pressure drives companies to demand schedules and roadmaps, and they are not willing to wait for volunteer programmers to find some free time to continue work on an open source project. "The most efficient way is to hire someone to code what you need," says Henry Hall, president of software consultancy Wild Open Source. Most companies are now developing open source in-house, releasing it to the community for review and sometimes for enhancement. Still, others note that the open source community remains wide and varied, pointing to the 79,599 projects listed on SourceForge.net as well as the 830,490 registered users. Linux International Executive director Jon Hall says that "there will continue to be people who work on the Linux kernel just because they like it...Maybe their management will smile upon it and give them a little slack time during the day. If not, they'll work on it at home as a hobby."
    Click Here to View Full Article

  • "Sturdy Quantum Computing Demoed"
    Technology Research News (04/14/04); Smalley, Eric

    Quantum computing uses the quantum states of particles as the basis for computer logic, but those states can be erased by the slightest interference from light, heat, or magnetism. Researchers at the University of Toronto have created a method for protecting quantum computing information from the effects of environmental noise, called decoherence. The team used nuclear magnetic resonance (NMR) to create a four-qubit quantum computer, then encoded the computing instructions into the interactions between those qubits. Qubit pairs share a common waveform that changes simultaneously in response to outside stimuli, so the researchers used the symmetrical portions of the waveform to conduct their computing calculations, rather than the qubits themselves. University of Toronto researcher Jason Ollerenshaw likened the scheme to a chess game, where some spaces are vulnerable to an opponent's attack, but others are protected. The team ran a simple list-check program that uses far less steps than normal programs, and also tested the Deutsch-Jozsa algorithm, which basically allows simultaneous observance of two possible outcomes. Ollerenshaw said that NMR was not a good platform for future quantum computing because signal-to-noise ratio only allows computers of up to 10 qubits, but that it was an excellent platform for quantum computing experiments because the technology is well known. Another University of Toronto team used an optical quantum computer prototype to demonstrate Ollerenshaw's qubit encoding method, running the Deutsch-Jozsa program. The noise introduced in the experiment was artificially produced, and the next step will be to test it using naturally occurring noise. Ollerenshaw estimates practical quantum computers to be available in more than 20 years.
    Click Here to View Full Article

  • "IETF to Lead Anti-Spam Crusade"
    Network World (04/12/04) Vol. 21, No. 15, P. 1; Marsan, Carolyn Duffy

    The Internet Engineering Task Force is tacking the issue of email authentication standards with a new, high-profile working group called MARID. The name comes from the group's task: to build message transfer agent (MTA) authorization records in DNS. The end-solution will not eliminate all spam, but will dramatically reduce the rapidly growing burden, perhaps by 50 to 80 percent, according to former IETF chair and DNS inventor Paul Mockapetris. A scheme to identify spoofed email addresses will not stop spam from coming from trusted sources, such as partner businesses whose computer systems have been compromised by a worm. The MARID solution will be incorporated into the DNS system and create rules for email servers sending messages to certain domains or networks. MARID co-chair and VeriSign network engineer Andy Newton says the problem of spam is growing so quickly that he expects political infighting to be limited. The urgency and strong support of the MARID effort does not mean it will not be controversial. Several large email firms such as Microsoft and Yahoo have already proposed their own email authentication schemes to the MARID working group, though IETF officials say no vendor-proposed solution would likely be accepted if not first altered significantly. The IETF is also working on other spam solutions in its Anti-Spam Research Group, such as real-time exchange of spam filters and email abuse reporting standards. Anti-Spam Research Group co-chair John Levine says MARID will be most beneficial only when it is pervasively deployed on email servers, though he expects large companies such as Citibank to install any MARID solution early in order to improve internal email management. MARID will involve software updates to email servers, but will not mean significant changes to either DNS or Simple Mail Transfer Protocol.
    Click Here to View Full Article

  • "Logic From Chaos?"
    Economist (04/01/04) Vol. 371, No. 8369, P. 82

    Dr. William Ditto of the University of Florida believes that harnessing chaos is another way to make a powerful computer. In March, Dr. Ditto spoke during the annual conference of the American Physical Society about his efforts to base a computer on the mathematical idea of chaos, in which a system is predictable and reproducible. However, Dr. Ditto notes that choosing the starting conditions is key for a chaotic system, and letting it evolve for a short period of time, to minimize any unpredictable behavior. Dr. Ditto believes chaotic elements such as electric circuits, lasers, or neurons can be used in place of the logic gates of conventional computers. In 1998, Dr. Ditto collaborated with Sudeshna Sinha of the Institute of Mathematical Sciences in Chennai, India, to show a chosen chaotic logic element could be performed by varying the threshold value, and dynamically reprogrammed for reconfiguration so that entirely different functions could be carried out with each tick of the clock. The manner in which inputs, outputs, and thresholds of a chaotic system are designed would determine the computer's ability to handle numerous logical operations simultaneously. This would make it possible for calculations to be handled at a faster rate than conventional logic elements. So far, Dr. Ditto and his team have used components such as resistors and capacitors to build an electronic logic element, and have made another logic element by placing a pair of leech neurons on a microchip.
    Click Here to View Full Article

  • "Network of Traffic Spies Built Into Cars in Atlanta"
    IEEE Spectrum (04/04); Guizzo, Erico

    Engineers at the Georgia Institute of Technology have scored a significant advancement in traffic analysis by developing Global Positioning System (GPS) -enabled monitoring devices for automobiles. The monitoring devices, which are connected to the speedometers of older cars and onboard engine-diagnostic computers of newer vehicles, are being used by 500 drivers as part of a project cosponsored by the Georgia Department of Transportation and the U.S. Federal Highway Administration. The monitoring device is able to track second-by-second position, speed, acceleration, and other data and send it to a remote server over a cellular network. The traffic monitoring system is considered to be less expensive than placing wire loops under asphalt to detect vehicles as they pass, and more reliable than cameras connected to computers that track traffic by analyzing pictures. While current systems are often used to monitor highways, Steve R. Gordon, a researcher at Oak Ridge National Laboratory's Center for Transportation Analysis in Knoxville, Tenn., believes the GPS monitoring devices in cars on other busy roadways would be helpful in delivering data to traffic management agencies as they consider making changes to signal timing, diverting traffic from congested routes, and providing travel advisories for electronic signs. "It's one of the most cost-effective ways to reduce congestion, minimize emission, and reduce fuel consumption," says Gordon. Researchers from Georgia Tech are scheduled to meet in May with Georgia and U.S. officials to discuss how to effectively use the data gathered by the GPS monitoring devices.
    Click Here to View Full Article

  • "The Intelligent Internet"
    Futurist (04/04) Vol. 38, No. 2, P. 27; Halal, William E.

    According to George Washington University's TechCast forecasts, 20 commercial areas of Internet use should achieve 30 percent "take-off" adoption levels in the latter half of the current decade, while a new generation of intelligent systems will emerge in parallel with these trends thanks to advances in speech recognition, virtual environments, powerful processors, artificial intelligence, and flat-screen wall monitors. The leading e-commerce applications--broadband, business-to-business, wireless, online finance, entertainment-on-demand, online publishing, e-tailing, e-training, electronic public services--are expected to rise from current 5 percent to 20 percent adoption levels to 30 percent by 2010. More sophisticated applications such as e-voting, e-health, virtual reality, and the global grid will reach mainstream adoption levels sometime after 2010. By about the end of the decade, the PC will evolve into a conversational human-machine interface that can understand users, obey their instructions reliably, and even anticipate their needs. Important milestones in this evolution will be the pervasiveness of reliable speech recognition and virtual robots and environments online by 2010; more powerful computer processors, many of which are already available; intelligent, adaptive computers within 10 years; and widespread adoption of flat wall monitors in a few years. "It will be routine to meet in full-immersion virtual reality for business meetings and casual conversations in five to seven years," predicts computer scientist Ray Kurzweil. Nascent markets of "wired generation" college students and consumers of online entertainment will be ideal for TeleLiving technologies. "What the graphical user interface was in the 1990s, the natural user interface will be in this decade," contends IDC's Robert McClure.



    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM