HomeFeedbackJoinShopSearch
Home

Be a TechNews Sponsor

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM. To send comments, please write to [email protected].
Volume 7, Issue 878:  Wednesday, December 14, 2005

  • "Tech Group Blasts Federal Leadership on Cyber-Security"
    Washington Post (12/13/05); Krebs, Brian

    The Cyber Security Industry Alliance on Tuesday lambasted the Bush administration for its failure to address the proliferation of online crime, arguing that the absence of leadership and accountability endangers the economic and national security of the U.S. The group called attention to a body of recommendations it made a year ago that have gone largely ignored, charging that cyber security has become a diminished priority for the administration, while computer-based crime has reached unprecedented levels, with profits now rivaling the narcotics trade. The group criticized the administration for failing to nominate a top-level executive official to oversee the response to cyber crime over different agencies, despite a congressional directive to do so last year. The alliance also charged that R&D funding for cyber security has held steady at less than 2% of the overall R&D budget, despite the recommendations of Bush's technology advisory committee (PITAC). The administration has also cut cyber-security funding for the DHS by 7% after the agency, along with several others, was issued failing marks for its cyber-security efforts. Industry leaders have grown weary of the government's sluggishness to address the issue, complaining of the needlessly lengthy software certification process, and the secrecy with which the government addresses security issues. "The only leadership I see right now on this issue in the federal government is in trying to hide attacks that have been successful," said the SANS Institute's Alan Paller. There are some bright spots, however, such as the recommendation of the Senate Foreign Relations Committee that the Senate consider the Council of Europe's Convention on Cyber Crime, which would facilitate the prosecution of cyber criminals abroad, as well as several pending privacy and data breach notification bills.
    Click Here to View Full Article

  • "Can State Ignore Its E-Vote Laws?"
    Wired News (12/14/05); Zetter, Kim

    North Carolina election officials will appear in court this week to defend against allegations they ignored the state's laws on the certification of e-voting machines. The Electronic Frontier Foundation (EFF) has leveled the charges against the Board of Elections and the Office of Information Technology Services, claiming they certified machines despite the vendors' failure to place the source code in escrow, as required by the Public Confidence in Elections bill, passed in North Carolina after disruptions in e-voting systems in the elections of 2002 and 2004. While limited to North Carolina, the case has national implications, as many states are scrambling to meet a Jan. 1 deadline to qualify for federal funding for new systems. Vendors who fail to place their source code into escrow and furnish a list of all programmers involved in the development of the software face a felony charge and up to $100,000 in civil penalties for each violation. Under the law, election officials must also examine the source code, which the EEF alleges they failed to do. Diebold was denied a temporary exemption from the statute when the company argued that it was unable to disclose the source code of the commercial Windows CE software that powers its machines, though it would have no problem revealing its own code. The state certified Diebold and Election Systems and Software (ES&S) anyway, despite lingering concerns over the latter's use of Adobe Acrobat software, the source code of which ES&S could not legally provide. Counties must purchase machines by Jan. 20 in order to have new systems in place for North Carolina's May primary. While states are not required to upgrade their systems, the 2002 Help America Vote Act mandates that each precinct have at least one machine that is accessible to disabled voters, a requirement satisfied only by e-voting machines with audio assistance.
    Click Here to View Full Article

  • "About That Engineering Gap..."
    BusinessWeek (12/13/05); Wadhwa, Vivek

    A team of Duke researchers has found that statistics commonly cited to compare the number of engineering graduates in the U.S. with other nations are inaccurate, and that the U.S. actually produces more graduates than India, and that China's numbers are inflated, writes Duke University professor Vivek Wadhwa. While the Duke team argues that the U.S. is far from being in danger of losing its competitive edge, many students are laboring under the perception that engineering is a doomed profession, raising the concern that fear could turn the outsourcing myth into a reality. Commonly referenced figures hold the respective outputs of engineers of the U.S., China, and India at 70,000, 600,000, and 350,000. An exhaustive investigation of the formulation of those numbers found that the word "engineer" has translation difficulties in many Chinese dialects, that several universities reporting figures were operating under different definitions, and that nearly half of those counted received short-cycle degrees, representing two years or three years of schooling, roughly equivalent to an associate degree. In their revised data, the Duke researchers counted 222,335 U.S. engineers, 215,000 Indian engineers, and 644,106 Chinese engineers, though if the criteria are changed to include only four-year degrees, the figures shift to 137,437 engineers graduated in the U.S., 112,000 in India, and 351,537 in China. Far from being in trouble, the study finds that the U.S. higher education system is still the world's finest, and that students must not be discouraged from studying math and science by unfounded fears of outsourcing.
    Click Here to View Full Article

  • "Award-Winning Mobile Multimedia Project Bridges Art and Engineering"
    Innovations Report (12/13/05); Ahola, Eeva

    Helsinki University doctoral candidate Jurgen Schieble has developed MobiLenin, an interactive entertainment system that was recognized as the Best Arts Paper at ACM's Multimedia 2005 conference in Singapore. Combining Schieble's musical skill and mobile technology expertise, MobiLenin displays its creator on a large screen playing a song in many different styles, allowing audience members to determine which style should be played by voting on their smart phones. MobiLenin was well-received in live testing in Oulu, the city that boasts the interdisciplinary Oulu Rotuaari program where MobiLenin has its origins. Rotuaari Director Timo Ojala compares the program to a "living lab" that concentrates on "street research," gauging consumer response to mobile applications through testing in a live setting. While the program is likely to conclude in the spring, Ojala is satisfied with the extensive and respected research that has emerged from the initiative. Schieble plans to expand MobiLenin and apply it to his own concerts, and will continue to evangelize the creative linkages between art and engineering as he participates in forthcoming projects at MIT, Stanford, and Yahoo!'s Berkeley Labs.
    Click Here to View Full Article

  • "It's Gee-Whiz for the Golden Years"
    Washington Post (12/13/05) P. C1; Musgrove, Mike

    In a departure from the traditional focus on youth, miniaturization, and a dizzying array of features, technologists convened at an exhibition this week in Washington, D.C., to showcase a variety of innovations aimed at helping senior citizens manage and improve their lives, such as Pearl, a personal robotic assistant that can tote groceries or dirty dishes when prompted by a basic voice command. Intel demonstrated a watch that could project a reminder to take medication onto a television, or make a phone call, for those wishing to keep the message discreet. The industry of technology built for seniors is expected to surge as baby boomers near retirement. Many of the products use pattern recognition software to assist with patient monitoring, providing a bevy of information that could aid doctors in making diagnoses and report emergencies. A wireless monitoring device from Home Free can fit into a watch or a pendant, monitoring when you go to sleep and how often you wake up, in an effort to alert family members to a potential problem in advance of something actually going wrong. A smart walker developed at Carnegie Mellon helps its users navigate the hallways of a building, and can be summoned by remote control. Other technologies are less visible, such as a bed containing sensors to monitor breathing rate and blood pressure. Researchers have developed games to help Alzheimer's patients keep their minds sharp, and a version of solitaire that monitors players' moves to determine if they are making uncharacteristic mistakes. Addressing the concern that seniors are not ready for a barrage of new technology, Intel's Eric Dishman said, "I think we're treating today's seniors like they are unable to learn something," adding that "we have to stop infantilizing them."
    Click Here to View Full Article

  • "The Baltic Life: Hot Technology for Chilly Streets"
    New York Times (12/13/05) P. A1; Landler, Mark

    Estonia has become an incubator of successful technology companies one year after joining the European Union, and companies in the Estonian capital Tallinn especially have begun to entice significant foreign capital. The former Soviet Union country in Eastern Europe on the Baltic Sea is the home to the music-swapping software Kazaa, online gambling software company Playtech, and telephone-over-IP provider Skype. Skype has 170 employees and approximately 70 million users who have downloaded their software, and it was purchased by eBay for $2.5 billion. The success of Estonian technology companies shows the dynamism of the Internet and how a few programmers with good ideas and minimal investment can succeed even today. Skype itself was founded by two men from Sweden and Denmark, and currently has marketing, development, and corporate offices throughout Europe in addition to the software development hub in Tallinn. The Estonian IT industry employees roughly 2,500--small by American standards--and universities there graduate 200 IT professionals each year. Severe and lengthy winters are a prime reason Estonia software developers have done well cracking code, says Jaan Tallinn, who developed the peer-to-peer technology for both Kazaa and Skype.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Computer Languages Multiply, Pleasing Many--But Not All"
    Wall Street Journal (12/14/05) P. B1; Gomes, Lee

    While the proliferation of languages has been a boon to software programmers, the extensive variety often frustrates their bosses and confounds the larger software companies. C and the subsequent C++ may be the most popular languages in use today, but any programmer working on the Web must also include languages such as Perl, Python, PHP, and TCL in his resume. The explosion has been partially fueled by the ability of an individual programmer or a small group to create and market a language, as was the case with Ruby on Rails, which became an overnight sensation thanks to a 15-minute demonstration video the Danish programmer David Hansson circulated over the Web. Once a language has gained a core following, blogs and Web sites appear to track its developments. Many languages owe their origins to small design firms trying to make a commercial success of themselves, while others are labors of love, as is the case with many open source projects. As new languages continue to emerge, however, more programmers are defecting from mainstream systems such as .NET and Java in favor of niche offerings that are more tailored to a specific project. CIOs are often assailed by complaints from their programmers when they try to impose restrictions on the number of languages that are permissible. While it has been demonstrated theoretically that each language is the rough equivalent of any other, it is no more likely for a consensus to appear within the programming community than it is for a single car to be met with a universal embrace from the entire fleet of motorists.
    Click Here to View Full Article
    (Access to the full article is available to paid subscribers only.)

  • "To Tomorrow and Beyond"
    Financial Times Survey (12/14/05) P. 7; Barron, Alasdair

    The world's fastest supercomputers, already at work solving some of the most complex scientific problems, are likely to see a fourfold performance increase in performance as the next phase of technology takes hold. In the business community, supercomputers will move beyond their roles of processing transactions and supporting databases to offer such features as decision support, risk analysis, and portfolio optimization. In other commercial applications, supercomputers are helping improve simulation in the auto industry, and complex fluid dynamics analysis helped Proctor and Gamble resolve inefficiencies in its Pringles potato chip baking process. Supercomputers are also in use in a variety of climate modeling applications in which scientists hope to better predict natural disasters such as hurricanes and tornadoes through advanced simulation techniques. Promising four times the performance of today's teraflop supercomputers, 10 manufacturers have announced their intentions to create a petaflop system by 2010, only three of which are based in the U.S. High performance, or technical, computers have become an economic imperative for many companies, with sales having increased 30.2% in 2004. Many in the industry bemoan the shortage of talent that the surging industry requires. The next stage of high performance computing will be marked by clustered systems, where several linked PCs create a supercomputer, and affordable gaming consoles with 2 teraflop capabilities. Both the gaming and PC industries are angling to convert their products into home entertainment centers as the commodity-driven phase of computing approaches.

  • "Taking on Rootkits With Hardware"
    CNet (12/13/05); Evers, Joris

    While Intel's recent announcement that it is working to protect against rootkits took many by surprise, the company's Travis Schluessler notes that security technologies have long been a development priority for Intel, despite the fact that PC security has traditionally been the province of the software community. Intel's approach is to blend hardware security with software to address sophisticated, deeply embedded vulnerabilities, such as the rootkit that was recently discovered in millions of CDs distributed by Sony. Intel's System Integrity Services project employs a synergistic technique known as platformization, where by working in concert, the parts of a system outperform the sum of their individual capacities. Because many hackers gain access to a PC through its memory, a hardware-based approach to security helps restrict root level privileges that can be exploited to compromise software. Intel's System Integrity Services platform is designed to detect when a protected program is altered or prevented from running by a rootkit, virus, or worm by creating an isolated execution environment. Intel's technology is complementary to antivirus programs and other software solutions, as it screens for malware that might not carry an antivirus signature, and could be used on any PC to protect any application, so long as the application's contents in the memory do not change. Intel's program is inherently designed to hold up to as new threats continue to emerge, as it simply monitors for changes in critical operations, such as the firewall or antivirus software, which are frequent targets of malware. Because the system is isolated, it is designed not to have a noticeable effect on processing speed, though it does consume some memory bandwidth. The system, which is at the prototype stage, also allows for legitimate uses of rootkits, enabling users to define which changes in a program are authorized.
    Click Here to View Full Article

  • "Technology Leadership Is Key to Security"
    Financial Times (12/13/05) P. 17; McCormick, David

    Leadership in the technology realm is fast becoming the lynch pin of U.S. security and prosperity, writes U.S. Undersecretary of Commerce David McCormick, adding that leadership demands a delicate balance between attracting the top researchers in the world and ensuring that sensitive information to which foreign visitors may be privy does not pass into the wrong hands. While typically viewed as a trade-off between security and commerce, the restrictions on exporting sensitive U.S. technologies must support both, as preserving leadership depends on keeping the technologies close at hand, while an overly restrictive policy will undermine their commercial value. The U.S. is still the world leader in technology and research, though that position is in jeopardy if current trends continue, such as the closing gap between the U.S. and other nations in the number of patent applications and science and engineering degrees awarded. While federal education funding has increased 33% in the last five years, and federal research funding has nearly doubled to $132 billion, Asian universities are producing 47% of the world's engineering students, and almost half of U.S. patents come from foreign-born inventors. Foreign-born researchers accounted for almost 57% of postdoctoral science and engineering positions at U.S. universities in 2001, a figure that highlights the imperative of attracting the best minds in the world, but that also raises security concerns, given the widespread practice of foreign countries sending scholars and other professionals to the U.S. to collect sensitive data. The Commerce Department has arrived at a solution to preserve U.S. security without compromising research where access to sensitive technology is based on the country in which a foreigner holds citizenship or last held residence, which it recognizes as a more accurate indicator of true allegiance than his country of birth.

  • "IT Sector Joins Forces to Remove Barriers to Women"
    Computerworld Australia (12/12/05)

    Australia's tech industry has developed a mentoring program with hopes of solidifying a role for women executives in information technology. The Women in IT Executive Mentoring (Witem) Program is designed to provide women with a fast track for career development, coaching and visibility from top executives, opportunities to improve skills and performance, and a clear career path. "Ultimately our goal is to establish a legacy of cross-organizational corporate culture that facilitates the sharing of ideas and experience, increases loyalty and commitment by employees, and fosters an environment that attracts, retains, and progresses female talent," says Joe Kremer, managing director of Dell Australia. Other participants include Altiris, Cisco, EMC, Ingram Micro, Intel, LAN Systems, and Lexmark. Women who have experience, commitment, and a drive to succeed in IT will be selected for the 12-month program. Witem comes at a time when women account for just 20.5% of the IT industry, compared with about 44% of the overall work force. Also, the number of women studying computer science is on the decline; they have accounted for approximately 25% of computer science students in recent years, according to the Australian Bureau of Statistics.
    Click Here to View Full Article

  • "E-Voting Deadline Looms"
    Technology Review (12/13/05); Wood, Lamont

    Election officials in the states that accepted a portion of the $3.2 billion Congress offered to improve voting systems through the Help American Vote Act must decide on replacement equipment by Jan 1. E-voting systems have been the source of considerable controversy, however, as even direct-recording systems, designed to produce a measurable paper trail, have drawn criticism from disability advocates, who contend that the hearing impaired have no way of knowing how their votes are cast. The controversy is also punctuated with security concerns, such as the recent suit filed by the Electronic Frontier Foundation against the North Carolina Board of Elections for certifying machines without obtaining access to their source code, as state law requires. Federal election standards are voluntary and vague, and even if any new measures are taken up, they will not be in place in time for the 2006 election. The Government Accountability Office has detailed numerous instances of e-voting machines malfunctioning and disrupting elections. Voter advocacy groups charge that the machines are susceptible to programmer error, and that optical scanners are more reliable, as they produce an auditable paper trail. Thirty U.S. states have laws requiring voting machines to produce a paper trail, compelling those that have already adopted e-voting machines to supplement them with printers to generate a written, auditable record. Voting machine manufacturers contend that there are already auditing capabilities in place in each system, and that adding a printer is a prohibitive and unnecessary cost. E-voting machines were used by 25% of the voters in the 2004 election, compared to 39.8% who voted via optical scan.
    Click Here to View Full Article

  • "Program Sorts Terrorism Data"
    Daily Bruin (12/08/05); Lowe, Shauntel

    UCLA researchers Rafail Ostrovsky and William Skeith have created code that organizes online gathering, encryption, and discarding of terrorist-related information without anyone having knowledge of what data is kept by the agency and what data is thrown out. Agencies currently use a method where they gather loads of possible terrorist-related information off the Internet and then go through it in a classified setting so the top-secret process of analyzing the data is protected. UCLA computer science professor Ostrovsky and graduate student Skeith have been working on the code for the past year and a half, but the actual software program has not been developed yet, which Ostrovsky estimates may cost between $300,000 and $400,000 and take six months. Ostrovsky cites the Sept. 11 government report that indicated not enough information was collected fast enough as motive for developing the program. "This is one tool in the arsenal of many tools since 9/11 that are being developed, that can potentially help to stop future terrorist attacks," he says. Industry experts are hopeful the program the can be used to fight terrorism around the world.
    Click Here to View Full Article

  • "Steady Growth Triggers Optimism Among IT Pros"
    IT World Canada (12/08/05); Lombardi, Rosie

    The Canadian high-tech sector is on an even keel with pay levels that show smooth and stable growth, reveals a recent survey by Mercer Human Resource Consulting. Mercer says the steady environment has generated a greater sense of optimism among IT professionals. An average salary increase of 3.4% for 2006 is being projected by employers across the sector, which compares favorably with the cost of living. The survey also indicates a steady pace of growth among technical experts such as hardware and software engineers who are supporting new business development and growth in the total number of employees compared with 2004. Employers are more willing to reward top performers, such as an increase in total cash compensation from 15% to 45% for senior high-tech sales professionals, says Bushen, which shows companies are having better years and will pay more for highly skilled staff with a strong track record of developing new business and increasing revenue. Nevertheless, author and industry expert Rajesh Setty says that as IT skills become increasingly commoditized, IT professionals worldwide are being challenged to differentiate their skills. Instead of chasing the latest hot technology, Setty recommends that technology professionals develop expertise in specific areas such as user training.
    Click Here to View Full Article

  • "From "Toy Story" to "Chicken Little""
    Economist Technology Quarterly (12/05) Vol. 377, No. 3456, P. 24

    Computer animation in the movies has made astounding leaps and bounds since the first full-length CGI film, Pixar's "Toy Story," caused such a sensation at the box office. The animation technique has remained consistent: Drawings and illustrations of characters and other elements are converted into digital wireframe models that are "rigged" to move, colored and textured via "shader" software, and finally animated. Hardware and software advances can facilitate more sophisticated modeling and rendering, and enhance the imagery by adding more realism and detail to environments and characters. But the increasing speed of computers seems to be lagging slightly behind the complexity of CGI movies. Among the technological challenges companies are still struggling with is the realistic depiction of fur and water, and crowd animation. The usual approach to CGI crowds is to animate a small number of characters and duplicate them, explains DreamWorks Animation CTO Ed Leonard. He notes that a newer method involves assigning behavioral traits to groups and characters and animating them as a coherent group. The most formidable challenge for computer animators is depicting people, particularly human faces, realistically; attempts to do so in such films as "The Polar Express" have turned off viewers because of a lack of subtlety in digital interpretations of human movements and expressions. Leonard is less an advocate for realism in computer-generated characters than a supporter of characters capable of identifiable emotional expressions.
    Click Here to View Full Article

  • "New Brain Trust to Work Like the Web"
    eWeek (12/12/05); Baker, M.L.

    Informatics software maker Teranode and Science Commons plan to launch in the second half of 2006 NeuroCommons.org, a shared platform that will make it easier and faster for researchers to obtain information on brain function and disease. The company will provide the infrastructure for the data repository and the means for storing data about genes and proteins in common formats, while the nonprofit will develop the interface for finding and analyzing content and create a community of users. The interface will enable neurologists to find the datasets and protocols of other researchers, in addition to working models on the interaction of genes, proteins, and brain regions, similar to the way in which a Web search engine finds relevant Web sites. In addition, finding content and analyzing the results could be automated, which means researchers will not have to spend days searching literature and databases for data, according to Teranode CMO Matthew Shanahan. "The thought that a scientist can do that manually efficiently doesn't make sense; you really need the aid of software now," says Shanahan. Teranode will put the data into resource description format (RDF), the mark-up language for the Semantic Web, providing users with the ability to intuitively search and mark up each other's data.
    Click Here to View Full Article

  • "The Engineers Are Feeling Gloomy"
    Portland Business Journal (OR) (12/09/05); Earnshaw, Aliza

    Findings from a new survey are in line with recent reports that have revealed a lack of optimism from engineers about the future of their field. The more than 4,000 engineers surveyed by Portland technology public relations firm McClenahan Bruer Communications and CMP Media of Manhasset, N.Y., are also concerned about the quality of math and science education in the U.S., and the ability of the country to continue to be the leader in technology and innovation. Engineers also say they are frustrated because fewer Americans are graduating with engineering degrees, more software and electronic design work is being sent abroad, and U.S. students are not performing as well as foreigners in math and science. "There's no money in it, there's nothing but layoffs, and it's all being outsourced to India," says one engineer. Others talk about the lack of respect engineers receive compared to lawyers and physicians, and some respondents say they would not advise their children to follow their career path. Just 10% of engineers said they were confident that the U.S. would always remain the leader in technology and innovation, and many are concerned about the decline in federal government spending on basic research.
    Click Here to View Full Article

  • "Streamlined Databases Drive Military Simulation"
    Military & Aerospace Electronics (11/05) Vol. 16, No. 11, P. 34; Ames, Ben

    The fidelity and reliability of military simulations is steadily improving thanks to advances in display technology, data processing, portability, and, most importantly, software. Among the challenges of modern-day military simulation is the construction of digitized environments from databases compiled from classified government information; accurately simulating complex vehicles such as helicopters; enabling perfect synchronization of the various simulated elements; and recording each simulation run for instructional replay. Frank Deslisle of L-3 Link Simulation & Training notes that technology has improved to the point where simulations of geospecific areas can be built within a matter of months or days instead of years, and Link has made strides in display technology through such innovations as the SimuSphere visual display system and the advanced helmet-mounted display. Flight simulators used by the U.S. Army's 160th Special Operations Aviation Regiment-Airborne are fed accurate real-life scenarios from classified databases, and conventional simulation technology is unsuitable because of the time it takes to integrate databases. Linking several simulators in different locations for the purpose of team training is another challenge, but CAE's Dave Graham says his company will soon supply a unified environment/database to the 160th regiment that allows smooth storage/retrieval of any subsystem conforming with the common-database standard. Simulators can be built for less money now thanks to the growing use of commercial off-the-shelf components, while splitting functions up among a set of PCs is another cost-saving strategy that requires designers to frame-synchronize results. Some flight simulations require supercomputers to accurately, reliably, and quickly model elements such as missiles in flight, while managing the supercomputers' heat output is a core challenge.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM