Be a TechNews Sponsor

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM. To send comments, please write to [email protected].
Volume 7, Issue 821:  Wednesday, July 27, 2005

  • "Bush's High-Tech Report Card"
    SiliconValley.com (07/25/05); Puzzanghera, Jim

    Silicon Valley executives give the Bush administration's efforts to shore up the U.S. high-tech industry a failing grade, complaining that too many goals remain unrealized. The White House received high marks for prioritizing education, passing class-action lawsuit reforms, and strongly promoting international trade; but the administration has raised the ire of critics with its failure to more aggressively facilitate the expansion of high-speed Internet access, raise the National Science Foundation's federal budget for basic physics and engineering research, or impede the passage of new regulatory accounting rules for stock options that many Silicon Valley companies are opposed to. A frequent complaint among high-tech executives is that leading figures in the Bush administration--including Bush himself, Vice President Cheney, and Commerce Secretary Carlos Gutierrez--follow an "old-economy" way of thinking and are loyal to industries with the same mindset. Co-chair of the President's Council of Advisors on Science and Technology Floyd Kvamme believes the Bush administration essentially understands the high-tech industry's needs, but admits the White House handled the stock options issue poorly. White House Office of Science and Technology Policy director John Marburger claims physical-science research funding has risen substantially since 2001, but argues the high-tech industry must take a more pragmatic view since so many issues are vying for the White House's attention in the post-9/11 environment. Rep. Anna Eshoo (D-Calif.) characterizes the Bush administration's high-tech agenda as "rudderless and visionless." Information Technology Industry Council President Rhett Dawson is more charitable, noting that the removal of obstacles to innovation, though important to the White House, must compete with other high-tech industry priorities.
    Click Here to View Full Article

  • "Japan Plans Mind-Boggling Number-Cruncher"
    New Scientist (07/26/05); Knight, Will

    Japan announced plans on July 25 to construct an amazingly powerful supercomputer for modeling climate change, the formation of galaxies, and new drug behavior that will likely cost between $712 million and $890 million and involve the participation of NEC, Hitachi, Kyushu University, and other Japanese institutions. The machine is expected to boast a peak performance of 10 petaflops, making it five times faster than the collective computing speed of the 500 fastest existing systems in the world. The current record-holder is Blue Gene/L, a joint venture of IBM and the U.S. government with a peak speed of 136.8 teraflops. Blue Gene/L and other leading supercomputers are designed around specialized, custom-made hardware instead of off-the-shelf chips, and the University of Tennessee's Jack Dongarra believes Japan's 10 petaflop machine will employ a hybrid architecture that perhaps utilizes several different types of processors. The development of Blue Gene/L and other systems was spurred by Japan's capture of the top supercomputing spot with NEC's Earth Simulator three years ago, and the announcement of the 10 petaflop machine is likely to rekindle East-West rivalry for the supercomputing throne. The Japanese supercomputer should be completed in 2011, according to officials from Japan's Ministry for Education, Culture, Sports, Science, and Technology. Dongarra says the announcement "opens up new horizons in science."
    Click Here to View Full Article

  • "Companies See 'Crisis' in R&D"
    National Journal's Technology Daily (07/25/05)

    A lack of leadership in federal cybersecurity research and development funding will have serious long-term consequences for the United States without a quick resolution, according to a Cyber Security Industry Alliance (CSIA) report released on July 25. The organization says cybersecurity has been sidelined as a federal R&D priority, even though cybercrimes have risen 1,295 percent over a recent five-year period. CSIA also criticized President Bush's June decision to disband the President's Information Technology Advisory Committee. "The loss of this independent committee's expertise and advice reduces the priority level of cybersecurity R&D, which will continue to dissipate without an advisory body to oversee R&D," warned CSIA executive director Paul Kurtz. The group noted that the Homeland Security Department's science and technology division plans to devote just 2 percent of its $1 billion budget for 2005 to cybersecurity. The CSIA report recommended that coordination of private and government cybersecurity initiatives should be the responsibility of a single federal office, while a national, long-term plan for federal and private computer system security should be mapped out within a decade. The group also called for additional congressional hearings on cybersecurity, and a government/industry alliance to develop more R&D projects.
    Click Here to View Full Article

  • "Insecurity at Black Hat"
    CNet (07/27/05); Evers, Joris

    Issues to be discussed at this week's Black Hat Briefings conference and the subsequent DefCon event include new kinds of hacker exploits and targets, such as weaknesses in antivirus software. Internet Security Systems (ISS) researchers Neel Mehta and Alex Wheeler plan to disclose vulnerabilities in antivirus products, which are becoming more and more attractive to cybercriminals as their presence on PCs, servers, network gateways, and mobile devices expands. Mehta said he and Wheeler will demonstrate hacks that exploit known and patched flaws in antivirus software instead of new, as-yet-unpublished security bugs. Mehta said ISS has found flaws in offerings from McAfee, Symantec, F-Secure, and Trend Micro in the past year. A recently released Yankee Group research paper concludes that hackers are targeting security software for weaknesses as Microsoft Windows becomes harder and harder to easily penetrate. Few Black Hat papers and presentations will concentrate on Windows hacks, while other forms of hacker intrusion to be detailed at the conference include the penetration of Windows PCs through the use of USB keys, workarounds for Oracle database encryption, and potential vulnerabilities in Cisco's Internetwork Operating System.
    Click Here to View Full Article

  • "Privacy Guru Locks Down VoIP"
    Wired News (07/26/05); Zetter, Kim

    PGP email creator Phil Zimmermann will present his prototype for encrypting VoIP calls later this week at the BlackHat security conference in Las Vegas. Zimmermann says VoIP calls are susceptible to eavesdropping, and that hackers can sabotage calls and reroute them to an alternate number; he is concerned that just as people embraced email with little concern over the security risks, VoIP is attracting users who unwittingly assume that the calls they place are as secure as those that pass over the Public Switched Telephone Network. VoIP usage has swelled from 5 million customers in 2004 to 11 million today. Internet phone calls are not as easy to intercept as emails, but freely available software can tap into a VoIP network and record calls. VOIP Security Alliance Chairman David Endler notes that a protocol already exists for encrypting VoIP data, and VoIP phones with that option are already available. However, he says the lack of VoIP attacks as well as the complexity involving in using public key infrastructure-based encryption has likely turned off users. Zimmermann says his program doesn't use PKI, which he sees as needlessly complex. Though Zimmermann is cagey about many of the details of his program, he notes that he is still working through some non-security related bugs. The software will only work if both users have it installed on their phone or computer, and Zimmermann believes it will function both as a feature manufacturers can add on to VoIP phones and as software to be installed on a user's laptop. He believes his VoIP encryption program will receive a warm embrace from a security-conscious consumer base that seeks to shield itself from privacy encroachments such at the Patriot Act.
    Click Here to View Full Article

  • "Change in Daylight-Saving Time Could Confuse Some Programs"
    IDG News Service (07/26/05); Cowley, Stacy

    Congress is expected to soon pass the Energy Policy Act of 2005, which proposes a four-week extension of daylight-saving time (DST), an adjustment that could confuse applications and electronic gadgets programmed to re-calibrate their internal clocks according to the "summer schedule" the United States has adhered to for almost 20 years. Such adjustments, which are handled inconsistently worldwide, have long been a source of irritation for programmers and systems administrators. However, industry officials do not foresee costs and disarray on the order of Y2K: No Forrester Research or Gartner analysts are investigating the effects of a DST rescheduling, according to representatives from both research firms. In addition, several large vendors expect the proposed DST change's impact to be minor. "When the operating systems are updated to recognize the new dates, most of our products would automatically use the updated information," assures Bob Gordon of Computer Associates International. A Slashdot discussion on DST effects attracted a wealth of comments, including one from a consultant who saw the change as a business opportunity; he posted that "You might say there is nothing to really worry about here, but all the more reason to sell yourself to clients." The pending energy bill would move the start of DST from April to March, while its end would subsequently shift from October to November. Legislative supporters claim the adjustment would save 100,000 barrels of oil per day.
    Click Here to View Full Article

  • "Two Professors Go Fishing for Phishers"
    San Francisco Chronicle (07/25/05) P. E1; Kirby, Carrie

    Stanford computer science professors John Mitchell and Dan Boneh are leading a team developing anti-phishing tools designed to help email users avoid bogus Web sites and prevent crooks from stealing other peoples' passwords. The SpoofGuard software plug-in the team created last year examines each site visited by users for signs of phoniness, and alerts them if it spots anything suspicious. A second plug-in, PwdHash (password hash), scrambles the password typed into a site and creates a unique sign-on for each visited site; should a user sign on to a spoofed version of a legitimate site and be fooled into typing in his password, PwdHash will prevent the phishers from acquiring the same password the authentic site got. In addition, PwdHash addresses users' tendency to employ the same password at many different sites, which means thieves' attempts to log on to as many sites as they can with a PwdHash-scrambled password will fail. PwdHash will be unveiled at a Baltimore security conference next week, while Boneh expects to release a third tool, the SpyBlock Trojan horse key-logging software deterrent, in six months. The tools are freely available as browser plug-ins on the Stanford Web site, although the researchers would prefer that such solutions are embedded within the major browsers.
    Click Here to View Full Article

  • "China Not a Big Tech Innovator, But It's Spending a Lot on R&D"
    Investor's Business Daily (07/26/05) P. A1; Tsuruoka, Doug

    Some experts say China's lack of technology innovation is keeping its economy from joining the ranks of world leaders such as the United States, although the country is spending billions of dollars on research and development. "China is making a lot of progress in technology, but they are still primarily a tech follower," says Rand political scientist Roger Cliff. He says the chief source of R&D funding is the Chinese government, which takes a heavy-handed approach to investment. On the other hand, a June report from Pentagon analyst Michael Pillsbury cites notable Chinese tech achievements such as a supercomputer that makes 11 trillion calculations per second, the design of a Pentium-style chip, and research into the development of molecular-scale nanotech devices. But CNA analyst Dean Cheng points out that Chinese software development has been poor, compared to hardware development. He also says the Cultural Revolution's bloody purges left the country's educational system in disarray and an entire generation illiterate. The result has been a shortage of Chinese researchers, even though the government is trying to step up its production of technical graduates. As China becomes more prosperous, the gap between the haves and the have-nots (which comprise about 75 percent of the Chinese population) will widen, cultivating social unrest that could represent a major threat to China's economy, according to Cheng.

  • "The Automated 'Virtual Commentator' for Video Content"
    IST Results (07/27/05)

    A "virtual commentator" algorithm developed by the IST-funded COGVISYS project can automatically create textual descriptions of video streams by assessing specific cues from the video input signal, which are translated into conceptual representations and then converted into natural language descriptions of video sequences. This technique in turn enables the checking of the computational processes that are employed to produce descriptions of complex external events from video recordings. Project coordinator Hans-Hellmut Nagel says the COGVISYS system could potentially be used as a video content search tool. A user seeking a video sequence on the Internet can enter a text search query, and "the system searches for suitable videos, interprets the content on the fly, and decides if the match is good enough to report back to [the user]," he explains. Working demonstrators of the COGVISYS approach were successfully developed as a vehicle driver assistance application, an application for converting U.S./U.K. sign language into text and then into speech, and a TV program analysis application. Nagel believes the COGVISYS system could find use in the domain of elderly care, as a tool that monitors the activities of aged residents in conjunction with a household video camera, and alerts caregivers when anomalous actions are observed. He says such technology could be commercially rolled out within a decade. Part of the COGVISYS software is available for download under an open source license, and some applications can be viewed on the project Web site.
    Click Here to View Full Article

  • "Dual Roles: The Changing Face of IT"
    IT Management (07/26/05); Gaudin, Sharon

    The erosion of U.S. computer programming and other low- to mid-level tech jobs as a result of offshoring and outsourcing is causing the IT worker's profile to change, concludes a new report from Forrester Research. Surviving senior-level job holders--CIOs, project managers, vendor managers and the like--must supplement their IT knowledge with refined business skills as well as a business-oriented mindset. Forrester research director Laurie Orlov estimates that five years ago the ratio of white collar IT jobs to blue collar jobs was 1 to 1, but Forrester analysts expect that ratio to have shifted to 3 to 1 by the end of the decade. Orlov also stresses that the new breed of IT worker, while more knowledgeable of the business side of his or her industry, must still stay up-to-date on the latest technologies. "They need to be conversant with [IT] so they can inspect the technologies being used [by outsourcers] and map them to your own architectural standards," she explains. Dice President Scott Melland agrees with Orlov's assessment about the changes the IT industry is experiencing, but is unconvinced that this shift is unfolding as quickly as the Forrester report indicates. "One of the big differences today, in terms of managing IT operations, is that the job is becoming much more one of managing inhouse and outsourced resources," he comments.
    Click Here to View Full Article

  • "Traffic Model Maps Congestion"
    Technology Research News (08/03/05); Patch, Kimberly

    Researchers at Oxford University in England have modeled networks in an effort to determine the optimum passage of traffic. Their model, a wheel with spokes on the inside, applies with equal validity to the flow of blood, street traffic, and computer networks, because in all those situations there are costs and benefits for each pattern of travel. Going around the periphery means no congestion, but a greater distance traveled; cutting through the central hub minimizes the distance traveled, but heavy traffic adds time to the trip. Oxford physics professor Neil Johnson found that by quantifying the cost of traveling through the center based on the time added by congestion, his team could determine the optimum number of connecting roads to be added to a hub before the benefit of shortening the distance was outweighed by increases in travel time. Their study found the ideal number of connections to the hub in a 1,000-node network to be 44. To reduce street traffic, planners could engineer optimal travel efficiency through the introduction of tolls or longer lights in a city's center, and Johnson also raised the possibility that a network could adapt and change its pattern in response to traffic, as he suggests fungi may do as they transport nutrients from one point to another. The next level of research the team plans to undertake will involve more complex networks drawing on the findings of the current study, "like a network within a network," said Johnson. In partnering with another research team, the Oxford group will examine the patterns of fungi and how biology regulates and optimizes their traffic flow.
    Click Here to View Full Article

  • "Grid Group Issues Security Requirements"
    eWeek (07/25/05); Vaas, Lisa

    The Enterprise Grid Alliance (EGA) issued a list of security requirements identifying and addressing vulnerabilities common to grids in the hope of spurring more widespread adoption of grid technology by enterprises, including provisioning and deprovisioning, and the potential weaknesses of a grid management entity. Though the list is not binding in any way, it can nonetheless serve as a checklist for any enterprise assessing its security profile. One recommendation is to ensure the solidity of standard security between the management entity and the grid components. The EGA Security Requirements expand on the group's May release of its Reference Model, which defined a body of grid terms and outlined the management and life cycles of the elements necessary for enterprise grids. EGA believes that grids may offer greater security than siloed areas, as they offer centralized management within a grid, and even if one component succumbs to a DoS attack, the application has a good chance of continuing to function. The industry still has reservations about the security of grids, as some believe that systems on public networks compromise application architecture. There also remains doubt if the EGA standards will bring some cohesion to a diffuse field that can include linked storage systems, linked processing nodes, and virtual access environments. The goal of standards such as EGA's is to minimize the labor associated with operating and administering grids while enhancing security.
    Click Here to View Full Article

  • "The Weird Web and Other Safety Concerns"
    InternetNews.com (07/25/05); Needle, David

    Tech visionary Bill Joy, speaking at the recent AlwaysOn Innovation conference, discussed how during his tenure at Sun Microsystems he envisioned the concept of a "Here Web" that is always accessible through mobile devices. He also talked about the as-yet-unrealized "Weird Web," in which humans are connected to any number of objects via sensors. Joy said the Here Web concept is unfolding at a much slower pace than he expected, but cited manifestations of its presence in such technologies as iMode cell phones that support interactive gaming in Japan, and the Palm Treo combination Web browser/personal information manager/mobile phone. Joy discussed the Device to Device Web paradigm, in which "All devices that have electricity will get connected in a worldwide embedded sensor net." Technologies such a network could support include pacemakers that call doctors through cell phone links, and shoes that communicate with a PC to record the wearer's walking performance. Bill Joy's controversial opinion that technology needs strict management in order to avoid potentially disastrous consequences, which many people have criticized as alarmist, reared its head at the conference. Silicon Graphics scientist Jaron Lanier argued that society cannot advance without new technology, and pointed to the U.S. government's "tremendous retreat" from long-term science funding as the latest sign of an anti-science trend sweeping the country.
    Click Here to View Full Article

  • "Making Purchases at Your Fingers' Ends"
    Baltimore Sun (07/24/05) P. A1; Walker, Andrea K.

    Biometrics, the technology that identifies a person through certain physical attributes, is enjoying greater popularity in the consumer retail arena, where some companies are offering a payment option where, by pressing a finger to a screen, a customer's payment information is retrieved and billed directly. In the midst of rising concerns over identity theft, many consumers are looking for more secure methods of payments, though privacy advocates fear that biometrics technologies mount a greater threat to personal security, as they involve storing considerable personal information in databases. Companies such as BioPay and Pay By Touch are betting that a consumer's desire for convenience will be the major engine to propel their biometrics retail applications, though initial response has been muted due to retailers' fear of bugs. Biometrics has a long history dating back to at least the 14th century, and 20 years ago the Army tried using handprints for ATM use, but Jim Wayman, director of San Jose State University's Biometrics Identification Research Program, says, "Much of this stuff has never gotten out of the pilot stages." But as prices drop and the technology develops, many are betting that it will be the way of the future. Finger scan devices are making inroads in smaller retail settings, as national chains are typically slower to commit to new technologies. For customers to sign up, they visit a store that offers the service, have their fingers scanned, and enter the credit or debit card information they wish to have linked to their fingerprint. The machine converts a set of data points from the fingerprint into an encrypted number or an equation and adds it to a database. The companies offering the technology are quick to remind users that the process cannot be reversed to forge a fingerprint with the data points.
    Click Here to View Full Article

  • "Co-opting the Creative Revolution"
    BBC News (07/15/05); Twist, Jo

    Digital thinkers say organizations will have to get used to distributed groups of people working together to innovate on content now that more powerful and easy-to-use computing tools are in their hands. At the Technology, Entertainment and Design (TED) conference in Oxford, U.K., digital futurists cited as an example the emergence of the mountain bike in the United States, and how a few northern Californian consumers collaborated to create something that specifically met their need. Similarly, people are taking advantage of blogging, services, peer-to-peer distribution of content, grid computing, and open source software to produce and share content. For example, people are using tag-based applications, and the keywords make it easier to classify content usefully and for others to find it. The people who are using tagging tools may have no idea how a real library is organized. Digital authority Clay Shirky says these people may not be credentialed librarians, but they are likely to determine how content is classified online in the years to come. Over the next 50 years, companies and other organizations will struggle with the creative contributions of Internet users due to patent and copyright concerns, digital experts say.
    Click Here to View Full Article

  • "Buggy Software: Up From a Low-Quality Quagmire"
    Computerworld (07/25/05) P. 23; Hildreth, Sue

    CIOs are studying how software bugs are introduced into the application development process and why they seem so resistant to prevention in an effort to stave off the tremendous losses in revenue, production, data, and customer satisfaction such flaws can entail. Experts on bad software blame the problem on poor application life-cycle management (ALM), and note that initiatives to improve software quality must encompass every stage of the software's existence--from planning through development, testing, and maintenance. Gartner analyst Theresa Lanowitz estimates that about 90 percent of all IT organizations are in the dark when it comes to effective ALM, and she concludes that the majority "waste quite a bit of their budget because they have bad business practices, fail to deliver on requirements, and fail to manage projects to meet schedule, cost, and quality goals." Clear communication between developers, testers, and business users must be established at the outset of the application's life cycle, and Tescom Software Systems' Arthur Povlot says most quality assurance problems can be traced to poor requirements. Sorin Fiscu with the Berkshire Life Insurance Company of America says developers should subject their code to specific QA tests before passing it on to the QA staff, while configuration management and change management policies and tools can help enforce a standard code creation and testing process. Once the code is passed off by developers, it must be rigorously tested for functionality, integration, performance, security, and any program changes or updates. Povlot recommends the creation of test cases for all the application's most crucial requirements. To maintain the software's quality after deployment, data collected during production must be re-entered into the requirements planning of the next iteration.
    Click Here to View Full Article

  • "Seamless Communications Closer as 3G, LAN Fuse"
    Nikkei Weekly (07/18/05) Vol. 43, No. 2192, P. 16; Hoyama, Taisei

    Researchers at Japan's National Institute of Information and Communications Technology (NICT) are working on a next-generation mobile network project that integrates third-generation cellular and wireless LAN technologies, overcoming protocol incompatibilities to combine the ubiquity of the former with the economy of the latter. The potential for a phone conversation to transfer from a cellular network to a LAN without interruption could have the commercial appeal of lowering a user's phone bills. To fully integrate all the wireless networks, NICT researchers must use IP for all data transmissions. Aiding this is the trend of telephone carriers building IP-based networks, which reduce business costs, leading to the more widespread adoption of mobile phone networking based on IP. The four-year, $9 million project, at the Yokosuka Radio Communications Research Center, involves 20 companies, including Sharp Corp., Denso Corp., and Mitsubishi Electric, as well as three universities. NICT lead wireless applications researcher Masahiro Kuroda says, "The technology will allow you to easily select the (communications) system that best fits your environment at any particular time and location." The researchers are also working on technology that would enable cell phones to automatically switch between 3G and wireless LAN networks. The researchers are also working on a plan dubbed "Metro Mobile Ring Network" that would connect fixed wireless base stations with a cable ring to enable high-speed mobile Internet technology. Such a system would make it possible for cell phone users on a fast-moving train, for example, to watch on-demand movies.

  • "Voting Machine Standards Move Forward"
    Today's Engineer (07/05); Costlow, Terry

    The widespread adoption of electronic voting systems in U.S. states and territories necessitates the continuous upgrading of voting machine and voting machine software standards, and the IEEE and the National Institute of Standards and Technology (NIST) are working on such standards. The IEEE Standards Coordinating Committee 38 (SCC 38) is developing a pair of e-voting standards: One that covers e-voting equipment requirements and assessment techniques for voting machine manufacturers or buyers, and another for formats to be employed by voting system components for electronic data interchange. The U.S. Election Assistance Commission (EAC) is considering proposals for voluntary voting systems guidelines from NIST's Technical Guidelines Development Committee, and IEEE SCC 38 Chairman Stephen Berger says "a lot of the IEEE material will be included in the EAC document and later we'll publish a separate IEEE standard that other countries could adopt." A representative of the EAC says the commission is focused on the augmentation of existing standards rather than the development of completely new standards. Updated standards can not only keep costs to a minimum, but also assure buyers that the products they are purchasing are appropriate. Stanford University professor David Dill says the deployment of e-voting equipment has been hampered by a dearth of standards, which provokes anxiety among equipment purchasers that the standards might change. Issues over the susceptibility of electronic equipment and the data they capture and store to tampering must be resolved through the collective effort of voting equipment designers, developers, and implementers, while standards committees are establishing the foundation for secure systems.
    Click Here to View Full Article

    For more on e-voting, visit http://www.acm.org/usacm.

  • "One Week With the Gurus"
    Software Development (07/05) Vol. 13, No. 7, P. 46; Wayne, Rick; Morales, Alexandra Weber; Lum, Rosalyn

    The 18th annual Software Development West Conference and Expo was a beehive of discussion about topics ranging from project agility to programming languages to user interface design. Keynote speaker and IBM computing pioneer Jerry Weinberg wryly observed that computing technology's awesome advances over the past half-century or so have been tempered by mostly unchanged human attitudes; debugging methodology, for example, is as hard now as it was 50 years ago because many general principles of computing and testing have not changed, yet people do not seem to realize this. Sun Microsystems Java architect David Hecksel moderated a panel discussion on agility where he talked about his "System and Method for Software Methodology Evaluation and Selection," for which he is seeking a patent. He described his invention as not so much an enforceable patent as a blueprint for an application. The panelists voiced divergent views on programmers' need for documentation, but most agreed that extreme programming, though useful, requires considerable customization for projects. A panel of elite computer language inventors discussed how literature, mathematics, and learning plays into software development: JavaScript, C, and English were cited by panelists as the best languages, while SQL co-creator Don Chamberlin said computer languages were overstuffed with features because of "a tendency to feel like something that's more complicated is better." Perl creator Larry Wall reported an apparent breakdown in communications between language and mathematics, while Python creator Guido van Rossum disputed the idea that math forms the basis for programming. Classic System Solutions President Jim Hobart presented a tutorial on modern UI design that emphasized listening skills as paramount.
    Click Here to View Full Article
    (Access to this full article is available to paid subscribers only.)

    [ Archives ]  [ Home ]