HomeFeedbackJoinShopSearch
Home
 
ACM TechNews sponsored by Learn more about Texis, the text-oriented database providing high-performance search engine features combined with SQL operations and a development toolkit, that powers many diverse applications, including Webinator and the Thunderstone Search Appliance.   Learn more about Texis, the text-oriented database providing high-performance search engine features combined with SQL operations and a development toolkit, that powers many diverse applications, including Webinator and the Thunderstone Search Appliance.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Thunderstone or ACM. To send comments, please write to [email protected].
Volume 7, Issue 808: Friday, June 24, 2005

  • "Bush Dissolves IT Advisory Group"
    Federal Computer Week (06/23/05); Sternstein, Aliya

    The President's IT Advisory Committee (PITAC), a congressionally mandated committee established in 1997 by President Clinton to provide guidance on various IT policies, has been shut down by President Bush. The latest executive order for PITAC expired on June 1, and no new members have been appointed. The group consists of academic and industry experts that oversee the Federal Networking and IT Research and Development Program (NITRD). PITAC's last report, issued June 16, outlined a multi-decade roadmap for computational science, called for a fast-track study to find ways to use government funding to connect computational science research from various institutions, and provided guidance on such topics as high-end computing and software sustainability centers, among others. PITAC members said the group next had planned to broadly re-examine IT research as a follow-up to its 1999 report. Dan Reed, vice chancellor and CIO at the University of North Carolina at Chapel Hill and leader of PITAC's Computational Science subcommittee, says, "People are a little demoralized about the fact that PITAC hasn't been renewed." Reed, the next chairman of the Computing Research Association, hopes PITAC will reform soon.
    Click Here to View Full Article

  • "Election Auditing Is an End-to-End Procedure"
    Science (06/24/05) Vol. 308, No. 5730, P. 1873; Selker, Ted

    Electronic voting machines are becoming more accurate, as the most significant drops from 2000 to 2004 in the residual vote rate, which measures uncounted votes, occurred in states that rely on electronic methods of ballot casting. Due to lackluster record-keeping, though, lingering doubts that undermine public trust in the machines remain. Congress' proposal to attach printers to direct recording electronic (DRE) voting machines to generate a paper trail has many drawbacks, including increased costs and the possibility of voter tampering. To resolve ongoing security concerns, advances in code are required, but subordinate to the need for exhaustive testing to determine conclusively that there is no malware. To foil potential hackers, Maryland and California have employed a method known as parallel monitoring, where on Election Day a random set of machines is taken out of service. Test ballots are then cast and checked for accuracy, and because parallel monitoring is concurrent with live voting, hackers have no advanced knowledge of which units actually count. That ballots are simple for voters and poll workers to use remains the most important step toward ensuring an accurate election. Other measures are in the works, such as the Low-Error Voting Interface (LEVI), which promises to improve accuracy by 50 percent. The best solution will be a more accurate, user-friendly DRE, with built-in redundancy and manipulation guards, that enables voters to view and correct their votes combined with a secure paper trail. Audio prompts have also been shown to help voters navigate the ballot. In the end, however, some of the most obvious fixes, such as the improved supervision and training of poll workers and clear ballot designs, will have the most immediate impact, and need to be thoroughly reassessed in a public, non-partisan fashion that will better inform polling machine legislation, writes CalTech/MIT Voting Technology Project co-director Ted Selker.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

    For information about ACM's e-voting activities, visit http://www.acm.org/usacm.

  • "An Army of Soulless 1's and 0's"
    New York Times (06/24/05) P. C1; Labaton, Stephen

    By luring Internet users with an enticing offer just one click away, hackers are seizing control of thousands of computers that they can then deploy to attack other Web sites or crack security codes. These computers, known as zombies, are compromised when their user takes the bait and clicks on the offer, which immediately downloads software onto their computer, enabling it to be controlled remotely, frequently without the user's knowledge. By marshaling thousands of computers to request a given Web site's page simultaneously, known as a denial-of-service attack, hackers can effectively shut down a site. Al Jazeera, Microsoft, and the White House have all had their sites come under this sort of attack. The numbers of zombie computers are growing, as CipherTrust reports that in May, 172,000 new zombies were identified each day, compared to 157,000 the previous month, with hackers preying especially on Chinese computers that do not offer software protection from zombie attacks. High-speed connections have enabled hackers to target individuals within households, who are typically the most vulnerable. Due to their high bandwidth, computers on college campuses are also popular targets. One case currently being prosecuted in New Jersey involves one online merchant hiring a hacker to create zombie networks to attack the sites of two competitors, so that anyone attempting to view those sites would be met with an error message. Cautious clicking and comprehensive antispam and antivirus software are the best ways for users to protect themselves, but individual users, who often "don't have the knowledge to protect themselves," said CipherTrust's Dmitri Alperovitch, "pose a threat to all the rest of us."
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Net Guru Predicts Another 10 'Wild' Years"
    CNETAsia (06/23/05); Yu, Eileen

    The future of the Web demands a class of politicians sufficiently well-versed in technology to introduce intelligent, forward-thinking legislation, according to David Farber, computer science professor at Carnegie Mellon University. Fearful that the Web could come to generate more problems than solutions, Farber believes "the next 10 years will be as wild as the last 25." In a climate where hackers are increasingly motivated by personal greed, Farber insists that security must remain the industry's top focus. Farber noted that "politicians don't like the Internet...they don't like losing control." It is important not to lose sight of the Internet's value as a communication tool, however, said Tan Geok Long, CTO of Infocomm Development Authority of Singapore, adding that "government has to ensure that the quality of information is high so user trust can be maintained." Farber emphasized the challenge of regulating the Internet without trampling on First Amendment rights. In detailing his concerns for the Internet's future, Farber lamented that "software vendors are saying that researchers should not be allowed to look at the source codes, or say anything [publicly] if they find a security flaw. Or otherwise the vendors would threaten to throw them in jail. And that's wrong."
    Click Here to View Full Article

  • "Pentagon Creating Student Database"
    Washington Post (06/23/05) P. A1; Krim, Jonathan

    The Department of Defense on Wednesday started working with BeNow, a private marketing company, to formulate a database of high-school students between the ages of 16 and 18 to assist the military in locating possible recruits in a period of falling enlistment in certain branches. The database will include personal data such as birth dates, Social Security numbers, email addresses, grade-point averages, ethnic background, and what subjects the students are studying. Privacy activists say the plan seems to be an effort to go around laws that limit the government's right to obtain or retain citizen data by using private companies to perform the work. Certain data on high-school students is already presented to military recruiters in a different program under provisions of the No Child Left Behind Act. School systems that do not offer that data risk losing federal money, although individual parents or students can keep information to themselves that would be sent to the military by their districts. The new system would mean that more information would be gotten from commercial data brokers, state drivers' license files, and additional sources, including data already possessed by the military.
    Click Here to View Full Article

  • "Quantum Computer Springs a Leak"
    New Scientist (06/25/05) Vol. 186, No. 2505, P. 18; Buchanan, Mark

    Physicists from Leiden University in the Netherlands have demonstrated the limits of quantum computing designed around ever-smaller quantum bits, or qubits, in which stored information is manipulated. Qubits involved in computation must be isolated from their environment since any outside disturbance can lead to "decoherence" and ruin calculations. Coherence is more difficult to attain in larger qubits since there is more chance of interaction with surroundings. To limit this effect, researchers are trying to develop microscopic qubits using superconducting circuits on silicon chips or with drops of semiconducting materials containing free electrons. In essence, qubits can be constructed out of individual electrons and photons. But the Dutch researchers have shown that there is a "universal decoherence rate for qubits," with the smaller the system the shorter the limit for decoherence. This means that quantum data will inevitably leak away after a certain time, even with no outside disturbance. Instead of staying in a superposition of two states, a qubit will spontaneously collapse into one state or the other.
    Click Here to View Full Article
    (Access to full article is available to paid subscribers only.)

  • "Supercomputers Tackle More Everyday Tasks"
    Investor's Business Daily (06/23/05) P. A5; Brown, Ken Spencer

    Supercomputers are gaining practical applications and becoming widely used, from the manufacturing of consumer products such as potato chips to facilitating graphics animation. As prices drop and systems gain speed, advanced computer systems are no longer relevant only to government, many businesses have found. Suzy Tichenor of the High Performance Computing Initiative believes that "high-performance computing really needs to become part of the fabric of the innovation infrastructure in the U.S." She argues that increased spending and training will lead to a future with greater adoption of supercomputers, but adds that it is important not to overlook existing technologies. Supercomputer technology helps companies such as Procter and Gamble model product and packaging innovations without costly testing. Falling prices have also been a significant catalyst to the animation industry, allowing for the cheaper, faster rendering of images. Technology such as Hewlett-Packard's "utility rendering," has enabled DreamWorks Animation SKG to strive for better than twice the annual output of rival Pixar. Other applications include clothing design and crime solving, where the technology could recreate the actual circumstances of a crime through means such as eyewitness accounts and bullet trajectories.

  • "Better PC Security Years Away"
    TechNewsWorld (06/22/05); Mello, John P.

    The immediate future of secure computing will more closely resemble a mainframe than a PC, until an enhanced operating system and better hardware are developed. In the meantime, researchers are working on technologies to improve PC security, such as the Trusted Platform Module (TPM), which establishes a secure hardware zone inside a PC to confidently support security programs. Intel, AMD, and Microsoft are also jumping on board with their own PC security applications. Intel's Chad Taggard said, "What we're doing with this hardware and the Trusted Platform Module is taking best known security methods and putting them where people can't tamper with them." AMD's technology solves the "warm boot hole" problem that opened the door to hackers accessing data in a computer that had just been restarted, with its power still on, by wiping the immediate memory. Microsoft's next Windows version, code-named Longhorn, will be vital to their own Next-Generation Security Base (NGSCB), as well as the future of the secure PC in general, though by some estimates the technology will not be fully actualized until 2009 or 2010. Computer Associates' John Bedrick cautioned, "These aren't going to be a panacea for everything." He adds that while there are no sure bets, "what we all try to do is improve what we have and try to get ahead of the curve as much as possible," allowing that hackers will evolve just as security technologies do.
    Click Here to View Full Article

  • "Sun Tries Sharing Java Again; Still Not Open Source"
    CNet (06/21/05); Shankland, Stephen

    At next week's JavaOne conference in San Francisco, Sun Microsystems will discuss plans for GlassFish, a project that provides a window to the code under its Java Research License (JRL), though it stops well short of offering open-source access. Part of the company's "share" campaign, "GlassFish is a renewed partnership between Sun and the larger enterprise Java community," says Sun's Web site. The software, which will allow users to view source code and post suggestions for its improvement, is "Sun's way to try to generate a community to fix bugs and create test cases and add value to the Java platform for free," says analyst Anne Thomas Manes. Part of Sun's reluctance to present Java in open-source format stems from the fear that others could take the code and generate incompatible versions. As a compromise, Sun sees GlassFish as reaching out to developers who have long clamored for involvement in the code's evolution, though important pages such as frequently asked questions are only available to users who sign a licensing agreement. While similar to their previous Sun Community Source License (SCSL), GlassFish will employ a simpler mechanism to view its code, though if it is intended "for a productive use" or distribution, Sun's Web site insists that users sign their commercial agreement.
    Click Here to View Full Article

  • "Is IT Unfriendly to Women?"
    TechRepublic (06/21/05); Armstrong, Judy

    Women, comprising 20 percent of the IT labor force but more than 50 percent of the overall workforce, face significant barriers from the IT industry. The Department of Labor Women's Bureau reports that women earn just 9 percent of the bachelor's degrees related to engineering and fewer than 28 percent of those awarded for computer science, down 37 percent in the last 20 years. Especially rare are women at the CIO and executive levels. Many women are deterred from pursuing IT careers because the time commitment often comes at the expense of their family, and a recent survey found that more than half of the women in IT are working more hours than they had expected. The general absence of women deprives the field of the distinct approach they take to problem-solving and the compassion they bring as managers, as well as simply reducing the overall supply of talent. As children, girls often do not receive the same focus from educators about computers, and many commercial products, such as computer games, are marketed to boys. Adopting the traditionally male approach to networking in the name of advancement would help the woman's cause, as would the industry's adoption of more flexible scheduling and telecommuting to help women with family commitments. Women also must rise to the challenges of IT, confidently pursuing those projects that no one else wants to tackle to stand out and further their careers. Many female CIOs and executives are met with resistance from their staff simply because they are women, but the best defense against such discrimination is to keep an even keel and concentrate on building relationships and not overcompensating to prove one's competence.
    Click Here to View Full Article

    For information on ACM's Committee on Women and Computing, visit http://www.acm.org/women.

  • "When Computers Play Games, Artificial Intelligence Is the Key to Victory"
    Stanford University (06/20/2005); Madden, Kendall

    The achievement of better general game playing (GGP) is the subject of an article by Stanford computer science professor Michael Geneserith appearing in this summer's AI Magazine. "Programs that think better should be able to win more games," said Geneserith, who believes games that think for themselves will transcend the Deep Blue model that merely reflects the intelligence of the programmer. Unlike Deep Blue, the celebrated chess program, GGP relies on a computer's ability to learn and comprehend rules. Reflecting a return to original computer science theory, GGP learns information and applies it to new situations. Hodge-podge, a game that runs chess, checkers, and tic-tac-toe at the same time, highlights the difference between human and computer intelligence because the computer cannot distinguish among the different sets of rules. Geneserith's research in GGP seeks to overcome that problem. He envisions GGP yielding value for businesses that seek to avoid costly reprogramming when new regulations arise. Stanford will host a GGP competition this July at the American Association for Artificial Intelligence conference in Pittsburgh. As he seeks to bridge the gap between human and artificial intelligence, Geneserith defines intelligence as "synthesizing a wide array of information and making a decision."
    Click Here to View Full Article

  • "A Wireless World, Bound to Sockets"
    Washington Post (06/19/05) P. A1; Noguchi, Yuki

    Tech companies are focusing more on preserving and prolonging battery life as consumers accumulate gadgets such as handhelds, cell phones, iPods, laptops, and digital cameras. For consumers who have embraced the wireless world, recharging such devices has become a troublesome issue. A professional who goes on a business trip may have to carry a number of chargers for the various gadgets, and sit on the floor of an airport to get close enough to a wall to use a plug. Those who are not on the road are likely to have their wireless devices charged in the kitchen rather than use their toasters and coffeemakers, and the devices dominate sockets and outlets when it comes time to turn in for bed. Device users continue to demand longer life, although improvements to power and circuitry are being made each year. Intel expects to have new battery technology available by 2008 that will render laptops usable for approximately eight hours without external power. Convergence of the technologies onto one gadget is lauded in the tech community, but manufacturers have been slow to standardize chargers. The average consumer has 5.5 power devices, according to Charles R. Mollo, CEO of Mobility Electronics, which makes, iGo, a universal power adapter that can charge different devices, and charge several at a time.
    Click Here to View Full Article

  • "New Version of Linux Kernel Released"
    Techworld (06/21/05); Broersma, Matthew

    A new version of the Linux kernel has been released for the first time since Linus Torvalds changed systems for managing the kernel source code, which has slowed the pace of development. Version 2.6.12 of the Linux kernel, which comes more than three months after version 2.6.11, offers support for Trusted Platform Modules (TPM) chips, a hardware-based security scheme that stores cryptographic keys, passwords, and digital certificates on the motherboard. A driver has been introduced to support the embedding of security measures in hardware, including TPM devices from National Semiconductor and Atmel. Also, enhancements have been made to IPv6, SELinux, the Software Suspend feature, and the device mapper; upgrades have been made to drivers for DVB, USB, networks, and sound chips; and improvements have been made to the CIFS, JFS, and XFS file systems. Another major change is the addition of an address space randomization feature that neutralizes viruses.
    Click Here to View Full Article

  • "File Systems That Fly"
    InformationWeek (06/20/05) No. 1044, P. 43; Ricadela, Aaron

    As more and more supercomputers are cobbled together from inexpensive, off-the-shelf PCs, disk drives, Ethernet cables, and Linux, the latest developments in file-system software for these clusters will alter how companies purchase storage and bridge the gap between the immobility of data and the huge jumps that have been made in the speed of microprocessors and memory. Input-output speeds rise considerably with cluster file systems such as the Department of Energy-backed open-source Lustre, which increases speeds from the hundreds of megabytes per second range to 2 GB per second per computer, and increases in proportion with the number of attached computers. Hewlett-Packard, a distributor of Lustre, will soon release the second version of Scalable File Share, a Lustre-based product that will boost the speed of clustered Linux machines to 35 GB per second, and double the total storage to 512 TB. By distributing blocks of data throughout hundreds of thousands of servers, Lustre avoids the trap of traditional file systems where one computer opening a piece of data can prohibit or slow another computer's access. As clusters now claim 296 of the world's 500 fastest supercomputers, companies are also starting to redefine the way they shop for storage space, placing more of a premium on speed over size. "Lustre has gotten a remarkable amount of traction," says Myricom CEO Chuck Seitz of a marketplace relied upon increasingly by the science and business communities. Increased disk-communication speeds will be especially welcome to the banking, automobile, and aerospace industries, as well as others that rely on input-output enabled software. While it shows boundless promise, Lustre is still relatively new and unproven, and is incompatible with systems not powered by Linux or Catamount, drawbacks that have kept some managers from rushing to adopt it.
    Click Here to View Full Article

  • "Large Users Hope for Broader Adoption of Usability Standard"
    Computerworld (06/20/05) P. 1; Thibodeau, Patrick

    A three-year-old usability standard should gain greater acceptance in the business community when it is approved by the International Standards Organization. The standard, called the Common Industry Format for Usability Test Reports (CIF), is expected to gain steam once it is adopted internationally. The European demand for communication across borders will help solidify the standard's position. CIF reports usability test results in a common format that gives prospective software buyers an idea of "the real costs of ownership," said Jack Means of State Farm. CIF, developed jointly by Microsoft, Intel, IBM, and others, owes the impetus for its creation largely to Boeing, which was experiencing costly usability issues. Thomas Tullis of Fidelity Investments, another supporter of CIF, says the standard helps companies make sound purchasing decisions because "the usability of the software that you buy on an enterprise-wide basis potentially has a really significant impact on the productivity of your employees." CIF will enjoy greater acceptance once customers start to ask vendors for usability reports before they purchase software, though that could substantially alter developers' method of building applications. The standard was accepted by the ISO's technology standards committee last month, and now awaits full ISO approval.
    Click Here to View Full Article

  • "Humanistic Approaches for Digital-Media Studies"
    Chronicle of Higher Education (06/24/05) Vol. 51, No. 42, P. B26; Murray, Janet

    The absence of a universal standard for evaluating the effectiveness of technical applications in the humanities leaves designers unsure of how to best improve their digital projects, such as online newspapers and interactive museum exhibits. In developing a curriculum for teaching principals of digital media, an interdisciplinary approach that nourishes dialog between the more technical fields of computer science, engineering, and architecture and the humanities is needed. The digital-media program at Georgia Tech draws students from diverse academic backgrounds, and insists that they spend some time working collaboratively outside their area of specialization to gain the breadth of experience they will need in their professional lives. One required course, "Project Studio," challenges students to produce a finished digital-media product, such as a computer game, in one semester with faculty direction on how to apply fundamental design principals to a project that will enhance their professional portfolio. To get past the interdisciplinary quagmire that stymies many emerging academic departments, Georgia Tech had its digital and new-media faculty compose lists of texts for students to select in preparation for their final exams. That process highlighted the areas of common interest in each other's work. As other schools adopt digital-media programs at the undergraduate, graduate, and PhD levels, it is imperative for the emerging discipline to avoid the fissure between analysis and creation found across academia, where the evaluators and the creators seldom communicate, writes Janet H. Murray, a professor at the Georgia Institute of Technology's School of Literature, Communication, and Culture.
    Click Here to View Full Article

  • "Lift Off at Last?"
    InfoWorld (06/13/05) Vol. 27, No. 24, P. 39; Snyder, Jason

    The 2005 InfoWorld Compensation Survey reveals that although company performance, salaries, bonuses, job opportunities, and spending in the information technology sector are all improving, dissatisfaction and uncertainty about the industry is also on the rise. More IT workers have low morale because their workload has increased over the years as their companies trimmed their staff and spending, and they (IT staff more than managers) are also concerned about an increase in outsourcing work overseas. While overall IT salaries increased for the first time in three years, rising 2.7 percent, senior managers saw a 3.3 percent dip in their base pay, but a 36 percent surge in bonuses, which suggests that companies want to tie upper-level compensation to company performance. The survey of 1,510 IT professionals also reveals that the percentage of middle managers who say they are final decision-makers has increased by more than 50 percent, but the mounting years of little improvement in budgets and take-home pay has more mid-level managers open to work opportunities elsewhere. At the same time, respondents say their employers do not respect IT and view it as a cost center, and 45 percent add they are dissatisfied with IT's perceived value, compared with 37 percent a year ago. New job opportunities and higher salaries will result in some job-hopping, but IT workers feel no more secure about their jobs than they did last year. Mark Ohlund, vice president of technology strategy at a third-party logistics company, says IT will earn respect once it starts improving how technology supports the business side of a company. Bonuses tied to business performance appears to be a strategy companies are using to get their point across to IT managers.
    Click Here to View Full Article

  • "The People Own Ideas!"
    Technology Review (06/05) Vol. 108, No. 6, P. 46; Lessig, Lawrence

    Stanford Law School law professor and author Lawrence Lessig writes that Brazil and other nations are pushing a movement to erect a "free-culture" economy atop a platform of free software. He draws a parallel between the free-software and free-culture movements by noting that their genesis stemmed from developments that stripped a practice of its freedom (the advent of proprietary code in the case of free software and a broadening of copyright regulation's scope in the case of free culture), and that activists in both movements are attempting to restore that freedom through the use of technology and law. When American copyright law was amended to automatically shield any creative work with a federal copyright in the absence of registration, renewal, or trademark, the consequences for digital content were not immediately felt. But the emergence of digital technologies has dramatically extended copyright regulation to encompass ordinary uses of the material. Copyright owners' determination to regulate the use of digital technologies through digital rights management (DRM) is an issue of concern to the free-culture movement, because DRM effectively prevents people from applying other people's creative works in new contexts, a practice essential to cultural growth that Lessig terms "remixing." He illustrates his point by noting that current copyright rules make the remixing of copyrighted digital content an act of infringement. Brazil's free-culture movement does not reject fundamental economic principles, as most people assume. The movement seeks to restructure copyright law to align it more closely with digital technology so that creativity and cultural growth can continue to flourish.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM