HomeFeedbackJoinShopSearch
Home

Be a TechNews Sponsor

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM. To send comments, please write to [email protected].
Volume 7, Issue 850:  Wednesday, October 5, 2005

  • "E-Voting Report Could Push Audit Trails"
    CNet (10/04/05); McCullagh, Declan

    Electronic voting machines should be equipped with voter-verifiable paper audit trails, says a recent report from an election commission headed by former President Jimmy Carter and former secretary of state James Baker III. The report claims such a measure would enable recounts when required, make citizens more confident that their votes are accurately recorded, and permit a random assortment of e-voting machines to be tested for accuracy. Researchers have warned that a lack of audit trails makes e-voting machines susceptible to problems such as viruses, software bugs, and malign programming. The report will probably spur initiatives by state and federal lawmakers to institute some form of audit trails. Electionline.org estimates that at least 18 U.S. states have enacted rules of some kind requiring polling places to provide voter-viewable hardcopy records.
    Click Here to View Full Article

  • "Text Hackers Could Jam Cellphones, a Study Says"
    New York Times (10/05/05) P. C1; Schwartz, John

    Metropolitan cell phone networks could be crippled by hackers who launch denial-of-service attacks against the phones' Internet-accessible text-messaging services, according to a study from Pennsylvania State University researchers. The study's lead researcher, computer science and engineering professor Patrick McDaniel, says hackers could hinder voice calls by clogging the control channel for cell phone calls with text messages. McDaniel and colleagues say they validated the feasibility of this scenario by demonstrating it on a small scale with their own cell phones, and their findings were corroborated by government regulators and phone company engineers. Cellular companies insist they have established deterrents to address the threat, though experts such as Cigital CTO Gary McGraw believe the solutions will likely be inelegant. The Penn State researchers' report cites the impracticality of severing the link between the phones' short messaging services and the Internet gateways, but suggests security could be added by restricting the message traffic that is fed into the network. Fencing in voice and data in next-generation cell phones to prevent traffic jams from blocking voice calls is another recommendation of the paper, which will be posted online and presented at the 12th ACM Conference on Computer and Communications Security (CCS'05) in November. Aviel D. Rubin, technical director of Johns Hopkins University's Information Security Institute, says, "Anytime a vulnerability in the physical world exists that can be exploited via computer programs running on the Internet, we have a recipe for disaster."
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "3-D Movies Piggyback on Digital Cinema Supply Chain"
    InformationWeek (10/03/05); Sullivan, Laurie

    Popular filmmakers are pushing for a return to 3D cinema thanks to the emergence of more advanced technologies that promise to make the dubious 3D experience of yesterday obsolete. Standards are currently lacking, but 3D film technology has hitched itself to the supply chain for digital cinema. Notable advances over older 3D film technologies include In-Three's dimensionalization process for converting standard 2D films into 3D during postproduction. Digital 3D employs a single, standard digital light processing cinema projector and single server capable of dual streaming, while using two servers linked together is another option for theaters. The LCD battery-equipped eyewear, however, is bulky and expensive, but DLP Cinema's Glenn Kennel says work is under way to develop cheap, lightweight glasses that could lower the cost tenfold. Real D, meanwhile, uses Z-Switch technology involving an active polarizer attached to the front of the projection lens, which polarizes each alternating image so it can only be viewed through the lens of disposable, low-cost glasses. Real D CEO Josh Greer expects roughly 1,000 screens to become capable of running 3D movies within a year and a half, "assuming we can get enough projectors." A panel of industry executives discussed the business, production, and distribution trends of digital cinema and 3D at an event hosted by the Entertainment Technology Center at the University of Southern California and ACM's Los Angeles Siggraph chapter.
    Click Here to View Full Article

  • "Female Equation"
    Washington Times (10/03/05) P. B1; Widhalm, Shelley

    There are fewer women professionals in the math and computer science fields because fewer female college and university students are pursuing studies in those areas. Jill Landsman, with the Technology Student Association in Virginia, attributes this downward trend to a lack of female role models. Mary Jean Harrold, with the National Science Foundation's Advance program, says girls taking computer science classes may perceive a computer science career as socially isolating and personally unrewarding. National Alliance for Partnership Equity executive director Mimi Lufkin thinks such views are nurtured by the competitive environment of computer science classes and their emphasis on theory and individual performance rather than practical application and teamwork, while additional discouragement can come from the media and parents. Anita Borg Institute for Women and Technology CEO Telle Whitney says, "What women often express is that they do feel alone. They look around and don't see people who look like them." Colleges in the Washington, D.C., metro area are attempting to provide female role models for students: Sanjay Rai, dean of Montgomery College's science, engineering, and mathematics department, says more than half of his department's personnel are women. American University recruits female faculty members to encourage higher enrollments of female students in its math department, according to AU professor Mary Gray. In addition, the AU math department encourages students to socialize with faculty members or each other through special events.
    Click Here to View Full Article

  • "The Time Is Now: Bust Up the Box!"
    New York Times (10/05/05) P. E1; Markoff, John

    Thanks to the millions of miles of fiber-optic cable telecom companies have laid, the vision of seamless and inexpensive interconnection among computers is fast becoming a reality. The era of integration is exemplified by Google, which holds an inordinate network capacity to power its more than 100,000 processors spread across a dozen data centers around the world. New Web services are linking programs running on disparate servers and in different locations, a so-called "mash-up" of services that often is offered for free; Google is a frequent target of mashing efforts, as consumers, scientists, and corporations alike tap into Google's extensive network to distribute their services. The optical circuits that can carry up to 10 billion bits of information per second, known as lambdas, are linking machines around the world to create a new breed of supercomputers that transmit data at the speed of light. The exponential increases in network speed have virtually eliminated geographical boundaries, and facilitated technologies such as Speciflic 2.0, a form of distributed cinema to be unveiled later this month at the University of California, San Diego, that will project a movie onto clustered displays featuring both live and filmed actors. "The story isn't just told, it's experienced," said Adriene Jenik, an associate professor of computer and media arts at UCSD. The university's iGrid workshop has seen a network capable of transmitting 100 billion bits of data per second, as well as the optical networks that are enabling the next generation of supercomputers, such as the National Lambda Rail and Teragrid. Hurricanes Katrina and Rita have underscored the practical necessity for networks that can transmit information without delay, as some relief efforts were impeded by slow computer connections.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "USC's Michael Arbib"
    Technology Research News (10/03/05); Smalley, Eric

    University of Southern California Fletcher Jones Professor of Computer Science Michael Arbib describes the exchange of ideas between brain studies and machine design as his area of interest. He believes exploiting advances in data collection as they take place will help us better understand the brain and mind. Arbib envisions emotional robots in the future, but says emotions that are useful for robot-to-robot interaction do not necessarily have to resemble those of "mammalian humans." Arbib says the ability to learn will be essential if machines are to perform effectively in unfamiliar and dynamic surroundings, but perhaps even more critical will be a series of "high-level descriptors" that can quickly establish practical initial conditions for learning by rapidly sorting out environmental features that can be explained by schemas obtained elsewhere. To determine the key social challenges associated with current cutting-edge technologies, Arbib suggests that we "turn the question round and get our heads straight about what are the important social questions, then ask where the solutions lie--whether in education, technology, politics or elsewhere." Future technological breakthroughs he foresees include the integration of molecular biology, medicine, nanotechnology, and computer-aided diagnosis, which will yield a methodology for developing drugs customized for the individual patient's body chemistry and ailments.
    Click Here to View Full Article

  • "A New Battlefield: Ownership of Ideas"
    International Herald Tribune (10/03/05); Kanter, James; Shannon, Victoria; O'Brien, Kevin J.

    The generation of ideas is an essential component of economic growth and competition, but experts in the governmental, academic, and corporate domains agree that this concept is under threat as ideas are redefined by governments and companies as assets that must be jealously guarded. The free exchange of ideas is battling against software patents and other measures to control innovative concepts for profit; companies, inventors, distributors, and even entire countries are ensnared in this tug of war, which experts warn could stifle true innovation and harm competition. "Our standards-setting process risks being corrupted by having people filing for, and getting, any patents they want," says John Lerner of Harvard Business School. Governments have boosted the appeal of patent ownership by providing companies more safeguards against copying and other challenges through long terms of ownership and sympathetic rulings. Meanwhile, patent offices around the world are evaluating patents with less rigor in an effort to reduce their massive application backlogs, resulting in the approval of patents of dubious quality. Attempts to rein in the rampant patenting of ideas in Europe and elsewhere are being undone by diverging opinions on the best way to protect ideas. Critics say the rising tide of patent applications complicates the task of evaluating the value of the underlying concepts. Brandeis University dean of arts and sciences Adam Jaffe says, "Today's collaborative technologies are presenting a real challenge for patent law, and for the kinds of thinking that emerged at the time of Thomas Edison and Alexander Graham Bell, when individual inventions seemed much more distinct, much less complex."
    Click Here to View Full Article

  • "Emotional Intelligence May Be Good Predictor of Success in Computing Studies"
    EurekAlert (10/04/05)

    The emotional intelligence of students plays an indirect role in how well they excel in information technology studies, according to a study by researchers at Virginia Tech's Pamplin College of Business. The study involved the participation of more than 600 undergraduates, both minorities and non-minorities, at over 40 U.S. institutions. The experiment evaluated how well students in computer science and information systems functioned under stress and how their grades reflected their levels of emotional intelligence, described as "the ability to perceive, assess, and positively influence personal and others' emotions." Research team member France Belanger says coping tactics and emotional intelligence were measured to determine whether something greater than innate intelligence is needed to tackle the challenges of rigorous curricula, and the researchers concluded that students with higher emotional intelligence levels were more self-confident and aware that they could effectively cope with any problems, which subsequently fed into their enhanced academic performance. "One of the implications of these findings is that computing curricula might need to be redesigned to include emotional intelligence training, which is a learnable skill," notes Belanger. The study is part of a three-year, National Science Foundation-funded project exploring how intrapersonal as well as interpersonal variables play into the recruitment and retention of students in IT studies, with particular emphasis on minorities.
    Click Here to View Full Article

  • "This Laser Trick's a Quantum Leap"
    Wired News (10/04/05); Hudson, John

    Researchers at the Australian National University's Laser Physics Center haves successfully slowed down a pulse of laser light, trapped it within a praseodymium-doped silicate crystal, and released it. ANU's Dr. Matthew Sellars says this represents a milestone in the quest for quantum computing, since information can be mapped onto the slowed-down light, transferred to the crystal, and recalled when the pulse is released. "If we can store the light pulses for a very long time, we have a memory that operates on a quantum scale," he says. Scientists can use photons to map data onto light beams, and photons exhibit a property known as quantum superposition, in which their spin is oriented both up and down at the same time until they are measured or observed. Superposition can be harnessed to generate a basic quantum unit of information, or qubit, that can represent several values simultaneously. This gives quantum systems far more efficiency than classical systems, which translates into superior processing speeds for quantum computers.
    Click Here to View Full Article

  • "Linus Torvalds Outburst Sparks Fierce Debate: Does Open Source Software Need Specs?"
    SYS-CON (10/03/05)

    In a recent posting on the Linux Kernel Mailing List, Linus Torvalds blasted specs as a method for developing software, claiming they pay more attention to theory than reality and introduce a needless level of abstraction that often falls flat when put into practice. Torvalds cited the OSI network model protocols, which he characterized as "classic spec-design, which had absolutely zero relevance for the real world." He admitted that specs could serve as a useful framework for discussion, but that they should have no bearing on the actual design. Feedback to Torvalds' posting was varied, as some immediately took him to task, insisting that specs, like actual code, were by definition approximations, while others supported his view that their use is limited. One respondent noted that many original Unix specs were quite good, while later standards such as HTML and SOAP were well-written, but ultimately functioned poorly due to sloppy designs. One post criticized Torvalds for being too narrow in his definition, claiming that his view of a spec as an unchanging template incapable of adapting throughout the development process is not held by anyone in the industry. Another reader claimed specs are essential to commercial projects, as they serve essentially as a contract, defining the scope and function of a given body of code. By way of mediation, one respondent sought to clarify Torvalds' position by arguing against the misconception that programs can be written through the progressive revision of specifications, when really specs are useful insofar as they can guide a discussion, but must yield to technical requirements as the development process unfolds.
    Click Here to View Full Article

  • "Looking Into the Future"
    The Shorthorn (10/04/05); Dowden, Cole

    University of Texas at Arlington professors Diane Cook and Larry Holder have been awarded a $500,000 Defense Advanced Research Projects Agency (DARPA) grant for SUBDUE, a project to develop a computer program that perceives interesting patterns in data presented in graph form. The program can easily analyze large data fragments as well as examine and pinpoint relationships between data points. SUBDUE arranges its output in a graph of the positive phenomenon and then compares that information to more graphs; this gives researchers the opportunity to blend the information and outline a "predictive model to identify emerging criminal networks." The program essentially learns from itself, and DARPA believes Cook and Holder's research has potential analysis applications for a broad spectrum of data. The employment of artificial intelligence in the program is not without controversy, since opinions on the subject range from fears AI could subjugate mankind to skepticism that our current technology can deliver AI's promises. Other contributors of funding to the SUBDUE project include the National Science Foundation, NASA, and the Texas Higher Education Coordinating Board.
    Click Here to View Full Article

  • "E-Voting Experts Call for Revised Security Guidelines"
    Security Focus (10/03/05); Lemos, Robert

    The National Science Foundation-funded A Center for Correct, Usable, Reliable, Auditable, and Transparent Elections (ACCURATE) saved its suggested reforms to the U.S. Election Assistance Commission's recommended process for assessing electronic voting system security for the last day of the public comment period on EAC's Voluntary Voting System Guidelines. The center's researchers stated that security is not built into the design of current voting systems, while current testing procedures ignore security in favor of functionality. ACCURATE director and Johns Hopkins University computer science professor Avi Rubin criticized e-voting machines for their lack of public testing and increasing incomprehensibility to average voters. The ACCURATE researchers have recommended the establishment of public and transparent procedures for testing and certifying e-voting systems, and the collection of data on Election Day to ensure better system evaluation. Many system vendors have balked, asserting that such measures would threaten their intellectual property or permit reckless claims against their products to be made. Some technologists believe complete transparency should be incorporated into e-voting systems by basing them on open-source software, and a template for such a system is being developed with funding from the Open Voting Consortium. Rubin said open source software will not solve all problems associated with e-voting system security, noting that the code would require intense auditing and careful maintenance. In addition, he warned that the inclusion of any proprietary software in the system would endanger the system's overall security.
    Click Here to View Full Article

  • "Can Sun, or Anyone, Make DRM Better With Open Source?"
    IT Manager's Journal (09/30/05); Lyman, Jay

    Sun Microsystems is looking for partners for its open source digital rights management (DRM) initiative aimed at fostering innovation and compensating rights holders, though some are concerned about the restrictions Sun's DReaM (DRM everywhere available) Project could entail. Sun's Open Media Commons is the company's latest effort to achieve open, royalty-free DRM across the industry, and Sun also plans to apply DRM more broadly and develop a forum for creators, owners, and consumers that will receive its initial impetus from the virtual meeting place Sun has created for all those involved with the project. Sun intends to make source code available to non-participating members of the community under defined terms of usage and licensing. The notion of competing proprietary DRM models impedes innovation, Sun maintains, arguing that its solution will both encourage innovation and compensate rights holders. Opponents to DRM, such as the Electronic Frontier Foundation (EFF), argue that openness does not mitigate the inherent restrictions imposed by DRM. The EFF believes Sun's initiative will actually hinder the open source community, as it creates a de facto monopoly that is more restrictive than keeping content in the hands of Hollywood. By contrast, others in the industry believe Sun's project will lead to a more interoperable DRM that will be a boon to both users and rights holders. Industry debate aside, the jury is still out on whether rights holders will buy into Sun's initiative, which will likely determine the ultimate success of the project.
    Click Here to View Full Article

  • "Cray's Rottsolk: HPC's 'Eternal Optimist'"
    HPC Wire (09/30/05) Vol. 14, No. 39; Meade, Peter

    James Rottsolk recently relinquished his position as CEO of Cray, though he intends to stay involved in the company he co-founded, as he will serve on the board and remains committed to making it profitable again. Since posting $21 million in profits in 2003, Cray has fallen on hard times, though Rottsolk believes his easing the transition to new CEO Peter Ungaro will help him focus on the company's much-needed rejuvenation efforts as it launches several ambitious projects. Industry analysts agree that much of the brilliance of Cray's research went unrewarded because it was out of sync with the market, as was especially the case with its ill-fated merger with Silicon Graphics. Ungaro's tenure has been promising, as Cray has made moves toward regaining its industry-leadership status with a renewed focus on design, rather than high-performance computers created with commodity hardware. A unique challenge to the supercomputer industry is its close association with government-funded organizations that operate under rigid budget controls, such as Oak Ridge National Laboratory, which has imposed stifling delays on the sales process. Although Oak Ridge and other government-funded groups have been valuable customers for Cray, the company is striving to make inroads in the commercial sector, where it will have to demonstrate its unique value over the existing microprocessors that can be bought off the shelf. As the marketplace moves toward component composition, Cray could become more of a boutique firm that, while it may have difficulty competing with clustered solutions, will always have a niche in the high-performance computing industry.
    Click Here to View Full Article

  • "IT Groups Push Congress to Raise H-1B Visa Limits"
    Computerworld (10/03/05) P. 12; Thibodeau, Patrick

    High-tech industry organizations such as the Information Technology Association of America are pushing for Congress to raise the H-1B cap, which reached its 65,000-annual limit two months before the beginning of the 2006 fiscal year, while industry lobbyists and other reform advocates expect any amendments to ultimately be incorporated into a broader immigration reform bill. Suggestions that may end up in legislative proposals include a flexible cap that would permit the number of new visas to "rise as needed," according to American Council on International Personnel executive director Lynn Shotwell. Another revision could allow foreign workers, specifically those who hold advanced degrees from American universities, to become permanent residents and thus avoid H-1Bs entirely. ITAA President Harris Miller is uncertain that Congress will take action on the issue this year, but he cited the importance of foreign workers to U.S. businesses and warned that America is damaging its own global competitiveness through measures such as the already depleted fiscal 2006 H-1B cap. Last year, Congress approved 20,000 additional H-1B visas for foreign nationals with advanced degrees from U.S. schools, while the new E-3 visa program sets aside 10,500 visas for Australian residents hired by American companies.
    Click Here to View Full Article

  • "The WiMAX Wait Is Over"
    Electronic Design (09/15/05) Vol. 53, No. 20, P. 55; Frenzel, Louis E.

    WiMAX is set to move beyond the design stage and into the world of real products in the coming months. Initially developed for metro-area networks and standardized as 802.16d, WiMAX can be deployed on both licensed and unlicensed spectra, with the most common bands being 3.5GHz in Europe and Asia and 2.3GHz to 2.5GHz and 5.8GHz in the U.S. Applications can either use frequency-division duplexing (FDD)--a full-duplex technology usually implemented in licensed spectra that offer paired frequencies--or time-division duplexing (TDD), used for unlicensed applications. The typical basestation radius will be between 2km and 10km, although there is a maximum range of about 30 miles, and each basestation in a fixed service is expected to handle several hundred consumer connections as well as a few dozen T1/E1-like business connections. It is expected that the chief application for WiMAX will be last-mile/first-mile service, while another major application would be inexpensive backhaul for Wi-Fi hot spots and cell sites. Eventually, some experts believe that the dominant form of WiMAX will be a mobile/portable version that will allow computers to roam with cell phone-like handoffs. The IEEE is working on the 802.16e standard for this, with ratification expected in late 2005 or early 2006; yet another standard, which could be called 802.16f, would provide roaming and handoffs between WiMAX and Wi-Fi. With certification being handled by the WiMAX Forum, semiconductor companies have already created chipsets that will make it easy to design CPEs and basestations for WiMAX.
    Click Here to View Full Article

  • "Fortifying DOD's Network Defenses"
    Federal Computer Week (09/26/05) Vol. 19, No. 33, P. 60; Tiboni, Frank

    As attacks on Defense Department (DOD) computer networks increase, Purdue University computer science professor Eugene Spafford calls for the creation of a new generation of computer systems and security tools. However, such a project will require long-term research. Meanwhile, Spafford recommends six steps to better protect DOD computer networks: Basing security purchases on effectiveness rather than cost; severely limiting access to computer systems; removing all unnecessary systems; narrowing the number of users that can add hardware and software to the networks; requiring training and supervision of all network users; and implementing network-monitoring practices. Spafford laments that the government is not currently funding long-term cybersecurity research that is key to designing a new and highly effective network security system for federal agencies. Most security used to protect federal agency networks is designed for commercial use and not to protect highly sensitive data. SANS Institute research director Alan Paller says network security is not about implementing the latest security methods but more about preventing attacks up to 18 months in advance. An anonymous Defense Information Systems Agency official reports a change in DOD security that involves moving to a service-oriented architecture to facilitate data sharing among agencies as well as more effective IT services. Also, the new structure makes the Joint Task Force-Global Network Operations in charge of defending, operating, and maintaining the DOD's information infrastructure, according to the official who says, "We have many challenges in synchronizing the many IT efforts and security for [networks] across [the DOD's] vast infrastructure."
    Click Here to View Full Article

  • "Are Attackers Winning the Arms Race?"
    InfoWorld (09/26/05) Vol. 27, No. 39, P. 22; Grimes, Roger

    The severity and speed of malware attacks as well the skill of those who orchestrate them is increasing as hacking becomes more professional and profit-oriented. Forty-nine percent of 474 individuals surveyed in this year's InfoWorld Security Research Report said increasingly sophisticated cyberattacks represented the most serious security challenge their companies will face in the next 12 months, while 57% listed viruses as the top network security threat. Respondents noted that each had thwarted an average of 368 intrusions in the preceding 12 months, but an average of 44% of those attacks were successful. Malware's formerly stagnant nature is shifting toward a "mothership approach" in which a malicious program, once it has infected a computer, links to outside servers and downloads new instructions or programs. Hackers are designing worms to configure into bot networks that hijack thousands of PCs, which are "rented out" to criminal businesses or organizations. A lot of present-day malware exploits patched and unpatched vulnerabilities in Internet browsers, while the interim between the announcement of a vulnerability and the emergence of an exploit is shrinking. The InfoWorld poll found that anti-spyware software and appliances will experience the biggest purchasing increases in the next year. Strong adoption continues for intrusion detection and intrusion protection systems, but a greater number of administrators are enabling those products' blocking functionality.
    Click Here to View Full Article

  • "A Conversation With Roger Sessions and Terry Coatta"
    Queue (09/05) Vol. 3, No. 7; Allman, Eric

    In a discussion moderated by Sendmail CTO Eric Allman, ObjectWatch CEO Roger Sessions and Silicon Chalk VP Terry Coatta converse in depth about objects, components, and Web services. Sessions and Coatta agree that the CORBA architecture failed due to its low support of interoperability in favor of application programming interfaces; Sessions foresees a similar fate for J2EE. However, he disagrees with Allman's assertion that CORBA is more successful than Web services, arguing that "relatively few" CORBA apps have succeeded. Coatta attributes a wide array of emerging Web services applications to Microsoft vastly simplifying Web service creation. Sessions thinks this transparency eliminates a deeper understanding of why, when, and where Web services should be built, which is needed to ensure their effectiveness; he therefore expects architectures to be very shoddy. Sessions characterizes the differences between objects and components as a design distinction, but Coatta makes the case for a component-based architecture with the argument that component technology supplies an interception point. Sessions says the differentiating factor between the CORBA specifications and the Web services standards is the need for developers to understand the former. "You need to understand architecturally what you need to do to build effective Web services, but as far as how the Web services standards move information around, that's not your problem," he concludes.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM