HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 629: Friday, April 9, 2004

  • "See-Through Voting Software"
    Wired News (04/08/04); Zetter, Kim

    VoteHere is proposing a vastly more transparent approach to electronic voting than offered by other companies, allowing public inspection not only of its source code, but also the entire voting transcript. The electronic voting systems firm does not make voting machines itself, but licenses its VoteHere Technology Inside software that can be integrated in current touch-screen voting set-ups. So far, only Sequoia Voting Systems has signed up, though it has not said when it will install the software on its machines and will have to get federal and state recertification when it does. VoteHere uses a unique audit scheme that produces a code randomly generated for each candidate. Voters print out a receipt with the code, then cast their ballot. They can check to see if their vote was counted correctly after the election by viewing results on the county Web site. VoteHere CEO Jim Adler says this scheme allows independent verification not only from the voters themselves, but also from independent watchdog groups and computer experts. Even if only a small percentage of the voter population retains their receipts and checks their vote's correctness after the election, it would provide a enough oversight to ensure no one tampered with the results. Microsoft cryptographer Josh Benaloh has studied VoteHere's methodology and technology, and says making the vote transparent to the public is the best way to guard against fraud. While the paper receipt is nice, Benaloh says it is not as important as the encryption in VoteHere's scheme. The company's source code was copied by a hacker in December, but Adler says VoteHere had already decided to make its source code public by that time.
    Click Here to View Full Article

  • "Taking High-Tech to a New Dimension"
    Monterey County Herald (04/08/04); Howe, Kevin

    The Web3D 2004 Symposium sheds light on attempts to share information among computer devices for a number of 3D-display applications, ranging from game-playing to war-fighting. Sun Microsystems game technologies chief architect Dough Twilleager described a scenario where 3D content could be delivered wherever the user happens to be. Currently, computer devices are merely "content destinations" that serve up what content creators want, but in the future, standard interfaces would enable devices to be "content portals" that provide users with the 3D applications of their choice. Twilleager said making 3D content widely available on a variety of devices would increase their usefulness and users' level of participation. This is especially important in the computer games market, he said, imagining a time when he would be able to play the computer game of his choice on the TV in his hotel room. Navy Capt. Chris Gunderson said the military is also interested in harnessing 3D technology across a wider range of devices regardless of which proprietary hardware and software is used. He said 3D and 4D computer graphics provide the military with information critical in fighting elusive enemies, noting that good pictures quickly convey a large amount of information. He cited a need for more standardization among information systems before information could be harnessed in this way. Existing military information systems are like custom houses and are not configurable to meet different tasks as needed, he said. Monterey Bay Aquarium Research Institute software developer Mike McCann described how his group uses many small photos taken by a robot submarine to construct a larger map of the ocean floor.
    Click Here to View Full Article

  • "W3C Advances Specs for Web Interoperability"
    InternetNews.com (04/08/04); Kerner, Sean Michael

    New technical recommendations from the World Wide Web Consortium (W3C) spell an end to the browser wars of the late 1990s and create a new basis for Web interoperability. The Document Object Model (DOM) Level 3 Core is an API standard that helps developers create applications that work on different browser and server platforms. DOM started back in 1997 when Netscape's Navigator and Microsoft's Internet Explorer were battling for browser dominance, says W3C DOM activity lead Philippe Le Hegaret. Using Netscape 3.0 and Internet Explorer 3.0 as baselines, the DOM effort sought to enable browser compatibility. DOM Level 2 Core is widely known among Web developers as defining the Cascading Style Sheets standard. Both DOM Level 1 Core and Level 2 Core are widely used in Web applications, but Le Hegaret predicts the Level 3 Core will take some time before being adopted by Web user agents. He describes DOM Level 3 Core as the final product of the DOM Working Group, though further extensions will be made for specific users, such as the SVG Working Group. DOM Level 3 Core is very broad-based and only deals with browser technology in some portions of the specification, according to Mozilla Foundation chief architect Brendan Eich. He says the Load and Save specification, the other specification recently approved as a W3C recommendation, will especially be useful in building Web services capabilities into the Mozilla browser. The Load and Save Recommendation allows XML documents to be loaded into DOM tree in memory, or the DOM tree to be saved as an XML document in memory.
    Click Here to View Full Article

  • "Intel Readies Earth-Friendly Chips"
    TechNewsWorld (04/07/04); Koprowski, Gene J.

    Intel and other chip firms are promising to eliminate lead and other hazardous materials from their semiconductor products. By the end of this year, both Intel and National Semiconductor say they will offer lead-free and nearly lead-free chips and chip packages. Reducing lead and other materials in chip packages will allow for easier extraction of precious metals during the recycling process later on, said National Semiconductor executive vice president Kamal Aggarwal. Flame retardants based on bromine and antimony are also targeted for elimination. Electronics makers began moving toward more environmentally friendly products and processes four years ago, and have been pressured by increasingly strict government regulations and market demands for lighter products. Reducing lead in electronics will make them lighter, a boon for mobile applications such as laptops and cellphones. Meanwhile, Intel is also promoting low-power technology with four new Pentium M and Celeron M processors that run at lower clock speeds than other chips, but consume just seven watts of power while operating at just over one volt. The power-efficient chips will give laptop manufacturers more leeway to extend battery life or reduce weight by including smaller battery cells. Other electronics manufacturers Samsung and NEC have promised to eliminate lead from their products in a couple years using chips or chip packages from Intel and National Semiconductor.
    Click Here to View Full Article

  • "College Campus Is IT Hot Spot"
    Federal Computer Week (04/07/04); Hasson, Judi

    An increasing number of colleges and universities are offering majors in information technology, which should prove to be beneficial to the federal government. The list of colleges offering IT degrees includes Harvard University, which offers an online degree in IT for undergraduate and graduate students. Henry Leitner, assistant dean for IT at Harvard's Extension School, says the program has about 200 students, and a dozen graduate each year from the program, which is in its sixth year. "Colleges and universities realize there's a shift in our society, and a lot of students are starting to look for more hardcore technical degree programs at a rate far greater than any other time in the educational cycle," says Ira Hobbs, co-chairman of the CIO Council's Workforce and Human Capital for IT Committee. American University in Washington, D.C., introduced a master's degree program in IT management last September, and many students are government employees or federal contractors who want to improve their career opportunities. Meanwhile, Pennsylvania State University graduated the first class from its School of Information Sciences and Technology, which it started in 1999. "It's really a degree meant to use technology as a solution, not to create the technology--that's what computer scientists and engineers do," says James Thomas, dean of the school.
    Click Here to View Full Article

  • "Panelists Call for Lightweight Linux"
    IDG News Service (04/07/04); McMillan, Robert

    Linux complexity was a hot topic for discussion at this week's ClusterWorld Conference & Expo in San Jose, with Linux users and distributors divided on whether to make the open-source operating system more lithe or more complex. Linux is big in the high-performance computing community, having largely pushed aside proprietary vendors such as Cray. But despite claiming more than half of the spots in the Top500 Supercomputer Sites list, Linux clusters still can be improved on, according to conference panelists. Argonne National Laboratory computer scientist Rusty Lusk called for a simpler Linux that would be more modular and easy to manage architecturally. Currently, there are too many different software library versions, creating "dependency" problems and unnecessary complexity. Simple distributions that are easy to build on would be a tremendous boon to customers, said Henry Hall, president of systems integrator Wild Open Source. A tool that allowed users to track and audit changes made to the basic kernel is also important, he said. But Scyld Linux co-creator Donald Becker predicted Linux development would lead to a more complex operating system, albeit one that is easy to customize through componentization. No mainstream Linux distribution has yet addressed the unique needs of the high-performance computing community head-on, said SUSE's Timothy Beloney. SUSE parent firm Novell recently formed a team to work on the issue, and is especially looking at how to make Linux more customizable.
    Click Here to View Full Article

  • "A Pearl for the Elderly"
    Pittsburgh Post-Gazette (04/04/04); Rotstein, Gary

    A research team from Carnegie Mellon, the University of Pittsburgh, the University of Michigan, and Stanford University continue to test the elder-care robot Pearl at the retirement community Longwood at Oakmont in Pittsburgh. Pearl is one facet of the Personal Robotic Assistants for the Elderly project, which also includes the high-tech walker IMP, and a handheld device designed to remind people of things they need to do. While Pearl can guide herself through an area at a pace equivalent to a slow walk for humans and verbalize scripted reminders such as it is time for a person to take their medicine, the four-foot-high robot will not hear, talk, recall, and respond the way its inventors want it to for at least another decade. The researchers want Pearl to sense that something is wrong if a person has not visited the bathroom in some time, stays in the bathroom too long, or does not move from a chair; seek an explanation and summon help if it receives no answer; as well as pick up and move things for people. "From a robotic domain, topics such as living with a person, sharing the same space, interacting with a person, are the cutting edge in robotics,"## says professor Sebastian Thrun, who runs the artificial intelligence lab at Stanford. The next-generation walker makes use of sonar detectors, a laser range finder, and mapping software to determine where a person wants to go (even if the person gripping it does not), and the researchers also envision IMP moving itself out of the way after it has been used and returning when the person summons it. The handheld piece of artificial intelligence would give Pearl and IMP the ability to know when a user has completed essential tasks or when to offer reminders.
    Click Here to View Full Article

  • "Nonlinear Nets Tackle Wireless Apps"
    EE Times (04/06/04); Brown, Chappell

    A new neural network design promises tremendous wireless bandwidth at low power, mimicking biological systems that handle complex tasks with minimal energy use. Unlike conventional linear circuit designs, the so-called echo state networks (ESNs) use nonlinear dynamics to operate close to performance limits. ESN inventor Herbert Jaeger, at the International University Bremen in Germany, has set up a Web site soliciting corporate sponsorship of his work. Jaeger says his ESN research began with enabling dynamic channel assignment in next-generation cellular network applications. Nonlinear networks are widely recognized as more efficient and powerful than linear kinds, but they defy attempts to mathematically model them except for very simple systems. Jaeger overcame the modeling barrier by using a black-box approach where an input is fed through the neural network and connected at the other end according to random synaptic weights. The network uses feedback loops to produce an echo effect and turn it into a nonlinear system. Output weights are adjusted according to model input-output data sets so that the system is trained, much in the same way conventional back-propagation networks are. According to benchmark tests, the ESNs were more efficient than standard techniques by a factor of 2,400. Jaeger used FPGAs (field-programmable gate arrays) to implement the networks out of practicality. Even though an analog approach would have yielded better performance, he said the ESNs have not yet been reworked to minimize the noise generated by analog circuits.
    Click Here to View Full Article

  • "Data Finds a Place on the Grid"
    Computerworld (04/05/04) Vol. 32, No. 14, P. 22; Thibodeau, Patrick

    Large companies and public institutions are beginning to investigate data grids, which have garnered far less media attention than computing grids but work on the same principles. Just as computing grids allow resources to be viewed singularly and applied to a specific problem, data grids allow a single view of data stored in disparate physical locations. Data grids use metadata frameworks and middleware to enable the pooling of data resources, irregardless of the operating systems or formats used in the databases. Data and computing grids can operate together. So far, most research institutions have adopted data grids, which are best applied to organizations with large and scattered data repositories. Standards need to be established and widely adopted among vendors for data grid applications to be most effective, says Pfizer research information architecture director Paul Lewis. While current data grid products are being deployed within organizations, the most advantageous deployments will be across companies and supply chains, say experts. Globus Alliance grid standards co-director Ian Foster compares the protocols that enable remote data access to Internet Protocol. He says that fixed-file-based data is easy to link up the grid, but more work needs to be done on relational databases and XML databases. European research groups are linking up their data repositories in the DataGrid project, and are gathering resources for an even larger endeavor called Enabling Grids for E-science in Europe (EGEE). Middleware and security are outstanding issues, says DataGrid project head Frabrizio Gagliardi. Eventually, data grids may reduce hardware costs since enterprises can use more heterogeneous vendor offerings.
    Click Here to View Full Article

  • "Ballot Box Debate"
    eWeek (04/05/04) Vol. 21, No. 14, P. 29; Carlson, Caron

    The success of online voting in the Michigan Democratic Party Caucus Feb. 7, 2004, did not impress the computer scientists involved in the federal SERVE (Secure Electronic Registration and Voting Experiment) project. A month later, the Pentagon decided to table SERVE, which would have enabled some military personnel and other U.S. citizens overseas to vote in the November election, over security concerns. According to former ACM president Barbara Simons, who served on the peer review panel that recommended ending SERVE in January, there was no motivation to tamper with the Michigan Democratic caucus, which does not count, and Sen. John Kerry was expected to win. The peer review team is concerned that online voting could lead to automated vote buying and selling, privacy violations, and the reversal of the outcome of elections, all without anyone knowing. Simons and the other members of the peer review team also are concerned about e-voting systems that are used at the polls, and they see potential problems in proprietary software, a lack of protection against insider fraud, and a lack of voter verifiability. "You're basically handing over your rights in a democracy to a handful of companies with secret code,"## says Simons. "I believe our democracy is at stake." Some lawmakers have addressed the concern for voter verifiability by requiring their states to make verifiable paper ballots part of the electronic process.

  • "Researchers Develop Electronic Nose for Multimedia"
    ScienceDaily (04/02/04)

    Researchers at the University of Alberta are working to bring the sense of smell to the personal computer. Dr. Mrinal Mandal, a professor in the Department of Electrical and Computer Engineering, and Rafael Castro, a master's student, have developed a device that recognizes the odors from 10 different smell grouping, such as fruits, coffees, spices, and gases. Mandal and Castro used inexpensive electronic parts that can be found in a hardware store to build the device, which attaches to a PC and works as an electronic nose. Once connected to the PC, the smell capturing system also gives the PC the ability to discern the smell it picks up. As a multimedia application, the electronic nose one day could enable a person a thousand miles from home to open an emailed photo of their favorite meal cooked by their mother, and smell the food. The researchers had to create approximately 1,000 smell channels to simulate the several million smell receptors in the nose and the 1,000 different types of receptors, and they had to develop a pump cleaning system to deal with smells that stick or linger. Mandal and Castro say they still have to develop a low-cost smell generation system; they expect to have an inexpensive device available in the next five to 10 years.
    Click Here to View Full Article

  • "The Next Job for the W3C"
    SD Times (04/01/04) No. 99, P. 1; Lee, Yvonne L.

    World Wide Web founder Tim Berners-Lee says many important Web innovations are in the pipeline, including voice browsing, greater Web collaboration, and the Semantic Web. He describes current voice recognition software as "less hopeless" than before, making voice browsing possible. Users would be able to dramatically change the way they interact with their computers, ordering pizza over the Internet through voice commands, for example. Berners-Lee says the Web should be a more collaborative space where people can do all the things they do on their desktop with others via the Web, such as editing documents simultaneously. New Web frameworks for this type of pervasive Web collaboration may be needed, including the ability to authenticate collaborative partners and make simple annotations to others' Web sites. Simpler processes are needed to update Web content used in ad hoc collaboration, such as photographs. Berners-Lee says the Semantic Web will make browsing online less time-consuming and more accurate. The recent release of the RDF format for publishing and reading Web data is a huge step forward to allowing people to easily cross-reference personal and corporate information, for example. Within companies, Semantic Web concepts promise to break down information silos and allow people to easily see the connection between bug reports and customer satisfaction, for instance. Instead of putting all the data on huge databases, companies should be able to administer their own data and put it on a Web site linked to other groups. As with any powerful, new technology, rolling out the Semantic Web will require a conscientious approach. Just as consumers were not willing to take up e-commerce until companies became strict with privacy, the Semantic Web will also require privacy guarantees.
    Click Here to View Full Article

  • "The Mainframe's Mid-Life Crisis"
    Darwin (03/04); Van Hoboken, Rob

    Mainframes, the backbone of many organizations' IT infrastructure, are accessible through network services, and companies need to reconsider how they secure their mainframes, writes consultant Rob van Hoboken. Experienced mainframe help is hard to find and often overworked, and overconfidence and complacency are problems when it comes to mainframe security. Mainframes are especially vulnerable to aging software, malicious data access, and mistakes from less-experienced staff. To improve mainframe security, organizations can create a mainframe security dashboard that includes an overview of who is accessing data, what is accessed most, dormant accounts, weak passwords, and other information. Centralizing some mundane security functions to less-specialized resources or using mainframe software or enterprise systems can free experts to do other things. Mainframe operating system and application log-on processes should be configured to show who accessed what data and help detect access violations. Automated tools are available to help with this task. Intrusion-detection systems and real-time alerting for misconfigurations or access violations are also beneficial, as are solutions that protect against mistakes made through inexperience. Hoboken says that although mainframe security issues are often overlooked in the media, it is vital to protect the mission-critical services mainframes support.
    Click Here to View Full Article

  • "Who's Responsible for Cybersecurity?"
    Network World (04/05/04) Vol. 21, No. 14, P. 8; Marsan, Carolyn Duffy

    Software vendors are taking responsibility for their insecure products, pressured by customers who are tired of being blamed for the poor state of cybersecurity. In the past, software vendors discouraged regulation of their industry while issuing numerous security recommendations for users. But now with cyberattacks at all-time highs, the debate has shifted direction. The Business Roundtable, an association of top U.S. CEOs, will release guidelines this month outlining how vendors can improve their products. That group conducted a March survey of 100 CEOs and found 100% had increased cybersecurity outlays since Sept. 11, 2001, by an average of 10%, but notes costs associated with cyberattacks have grown exponentially. The National Cyber Security Partnership (NCSP), which includes top IT lobbying representatives, has asked the Department of Homeland Security to examine possible action on improving security in the IT industry. The group specifically mentioned "liability and liability relief, regulation and regulatory reform, tax incentives, enhanced prosecution, research and development, education, and other incentives."## Other NCSP recommendations include software security accreditation, best practices designing secure software, improved university computer security programs, and guidelines for patch management. NCSP enterprise task force chair Marc Jones says that while improvements in software development are necessary, their effect will not be immediate. SANS Institute research director Alan Paller says the software industry should make products with security built-in and not expect customers to be experts. Products should come shipped with security features turned on and vulnerable services turned off, and that patches should be distributed via automatic updates. Stratex Networks CIO B. Lee Jones says government regulation should focus on liability and incentives, not mandating technical specifications.
    Click Here to View Full Article

  • "Managers Rush to Certify IT Security Workers"
    Federal Times (03/29/04) Vol. 40, No. 8, P. 16; Breen, Tom

    Government managers are finding that they must hire, retain, and train employees who protect the government's information systems. "It is urgent that we hire people with the skill sets necessary to do this incredibly important job, and to keep them once we train and certify them," says Veterans Affairs Department security information official Bruce Brody. Managers are trying to decide whether to train internally or to hire outsiders, and some IT security professionals say that security concepts are not really included into general computer training and certification programs. This spring, the Department of Homeland Security is making a big push in the area of security certification. Overall, the federal government retains more than 10,000 employees classified as computer security professionals, far more than the number present two years ago, while several thousand work for private contractors on government projects. The federal government is beginning to up its pay for IT security managers to avoid losing them to private contractors. So far, the Veterans Affairs Department is the only federal agency offering cybersecurity certification, but a number of private organizations do also, including the SANS Institute in Bethesda, Md., and the International Information Systems Security Certification Consortium in Vienna, Va. The Veterans Affairs Information Security Professionalization Program is modeled after some private-sector programs, and Brody says that it was created to ensure that all the agency's IT personnel had the same knowledge base. He says a security certification program for the entire federal government would benefit from economies of scale and encourage more involvement from agencies.

  • "Search Engines--The Future"
    Computerworld (04/05/04) Vol. 32, No. 14, P. 26; Anthes, Gary H.

    Web search engines will be much more sophisticated, powerful, and accurate in the future. As useful and even life-changing as today's search engines are, experts say there are obvious areas of improvement. Searches should be personalized, not targeted at a general Web audience or skewed in favor of advertisers, for example. Colorado State University Professor Adele Howe and graduate student Gabriel Somlo have created the QueryTracker software agent, which filters results from normal Web search engines and improves on user queries. Based on what it learns about user preferences, QueryTracker is able to compensate for poor Web searching skills, which Howe says is probably the most inhibiting factor for most people right now. Too many people use just one word to search the Web and turn up millions of hits, for example. Dalhousie University Professor Jeannette Jenssen does not rely on traditional search engines, but instead has developed entirely new Web crawlers that index Web content according to preferences and search out indirect links. Her crawler knows when it is on the trail of good content based on past user behavior and indexed Web paths. In the future, more powerful computers will enable vastly more powerful Web searches that consider many things in a document together, not just isolated keywords or the number sites linked to the page, according to Indiana University Professor Filippos Menczer. That type of Web search is already being done in IBM's laboratories, where the WebFountain Linux cluster continuously trawls the Web, extracting semantic meaning and attaching XML metadata. "We are tagging the entire Web, all of Usenet news, all the wire services and so on," says WebFountain chief architect Dan Gruhl. IBM researchers are working on complementary sentiment analysis systems that would allow companies to search the Web for popular sentiment on a specific product, for example.
    Click Here to View Full Article

  • "Visualization Aids Steering of Complex Simulations on the Grid"
    Scientific Computing & Instrumentation (03/04) Vol. 21, No. 4, P. 17; Walton, Jeremy; Wood, Jason; Brodlie, Ken

    Data transfer optimization and biological process modeling are just a few of the Grid-enabled applications that can benefit from visualization tools that would allow simulation progress to be monitored and steered, delivering meaningful results faster so that better-informed decisions can be made. The content modules and connections that make up visualization applications can be selected and replaced using a visualization toolkit such as IRIS Explorer, while extensible applications allow users to insert their own modules; such extensions are exemplified by the development of collaborative modules that establish communication between geographically separated applications. Extending an existing visualization system spares users the burden of adapting to a new package, and enables existing modules and applications to be reused in the steering examples. The Grid environment integrates well with the steering model, permitting remote execution of simulations and local execution via visualization. In one demonstration, the Grid can simulate the release of a hazardous chemical gas over a region in faster than real time: A domain expert can run the simulation and analyze the results through visualization while getting up-to-date weather data from the meteorological office through collaborative steering. The visualization tools must be flexible enough to deliver different representations of the same data without ceding collaborative control. Complementing the IRIS Explorer is a steering library that is split into two APIs--one of which instruments the simulation code while the other enables matching capability to be meshed with the visualization system. The steering library and IRIS Explorer, used together, enables simulation and visualization modules to be placed closer to one another on the grid, optimizing network data transfer and facilitating a better comprehension of the model and the physical forces it simulates.
    Click Here to View Full Article



    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM