HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Parasoft Logo    Automated Software Analysis - by Parasoft:
Error prevention for unit testing, coding standards analysis,
functional/performance testing, monitoring and reporting.
http://www.parasoft.com/products


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Parasoft or ACM. To send comments, please write to [email protected].
Volume 6, Issue 728:  Wednesday, December 8, 2004

  • "Tighter Cyber Protection Is Urged by Computer-Security Industry"
    Wall Street Journal (12/07/04) P. A3; Bank, David

    Computer-security industry executives today intend to release a series of recommendations to the Bush administration for addressing cybersecurity issues, which many feel the White House has given short shrift to. "We believed [cybersecurity] would be an agenda item visible at the highest levels of government," notes Symantec CEO John Thompson. "That has certainly not been the case." Actions that the Cyber Security Industry Alliance wants the White House to take include pressuring the Senate to sanction Europe's cybercrime treaty, which defines computer crimes and common cybercrime prosecution strategies; mandating government contractors and suppliers to secure their information systems; promoting private-sector information security and encouraging corporate directors to realize their obligations under the Sarbanes Oxley Act of 2002; creating a backup communications network in case of a major Internet blackout; sharing data about attacks on government computer networks with the private sector; increasing the federal Information Sharing and Analysis Centers budget; and, most important of all, establishing the position of assistant secretary in the Homeland Security Department in charge of managing the national computer and communications infrastructure. This last recommendation is not supported by Homeland Security, nor is it endorsed by former Homeland Security cybersecurity division director Amit Yoran, who nevertheless acknowledged in an interview that department efforts to fortify Internet security and prevent cyber-assaults on critical infrastructure networks had stagnated. Software companies and vendors are facing pressure from customers who desire less buggy and insecure software, as well as more vendor accountability for flawed products. Homeland Security's Michelle Petrovich claims the department is aggressively pursuing many of the recommendations.

  • "UCSD Engineers Join International Consortium to Advance 'Invisible' Computing"
    UCSD News (12/06/04); Ramsey, Doug

    Computer researchers from Europe, the United States, and Australia are developing standard methods, tools, and middleware that will enable networking among pervasive embedded devices. "Invisible, or pervasive, computing is already all around us," says Reconfigurable Ubiquitous Networked Embedded Systems (RUNES) technical director and University College London computer scientist Steven Hailes. The European Union-funded RUNES project is meant to work out standard ways for those different devices to function together in an "always-on" environment, despite running on different networks. Besides the ubiquity of embedded processors for industrial, personal, automotive, and household uses, the rapid expansion of the Internet and the emergence of cheap, low-powered radio technologies has paved the way for radical new applications, says California Institute for Telecommunications and Information Technology [Cal-(IT)2] San Diego division director Ramesh Rao. The RUNES project will study the obstacles to connecting these devices and programming applications for distributed resources. RUNES has wide implications for business and everyday living, including applications such as wearable computing, smart homes, and health monitoring, says European Union project officer Franck Boissiere. Middleware will play an important role in RUNES research because of the tremendous complexity involved in managing cooperation between distributed systems, says Cal-(IT)2 researcher Ingolf Krueger, whose group will be in charge of creating design and programming models for such networked embedded systems. Besides UCSD and the University of California, Berkeley, the RUNES consortium also includes 22 European partners and corporate partners such as Kodak and Ericsson.
    Click Here to View Full Article

  • "Florida E-Vote Study Debunked"
    Wired News (12/07/04); Zetter, Kim

    Numerous academics are refuting a published UC Berkeley study's conclusion that President Bush received an abnormal number of votes in Florida counties that used touch-screen voting machines in the recent election with their contention that the researchers' equation was flawed. Voting activists cite the study as evidence of election fraud or defective voting equipment, but Drexel University decisions science professor Bruce McCullogh says the Berkeley researchers analyzed the vote outcomes using one statistical model while discounting other models that would have produced dramatically different results. "They either overlooked or did not bother to find a much better-fitting [statistical] regression model that showed that e-voting didn't account [for the voting anomalies]," he explains. Another point the dissenting academics raise is the lack of a thorough peer review of the analysis, although Berkeley sociology professor Michael Hout and his student team insisted that their findings were reviewed by seven professors. "If I were to get this article as [an academic] reviewer, I would turn it around and say they were fishing to find a result," remarks MIT political science professor Charles Stewart, noting that only two out of 15 Florida counties employing e-voting systems had anomalous results. Hout vouches for the report while admitting that his team was not able to study other data that might have led to a different conclusion, such as analysis of votes cast by absentee paper ballots in the counties with touch-screen machines. The disputation of the study will probably not stop planned investigations into election issues by the Social Science Research Council, the Government Accountability Office, and others.
    Click Here to View Full Article

  • "When Technology Gets Personal"
    BBC News (12/06/04); Twist, Jo

    BT futurologist Ian Pearson foresees a "pervasive ambient world" where people are surrounded by or perhaps even physically integrated with intelligent objects. Smart fabrics and textiles developed through breakthroughs in micro- and nano-engineering are already on the market: Anti-odor socks and stain-resistant seats no longer exist solely in the realm of science fiction thanks to the development of nanoscale titanium oxide coatings; MP3 jackets in which conductive fabric is linked to keyboards sewn into sleeves are now commercially available; and tiny structures modeled after shark skin have been incorporated into professional swimming suits to reduce drag. Pearson says this is only the tip of the iceberg, and speculates potential future advances such as wearable technology that runs on body heat and intelligent electronic contact lenses that can operate as TV screens. However, neuroscientist Baroness Susan Greenfield warned at a recent Royal Society of London conference that such technology carries with it significant ethical questions. Wearable and implantable communication and monitoring devices must address the issue of privacy, which Pearson considers to be of paramount importance. "We are looking at electronics which are really in deep contact with your body and a lot of that information you really don't want every passer-by to know," he comments, noting that security must be built in. Baroness Greenfield points out that the fusion of technology and biology could also dramatically alter the way the human brain functions. For instance, successful attempts to grow human nerve cells on circuit boards have demonstrated the potential of an implanted neural human-computer interface for paralytics.
    Click Here to View Full Article

  • "Caltech Computer Scientists Embed Computation in a DNA Crystal to Create Microscopic Patterns"
    Caltech (12/06/04)

    California Institute of Technology professor Erik Winfree has for the first time experimentally demonstrated his theory that any algorithm can be embedded in the growth of a crystal with the creation of a DNA crystal that computes as it grows, building a microscopic pattern of fractal Sierpinski triangles as the computation unfolds. Winfree, an assistant professor of computation and neural systems and computer science, and colleagues write in the December issue of Public Library of Science Biology that DNA "tiles" consisting of just 150 base pairs each can be programmed to spontaneously configure themselves into Sierpinski triangle-bearing crystals. The crystal computes using a well-established algorithm that starts with a sequence of 0s and 1s and redraws the sequence repeatedly, filling up consecutive rows on a piece of paper, performing binary addition on adjacent digits each time until a Sierpinski triangle composed of 1s and 0s emerges. The Caltech researchers represented written rows of binary 1s and 0s as rows of DNA tiles in the crystal, and addition was mimicked by designing each tile's loose or "sticky" ends to guarantee that whenever a free tile stuck to tiles already in the crystal, it represented the sum of the tiles it was glued to. Paul W.K. Rothemund, a Caltech senior research fellow in computer science and computation and neural systems, notes that scientists in the field of algorithmic self-assembly have "proposed a series of ever more complicated computations and patterns for these crystals, but until now it was unclear that even the most basic of computations and patterns could be achieved experimentally." The crystals' application to nanotechnology may hinge on whether the patterns can be converted into electronic devices and circuits.
    Click Here to View Full Article

  • "Faster Python Grabs Programmers"
    Government Computer News (12/03/04); Jackson, Joab

    The newly released Python version 2.4 will increase the responsiveness of the open-source programming language and enable Python-coded applications to run faster. Version 2.4 includes a new module that uses small Python programs in lieu of shell scripts for more security and visibility when handling errors. Python enthusiasts say the programming language may produce slower running applications than commercial products, but that the flexibility and easily readable syntax make it a better choice in many cases. Federal contractor Development InfoStructure was able to complete a Web content management system for the DisabilityInfo.gov site in just two weeks using Python to add version control to existing open-source content management software, says chief technical officer Martin Hudson. Python is easier to use than object-oriented languages such C++ and Java because its syntax is easy to read and there is more flexibility in declaring data types. Python applications run directly from the source code and are slower than compiled programs, but Python version 2.4 aims to compensate for that deficiency by increasing response times for the language so that it can be used to tackle even larger projects. Los Alamos National Laboratory programmer J.D. Doak said at the PyCon 2004 meeting earlier this year that his environmental control simulation for nuclear fuel processing facilities was coded using Python and SimPy, a Python extension; although Doak was concerned about performance, the combination provided a sufficiently fast product. The invaluable benefit to using Python, however, was the ability to make quick changes according to facility designers' needs.
    Click Here to View Full Article

  • "Getting Out (of) the E-Vote"
    Electronic Design (12/02/04); Schneiderman, Ron

    Despite assurances from election officials and voting machine vendors that the e-voting systems used in the recent national election operated more or less smoothly, voters, activist organizations, and others reported more than 1,000 incidents of system crashes, malfunctions, bugs, disenfranchisement, overvoting, inadequate security, and even election fraud. Almost 450 ballots cast at two polling places in Raleigh, N.C., were not counted on e-voting machines; an e-voting system in Ohio added nearly 4,000 votes to President Bush's count in a precinct where only 800 voters resided; and the Caltech/MIT Voting Technology Project discovered that some voting machines were stored in locked facilities with no guards. Most problems involved people who cast votes for John Kerry, but who claim the machines showed a vote for Bush when the voters were asked to confirm their selection. "If we continue to use the kind of insecure [machines] that were used in this election, it is only a matter of time before somebody exploits them," warns Johns Hopkins University professor Avi Rubin; worse still, such exploitation may be undetectable. Everyone involved with the problem says security issues will be addressed, while a growing movement to add voter-verifiable paper trails to e-voting machines is prompting some vendors to design and offer products with such capabilities. Participants in the Caltech/MIT project are calling for stronger e-voting security and more random testing of machines on election day, while proposals for audio recording systems that boost e-voting machines' intuitiveness and document voter selections are being considered. The IEEE and Vote-Here founder Jim Adler have teamed up to develop a book that details technology's potential to augment e-voting security, reliability, and accuracy.
    Click Here to View Full Article
    For information on ACM's e-voting activities, visit http://www.acm.org/usacm

  • "The Challenges of Integrating Cellular and Wi-Fi Networks"
    TechNewsWorld (12/07/04); Korzeniowski, Paul

    In-Stat/MDR analyst Neil Strother believes the integration of cellular and Wi-Fi networks is inevitable, but its emergence and architecture are still vague. Datacomm Research President Ira Brodsky notes that bundling Wi-Fi with cellular services would reduce coverage problems related to large buildings or regions outside of local cellular towers' range, while ABI Research analyst Phil Solis says the integration would allow carriers to reduce network usage and slow down the rate of network upgrades; in addition, meshing these two services could contribute to carriers' bottom lines by helping operators enlist new customers or retain existing clients and thus reduce their churn rate. Consumer benefits of cellular/Wi-Fi integration include the elimination of dropped calls and lower operating costs for businesses. Wi-Fi networks' service is currently restricted to areas with high volumes of business and consumer users, but cellular carriers and hot-spot experts are collaborating to broaden their network coverage. Furthermore, the separation of Wi-Fi and cellular networks means users need two distinct wireless handsets to access them, so handset vendors are devising compact, dual-purpose devices that can pick up different kinds of radio signals without interference; Brodsky says Wi-Fi-related battery drain is the major technical challenge. Carriers will also have to meet quality of service challenges as they integrate Wi-Fi and cellular networks, and they have designed their networks to comply with engineers' criterion that wireless connection latency cannot exceed 50 milliseconds. "As users roam from cellular networks to Wi-Fi links, one carrier gains minutes and another one loses minutes," observes Strother. "How will customers be charged and how will carriers compensate one another are important decisions that have to be made."
    Click Here to View Full Article

  • "Study Finds Patterns in Web Site User Motivations and Questions"
    Computerworld (12/03/04); Poteet, David

    A User Interface Engineering (UIE) study sees distinct patterns concerning the questions that motivate users to visit Web sites, based on analysis of more than 3,000 posts on discussion forums about chronic neurological diseases. The study's aim was to better understand Web site content in order to address usability problems, based on the theory that users approaching a Web site typically suffer from a gap between their current knowledge and the knowledge they need to reach their goals. UIE reasoned that discussion forums are the most likely resource users would turn to when the Web site lacks the information they need, and the study outlines 14 specific types of posts in each neurological illness forum examined, such as dealing with friends and family, dealing with the medical profession, and sharing experience with medications; the same 14 post types were uncovered in other kinds of forums. The posts, or topic perspectives, were abstracted into more generic terminology. These perspectives fit into three steps that take place as a person attempts to understand his needs, find an effective solution, and live with that solution: Identifying needs is the first stage, which requires Web site content that focuses on helping people recognize their goals without offering a specific solution too soon; isolating alternatives is the second stage, where content must demonstrate the advantages of a solution or contrast it with other solutions; the third stage involves refining the solution, and content is geared toward helping people incorporate the solution into their lives. Most of the sites studied by UIE concentrate on the second stage, and UIE President Jared Spool says the user question patterns recorded in the study indicate that many current sites only address a small percentage of user needs. Applying the study's conclusions to the design and organization of Web site content and user interfaces can improve site usability.
    Click Here to View Full Article

  • "Locking Up All of That "'Free Information'"
    IT Management (12/07/04); Robb, Drew

    Free access to information is a credo of the open source movement, but the flipside of that philosophy is the risk of the unintentional or intentional publication of sensitive data--a risk that has increased significantly with the expansion and growing permeability of enterprise security perimeters. Indeed, the threat is so great that the federal government has gotten involved, mandating severe penalties for information leaks in such legislation as Gramm-Leach-Bliley and HIPAA. The apparent theft of a pair of zip drives containing sensitive information from the Los Alamos National Laboratory prompted U.S. Energy Secretary Spencer Abraham to order 17 federal installations to halt carrying out classified operations on systems with removable media storage. Open source software is touted as a more secure option than proprietary software since the free availability of source code to programmers increases the odds that flaws will be found and patched faster. Gartner security consultant John Pescatore remarks that bugs found in Windows 2000, XP, and Server 2003 continue to be easily hackable, noting that "Enterprises that are dependent on Windows systems must invest both in means to patch faster and in host-based intrusion prevention software for all Windows PCs and servers." However, open source security tools can be used without necessitating a switch to an open source operating system such as Linux; these tools offer the advantages of high support levels and no-cost downloading. The open source Snort intrusion detection system, for example, can identify denial of service attacks, buffer overflows, and other problems by performing real-time traffic analysis, packet logging, protocol analysis, and content searching and matching. Snort is one of approximately 2,000 open source security measures available on the SourceForge Web site.
    Click Here to View Full Article

  • "Paraplegic Fitted With Brain Sensor Ushers in Cybernetic Age"
    San Francisco Chronicle (12/05/04) P. B1; Duncan, David Ewing

    Matthew Nagle is a 25-year-old paraplegic with a sensor implanted directly into his brain that allows him to control a computer as well as an artificial hand by thought. Cyberkinetics Neurotechnology Systems, the human-brain interface's developer, is using Nagle as a test case. Cyberkinetics, co-founded by Brown University neuroscience department director John Donoghue, is the first organization approved by the FDA to run tests on implanted electrode arrays using as many as five disabled human subjects. Nagle's Braingate Neural Interface System is equipped with 100 electrodes implanted above an area of the brain that controls motor activity, and Nagle is plugged into a computer via a fiber-optic cable attached to a cranial node. The computer translates Nagle's neural impulses into commands for moving a cursor or opening and closing the artificial hand. The interface comes from Donoghue's study of how the brain converts thought into physical action by analyzing the mechanics of neuron excitation, and early experiments involved collaboration with University of Utah researchers and the implantation of electrodes into the brain of a primate trained to play computer games using a joystick. Nagle's implant only allows a one-way transmission of Nagle's thoughts to the computer, although there are neural implants that can be triggered or controlled by outside commands--which raises the specter, however unlikely at this point, of mind control. Donoghue's breakthrough also revives visions of being able to download thoughts and consciousness into a computer and transmit them via email, or thought-controlled aircraft for military use; challenges include making the technology less invasive as well as addressing ethical issues.
    Click Here to View Full Article

  • "Open-Source Practices Moving Into Enterprise Development"
    Application Development Trends (12/07/04); Waters, John K.

    Even as open-source applications grow deeper roots in enterprise infrastructure, many businesses are also leveraging open-source-style development techniques to become more productive and meet the needs of their customers the first time. Companies are increasingly incorporating the concepts of iterative development, collaboration, and component reuse in their software development efforts, says VA Software product development senior vice president Colin Bodell. VA Software founded the SourceForge.net portal in 1999 and offers a commercial development tool called SourceForge Enterprise Edition that encapsulates an open-source approach to software development for use in companies. With rapid, agile processes focusing on users' immediate needs, open-source-style development has proven to be a highly productive form of software development that rewards ideas and technologies that cater to what people want. Today, many companies need increased flexibility with their software development efforts since employees are seldom able to work on a project for a sustained period of time. With iterative development, new team members are able to quickly contribute to the project even on a short-term basis, and there is always a build-able version of the product ready for testing, says Bodell. With close communication between developers, project management, and customers, open-source-style development projects ensure requirements are fulfilled the first time, and not in the second, third, or fourth iterations. There is a common misperception that open-source software is all about Linux, but there are a number of other applications coming off the SourceForge.net site, says Bodell.
    Click Here to View Full Article

  • "Linux Camp Takes New Tack on Kernel"
    eWeek (12/06/04) Vol. 21, No. 49, P. 9; Galli, Peter

    The stability and maturation of the latest Linux kernel is allowing developers to transition from the conventional development model to one that will permit more frequent releases. Dan Frye with IBM's Linux Technology Center reports that providing a smoother, speedier, and more regular development cycle in which new features have to be tested only once is the motivation behind the decision to put the features and technologies into 2.6.x, the production 2.6 kernel. The current production kernel was issued in January, with the latest version, 2.6.9, released nine months later. "There still hasn't been a single patch that has made me or Andrew [Morton, who maintains the 2.6 kernel] say, 'Hmm, that looks too fundamental; it really needs 2.7,'" remarks Linux kernel creator Linus Torvalds. "So right now, I'm trying to concentrate on being good about merging 'regular' things into 2.6.x." Frye thinks that a 2.7 kernel may be completely unnecessary, or at least will not be required for a long time. The past six months saw the emergence of some basic problems that raised the need for a 2.7 kernel, but Frye says the development team managed to split the problems into less troublesome segments for 2.6. An anonymous IT manager reports satisfaction with the new development process as long as it supports a stable, reliable, protected, and thoroughly tested operating system.
    Click Here to View Full Article

  • "Cornucopia: A Radically Different Approach to TLDs"
    CircleID (12/03/04); Watson, Brett

    In this proposed new approach to assigning top-level domains (TLDs), Brett Watson, a PhD student in computer science, suggests creating a cornucopia of meaningless TLDs that fit some arbitrary formula-- for example, a group of domains made up of one letter followed by two digits-- to make it easier for users to register the name of their choice. With a cornucopia of meaningless TLDs, the second level part of the name becomes the most important. A person wishing to register the name "X" would be assigned an available domain from the cornucopia chosen at random, such as "X.p45." If for some reason the user did not approve of the TLD, the person would be asked to choose a new name; this process would eliminate the prospect of people fishing for a potentially meaningful TLD, such as "b52." Watson says this system has many potential benefits: it allows people to register generic or popular domain names that are already claimed under the short list of existing TLDs; it promises to offer a more balanced distribution of names; it should eliminate the scarcity of domain names that has driven prices up; it should eliminate the possibility of cybersquatting and the need for dispute resolution; and the "meaninglessness" of the cornucopia would aid in the internationalization of the domain name system. On the negative side, however, new technical considerations would need to be taken into account, the economic ramifications are not entirely clear, and, as always, there are some legal issues that would come into play, particularly in terms of trademark laws. Still, Watson suggests that the cornucopia at least be considered as a solution to root zone growth.
    Click Here to View Full Article

  • "In Praise of P2P"
    Economist (12/02/04) Vol. 373, No. 8404, P. 35

    Peer-to-peer (P2P) networking technology is the Internet evolved, say proponents who also worry about legal concerns. The music recording industry has already successfully shut down Napster and is currently lobbying to outlaw P2P technology; movie and music firms have also asked the Supreme Court to hold P2P software firms liable for copyright infringement committed using their products; and the Justice Department has suggested P2P networks could be used by terrorists. Yet P2P innovation continues as small firms and open-source groups develop more uses, including super-efficient voice calls, workplace collaboration, and distributed data storage. P2P networks are actually closer to what the Internet was intended to be--a network of equal peers that was not reliant on centralized services. The Internet today is more like "television with packets" with a hub-and-spoke distribution system, says technology consultant Clay Shirky. P2P traffic consumes a huge amount of Internet resources, some of it legal but most of it for illegally shared copyrighted works. P2P network-services firm CacheLogic estimates that more than half of all Internet traffic is actually P2P activity, and that the BitTorrent application alone makes up 35 percent of all Internet traffic; BitTorrent is an open-source program that up-ends many traditional assumptions about the Internet, including the idea that popularity means bandwidth penalties. With BitTorrent, users pass around small pieces of large files in a system called swarming, and users that share more content are rewarded with faster connections. Meanwhile, many legal P2P distribution services are emerging, including LionShare, a Penn State University program for globally distributing academic information; Red Swoosh, a network set up to legal distribute copyrighted content; and the Internet Archive, a non-profit Internet archive of digital content that uses five P2P systems to manage bandwidth efficiently and inexpensively.

  • "Does E-Science Work?"
    Chronicle of Higher Education (12/10/04) Vol. 51, No. 16, P. A25; Young, Jeffrey R.

    Although e-science collaborations between researchers at multiple institutions were expected to boost innovation, a study of 62 projects funded under the National Science Foundation's Knowledge and Distributed Intelligence program concludes that such initiatives have been less productive than traditional one-campus efforts. The report finds that logistical issues, such as setting aside time to share ideas about the efforts' core goals while ensuring that routine tasks are carried out on schedule, represent the biggest obstacles to multi-campus projects. Study co-author and MIT professor Jonathon Cummings notes that scientists usually find themselves preoccupied with mundane tasks such as "scheduling, timing, and overall logistics" instead of "the discovery process," while cultural issues such as differing intellectual property policies or procedures for human experimentation are also major impediments. Cummings and Carnegie Mellon University computer science professor Sara Kiesler report that the most successful distributed projects involved regular face-to-face interaction between participants. "In multi-institutional collaborations, leaders and members had to figure out how to keep communication going, perhaps through the excuse of special workshops or conferences, to create successful projects," the authors write in their report, which will be published next year in Social Studies of Science. NSF program director Susan Iacono says Cummings and Kiesler's study prompted the agency to amend the grant-approval procedure for its Information Technology Research program; researchers are now asked to outline their strategies for coordinating multi-campus collaboration in detail. Certain researchers believe some problems are so intricate and far-flung that e-science collaboration is their only option, while tools that promise to enhance such projects are being developed.

  • "Building an Intelligent IT Infrastructure"
    Intelligent Enterprise (12/04/04) Vol. 7, No. 18, P. 18; Ferguson, Roy; Charrington, Samuel

    Organizations seeking to cope with the rising complexity of today's IT infrastructure are seeking out a new class of solutions often called "intelligent infrastructure," but in order to make such an infrastructure work for them, organizations should look carefully at what they hope to achieve and what technology options are available, recommend Newmetrics CEO Roy Ferguson and Tsunami Research VP Samuel Charrington. Organizations should also put together an effective plan for successfully introducing the technologies and realizing their benefits. The characteristics of an intelligent infrastructure are that it is dependable, manageable, adaptable so that infrastructure for new application capacity can be added, and affordable. Some technologies that can be seen as first-generation intelligent-infrastructure solutions are automation and configuration management, autonomic computing, grid computing, and virtualization and rapid processing. The authors write that as intelligent infrastructure moves into its second generation, it will take the form of a fully distributed, application-aware architecture, with management intelligence uniting infrastructure into a single logical system. Also in the second generation, a simple and flat architecture will fully virtualize underlying resources, and it will produce significant simultaneous cost reductions covering application development, infrastructure, and management. The second-generation architecture will facilitate "hive" computing that distributes work across various machines, enables machines to cope with failed hardware and scalability issues, and lets new computers enter and exit the hive without making manual configuration changes or shutting down the application. When putting together a plan for adopting an intelligent infrastructure, Ferguson and Charrington say the Pareto selection method comes in handy to help determine which 20 percent of applications tend to cause 80 percent of the organization's current problems.
    Click Here to View Full Article

  • "Wired and Ready to Wear"
    Military & Aerospace Electronics (11/04) Vol. 15, No. 11, P. 28; McHale, John

    Commercial companies are driving the development of wearable electronics for military use, whose functions range from networking soldiers to ground, sea, and air forces to language translation to delivering situational awareness so that the infantry has the best chance to survive and defeat an enemy. Microvision's Nomad helmet-mounted display is designed to improve tactical area situation awareness for commanders in the field by allowing them to access see-through, daylight-readable vehicle displays while keeping their heads outside vehicles. A Microvision white paper claims that situational awareness will enable armor units to "confidently and swiftly take the fight to the enemy while avoiding friendly fire incidents." The U.S. Air Force will test 10 prototype Itronix GoBook tablet computers powered by direct-liquid fuel cells from Medis Technologies, provided by General Dynamics C4 Systems as a potential successor to current Air Force ground air-traffic control systems. This would enable extended field operations without the need for centralized recharging equipment and spare batteries. "This science and technology initiative will equip the U.S. Air Force with an enhanced Battlefield Air Operations kit, including an advanced computing platform with increased endurance for dismounted missions in an overall lighter, wearable and more deployable package," comments GD C4S general manager Chris Marzilli, who adds that the system could also find use in Defense Department products such as Joint Command Radio System Cluster 5 software-defined radios. Xybernaut's patented "Wearable Computer System," which can be worn in a person's collar, can support adjustable components such as a display, a monitor, a microphone, and an audio headset. Xybernaut President Steven Newman says hands-free computing and communication access is critical to war fighters, first responders, field service technicians, and mechanics.
    Click Here to View Full Article

  • "The View From the Top"
    IEEE Spectrum (11/04) Vol. 41, No. 11, P. 36; Applewhite, Ashton

    Many distinguished science and engineering luminaries harbor similar attitudes about the most profound technology of the last four decades, the technology likely to have the biggest impact in the next 10 years, and the most surprising examples of technological development. Experts often reiterated the importance of the integrated circuit, the computer, and the Internet in the first instance, wireless communication, embedded computing, and distributed sensing in the second, and the rate of technological change in the third. Intel CEO Craig Barrett warned that the United States will lose its leadership in the world's tech sector unless it invests more in science and engineering education, infrastructure, and research and development, while Mathkumalli Vidyasagar with Tata Consultancy Services envisions the Internet and wireless technology bridging the digital divide through broadband and native-language email, text processing, database management, and data retrieval. Harvard University electrical engineering research fellow Federico Capasso predicted that quantum mechanics will play an influential role in communications, computing, and sensors in the next several decades, while National Academy of Engineering President William Wulf pointed to the University of Virginia's work on a sensor-equipped chip designed to detect and transmit data about structural integrity. MIT professor Mildred Dresselhaus commented on a changing industry focus from software to hardware--specifically nanolevel electronics--due to hardware's approaching scalability threshold, and fellow MIT researcher Neil Gershenfeld foresees digital manufacturing furthering the advancement of personal fabrication. Understanding natural, organizational, and social environments in innovative ways through distributed sensing was viewed as critical by Priscilla Nelson with the National Science Foundation, while Fujitso Laboratories President Kazuo Murano believes robots will be a key technology for addressing social issues. UC San Diego's Francine Berman talked about how increased dependence on the Internet, cell phones, and data storage is bringing the need to create a solid but evolving "cyberinfrastructure" to the fore.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM