HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 695: Friday, September 17, 2004

  • "American Programmers Still Alive and Kicking"
    IT Management (09/16/04); Pastore, Michael

    Edward Yourdon, author of the upcoming book, "Outsource: Competing in the Global Productivity Race," foresaw rough waters ahead for U.S.-based programmers as far back as 1989, when he noted that Indian workers adhered to very high standards of quality and productivity and were willing to offer their services cheap; this was one of the earliest inklings of the offshore outsourcing movement, which has surged in recent years thanks to the economic downturn and improved global collaboration and communications technologies. Yourdon thinks the next phase of the offshoring boom could be fueled by some as-yet unimaginable technology that transcends lingual and cultural barriers, although forecasting the economic cycle is harder. There is little argument, however, that the offshoring wave is unlikely to reverse itself, and the migration of tech positions to overseas workers has U.S. professionals doubting the security of their jobs, while younger people who wanted to pursue tech careers are being forced to re-think their options. Yourdon believes entry-level employees and the bottom 10% to 20% of the workforce will bear the brunt of the outsourcing trend. "One of biggest issues we'll struggle with as a nation and a society is how we'll subsidize and take care of entry-level workers," he predicts. Yourdon suggests that students increase their value to employers by gaining specialized training in another field besides computer science, such as law or biology; complicating matters is the fact that students' desire to study math, science, and technology can wither before they even enter college. He also notes that current industry workers must have a willingness to continue their education and gain new skills to sustain their viability, yet is puzzled that this willingness appears to be so scarce.
    Click Here to View Full Article

  • "Looking for a User-Friendly Internet"
    Wall Street Journal (09/16/04) P. B4; Dvorak, Phred

    The Internet can be a double-edged sword for visually impaired users: It can expand their independence, or isolate them further from important information if it is too complex to navigate. Solving this problem has been the goal for the past eight years of IBM Japan accessibility researcher for the blind Chieko Asakawa. Asakawa, who is herself blind, began her tenure at IBM by developing software that allowed Japanese Braille-card punchers to perform their work on a PC. Her exploration of the Internet started in the mid-1990s, when she used Netscape and IBM screen-reader software to help her browse; however, the software was only programmed to read English, and could not translate fill-out forms or tables with vertical columns. These limitations prompted Asakawa to devise software that added aural cues to aid navigation--for example, it assigned a male voice to read text and a female voice to read links. The first version of the software, dubbed the Home Page Reader, was in Japanese, followed by versions in English and nine other languages. However, Asakawa learned that as the number of Web sites grew, the percentage of those sites that her software could read shrank, due to the increased use of graphics, animation, and other non-textual components. So the IBM researcher developed aDesigner, a program Web designers can use to graphically evaluate how accessible their pages are to the visually impaired. The program reveals how long it will take reader software to reach a piece of data at the top of a Web page, and grades the accessibility of page elements by lightening or darkening backgrounds.

  • "DHS Moves Ahead With Cybersecurity R&D Efforts"
    Computerworld (09/15/04); Verton, Dan

    The Department of Homeland Security (DHS) is engaged in several pilot cybersecurity efforts designed to address the scarcity of real-world incident data, such as the Protected Repository for Defense of Infrastructure Against Cyber Threats (Protect) program. The goal of Protect is to convince major private-sector infrastructure companies to voluntarily provide real-world attack data that can be used to test prototype cybersecurity measures, says Douglas Maughan with the Homeland Security Advanced Research Projects Agency. He says the program would be dependent on a trustworthy access repository process featuring a government-backed data repository hosted by a third party, with written contracts with data suppliers; researchers can apply to participate in Protect, while data owners would be permitted to block access for specific researchers. Meanwhile, DHS' Cyber Defense Technology Experimental Research test bed aims to contribute to the creation of next-generation critical infrastructure security technologies by building a homogeneous emulation cluster residing at the University of Utah's Emulab facility. The initiative, which lets researchers concentrate on security hole prevention and detection as well as assess operational systems' security and dependability, has so far received $14 million in funding. Sept. 20 marks the first meeting of the DHS' Border Gateway Protocol steering committee, which is readying R&D pilots to build safe protocols for the routing framework that links ISPs and subscriber networks, which is highly susceptible to human error and router-directed assaults. Another DHS-organized steering committee will analyze and develop cybersecurity pilots for the Domain Name System that will study such dangers and vulnerabilities as denial-of-service attacks and unsanctioned root servers and top-level domains.
    Click Here to View Full Article

  • "Supercomputers Aid Hurricane Forecasting"
    Associated Press (09/16/04); Fordahl, Matthew

    Weather forecasting has significantly advanced with supercomputing: Improvements supercomputer-aided climate modeling has helped usher in include increasingly accurate five-day forecasts and 50% less hurricane track error in the National Hurricane Center's three-day projections. Supercomputers responsible for these upgrades include a machine at the U.S. Navy's Numerical weather computing center comprised of hundreds of microprocessors conducting billions of operations every second. Refining weather forecasting requires more than additional hardware, because the numerical models are subject to constant amendment as researchers increase their atmospheric knowledge and make their algorithms more precise. The Fortran-based models try to take all atmospheric phenomena into consideration and analyze it on a global scale by enveloping the Earth in a 3D grid. The computer then processes initial observations of humidity, air pressure, temperature, wind speed, and other variables at each of the grid's intersection points so that a predictive model can be inferred. National Oceanographic and Atmospheric Administration researchers plan to debut a finer-grained weather model in the next several years that takes the interaction of land, atmosphere, and the ocean into account. The simulation will use data collected from a Doppler-radar-equipped Gulfstream IV jet, and will run at a Maryland supercomputer center that currently processes 116 million daily observations using IBM server clusters with hundreds of commoditized PC chips.
    Click Here to View Full Article

  • "They're Robots? Those Beasts!"
    New York Times (09/16/04) P. E1; Kirsner, Scott

    Some robotics researchers are looking to the animal world for inspiration, and they think the biomimetic devices they create will be able to function in places inaccessible to current-generation robots. Examples of biomimetic machines include segmented snake- and trunk-shaped robots from Carnegie Mellon University; Mecho-Gecko, a device designed by the University of California, Berkeley, and iRobot; Northwestern University's RoboLobster; MIT's fish-inspired RoboPike and RoboTuna; and RHex, a six-legged machine based on the cockroach. Although researchers often tout the robots' battlefield and homeland defense potential to secure funding, military applications are just one consideration: Robotic whales, for instance, could swim with their real-world counterparts, gathering and transmitting data to marine biologists or students. Medical applications for such machines could include minimally invasive surgery (an area CMU's Dr. Howie Choset is exploring with his snakebots), while legged robots could lay the groundwork for improved prosthetic limbs for amputees. Researchers think biomimetics will advance significantly with the advent of electrically-driven artificial muscles, but in the meantime they are studying animal biology to imbue their robots with capabilities that current machines lack. University of California professor Robert Full has developed an adhesive substance based on a gecko's toes that Boston Dynamics may incorporate into a climbing robot, although the company is chiefly focused on BigDog, a Pentagon-sponsored quadruped machine that runs; BigDog is currently awaiting a vision system from NASA's Jet Propulsion Laboratory. Although many biomimetic robots have legs, users of robot technology prefer wheels and tank treads because of their increased efficiency, notes iRobot President Helen Greiner.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Human Errors Silenced Airports"
    Los Angeles Times (09/16/04) P. A1; Alonso-Zaldivar, Ricardo; Malnic, Eric; Oldham, Jennifer

    A software glitch led to a three-hour shutdown of Southern California's air traffic control radio system, cutting off radio communications and leading to five incidents where planes breached the required separation distance from one another. FAA officials said the radio system, known as Voice Switching and Control System (VSCS), contained a software glitch discovered one year ago as the agency began upgrading the systems nationwide. Originally based on a Unix system built by Harris, the upgraded touch-screen system used Dell computers running a Microsoft operating system; the new system automatically shut down after 49.7 days in order to prevent data overload, in which case controllers might receive wrong information without knowing about a malfunction. FAA officials blamed an improperly trained technician for failing to manually reset the internal clock during maintenance, leading to the initial failure, while the back-up radio system's subsequent failure was also attributed to a technician's mistake. A technicians union advisor, Richard Riggs, said the software glitch should have been fixed when it was first discovered and before the new systems were deployed at 21 regional air traffic control centers. FAA officials have only corrected the error in the Seattle air traffic control center, but have deployed an early warning system in the Southern California center that will prevent another outage. The three-hour radio communications shutdown left planes above Southern California, Arizona, and New Mexico without air-traffic control instructions, until communications tasks were handed off to other regional centers. In two cases, pilots had to take evasive maneuvers to avoid danger, while Los Angeles International Airport officials said approximately 30,000 travelers were affected at their airport alone.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Nose-Steered Mouse Could Save Aching Arms"
    New Scientist (09/16/04); Biever, Celeste

    Dmirty Gorodnichy of Canada's Institute of Information Technology has created the "nouse," a tool that enables PC users to navigate using the movements of their nose and eye blinks. The nouse can navigate around 2D computer software using a single Webcam and 3D software with two Webcams. The interface is equipped with tracking software that observes the image captured by the Webcam to determine where a user's nose is pointing, and produces signals that drive the movement of the cursor; the nouse camera takes a picture of the user at the beginning of a session, and from this image it extracts approximately 25 pixels comprising the tip of the nose and reads each pixel's luminosity levels. Meanwhile, motion detection software registers eye blinks to trigger activations similar to mouse clicks. Earlier face-tracking interfaces focusing on the user's eyebrows or mouth can run into trouble because the tracking points can become distorted when the angle changes even slightly, a problem the nouse does not have because the software can discern the tip of the nose's distinctive pixel pattern from any angle, according to Gorodnichy. In the journal Image and Vision Computing, Gorodnichy explains that the technology could have applications in video gaming and virtual environment navigation; he also envisions the nouse as a helpful tool for disabled users. Cybernet System's Charles Cohen believes Gorodnichy's invention will probably complement the traditional mouse and keyboard, while Jupiter Research analyst Joe Laszlo doubts that users will take the nouse seriously.
    Click Here to View Full Article

  • "Berners-Lee Calls for More Voice Apps"
    InternetNews.com (09/14/04); Naraine, Ryan

    In a keynote address at the SpeechTek conference, World Wide Web creator and World Wide Web Consortium (W3C) director Sir Tim Berners-Lee called for further development of voice recognition systems, whose current shortcomings are causing frustration among users that could be detrimental to the industry. "Generally, I'm impressed with what voice technology could do but when it can't understand that I'm shouting 'yes!' into the telephone, there are limitations," he declared. Berners-Lee said end users crave a smooth, coherent experience when they interact with voice-activated phone systems, noting that it is the responsibility of voice technology companies to make those systems capable of accurately understanding mumbles and distorted phrasing, as well as the context of certain vocal commands. He said W3C-backed standards will impel the sophisticated use of voice technology, and solicited developers to participate in W3C initiatives to develop voice browser and multimodal interaction activity standards. The W3C director called attention to the recent publication of the consortium's recommendation for Speech Synthesis Markup Language 1.0, which enables speech- or hearing-impaired people to access VoiceXML-based services using text phones, and is also designed to aid the construction of mobile phone and personal digital assistant applications. Berners-Lee said that enterprise adoption of the Semantic Web could be encouraged by improved voice technologies; he said the Semantic Web could also solve problems with combining voice commands and existing back end databases.
    Click Here to View Full Article

  • "Too Hot to Handle"
    San Francisco Chronicle (09/13/04) P. H1; Yi, Matthew

    PC processors and graphics chips are generating uncomfortable amounts of heat for users, highlighting what is one of the most pressing technical concerns for the semiconductor industry. The excessive heat is a result of greater numbers of transistors being packed into smaller spaces. Chipmakers increase the density of their chips as a way to increase performance, since denser chips pack more transistors and can run faster, with the result that products such as the 3.4GHz Pentium 4 Extreme Edition house as many as 178 million transistors, which collectively use approximately 100 watts of electricity. Chip surfaces reach about 160 degrees Fahrenheit while the temperature inside PCs is normally between 95 degrees and 105 degrees Fahrenheit. Intel, AMD, IBM, and Sun Microsystems have either adopted or plan to use dual-core architectures for their chips, which will increase the processing capabilities while generating less heat than a larger, single-core chip; another heat-reduction technique is to use new materials, such as with silicon-on-insulator technology. The main recourse of chipmakers is to continue down the path of Moore's Law and create smaller transistors on the chip. AMD vice president Craig Sanders says a physical limit lies somewhere below 20nm, but Intel fellow Stephen Pawlowski says no one knows how small chip manufacturing techniques can reach. IBM chief technical officer Bernie Meyerson says the semiconductor industry still has yet to learn about building chips in a balanced way, and notes that a relatively new approach is ratcheting up speed on parts of the chip according to demand. Many of today's PCs use heat sinks to dissipate heat, or fans, but the latest technology involves water-filled copper tubes or a liquid cooling system that works like a car's radiator. Meanwhile, increasingly powerful graphics chips, which can exceed 100 watts, only adds to the heat problem.
    Click Here to View Full Article

  • "IETF Deals Microsoft's E-Mail Proposal a Setback"
    IDG News Service (09/14/04); Roberts, Paul

    Microsoft's SenderID technology was sent back for revision after the Internet Engineering Task Force (IETF) group studying the anti-spam proposal said it contained vague intellectual property claims. Open source groups such as the Debian Project and Apache Software Foundation have already said they cannot use the technology because it violates their own open source licenses. The IETF Mail Transfer Agent Authorization Records in DNS (MARID) working group voted to reject the current specifications of the technology because of the algorithms used for "purported responsible address" checks. Microsoft has refused to discuss the scope of its intellectual property claims for those technologies, causing many technologists to fear possible licensing requirements in the future. SenderID was created by merging Microsoft's CallerID for email with the sender policy framework (SPF) created by Pobox.com's Meng Weng Wong in May. SPF records have already been published by tens of thousands of Internet domains, allowing email operators to check DNS servers for the authenticity of email "envelopes," which are sent ahead of actual email messages; the new SenderID specification would extend those checks to the email header as well and prevent spammers from faking the "from" address. In a statement, Microsoft said the MARID decision was not a rejection of SenderID, but rather part of a refinement process that would likely mean alternative mechanisms for conducting "purported responsible address" checks would be included. Wong agrees with the IETF decision, saying standards must be protected from intellectual property disputes.
    Click Here to View Full Article

  • "Proving That Shape-Shifting Robots Can Get a Move On"
    Newswise (09/16/04)

    A major challenge to the creation of self-reconfigurable or shape-shifting robots is establishing control and planning methods that will prevent the machines from falling apart or getting stuck as they move, and a team of Dartmouth College researchers led by Daniela Rus has devised such methods, as detailed in the September 2004 issue of the International Journal of Robotics Research (IJRR). "These latest papers show it is possible to develop self-reconfiguration capabilities in a way that has analytical guarantees," explains Rus, former director of the Dartmouth Robotics Lab and MacArthur Foundation Fellowship recipient. Shape-shifting robots are designed to adapt their configuration to changing needs or tasks, preferably without human assistance, and Rus has spent the last decade making strides in both technical and control methodology. One of her breakthroughs is the design of 3D self-reconfigurable robots composed of "expanding cubes," a representation of the "lattice robot" concept. Lattice robots self-reconfigure by planning how to reshape themselves between two configurations, and also plan the series of shapes needed to meet more complex challenges. Shape-shifting robotics researchers generally agree that lattice robot reconfiguration should be governed by distributed rather than centralized methods, and the paper in this month's IJRR as well as an earlier report in the same journal last September detail such techniques: Both papers provide rules that dictate how such devices should travel over terrain, build structures to climb over impediments, or enter closed spaces through small passages. Unlike conventional robotics control methods characterized by connection to a particular hardware, Rus' work outlines control and planning for a whole class of lattice robots. The National Science Foundation has been funding her work through awards for the past eight years.
    Click Here to View Full Article

  • "Dozens of Experts Take on Cyberterror"
    Seattle Post-Intelligencer (09/13/04); Shukovsky, Paul

    Government and business leaders from across the Pacific Northwest conducted a cyberterror simulation last week to assess the vulnerability of computer-controlled critical infrastructure. The public-private partnership attracted more than 100 experts from several states, the Department of Homeland Security, the military branches, Microsoft, Boeing, the FBI, a number of U.S. and Canadian utilities, the Bonneville Power Administration (BPA), and the Los Alamos, Sandia, and Argonne national laboratories. In opening remarks, Maj. Gen. Timothy Lowenberg, adjutant general of the Washington National Guard, described cybertechnology as a great strength for the nation, but also as an area of tremendous weakness. The exercise, dubbed Blue Cascades II, gave experts an opportunity to determine how telecommunications, utilities, and other major systems rely upon one another, such as how a power failure brings banking and finance to a halt, for example. Participants signed an agreement not to reveal the result of the exercise, and a reporter was asked to leave after introductions. In exercises conducted by the BPA, systems were found to be secure from attacks. However, "there are some utilities that operate on the Internet, and that's a vulnerability," said BPA security manager Robert Windus.
    Click Here to View Full Article

  • "Software Tutors Offer Help and Customized Hints"
    New York Times (09/16/04) P. E3; Hafner, Katie

    Carnegie Learning's Cognitive Tutor program is employed at 1,700 U.S. middle schools and high schools to offer math students more personalized instruction using artificial intelligence. The educational software monitors students' performance while providing customized feedback and hints, and paints a picture of a specific user's strengths and weaknesses in order to suggest improvement strategies. Cognitive Tutor's AI component presents drills tailored to a student's deficiencies, and observes the work on a step-by-step basis, detecting when the user runs into trouble and offering hints when needed. Carnegie Mellon University professor and Carnegie Learning co-founder Ken Koedinger remarks that the problem-solving process differs for each person, and Cognitive Tutor is unique among computer-aided instruction systems because it takes this variance into account. The software is visually arranged to get students to fill up an onscreen "skillometer" with gold bars as their skill levels increase, while users can request hints by clicking on a light bulb icon. Some critics argue that intelligent tutoring systems' ability to aggressively guide students toward correct answers when they encounter difficulty actually discourages the development of skills for dealing with more complex problems. The programs are well-suited for math and science problems, but it is harder to apply the software to more abstract disciplines such as psychology, economics, and business. Still, CMU researchers have documented significant performance improvements among students who use Cognitive Tutor.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Digital Alchemy"
    Technology Review (09/17/04); Hellweg, Eric

    The computing industry has been pursuing the vision of software emulation for nearly three decades, but such efforts have yielded few results with wider applications beyond enabling one specific program to run on one other kind of processor. Furthermore, these products are often characterized by significant performance degradation. Transitive Software, a startup company, claims to have made a breakthrough with Quick Transit, a program that reportedly "allows software applications compiled for one processor and operating system to run on another processor and operating system without any source code or binary changes" with minimal performance degradation. Transitive CEO Bob Wiederhold explains that his company has been developing Quick Transit for nine years, and its announcement has been met with both excitement and skepticism. If Quick Transit actually does deliver on its promises, it could make the migration of a company's software to new hardware simpler and less expensive, an essential step for firms that want to upgrade or switch servers. The emulation program could also allow a single server to run multiple tasks, facilitating server consolidation and lessening budget overhead and management headaches. Transitive says a half-dozen as-yet-unidentified companies have signed up for Quick Transit, and Wiederhold expects the first customer announcement to be issued in upcoming months. The emulator will be initially marketed to large computer makers, although the technology could eventually penetrate consumer markets such as video games.
    Click Here to View Full Article

  • "As WGIG Forms, Ideas About Defining Its Scope Circulate"
    CircleID (09/14/04); Mueller, Milton

    The Internet Governance Project has issued a set of reports on the current "state of play" in Internet governance as commissioned by the United Nations ICT Task Force as input for the U.N. Secretary-General's Working Group on Internet Governance, writes Syracuse University School of Information Studies professor Milton Mueller. The report attempts to define exactly what is meant by the "Internet" and what falls under "Internet governance" and identifies the organizations and agreements that affect the Internet globally. The report emanated from issues raised at February's ITU Geneva Workshop and the UN ICT Task Force's Global Forum in New York City in March. It defines "Internet" in terms of protocols used, noting that physical-layer and layer-2 issues are out of range and that only layer 3 and above applies. It stipulates that issues beyond ICANN's reach should also be categorized as governance, including IPR protection in digitized data that use the Internet for dissemination; surveillance and privacy issues pertaining to ISPs and Web users; and trade and e-commerce issues. The report also calls for government authorities to "find a foundation of legitimacy and accountability" for non-state bodies such as the IETF and ICANN. U.N. Secretary-General Kofi Annan is expected to name the members of the working group in a week or two.
    Click Here to View Full Article

  • "Cursor on Target"
    Military Information Technology (09/02/04) Vol. 8, No. 7; Miller, William

    The Cursor on Target (CoT) project is a joint venture between the Air Force Electronic Systems Center (ESC), MITRE, the Navy, the Air Force Special Operations Command, and the Air Force Research Laboratory to synchronize the battlefield operations of space, air, and ground forces through a software platform that facilitates the sharing of mission-critical data between field command and control systems without a heavy reliance on human decision-making. The prototype software meshes the communications of all military components into one network using common message elements, and inserts a common data overlay that removes user format incompatibilities. CoT's key application is as a replacement for the voice and physical interface used by field-based combat controllers to send targeting data. CoT is designed to route all required data and tasking orders to the target as needed when command center staff move the cursor over the target and signal their authorization by clicking. "The decision to put the cursor on the target and bring firepower on it always has to be a human decision," assures Colonel Mike Therrien, who directs the Command Interoperability Program for ESC's C4ISR Enterprise Integration Office. Tests have demonstrated that CoT software enables sensor-to-shooter paths that speed up the targeting operation by almost 70% while boosting firepower accuracy. MITRE's Richard J. Byrne says the Extensible Markup Language the software is written in lets users translate mission-critical information by focusing on the most important data elements, in much the same way that people in foreign lands are trained to communicate with natives using keywords rather than learning entire languages.
    Click Here to View Full Article

  • "The SOA Puzzle: Five Missing Pieces"
    InfoWorld (09/13/04) Vol. 26, No. 37, P. 42; Sleeper, Brent

    The promise of service-oriented architecture (SOA) is impeded by challenges in the areas of reliable asynchronous messaging, orchestration, security, legacy support, and semantics, and IT managers' decision to follow either a conventional enterprise application integration strategy or a Web services SOA scheme hinges on the IT unit's goals as well as which challenge it is confronted with. Aiaz Kazi of Tibco calls messaging reliability an essential element of enterprise-quality integration, but it is perhaps the toughest challenge to meet because reliability standards such as WS-Reliable Messaging are not in wide use--for now, the only real strategy for accommodating the reliability gap is to build applications around it, according to Thomson Prometric's Chris Crowhurst. Orchestration, a blanket term for the design and execution of composite Web services, comprises the spine of business process management solutions, and is key to many SOA integration strategy adopters; proposed orchestration standards of note include Business Process Execution Language and the complementary WS-AtomicTransaction and WS-Composite Application Framework, which are designed to effect the transactions comprising long-running business processes. There are numerous proposed security standards for Web services-style interactions on the boards, three of which--SAML, WS-Security, and WS-Policy--collectively yield a solid basis for planning ongoing strategy. A relatively pragmatic, short-term solution to the security question is to use transport-level mechanisms such as SSL that are employed to protect Web-based applications. Embedding legacy systems and packaged applications into SOA requires legacy application adapters, but IT managers currently have no choice but to use costly proprietary adapters to connect outmoded systems to a common application architecture. Finally, patching the semantics gap so vital to well-designed SOAs is beyond the capabilities of technology and software products, but the pressure on business and IT managers to classify and deploy functions and data models for industry- and function-specific processes can be lessened by employing prebuilt elements and robust consulting experience.
    Click Here to View Full Article

  • "The Next Threat"
    Forbes (09/20/04) Vol. 174, No. 5, P. 70; Lenzner, Robert; Vardi, Nathan

    There is growing evidence that terrorist cells such as al Qaeda are attempting to become skilled in hacking and other forms of cyberwarfare, and experts warn that cyberterrorists could cripple the World Wide Web, interfere with military communications systems, or disrupt electrical grids to catastrophic effect. But few federal agencies or corporations have considered or followed recommendations for shoring up both public and private infrastructure, despite the imminence of the cyberterrorist threat. Reasons for the sluggish response include political in-fighting, beliefs among government officials that the threat is exaggerated, indecision over who should foot the bill for implementing tougher cybersecurity, and regulatory and financial stumbling blocks that are hindering the growth of corporate security spending. American businesses are reluctant to pass on the costs of cybersecurity upgrades to customers, either because they are tightly regulated or are faring so poorly that a price hike could kill them. Rep. William Thornberry (R-Texas) thinks tax incentives would be a far more productive tool to encourage corporate spending than government regulations, while the major automated control system providers contend that customers flatly refuse anything with a price tag, even if it is more secure. However, the deployment of such control systems to run utility grids and other key components of U.S. infrastructure is the reason why America is so vulnerable to cyberattack: Ted Lewis of the Navy Postgraduate School reports that almost 300 facilities responsible for 80% of America's electricity use employ poorly shielded control systems, which lack encryption and are easy to manipulate. Of particular concern are weaknesses demonstrated in the Border Gateway Protocol, which could be exploited to manipulate routing information and corrupt the Internet, and the Domain Name System, which is underpinned by poorly secured root servers.
    Click Here to View Full Article

  • "Exploring the Ultrawideband"
    Science & Technology Review (09/04) P. 12; Heller, Arnie

    A slew of commercial products, many with national and homeland security applications, stem from Lawrence Livermore National Laboratory's micropower impulse radar (MIR) technology, which emits millions of electromagnetic pulses in the ultrawideband (UWB) range each second to facilitate low-power, noninterfering data transmissions. Recent work by Livermore researchers is focused on using MIR to create new classes of UWB-based sensing, imaging, and communication devices characterized by portability, low cost, ruggedness, power efficiency, and resistance to detection, interception, and jamming. The pulses sent out by MIR units can pass through most low-conductivity materials as well as moderately conductive substances such as the human body. Portable MIR motion detectors equipped with directional, high-gain antennas have been developed for search and rescue operations in situations where people may be trapped under deep piles of rubble. The antennas can catch motion caused by breathing and heartbeat from as far away as 100 meters. MIR has been a key technology for several Defense Department projects, including one that reduces the maintenance costs of U.S. Marine and Navy V-22 Osprey helicopters via a device that sends UWB pulses to the spinning rotors overhead, measuring the vertical distance to determine whether rebalancing is needed. The HERMES Bridge Inspector project uses an array of MIR-based modules mounted in a trailer to generate images of bridge deck interiors for deterioration analysis, while the Urban Eyes imaging system employs a pair of MIR sensors to produce a real-time view of movement behind walls. Another MIR-based Livermore technology is Guardian, a portable motion sensor system that networks individual nodes comprised of a transmitter, receiver, global positioning system module, and processor into a "mother" node that routes data to a remote monitoring station.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM