Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to technews@hq.acm.org.
Volume 6, Issue 617: Friday, March 12, 2004

  • "Firms Push to Expand Visa Program"
    Wall Street Journal (03/11/04) P. A2; Munoz, Sara Schaefer

    Although chances for approval are slim, U.S. manufacturers and high-tech firms are lobbying Congress to authorize an exception to current H1-B visa policy in order to work around the 65,000 yearly visa cap, which was reached by February. The exception would grant H1-Bs to foreign applicants who have earned master's or doctorate degrees at American universities. Supporters claim that such admissions will actually help reduce offshore outsourcing, as admitting more foreign professionals into the country will lessen U.S. businesses' need to export jobs. "They have been educated in American universities but still have the knowledge, cultural skills and language skills of their home country," notes immigration adviser Elizabeth Dickson. U.S. immigration statistics estimate that 42 percent of H1-B visa holders in 2002 was comprised of workers with an education equivalent to a master's degree or higher in all professions. The only exception to the H1-B limit currently allowed is for visa holders who work at universities or nonprofit research and development agencies. Proponents of the proposed exception face a tough battle, as opponents to visa expansion include organized labor and legislators such as Connecticut Sen. Chris Dodd (D) and Rep. Nancy Johnson (R). But the lobbyists may yet arouse sympathy from the likes of Republican Sens. Orrin Hatch (Utah) and Saxby Chambliss (Ga.).

  • "Legislators Urge E-Voting Halt"
    Wired News (03/11/04); Zetter, Kim

    California Sens. Don Perata (D-Oakland) and Ross Johnson (R-Irvine) sent a letter to Secretary of State Kevin Shelley urging him to declare a moratorium on paperless electronic voting systems, arguing that a debacle akin to Florida's woes in the 2000 election could be in the offing if the problems that plagued the machines in the March 2 primary are any indication. If the paperless systems are decertified as the senators recommend, California election officials would have little choice but to use optical scan machines, which employ a voter-verifiable paper ballot. The legislators said that e-voting de-certification is justified by the malfunctions paperless machines suffered on March 2, which prevented many people from voting; they added that the upcoming presidential election is too important to leave to systems without a paper trail. "None of us want California to be the sequel to Florida," said Perata. Tom Stanionis, data processing manager for Yolo County's election division, said that e-voting machine vendors, in making their products as foolproof as possible, have placed an unreasonable burden on the shoulders of poll workers; as a result, poll workers gave incorrect ballots to voters on March 2, resulting in votes being cast for candidates and races that were not in the voters' district. In addition, Stanionis noted that half of the 14 counties that were using touch-screen machines in last week's primary suffered from counting problems that held up results way past the midnight deadline. The data processing manager thinks decertifying paperless systems until they are improved to voters' satisfaction is a good idea, but feels that election officials may be overwhelmed with this responsibility, given the timing of the directive. However, Sen. Johnson stated at a press conference, "California has a lemon law that protects consumers if they buy a bad car. So far, e-voting in California is a lemon."
    Click Here to View Full Article

    For information about ACM's activities regarding e-voting, visit http://www.acm.org/usacm.

  • "DARPA Takes Aim at IT Sacred Cows"
    Government Computer News (03/11/04); Jackson, Joab

    In its push to usher in the age of network-driven warfare, the Defense Advanced Research Projects Agency (DARPA) is considering upgrading some of the key elements of IT, or scrapping them altogether. Program managers at the DARPATech conference such as Col. Tim Gibson of DARPA's Advanced Technology Office complained that many of these elements are fundamentally flawed and uphold inflexible, insecure, and unreliable computer and networking architectures, which discourages military commanders from applying IT to vital tasks. "We don't expect computers to work, we expect them to have a problem," Gibson remarked. He said the packet-based nature of Internet Protocol probably needs to be revamped in order to make message delivery more secure, accurate, and reliable. Gibson maintained that "we must absolutely have some mechanism for assigning network capabilities to different users and that capability has to scale to large numbers of devices automatically." The IP approach's lack of dynamic scalability makes the military's need to quickly establish ad-hoc networks a difficult proposition. Advanced Technology Office program officer Anup Gosh wondered whether the von Neumann architecture--the basic design paradigm for nearly all modern computers--should be discarded. The drawbacks of this architecture include the likelihood that other programs could be affected by a malfunctioning application, and the possibility that program bugs could be exploited to launch a system-wide attack. Program manager Reggie Brothers noted that the seven-layer Open Systems Interconnection model, regarded as the cornerstone of network protocol construction, cannot accommodate wireless communications devices; he believes a mesh rather than a stack configuration would be a better solution.
    Click Here to View Full Article

  • "Autonomous Vehicles to Put Embedded Technology to the Test"
    Embedded.com (03/09/04); Murray, Charles J.

    The Defense Advanced Research Project Agency's Grand Challenge, a March 13 off-road race between driverless vehicles stretching about 250 miles between Barstow, Calif., and Las Vegas, will serve as a testbed for embedded technology that could be applied to defense, agriculture, industry, and especially automotive transportation. The vehicles will be equipped with a formidable array of sensors, software, and computers whose tasks will range from avoiding obstacles to driving motors to plotting a course using approximately 1,000 global-positioning-system "way points"
    that will not be disclosed until just a few hours before the race--all without human intervention. The participants are substituting drivers with by-wire systems that augment the steering wheel, accelerator, brake pedal, and gear shift with motors. SciAutonics' entrant, a revamped ATV all-terrain vehicle, combines a GPS with an inertial-measurement unit (IMU) and a magnetometer; the IMU gauges the vehicle's acceleration and rotation values to determine whether the vehicle is still on its desired course, while the magnetometer furnishes heading data when the vehicle is stationary. Virginia Tech University's entrant, dubbed "Cliff," is designed to recognize obstacles through a combination of radar, laser range finders, a visible-light camera, differential GPS, and dedicated cFP-2010 compact field point (CFP) computers. The CFP units, which transmit data to a trio of PXI-8176 controllers containing a geographic database, make navigation decisions based on information picked up by the sensors. To process the huge amounts of data coming through the sensor array and the GPS gear, Carnegie Mellon University's vehicle, a redesigned Hummer, boasts a four-way Itanium-based computer system with three dual-Xeon processor systems. Brad Chen of Intel's Performance Tools Lab notes that third-party GPS software, Intel compilers, and Intel Vtune performance analyzer software were integrated with navigation and obstacle-avoidance software written specifically for the Grand Challenge.
    Click Here to View Full Article

  • "Java Stewards Announce New Version of Java Community Process"
    eWeek (03/09/04); Taft, Darryl K.

    A new version of the Java Community Process (JCP), JCP 2.6, was announced on March 9 by the committees that oversee Java: JCP executive relations manager Aaron Williams stated that JCP 2.6 will offer more transparency and earlier developer involvement than before by making the first draft review of specifications open to the public. He added that the balloting process for the JCP will shift from after the first review period to after the second, noting, "We believe with these changes we can move review first periods down to four to five months, from about nine now." The result will be faster completion of Java Specification Requests (JSRs). JCP 2.6 gives more flexibility to JSRs by allowing some JSRs to extend over different Java platforms. Furthermore, JCP 2.6 will have more documentation and tools such as a community page, while every JSR will feature a home page on JCP.org. Sun Microsystems reported that membership in Java User Groups has swelled by over 150 percent since the beginning of 2003, while the number of groups in general has doubled to 600 over the same interval. John Zukowski, Java developer and vice chairman of the Association for Computing Machinery WebTech chapter in Boston, praised Java user groups for their networking opportunities, and as a resource for the latest Java news and community developments.
    Click Here to View Full Article

  • "In Search of the Deep Web"
    Salon.com (03/09/04); Wright, Alex

    Google, Yahoo!, and many other companies are working to make search engines capable of mining the deep Web--a vast, largely untapped reservoir of structured or semi-structured data; this approach should not only yield more focused search results, but also break the control over customer transactions wielded by organizations that function as data access intermediaries. Most of the deep Web data is hidden in databases under an avalanche of registration gateways, dynamically generated links, and session cookies, but new search engines designed to get around these barriers are in the works: BrightPlanet, for one, has developed a technique for brokering queries across multiple deep Web data sources simultaneously, combining the results and allowing users to compare changes to those results over time. Dipsie will launch an engine that employs a modified spider program that cuts through drop-down lists, forms, session cookies, and other obstacles by imitating a "well-formed user." Deep Web search engines will likely migrate away from the one-dimensional listings-style result pattern typical of current engines into a more mutable format that can also aggregate results into countless combinations. Electronic journal subscriptions and paid search services are revenue resources of increasing importance to scholarly publishers whose customers desire journal content to be Web-accessible; to satisfy customer demand while keeping their data safe, the major publishers have set up private permission-based search gateways. However, Peter Lyman of the UC-Berkeley School of Information Management and Systems comments that "The cost of creating formally quality-controlled information may drive people to consider lower-cost alternatives" such as deep Web search engines. The popularity of deep Web engines will spur organizations to supply more consistent, less unpredictable structures, and to abandon the development of their own public interfaces; the end result will be a transition from internal-facing systems to external-facing systems.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "Hebrew University Scientist Co-Directing European Research Project for Internet of Future"
    Innovations Report (03/11/04)

    Scientists are working on ways to measure the growth of the Internet and make future networking more efficient. The European Union-funded EVERGROW project involves IT companies, 25 universities, and more than 100 scientists. The four-year program started at the beginning of the year and has a budget of 5.6 million euros. Co-coordinator Prof. Scott Kirkpatrick, of the Hebrew University of Jerusalem, says the study will observe and measure the Internet in an effort to find a better computational solution for the operation of global communications. Users today can access a number of Internet functions from one small instrument, including voice, written messaging, image transfer, and remote command controls. EVERGROW participants say that advances in the architecture and functioning of overlaying communication networks are needed before it can be discovered whether existing networks can handle the huge loads of spreading technology. The project is headquartered at the Swedish Institute of Computer Science.
    Click Here to View Full Article

  • "Robotic Skeleton Takes Load Off Humans"
    Daily Californian (03/11/04); Chen, Regina

    Researchers at UC Berkeley have developed a self-powered, robotic skeleton that is designed to frame the human body, called Berkeley Lower Extremity Exoskeleton (BLEEX). Homayoon Kazerooni, a mechanical engineering professor who is the director of the Robotics and Human Engineering Laboratory, believes BLEEX is a breakthrough because of the natural way in which humans and the robot interact. The exoskeleton consists of metal leg braces, a computerized power unit, and a structure that resembles a backpack; the device enables users to carry heavy loads across long distances. During BLEEX experiments, a man wearing the 100-pound frame and a 70-pound backpack felt as if he was carrying just a few pounds. The exoskeleton is easy to use, only requiring wearers to balance it while its computer calculates controlling the frame so it moves in sync with the operator. The researchers believe BLEEX could be used by medics to carry injured soldiers off battlefields, by firefighters to carry equipment up flights of stairs, by rescue workers to bring food and emergency supplies to areas that can not be reached by vehicle, and by hikers to ease their treks. The UC Berkeley team wants to make parts of the exoskeleton more compact, the engine more powerful and quieter, and the frame strong enough to carry 120 pounds. BLEEX is a key development because researchers envision robotics as a way to help computers interact with the real world.
    Click Here to View Full Article

  • "Invasion of the Robots"
    CNet (03/10/04); Kanellos, Michael

    Innovations in robot technology are expected to spark a revolution in health care, domestic assistance, military operations, and other areas that will foster the creation of a multibillion-dollar market within several years. Companies making notable strides include iRobot, which developed the Roomba robot vacuum cleaner and the PackBot, a machine designed to perform reconnaissance, supply, and rescue missions over rough terrain for the U.S. military; Seegrid, which has produced software that enables a mobile device to generate a single-pass 3D map of its surroundings; and Intuitive Surgical, whose da Vinci Surgical System allows doctors to perform more precise, less invasive surgery through a set of robotic appendages that "see" in three dimensions, which raises the odds that patients will have a swifter, more comfortable, and less expensive recovery. Many innovative robot technology projects are taking place at Carnegie Mellon University, or are being spurred by CMU researchers such as William Whittaker, who is creating robots that can map mine shafts, drive harvesters, clean up nuclear waste, and remove obstructions from sewer lines. CMU is also participating in the upcoming Grand Challenge, a race between unmanned vehicles sponsored by the Defense Advanced Research Projects Agency. Partly responsible for the rush in robotics innovation is steady performance improvements and falling prices for processors, sensors, and other robot components; some robotics companies are also cutting manufacturing costs by collaborating with other industries. Robots' commercial success hinges on customer satisfaction, so developers are designing products that stress practicality rather than technological prowess. So that the robotics market can reach its full potential, researchers must meet several challenges, including refining the machines' dexterity and gripping abilities, and establishing a way for robots to communicate with each other and coordinate their activities.
    Click Here to View Full Article

  • "Brain Circuitry Findings Could Influence Computer Design"
    MIT News (03/08/04)

    Using funding from the National Institutes of Health and the RIKEN-MIT Neuroscience Research Center, neuroscientist Guosong Liu has uncovered new data about neurons that could have a bearing on future computer design. Liu has learned that neurons use a trinary systems of zeros, ones, and minus ones to process information, compared to the binary system of zeros and ones that computers employ; this could account for the brain's ability to disregard data in certain situations. Liu explains that a key component in the operation of brain circuits is connecting the proper positive (excitatory) wires with the proper negative (inhibitory) wires, and his research illustrates that neurons are made up of myriad processing modules that individually collect a set number of excitatory and inhibitory signals, and that can act as powerful processing nodes when the two inputs are correctly linked. Upon completion of processing, the modules transmit signals to the cell body, where they are combined and relayed to the next cell. "With cells composed of so many smaller computational parts, the complexity attributed to the nervous system begins to make more sense," notes Liu. This research supports a growing perception among neuroscientists that the basic processing unit, or transistor, of the brain is not the cell after all, but a smaller element. Liu discovered that the cellular modules automatically configure along the cell's surface as the brain develops, and possess an embedded intelligence that appears to allow them to generate new circuit connections when others are broken. Liu's work adds credence to MIT neuroscientist Tomaso Poggio's theory that neurons process information via an excitatory/inhibitory mechanism.
    Click Here to View Full Article

  • "Robots That Build (But Still Won't Do Windows)"
    New York Times (03/11/04) P. D10; Wertheim, Margaret

    The brainchild of University of Southern California engineering professor Dr. Behrokh Khoshnevis is a computer-controlled robotic gantry that can build concrete walls layer by inch-thick layer, and Khoshnevis believes such technology could pave the way for completely automated construction. "Our goal is to completely construct a one-story 2,000-square-foot home on site in one day, without using human hands," the professor explains. It is expected that later versions of the robot gantry will be capable of building compound curves as well as right angles, while Los Angeles architect Greg Lynn believes such machines could change the face of architecture. "I'm convinced this will allow you to make beautiful, innovative and as yet unimagined kinds of houses," he extols. The robot came out of Khoshnevis' desire to apply the principles of rapid prototyping to architecture, and the professor envisions a larger version of the device that "prints out" houses in concrete. Khoshnevis also holds patents in sintering, a chemical process in which powders are turned into solid shapes by computer control. After the machine is perfected, the drying process for concrete will need to be accelerated in order for the technology to become commercially viable. Khoshnevis notes that architects could sell fundamental designs that customers could then adapt and tailor to the concrete printer using special software. Khoshnevis' project is a joint effort between USC's engineering school and its Information Sciences Institute; funding has come from the National Science Foundation and the Office of Naval Research. Khoshnevis is seeking $5 million to continue developing the robot technology, known as contour crafting.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Productivity's Technology Iceberg"
    MIT Technology Insider (03/10/04); Brynjolfsson, Erik

    Director of MIT's Center for eBusiness Erik Brynjolfsson says the tremendous growth in U.S. productivity is directly related to the way companies use information technology, but dismisses the notion that technology alone can boost efficiency. Innovations introduced half a decade ago are finally bearing fruit and helping to support the current "jobless" recovery, while Brynjolfsson maintains that they will help spur persistent business growth and raise living standards for both workers and society in general in the long run--provided that managers do not focus solely on computer hardware investments. The largest and most important investments are funneled into the development of business processes designed to harvest technology's rewards. Brynjolfsson writes, "The unsung heroes of the IT revolution have not been the microchip and the Web browser, but rather the creative, diligent, and painstaking work done by those who have been rethinking supply chains, customer service, incentive systems, product lines, and 1,001 other processes and practices affected by computers." Current productivity growth, he observes, is driven by intangible capital investments. According to research at the Center for eBusiness, the most productive digital organizations are those that set up a communications channel between business managers and technologists, automating mundane functions while allowing staffers to make decisions on the fly with the proper authority and access to data. Brynjolfsson points out that there is a profound shortage of managers who are skilled in both business and technology, and writes that their innovations collectively contribute to substantial productivity gains. "To sustain the productivity surge, today's managers must develop incentives that encourage their workers--as well as themselves--to be more creative, self-starting, educated, and willing to experiment," he concludes.
    Click Here to View Full Article

  • "Alliance Sets Blade Standards in Motion"
    InternetNews.com (03/09/04); Singer, Michael

    Blade server technology is getting a further boost from the joint efforts of two standards bodies: The Distributed Management Task Force (DMTF) and Blade System Alliance (BladeS) partnership, which focuses on server management and utility computing standards. The DMTF's Web Based Enterprise Management and Common Information Model standards are used for platform-independent management information exchange. Companies are rapidly embracing blade server technology because of more UNIX to Linux migrations, increasing numbers of Linux supercomputing clusters, and the cost-benefits of server virtualization software coupled with blade technology. Blade servers are particularly popular with ISPs and ASPs for email, Web hosting, and domain-name serving. Industry expert Rob Enderle says Hewlett-Packard and Transmeta are producing the most interesting blade server products with their innovative power-saving and heat-reduction technologies: Those products will help companies significantly cut down on their server room cooling costs while simultaneously increasing the density and lowering the cost of blade servers. Despite the technological innovation, Enderle says IBM has the advantage in the market because of its superior marketing and services organization. The blade server market is set to grow rapidly, reaching $3.7 billion in 2006 and $6 billion in 2007, according to IDC; in 2007, IMEX Research says blade servers will make up nearly 25 percent of the total server market. Enderle says innovation on the thermal front, AMD's 64-bit competition with standard Intel parts, and new PC architecture technology such as PCI Express should ensure the blade server market remains tumultuous for some time.
    Click Here to View Full Article

  • "For Rural Pennsylvania, Wireless Is the Ticket to the 21st Century"
    EurekAlert (03/09/04)

    Lehigh University electrical and computer engineering professor Shalinee Kishore has won a five-year grant from the National Science Foundation to develop multitier wireless networks and demonstrate their viability in Pennsylvania's Susquehanna County, which extends across 823 square miles of rural, sparsely populated terrain that discourages the installation of wired and cable networks to support modern telecommunications such as high-speed Internet access and lower-speed voice, data, and messaging services. There is certainly a call for such technology in the region: Volunteer companies have expressed a desire to communicate wirelessly with each other at times of emergency; Wi-Fi Global Positioning System capabilities could prove very useful for hunters and farmers; and high-school students will no doubt want Wi-Fi Internet access. "Because of this lack of access, the county's residents feel they are being left behind the times," notes Kishore. The Lehigh professor reports that multitier wireless technology provides pervasive lower-rate coverage and targeted high-speed access at the same time via a network of user terminals and base stations--also known as radios--whose coverage areas diverge in their order of magnitude. The technology will need to employ bandwidth conservatively, since the FCC has allocated a limited amount of bandwidth in order to ensure that radios will not interfere with each other. Kishore wants to make the most of what little bandwidth is available and keep interference generated by high spectral reuse to a minimum. She will investigate centralized and non-centralized architectures as well as analytical techniques that consider signal processing, access control methods, and radio resource allocation.
    Click Here to View Full Article

  • "Physics: "Putting the Weirdness to Work""
    Business Week (03/15/04) No. 3874, P. 103; Carey, John

    Government and corporate scientists are working to understand the laws of physics at the atomic scale, where a single atom is able to exist in many places at once and can be strangely "entangled" with another atom even at far distances: Researchers theorize that these bizarre quantum effects can be harnessed for super-fast computation, and some companies are already selling quantum encryption products. Experts predict that a quantum computer would be able to factor a 400-digit number in just 30 seconds, compared with about 10 years on today's supercomputers. Large IT companies such as IBM and Hewlett-Packard are working on such machines, as is the Pentagon's Defense Advanced Research Projects Agency. At the National Institute of Standards and Technology (NIST), Nobel prize-winning physicists are working on creating quantum computers from a recently discovered material called Bose-Einstein condensate, where millions of atoms are clumped together but each is present everywhere within the clump at once. NIST physicist William D. Phillips is trying to create a computer by shining intersecting laser beams through the Bose-Einstein condensate. As atoms naturally fall into the valleys between intersecting light waves, they line up in a way that might provide for quantum computation; each quantum bit of information, called a qubit, can represent both a 0 and 1 value, so that the computational power of qubit strings grows exponentially with additional qubits. A quantum computer with 300 qubits could process more combinations than there are atoms in the universe. Another NIST researcher is using colored lasers in a similar Bose-Einstein condensate experiment, while yet another NIST quantum computing effort has successfully built logic gates using ions trapped in qubits' magnetic fields.
    Click Here to View Full Article

  • "Stay Just a Little Bit Longer"
    Computerworld (03/08/04) Vol. 32, No. 10, P. 46; Melymuka, Kathleen

    Bob Morison of The Concours Group cites Bureau of Labor Statistics estimates that the United States will suffer a shortage of 10 million workers by the end of the decade, but even more detrimental will be a paucity of skilled labor stemming from the mass retirement of aging baby boomers. His colleague Tamara Erickson adds that a study of government agencies predicts three-quarters of all U.S. companies will face a shortfall in qualified IT personnel in the next three to four years. Morison notes that IT workers well-versed in how applications uphold business processes possess skills that cannot be outsourced, making them difficult to replace. Erickson suggests that if companies institute flexible retirement packages that allow staffers to gradually phase out rather than retire abruptly, there is more likelihood that those employees will sustain a working relationship with the company even after retirement. Another suggestion is to add flexibility to work locations and schedules so that employees can reasonably balance their professional life and their life outside of work, an option that is especially appealing to older workers. "Many mature people need to work, and others want to because they enjoy the action, but on their own terms and not full time," remarks Morison. Erickson acknowledges that burnout is another factor to consider, but reports that often at the root of this problem is a lack of involvement with the work. Corporations must therefore determine how to rekindle employees' sense of involvement, and the Concours Group consultants think this can be accomplished through training and learning.
    Click Here to View Full Article

  • "Should Embedded Linux Be Standardized?"
    Software Development Times (03/01/04) No. 97, P. 26; Correia, Edward J.

    Embedded Linux system developers are weighing the costs and benefits of standardization, and their ultimate decision could drastically affect future embedded development. The Embedded Linux Consortium's Platform Specification (ELCPS) 1.0, published more than two years ago, was touted as the authoritative embedded Linux development standard, but support for ELCPS trickled away, mainly because former member companies saw little customer value in the initiative. Momentum is building around the Consumer Electronics Linux Forum (CELF), which plans to issue CELF 1.0 in May; the organization is dedicated to creating augmentations for embedded Linux that improve power efficiency, boot and shutdown times, footprint, and other capabilities that CELF members think will add value through consistent, universal use. MontaVista Software's Scott Hedrick believes CELF's benefits will extend to the enterprise, and CELF Chairman Scott Smyres thinks the standard could add intelligence to network devices if the forum follows the advice of IBM, a member of the CELF steering committee. Furthermore, Smyres has not dismissed the possibility of synergy between CELF and other organizations such as ELC or the Open Source Development Labs. Failing to standardize embedded Linux development will lead to market fragmentation, which FSM Labs CEO and co-founder Victor Yodaiken characterizes as "exceptionally dangerous," adding that Linux's appeal to enterprises is obvious, given the rising costs of developing niche operating systems. "Linux fragmentation will make the technology too expensive to deploy and to move from one CPU to another and keep compatibility at the API level," warns Wind River Systems' Michael Genard. Hedrick says MontaVista thinks concentrating on kernel.org and associated trees, as well as standard Linux evolution, is the best strategy to drive standardization.
    Click Here to View Full Article

  • "The 100-Million-Mile Network"
    Baseline (02/04) No. 27, P. 24; Carr, David F.

    The Deep Space Network extends across more than 100 million miles of space as the means of communication between the Spirit and Opportunity rovers on Mars and their controllers on Earth. The robot explorers usually send their data to the orbiting Mars Odyssey and Global Surveyor probes, which relay the data to Earth, where it is picked up by one of three listening stations located in California, Spain, and Australia. The global positions of these stations ensure that at least one will able to communicate with the spacecraft being monitored at any time. Communications between the rovers and the orbiters use the UHF frequency, while long-haul messages and critical commands are transmitted on the higher X-band frequency. The Deep Space Network must maintain communications with the rovers 24/7. Weeks after its successful landing, the Spirit rover fell silent, forcing engineers to scramble for a way to diagnose the problem and restore communications using what commands they could send and what little data they could receive on the Deep Space Network. When a connection is lost, the network's reception gear slips into a "closed-loop" mode and continually scans for a range of frequencies around the expected one, searching for a signal that can be locked on to. Spirit's problem was the result of its flash memory becoming overloaded by the data collected during its journey through space and its surface exploration, so NASA engineers used the Deep Space Network to delete excess files and transmit a patch telling Spirit and Opportunity how to more conservatively use their flash and random memories. NASA scientists now say the $860 million Mars exploration mission should last beyond the planned three months. Chief scientist Steve Squyres says, "We built margin [of error] on top of margin [of error], specifically to allow for the fact that things go wrong on a place like Mars."
    Click Here to View Full Article

  • "The World Is Your Perimeter"
    CSO Magazine (02/04); Lindquist, Christopher

    Corporate security perimeters are becoming less protective as more and more corporate information systems employ tools and processes outside the conventional firewall, and as employees, partners, and customers use multiple devices to access corporate data; coping with this trend requires a new security paradigm to replace the castle-and-moat model, but no one alternative metaphor has yet emerged as the best solution. Amy Ray, Bentley College trustee professor of computer information systems, thinks CSOs should get out of the knee-jerk habit of only patching existing systems and focus on prioritizing security investments according to a strategic understanding of information systems' architecture and use, although she admits that this is a difficult proposition for companies under pressure to comply with security mandates such as the Gramm-Leach-Bliley Act. Richard Baskerville, chairman of Georgia State University's CIS Department, says that information security organizations ought to consider splitting their security staff into two categories: One to handle centralized information resources and a second to deal with more decentralized resources. Whale Communications CEO Elod Baron and IBM security research head Charles Palmer agree that inverting the current security perimeter model is a good idea--Palmer argues that maintaining a list of everyone who is not authorized to be inside corporate walls is unworkable. Rainbow eSecurity's Bernie Cowens says a much better solution is to make security portable and identity-enabled, though Palmer thinks a plan suggested by the Trusted Computing Group, in which most computing hardware would be outfitted with chips that facilitate easy and secure authentication, is more effective. He also expects security will be augmented on a more detailed level with technologies such as applications featuring built-in rules about normal behavior. Carnegie Mellon University computer professor Mike Rider thinks security will evolve into a paradigm resembling a wall of organic cells designed to be diverse and redundant to fend off intruders.
    Click Here to View Full Article