HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 665:  Wednesday, July 7, 2004

  • "Programming Doesn't Begin to Define Computer Science"
    Pittsburgh Post-Gazette (07/04/04); Morris, Jim

    Jim Morris, computer science professor and Dean of Carnegie Mellon University's West Coast Campus, writes that the fall-off in college-level computer science enrollments is chiefly due to a misrepresentation of the field's goals: The computing industry's cyclical boom-bust pattern owes a lot to students buying into the idea that computer science can make them wealthy, only to be discouraged by the bursting of the tech bubble and the offshoring of computer jobs. Morris notes that there has been a 70 percent increase in the number of biological science graduates since 1990, compared to a 10 percent decline in computer science graduates, which proves that traditional sciences, with their intellectual stimulation and humanitarian goals, are much more appealing. The author asserts that the science of computing is the missing ingredient in current computer science education, and says the problems computer science aims to solve have practical applications with significant social and economic impact. For instance, solving the problem of reproducing intelligence in machines is critical to the advancement of robotics, which in turn will play a key role in space exploration. Likewise, computer security cannot improve without innovative mathematics, and people can become skilled in experimental technique by studying how humans interact with computers. Furthermore, Morris points out that a computer science education can prepare people for careers in diverse fields, such as medicine, law, biology, and business. The author believes that students should obtain a "liberal science" education with computing as their first focus of study. Morris writes that students should receive education in certain computer sciences before college, but notes that high school curriculums currently lack instruction in "the visions and grand challenges of computer science."
    Click Here to View Full Article

  • "Activist: E-Voting to Be a 'Train Wreck'"
    Associated Press (07/06/04); Konrad, Rachel

    E-voting reform crusader Bev Harris is the bane of politicians, company executives, and election officials that support electronic voting: Working for free out of Seattle, she has pursued the issue of insecure e-voting with a passion for the last two years. She was the person who first spotted Diebold source code on the Internet in January of last year, using Google to search "Global Election Systems," the name of the software company acquired by Diebold in 2002; the code she downloaded onto seven CDs was the basis for Johns Hopkins and Rice University reports on insecure e-voting systems. Among the findings that galvanized computer scientists against e-voting was the fact that Diebold's default password for voting administrator smartcards was "1111," and John Hopkins Information Security Institute director Avi Rubin concluded that any competent 15-year-old could compromise the Diebold system, though the company insists its code has been updated since then. Rubin says Harris' tenacity and fervor has put off some people, but that her aggressive style is needed to push for reform. California Secretary of State Kevin Shelley this year limited the use of Diebold machines and required paper receipts of all votes. Still, Harris says this November's election will be a tremendous fiasco eclipsing that of the presidential race in 2000; she has tracked down e-voting software programmers who boast they can manipulate election results, county election registrars, and company executives, while her Web site details links between politicians and e-voting companies, which she says amounts to an inexcusable conflict of interest. Harris says current e-voting systems are not so much a computer or even conspiracy problem, but a situation where the normal checks and balances have been removed. She asserts that a lack of oversight guarantees theft.
    Click Here to View Full Article
    For information on ACM's activities regarding e-voting, visit http://www.acm.org/usacm

  • "New on Campus: Faster Network for File-Sharers"
    Wall Street Journal (07/07/04) P. B1; Glassberg, Hope

    Students are making use of file-sharing programs that run on restricted networks such as Internet2, which offer better security and faster speeds than open networks such as Kazaa. One such service is i2hub, which works only at Internet2-connected schools, making students' activities more difficult to track. A recent i2hub test in New York demonstrated that the program could download a newly-released song in about three seconds and a bootlegged movie in little more than an hour; individual songs can take several minutes and movies several hours to download using traditional DSL or cable Internet connections. Internet2, which operates independently from the Internet, was established to support rapid file-sharing between universities for academic pursuits, but that has not stopped students from using services such as i2hub to share copyrighted materials. I2hub creator Wayne Chang, a student at the University of Massachusetts in Amherst, estimates that about 70,000 students currently subscribe to the service, which is employed at approximately 190 Internet2-connected schools. He admits that he does not know what kinds of files i2hub users are sharing, while the i2hub.com Web site is unclear on usage restrictions, although it states that the program "is for educational purposes only." Internet2 communications director Greg Wood says he does not support the use of the network for unlawful file-swapping, but notes that universities could employ i2hub to share large data files between them.

  • "Smarter Spacecraft: Science-Hunting Software for Robotic Explorers"
    Space.com (07/06/04); Malik, Tariq

    The goal of NASA's Autonomous Sciencecraft Experiment (ASE) is to enable spacecraft to carry out scientific investigations without human assistance, and project participants such as the Jet Propulsion Laboratory (JPL), Arizona State University (ASU), and the University of Arizona are making significant strides in the development of the initiative's software component. For example, the Earth-Observing-1 (EO-1) satellite has studied the volcanic activity of Mount Erebus in the Antarctic using NASA-developed software that autonomously looks for science targets through the interaction of three elements: A series of science algorithms that dictate what the satellite should scan for in incoming data, such as volcanic heat signatures; an onboard planner computer program that determines which science investigations the satellite should pursue autonomously; and a spacecraft command language to facilitate communication with onboard instrumentation. "Not only does this software detect activity, it starts making its own measurements before researchers even look at the data," exclaims JPL researcher Ashley Davies, who adds that EO-1 also tracks flood and ice activity using software created by ASU and University of Arizona researchers. The ASE software also has the potential to avoid the communication time lags inherent in interplanetary travel. Davies says that a spacecraft could employ the software to keep track of the target until researchers have the chance to view the data instead of passing the target while waiting to process the data. Although future missions to Mars or Jupiter could benefit from ASE software, planetary researchers caution that the software would rely heavily on previous knowledge of the area to be probed. "It really depends on the specific mission and whether you know enough to give those smarts," explains JPL scientist Joy Crisp.
    Click Here to View Full Article

  • "Knowing Their Politics By Their Software"
    New York Times (07/05/04) P. C1; Lohr, Steve

    Both Republican and Democratic parties are relying more on the Internet to advance their political causes, and in the process are adopting either proprietary or open-source technology according to their ideological affiliation. Both President Bush's re-election Web site and the Republican National Committee (RNC) use Microsoft's Web server, while presidential candidate Sen. John Kerry (D-Mass.) and the Democratic National Committee (DNC) run open-source software, such as the Apache Web server. Choices in software are an extension of political leanings, according to Plus Three founder and avowed liberal David Brunton, whose consulting firm did much of the work on Kerry's and the DNC's Web sites, as well as work for AFL-CIO union. Open-source advocates say strict intellectual property protections limit innovation and keep industries such as pharmaceuticals and entertainment from experiencing faster development and lower costs, while proponents of proprietary technology argue that intellectual property protections give companies, including startups, better chances at raising capital and profiting from innovation. Linux creator Linus Torvalds denies any party affiliation for open-source, and says the only common ideological denominator among open-source developers is individualism, which gives them libertarian tendencies and a distrust of large corporations. Representatives of both Democratic and Republican parties say their technology choices were not based on ideology, but on the practical advantages of each. RNC network and online services director Steve Ellis says Microsoft's Internet Information Services and Windows 2000 operating system provide better data security, even though computer security expert Richard M. Smith says the Apache Web server has a better track record.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Digital Evolution Reveals the Many Ways to Get to Diversity"
    Newswise (07/01/04)

    Using powerful computers, a team of researchers at Michigan State University (MSU), Keck Graduate Institute (KGI), and the California Institute of Technology have focused on solving the riddle of how evolution leads to species diversity by studying the evolution of artificial life forms in a virtual petri dish. Within this artificial environment are digital organisms that replicate themselves as well as obtain rewards through mathematical calculations; the rewards consist of additional computer time that can be used for reproduction, and each organism's "species" is determined by its mathematical function. Natural selection and evolution is triggered by the random insertion of mutations into the copies, while the research team observes the organisms' adaptation and evolution in dissimilar environments within the virtual habitat. MSU Hannah Distinguished Professor of microbial ecology Richard Lenski explains that the experiment allows the scientists to "look at changes in species diversity across thousands of generations, and see how the ecological relationship between environmental productivity and species diversity could be understood from an evolutionary perspective." MSU assistant professor computer science and engineering Charles Ofria says the results of the experiment demonstrate that productivity's effects are limited in environments with finite resources. "What we've learned is that if there isn't just one way to succeed, you'll see diversity," he concludes. The research is sponsored by the National Science Foundation under the auspices of its biocomplexity program, while KGI and the MSU Foundation provide additional funding. The research team detailed their experiments in the July 2 issue of Science.
    Click Here to View Full Article

  • "Evolution Could Speed Net Downloads"
    New Scientist (07/05/04); Knight, Will

    Caching data at different locations can lower costs and boost download speeds, but determining where data should be stored and for how long is a difficult proposition. Frederik Theil and Jurgen Branke of the University of Karlsruhe and Pablo Funes of Icosystem employed "genetic algorithms" to "evolve" caching configurations for Internet servers that have successfully sped up Net downloads over existing caching architectures in simulation. Efficient algorithms were evolved from a group of randomly generated algorithms, with the ones that best lowered network traffic and improved the download rate selected to "breed" a new population. The algorithms ascertain whether each piece of data should be stored and the duration of the storage by assessing known variables, such as the number of requests for a specific piece of data, its overall size, and how many points it must pass through. Different caching strategies were tried out on a network simulator that modeled a branch of Internet network where data could be duplicated and stored at each major junction. The evolved algorithms performed at double the speed of the best existing caching strategy when tested on a 300-node simulated network. Funes says that future networks might be set up to determine which users are deserving of the most help: "Sophisticated network behaviors might implement rules for reciprocity and trust," he notes. "And conversely, for not cooperating with others who try to abuse our resources."
    Click Here to View Full Article

  • "Getting Better...Virtually"
    ABCNews.com (07/07/04); Onion, Amanda

    Psychotherapists see value in treating patients through virtual reality--specifically, computer animated characters representing patient and doctor that interact in adaptable artificial environments, which experts believe is conducive to a more relaxed and open doctor-patient relationship. Virtual reality is already employed as a therapeutic tool for people trying to overcome addictions or phobias, but computer programmers and therapists are attempting to extend the use of the technology. "The most powerful application of virtual reality will be the clinician's ability to create an interactive environment that addresses the personality and cognitive style of the particular client or clients," says Rider University psychologist John Suler. Although whimsical virtual interpretations of the patient and the counseling environment can be helpful, there is also value is rendering more realistic virtual scenarios, an example being software that Australian researchers have developed to re-create hallucinations of psychotic patients in an effort to better understand and treat them. Another potentially useful application of virtual reality is as a training tool for psychologists through interaction with completely virtual patients. Virtual therapy is not without controversy: Daniel Lieberman, director of George Washington University's Clinical Psychiatric Research Center, states, "allowing the patient to 'hide' behind an avatar [a computer generated icon] seems to me to promote the fiction that personal change can be accomplished without great effort and endurance." Some people are concerned that virtual reality and other technologies could remove human aspects that are essential to successful therapy.
    Click Here to View Full Article

  • "Virtual Project May One Day Let Your Work Jump From Computer to Computer Without Interruption"
    Pittsburgh Post-Gazette (07/05/04); Spice, Byron

    Intel Research Pittsburgh's Internet Suspend/Resume project is supposed to enable users to transfer their work from computer to computer without interruption using the Net, distributed file systems, and virtual machines. Such a concept is expected to be particularly appealing in a world where computers are ubiquitous--so ubiquitous that lugging around a laptop seems archaic and cumbersome. Carnegie Mellon University computer science professor and Intel Research Pittsburgh lab founding director Mahadev Satyanarayanan notes that upgrading machines would become a simple process with Internet Suspend/Resume technology, while hard drive malfunctions would no longer be catastrophic. In addition, Internet Suspend/Resume's Rollback feature would enable users to eliminate viruses by backtracking to an arbitrary point in time before the contamination and resume work there. This recovery is possible partly because of the storage of data in distributed files on Internet servers as well as in each PC's hard drive. Internet Suspend/Resume revives the concept of virtual machines through its use of virtualization software that is wedged between the computer hardware and its operating system and other software. The management of corporate or university computer systems would become less expensive and complicated thanks to Internet Suspend/Resume, according to Intel Pittsburgh senior staff researcher Michael Kozuch. Virtual machines are also being employed in The Collective, a Stanford University project that aims to streamline system administrative tasks, primarily for home users: Although the virtual appliances would make users more mobile, Stanford computer science professor Monica Lam says the approach's chief benefit would be increased efficiency and security for computers.
    Click Here to View Full Article

  • "Memory Cards Make Connections"
    Technology Research News (07/07/04); Smalley, Eric

    Sony Computer Science Laboratories researchers have converted Sony's Memory Stick flash memory card into a virtual wire that enables people to link two networked devices by simply plugging in a matched pair of cards that share the same ID and key. These so-called TranStick cards are visually coded with symbols and by color, and Sony researcher Yuji Ayatsuka says the cards are particularly well-suited for applications whose connections are changed regularly, such as household audio/visual devices. The scientist explains that TranSticks offer an easier linkage technique than those that use software: "With software-based approaches a user has to select a target from a list, possibly including lots of names or icons on a display, or a system [has to detect the] correct target automatically," notes Ayatsuka, who adds that an environment characterized by many networked devices can complicate this approach. A TranStick card can pinpoint an intended target by looking for the device that its counterpart is plugged into, so there is no need for a user to know the names and addresses of the paired devices in order to establish a connection. Finding a paired TranStick is encrypted by the cards' secret keys. The TranStick cards can also be employed to permit a pair of computers to share memory on a server. Ayatsuka thinks the system could be ready for practical use in two years; he and colleague Jun Rekimoto detailed the TranStick technology at the Computer-Human Interaction Conference in late April.
    Click Here to View Full Article

  • "Early Row Signals Challenges for Next Net Summit"
    IDG News Service (07/05/04); Blau, John

    Although committees tasked with focusing on Internet governance and funding were established at the first World Summit on the Information Society (WSIS) in Geneva last year, and certain goals--such as ensuring that at least half the world's population had access to some form of electronic media by 2015--were agreed upon, little has been done in terms of implementation since, setting the stage for a showdown at the next WSIS, scheduled for November 2005 in Tunisia. On June 26, the first of three preparatory meetings to the upcoming summit was held. At the first meeting, controversy ruled as Tunisian human rights activist Souhayr Belhassen criticized his nation's violations of free speech and privacy rights, and although Tunisian government representatives attempted to censor Belhassen, the EU forced the Tunisian delegation to back off. The issue of Internet governance, as it pertains to policy making, has been the sticking point in arguments teaming the United States, European Union, and other developed nations against China, Brazil, and other developing countries. The former is happy with allowing ICANN to continue it policy administration while the latter wants strong government intervention. Another issue surrounds calls for a "digital solidarity fund" to expand access to the Internet, as proposed by the developing world and contrasted by U.S. and E.U. preference for maintaining current funding programs and voluntary contributions. Security and open-source software, which poor nations cite as a means to develop internal technology at relatively low costs, are also bones of contention.
    Click Here to View Full Article

  • "Idle Computers Are a Researcher's Dream"
    Idaho Statesman (07/02/04); Howard, Julie

    Boise State University (BSU), in collaboration with Micron Technology, has set up a grid computer network on campus that so far links together almost 100 classroom desktops into a supercomputer using a program known as Condor. When the machines are not in use by students, they are busy computing complex research equations. The Condor system was deployed at a cost of $10,000 compared to a $400,000 supercomputer also installed at BSU. BSU professor Elisa Barney Smith aims to use the grid to process computations related to her research on how to make optical character recognition systems capable of recognizing faded or blurred copies or faxes, a breakthrough that would ease the conversion of printed materials to a digital format. Smith says computing time can be reduced to a few hours thanks to the number of machines the grid currently supports. BSU systems administrator Angus McDonald predicts that the Condor grid's computer population should increase by the hundreds as more professors become interested and demand grows. Micron's Brooklin Gore says that harnessing idle computers through a grid architecture is becoming more and more popular in the enterprise arena. "The technology has been used in academia and research labs in earnest [for the past three or four years], and it's just starting to become popular for businesses," he notes, pointing out that computing power demands have been highest in the technology and bioscience sectors.
    Click Here to View Full Article

  • "CERN Openlab Adds a New Dimension to Grid Computing"
    Innovations Report (07/06/04); Grey, Francois

    The server and storage technical results of the first global science Grid, also known as the Large Hadron Collider Computing Grid project (LCG), was announced by the CERN openlab for DataGrid applications at the facility's annual sponsors meeting on June 22. The successful meshing of the LCG with a cluster of 40 Hewlett-Packard servers running 64-bit Intel Itanium 2 chips was demonstrated, proving that the 32-bit-processor-based LCG can branch into a diversified environment. This is critical, as the Grid must scale up its power and capacity to handle the Large Hadron Collider (LHC) project's massive data storage and analysis needs. CERN openlab researchers also concluded testing on IBM's StorageTank storage management technology employed in IBM's SAN File System, and over 100 concurrent SAN File System clients and more than 28 TB of storage distributed among 10 storage servers were recently managed. Other landmark technical achievements facilitated by the CERN opencluster include the maintenance of storage-to-tape rates of over 1 Gbps for hours, in keeping with the maximum rates at which LHC data must be stored to a primary tape backup, and an Internet2 landspeed record obtained in October 2003 with the help of the HP server nodes with Intel Itanium chips. CERN openlab researchers sponsored by Oracle recently boosted the CERN grid computing environment's availability by trimming downtime off its catalog. "The CERN openlab provides a role model for how CERN and its academic partners may in future wish to organize collaboration between the private and public sector, in order to develop the many new technologies that will surely be needed for endeavors beyond the LHC," says CERN director general Dr. Robert Aymar.
    Click Here to View Full Article

  • "Battlefield Robots Leap From Science Fiction to Reality"
    National Geographic News (07/01/04); Handwerk, Brian

    The U.S. military is deploying and testing robots in combat situations in the hopes that such machines will reduce casualties by performing dangerous tasks such as explosives placement, mine disposal, nuclear and biological agent detection, and hazardous material cleanup. Robots currently being employed in military operations include unmanned flying surveillance drones and ground vehicles, while both the military and the private sector are working on smaller devices that individual soldiers can carry and deploy. Harmonizing the development and implementation of American military robotics is the job of the U.S. Department of Defense Joint Robotics Program headed by Cliff Hudson, who reports that many robots are in Afghanistan and Iraq to help find and disarm explosives. One deployment involves small, remote-controlled bomb-sniffing robots equipped with cameras, thermal imagers, microphones, and manipulator arms for lifting objects. Hudson says soldiers are growing more and more adaptable to tele-operated systems thanks to their familiarity with videogames. Robots are also being armed: Examples include automated aircraft that fire missiles and the Marine Corps Gladiator, an unmanned vehicle designed to travel ahead of the troops to evaluate a locale's threat quotient and respond to assaults with deadly force. The Unmanned Systems Branch of the U.S. Navy's Space and Naval Warfare Systems Center is developing a moveable airstrip and refueling station for unmanned aerial vehicles with the help of scientists at NASA's Jet Propulsion Lab, who plan to contribute automation technology used in the current Mars mission. Hudson foresees the emergence of autonomous and partly autonomous military robots that carry out missions wholly through sensors and computer technology.
    Click Here to View Full Article

  • "Thought for Food"
    USC Viterbi School of Engineering (06/30/04); Mankin, Eric

    University of Southern California (USC) computer scientists and nutrition experts have teamed up to provide a system that boosts the use of fresh produce at community food pantries. Low-income people often lack the culinary knowledge or time to prepare vegetable dishes, and food pantries must deal with an ever-changing stock of donated food; by using a computerized system to create custom recipes according to available utensils and supplies, the researchers have been able to dramatically boost recipients' vegetable consumption--a key element to fighting disease and reducing health care costs. The system relies on computer software that produces individualized outputs depending on available resources and the preferences of the user. At the food pantries, volunteers collect information from recipients such as native language, food preferences, available kitchen supplies, and the type of people in their family; the software then creates a custom recipe list based on what is available at the time in the pantry. Surveys show people who receive the customized recipes accept more fresh produce from the pantry, keep the brochures much longer than general information, and try more of the recipes on the list. The team is developing a better infrastructure setup with tablet PCs and high-speed, quality printing, and USC computer scientist Eduard Hovy says similar technology could be used in supermarkets to help customers with food preparation tips and recipe ideas. The technology has also been licensed by a Canadian firm that is using it to create custom daily and weekly reminders for people who are trying to stop smoking. Further out, Hovy says the computer software could even be used to deliver personalized advertisements.
    Click Here to View Full Artic

  • "Not the Usual Channels"
    Economist (07/03/04) Vol. 372, No. 8382, P. 65

    Error-correcting codes allow the Cassini spacecraft currently orbiting Saturn to transmit images back to earth with antennae using just a bit more power than a light bulb. Since Cassini was built in the mid 1990s, such codes have improved dramatically, and promise to not only revolutionize space exploration but also terrestrial consumer electronics applications, such as cellular data transmissions. Today's code theory advancements mean transmissions can be virtually guaranteed against error, such as that often introduced in the vacuum of space. Claude Shannon established the theoretical limit for error-free transmission rates in 1948, and two coding technologies have come very near that limit: Turbo coding, developed by French telecommunications researchers, is being deployed in wireless 3G networks to bolster data transmission; the technique basically doubles the convolution coding method used on the Cassini satellite, where bits are added to the data and then the entire number of bits tallied upon reception to detect errors. Turbo coding dramatically bolsters error-detection but also works better with large data sets and is not worthwhile for voice communications, which use small groupings of bits. Low-density parity check (LDPC) is an even newer technology that performs better than turbo coding, though with the same performance limits in terms of data size: LDPC builds on work done by MIT researcher Robert Gallager in the 1960s, who was unable to utilize the theory because of insufficient computing power. LDPC multiplies a block of data by a "sparse matrix" with only a few "one" entries and many zero entries; the two blocks of resultant data look different, but allow users to determine where defects are by putting the results in context. The new LDPC coding techniques could allow super-fast wireless transmission in the future that would still be accurate--perhaps live data feeds from Saturn-orbiting satellites to users' mobile phones.
    Click Here to View Full Article

  • "OGC Looks to Enable the Sensor Web"
    GeoWorld (06/04) Vol. 17, No. 6, P. 22; Botts, Mike

    The Open GIS Consortium (OGC) Sensor Web Enablement (SWE) effort seeks to make sensor networks available over the Web, using existing OGC registry standards as well as new encoding and interface standards for discovery of and interaction with sensor networks. Sensor networks already affect numerous areas of our lives, predicting the weather through Doppler radar, weather stations, and satellite imagery, for example; but the need to leverage these sensor networks in enterprise architectures is what drives the OGC standardization effort. New encodings under the SWE project include XML and sensor model language (SensorML) schema, and observations and measurements schema (O&M) for the discovery of sensors and secure transmission of sensor data. The SWE Web services approach also requires three standard interfaces: Sensor observations service (SOS) for requesting and retrieving sensor data, sensor planning service (SPS) for the tasking of sensors to acquire specific readings, and the Web alert/notification services (WAS/WNS) that allow the publishing of messages and alerts. Used in conjunction with existing OGC standards, SWE-enabled sensor networks and systems would enable the Federal Emergency Management Agency (FEMA) to better respond to a large explosion and chemical fire, for example. Although NASA satellite thermal sensors were needed during the Sept. 11 attacks, for example, they were unavailable to relevant agencies because there was no way to immediately detect and interface with those sensors. SWE would allow that, and could also hook FEMA and other agencies up to biochemical sensors mounted on cellular towers for the detection of concentrated plumes of airborne chemicals. With SWE alerts, appropriate containment and medical teams could be dispatched to the appropriate areas quickly.
    Click Here to View Full Article

  • "The H-1B Cap: Friend or Foe?"
    Training (06/04) Vol. 41, No. 6, P. 43; Dolezalek, Holly

    The maximum limit on new H-1B visa applications permitted annually reverted to 65,000 in September 2003, and opinion is split on whether this is a positive or negative development. The cap has already been reached less than halfway into fiscal 2004, and Information Technology Association of America President Harris Miller argues that this is bad news for recruiters. "They can go to one of the top universities in the country and get one of the top [foreign-born] students coming out of a master's or Ph.D. program that they'd love to hire, but they can't hire them because they can't get the visa to let them stay and work in this country," he laments. Miller also claims that the situation will ramp up the offshoring of U.S. jobs to other countries because it leaves firms that want to use teams composed of foreign and American workers with little choice but to send those teams overseas, which promotes offshore outsourcing. Institute of Electrical and Electronic Engineers-USA President John Steadman testified before the Senate Judiciary Committee last September that importing foreign professionals through the H-1B and L-1 visa programs only exacerbated the outsourcing situation because those workers are leveraging their language skills, contacts, and close ties to overseas business partners to effect offshoring, as well as taking their knowledge of U.S. markets, technology, and business practices back to their native countries. Steadman also stated that the hiring of additional foreign labor is not justified, given U.S. unemployment levels: He pointed out that almost 540,000 new H-1B applications were approved by the INS between 2000 and 2002, while the unemployment rate for U.S. electrical and electronics engineers climbed from 1.3 percent to 2.4 percent in the same period. Intel human resources attorney Patrick Duffy argued at the hearing that the diversity of electrical engineers and the lack of interchangeability between their disciplines negated Steadman's conclusions. "The real issue here is the lack of highly-educated U.S. candidates for the jobs for which we experience shortages," he wrote.

  • "Cyber Crumbs for Successful Aging With Vision Loss"
    IEEE Pervasive Computing (06/04) Vol. 3, No. 2, P. 30; Ross, David A.

    Researchers at the Atlanta VA Rehabilitation Research and Development Center have collaborated with Charmed Technology to develop Cyber Crumbs, an electronic system that functions like a trail of bread crumbs to help visually impaired people efficiently navigate through unfamiliar environments. Several location awareness technologies were tested as candidates for Cyber Crumbs' tech infrastructure, and the MIT Media Lab's Locust IR system emerged as the best technology because it enabled users to follow the route without breaking stride and did not require any additional effort on their part to obtain navigational data. The Atlanta team approached Charmed Technology to supply the hardware component of Cyber Crumbs by modifying interactive conference badges based on the Locust System to function as cyber crumbs and reader badges, and both organizations worked together to create software that would direct people to one of two specific sites in the VA hospital. In the Cyber Crumbs system's current form, the user wears a reader badge and walks along a route peppered with cyber crumbs that are asleep most of the time to conserve battery power; the badge sends out a wake-up signal in one-second intervals, and a crumb activates and transmits its ID code and text message once every second when a person comes within 22 feet of the device. The prototype system was tested on 10 visually handicapped people between 37 and 85 years old, and results indicated that Cyber Crumbs reduced average performance times by 50 percent and average distance traveled by 25 percent when compared to baseline figures. Participants generally praised the system as a tool for giving them support for independent travel and awareness of their surroundings, and appreciated Cyber Crumbs' ability to identify important objects or locations en route. Aspects of Cyber Crumbs they did not like included a lack of reliability due to signal reflections, the time it took for the system to locate crumbs, and a lack of crumb personalization. The VA hospital and Charmed Technology are working to address these issues.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM