HomeFeedbackJoinShopSearch
Home

Be a TechNews Sponsor

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM. To send comments, please write to [email protected].
Volume 7, Issue 881:  Wednesday, December 21, 2005

  • "Does U.S. Face an Engineering Gap?"
    Christian Science Monitor (12/20/05) P. 1; Clayton, Mark

    Many scientists and politicians have raised alarms about an October report from the National Academies that suggested that China and India are far outstripping the United States in terms of producing new engineers. The high numbers reported for Chinese and Indian engineers, however, were swelled by more inclusive definitions of engineers--which in some cases included graduates of two-year technical schools and even auto mechanics. According to a more recent study from Duke University, more specific comparisons show that U.S. competitiveness is not eroding all that quickly, and may not be eroding at all. "Inconsistent reporting of problematic engineering graduation data has been used to fuel fears that America is losing its technological edge," says the Duke study, adding, "A comparison of like-to-like data suggests that the US produces a highly significant number of engineers, computer scientists, and information technology specialists, and remains competitive in global markets." For example, a comparison of how many engineering bachelor's degrees have been awarded is clouded by the fact that many of the Indian diplomas were actually three-year degrees. Some researchers argue that the fears of an engineering gap with China and India are overblown and that businesses are attempting to muddy up the issue as they look overseas for cheaper engineering talent. "Business groups have been very smart about trying to change the subject from outsourcing and offshoring to the supposed shortfall in US engineers," says Ron Hira, an outsourcing expert at Rochester Institute of Technology. Experts say that in some ways the debate over the engineering gap resembles the controversy over the "missile gap" with the Soviets during the Cold War.
    Click Here to View Full Article

  • "Diebold Hack Hints at Wider Flaws"
    Wired News (12/21/05); Zetter, Kim

    The discovery of flaws in Diebold's optical-scan voting machines in Florida's Leon County should give election officials pause about the machines supplied by other manufacturers, as well, said Hugh Thompson, the adjunct computer science professor who conducted the test last week. After Thompson and Finnish computer scientist Harri Hursti altered votes on a Diebold machine without leaving any evidence of tampering, election officials announced that they would discontinue use of Diebold machines in favor of Election Systems & Software (ES&S) systems, though Thompson cautions that ES&S products are not necessarily more secure. Optical-scan machines have become a popular alternative to touch-screen systems that do not produce a paper trail for election officials scrambling to meet a Jan. 1 deadline to select an e-voting system to replace punch-card and lever machines in order to qualify for federal funding. The memory card, which is used to record the ballots, is unencrypted and not protected by a password, which enabled Thompson and Hursti to access it with a laptop and a card reader. Anyone with programming skills and access to the machines could rig an election by exploiting the vulnerability, they said. Because the machines could be programmed to begin an election with a given number of votes added to a candidate's total and the same number subtracted from the other candidate's, the total number of votes cast would still be accurate, so election officials would be unlikely to notice. The security flaw is also capable of concealing itself and leave no trace of tampering.
    Click Here to View Full Article

  • "Research Funds Headed for Small, But Noticeable, Cuts"
    National Journal's Technology Daily (12/20/05); Barrett, Randy

    Federal funding for research and development is poised to emerge slightly weakened from a difficult budget rescission process that is imposing 1 percent cuts on all government programs to reduce the deficit. While the cut is smaller than some had predicted, reducing the amount earmarked for research and development to $135 billion for fiscal 2006 will still have a noticeable impact. The Defense Department's research budget would increase by $1.5 billion to $73 billion, while the NSF stands to lose $42 million, cutting funding for its research program to $4.15 billion. The proposed budget, which has cleared the House of Representatives but must still be passed by the Senate, adds $3 billion to the Pentagon's science and technology budget, bringing it up to $14 billion. Allotments for research projects throughout the country will bring an additional $6.7 billion to the basic and applied research accounts of the military. While the cuts may force some organizations to tighten their belt, things could have been a lot worse, said Robert Boege, executive director of the Alliance for Science and Technology Research in America.
    Click Here to View Full Article

  • "The Internet Is Broken--Part 2"
    Technology Review (12/20/05); Talbot, David

    Originating in the late 1960s, the original Internet protocols were designed for the task of easily moving packets of data between a few hundred government and academic users, without regard to what information is carried by the packets. This means the protocols were not designed to distinguish unwanted information like malicious viruses, or to handle Internet nodes that were able to move, such as PDAs that could connect to the Internet from many different places. The Internet's evolution thus brought numerous patches, such as firewalls, antivirus software, and spam filters--but the patches providing mobility for mobile devices have their downsides, and the security patches are clearly not able to keep pace. Many experts believe there are even more fundamental reasons to be concerned about the current state of the Internet, as they argue that the patchwork nature of the Internet makes it overly complicated and difficult to manage, understand, and improve. Tom Leighton, the co-founder of Akamai--a company that itself makes money by patching up some of the shortcomings of the Internet--also criticizes the patchwork nature of the Internet and is pushing for fundamental architectural changes. An MIT mathematician and chairman of the Cyber Security Subcommittee of the President's Information technology Advisory Committee, Leighton says the job of replacing the Internet's current architecture needs to start with the setting of goals. First, there needs to be a basic security architecture; second, a better practical architecture is needed for more efficient traffic-routing and collaboration among Internet service providers; third, future computing devices of any size, including sensors and embedded processors, need to be able to connect to the Internet; and fourth, technology is needed to make the network more resilient and easier to manage. Some organizations that are working on promising technologies for meeting these goals include the National Science Foundation, PlanetLab, and the Internet2 consortium.
    Click Here to View Full Article

  • "A 'No' Vote on Machines"
    Mercury News (12/21/05); Garcia, Edwin

    Citing security concerns over the systems' memory cards, California Secretary of State Bruce McPherson suspended the use of thousands of Diebold optical-scan and touch-screen e-voting machines in 17 counties pending federal review. McPherson's office has requested that Diebold provide its source code for review by the federal Independent Testing Authorities. Some analysts believe that suspending use of the AccuVote-OS optical-screen and AccuVote-TSX touch-screen machines was politically motivated, given the vocal reservations many voters have expressed about Diebold systems. Diebold has already been fined $2.6 million for distributing systems with software that had not been approved, and McPherson barred the use of the TSX machine this summer after testing exposed paper jams and screen freezes. Diebold machines were used in a November election in 16 counties after the state had granted the machines a conditional certification. Simply examining the source code does not prevent a hacker from infusing the machines with new, malicious code through the memory card, some experts claim. McPherson's decision has also been criticized for passing the responsibility for testing on to the federal government and an independent testing authority that is largely supported by the voting machine manufacturers. The testing is expected to take around two weeks, though it casts a shadow of uncertainty on election officials in precincts that had planned to use the machines in upcoming elections.
    Click Here to View Full Article

  • "Hispanic IT Students Get Boost From Latin American Grid, IBM Awards"
    InformationWeek (12/16/05); Jones, K.C.

    The Latin American Grid initiative will receive BladeCenter equipment with attached fiber channel storage, middleware, and infrastructure software from IBM, which is providing the equipment through its Shared University Research program. The LA Grid is a cooperative effort involving Florida International University, the University of Puerto Rico at Mayaguez, the University of Miami, the Barcelona Supercomputing Center, and Monterrey Tech, in addition to IBM's T.J. Watson Research Center. The participants are involved in research in health care, life sciences, nanotechnology, and region-specific projects such as the study of hurricanes. The LA Grid has the potential to help prepare students for successful careers in IT and boost economies by bringing IT businesses and jobs to Hispanic communities in the United States as well as Latin America and Spain. Currently, about 4 percent of computer science graduates are Hispanic, according to the 2003-2004 Computer Research Association Taulbee Survey, while just 1.1 percent of graduates with Master of Science degrees or Ph.D.'s are Hispanic.
    Click Here to View Full Article

  • "Playing Favorites on the Net?"
    CNet (12/21/05); Broache, Anne; McCullagh, Declan

    A bill expected early next year in the U.S. House of Representatives, along with recent comments made by executives from BellSouth and AT&T, has raised the prospect of a two-tiered Internet in which some services--especially video--would be favored over others. Although no broadband provider has proposed to block certain Web sites, some could choose to deliver their own video content faster than a similar service provided by rivals. The prospect of a two-tiered Internet worries e-commerce and Internet companies, including Yahoo!, Google, and Microsoft, which are lobbying to maintain the principle of "network neutrality," which holds that network owners must not pick favorites among the many technologies, applications, and users that travel across their networks. All three of these companies are very interested in online video services, which would compete with AT&T and BellSouth's planned Internet-based television services. There are several bills before Congress that address the issue of network neutrality, including a draft bill prepared by Rep. Joe Barton (R-Texas) that would prohibit Internet providers from blocking or unreasonably impairing or interfering with "the offering of, access to, or the use of any lawful content, application or service provided over the Internet." However, any attempt to develop broad network neutrality rules is likely to be hampered by the fact that problems remain mostly hypothetical. FCC Chairman Kevin Martin told a roomful of communications company executives last week that his agency was hesitant to adopt formal Net neutrality rules because "there hasn't been significant evidence of a problem."
    Click Here to View Full Article

  • "Google Offers a Bird's-Eye View, and Some Governments Tremble"
    New York Times (12/20/05) P. A1; Hafner, Katie; Rai, Saritha; Kramer, Andrew E.

    Google promotes its free Google Earth software, which enables 3D mapping of satellite and aerial imagery, as a navigation and instructional aide, but government officials from several nations are complaining that Google Earth's ability to display structures and sites in detail constitutes a security threat. India has very strict regulations about satellite and aerial photography, while Russian, Thai, and South Korean officials are also perturbed. U.S. experts generally concur that calling Google Earth a threat to security is wrong-headed, because the images Google receives from various sources are directly available from imaging companies: "Google Earth is not acquiring new imagery," explains Globalsecurity.org director John Pike. "They are simply repurposing imagery that somebody else had already acquired. So if there was any harm that was going to be done by the imagery, it would already be done." Google policy counsel Andrew McLaughlin says the company had entered into negotiations with several countries in the last few months over the regulation of Google Earth content, but he reports that none of the nations have yet requested Google to remove or obscure any information; neither has the U.S. government ever asked the company to remove data. Dave Burpee with the National Geospatial Intelligence Agency claims services such as Google Earth contain no restricted or classified content. Google Earth was developed by Google's Keyhole subsidiary, and Keyhole founder John Hanke is an advocate for freely available information; he argues that such transparency can aid disaster relief, land conservation, and other beneficial services.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Humans Down But Not Out Against Chess Computers"
    Reuters (12/16/05); Winfrey, Michael

    While human chess players have failed to recapture supremacy over computers since IBM's Deep Blue defeated Gary Kasparov in 1987, Bulgaria's Veselin Topalov, a current top-rated player, notes that computers can still be defeated, though with the psychological element of the game removed, humans cannot count on benefiting from their opponents' unforced errors. "You have to find a special strategy completely different from what you would do against humans," Topalov said. In a recent exhibition match in Spain, three machines handily dispatched three former champions, winning five games, drawing six, and losing just one. While computers have the advantage of performing many more calculations than humans can and not tiring, Topalov believes that humans have the best chance in a long match because computers can still make strategic errors. In addition to being able to perform more computations than people by a rate of 200 million to one, computers have superior memories and the ability to mine databases containing millions of games to help inform their strategy. Many top players have enlisted computers to help improve their own game, using them to gain insight into a grandmaster's strategies to gauge the effectiveness of their own moves. Topalov says that as long as computers make mistakes, there is a small window of opportunity for human players to beat them.
    Click Here to View Full Article

  • "Sensors Bring Us Closer to a User-Friendly Future"
    Globe and Mail (CAN) (12/15/05) P. B13; Schick, Shane

    Canadian researchers are working on bringing "ubiquitous computing" into reality by developing new concepts and technologies for the human computer interface (HCI). The term ubiquitous computing was coined by the late Xerox researcher Mark Weiser, who drew a distinction between a ubiquitous computing environment and the desktop-oriented one we currently inhabit by characterizing the latter as a domain where humans and computers comfortably coexist, and the former as a place where interaction between people and computers is "distant and foreign." Weiser envisioned ubiquitous computing as a component of computing's "third wave," in which public and private computers serve anyone at any location. The concept implies that each device should perform a single function well, but that the data should be easily exchanged or controlled between devices, which runs counter to most product designs. The result is better user experiences, according to ubiquitous computing proponents. Queen's University professor Roel Vertegaal has designed prototype headgear to flesh out his vision of an "attentive user interface" that makes devices aware of user activities in order to be less intrusive. One of the concerns about ubiquitous computing is the potential involvement of surveillance system technologies, but Vertegaal notes that people feel more comfortable around sensors, which are the more likely candidate for ubiquitous computing. "Ideally we would design [computers and sensors] so you wouldn't be able to see them," he explains.
    Click Here to View Full Article

  • "Microsoft's Vision of the Future"
    BBC News (12/16/05); Simmons, Dan

    Microsoft is not focusing solely on operating systems, and demonstrations in Brussels reveal the company has the potential to be a major innovator in producing some fun gadgets in the next few years. The Personalized Facial Sketch technology, which automatically creates a caricature from a portrait for use as emoticons for personalizing email or instant messaging, will be available outside of Japan next year. The Whereabouts Clock is more of a concept that involves tracking mobile phone signals, and cross-referencing the mobile cell they are in with pre-programmed locations. For example, a mother could use the technology, which could be up to two years away, to determine whether her child has left school for home, and some bosses might want to use it to track their staff. Another development targeted for the home is Homenote, a digital Post-It note that allows users to send text or email to a location, such as the microwave in the kitchen, rather than an individual. Snarf, an application that sorts emails according to the user's relationship with the sender, was also on display. "Email tools should know the difference between strangers and people you interact with frequently, and help you sort and prioritize your mail accordingly," says researcher Marc Smith.
    Click Here to View Full Article

  • "Copyleft Hits a Snag"
    Technology Review (12/21/05); Fitzgerald, Michael

    Cyber-lawyer Lawrence Lessig has found that his copyleft licenses, while intended to facilitate the distribution of intellectual property, may actually impede it because they demand that derivative content be licensed under the same terms, meaning that works using one license cannot be combined with material of a different license. Lessig co-founded Creative Commons in 2001 with the intention of producing flexible versions of copyright licenses so that elements of multiple works could be shared freely. Creative Commons has six main versions of its license offering varying degrees of flexibility and modification. The number of link-backs to Creative Commons' site from the sites of writers, artists, or publishers who use the license has jumped tenfold in the past year to 45 million. Other copyleft licenses, such as the Free Documentation License (FDL), used by Wikipedia, also demand that derivative works have the same distribution agreements. As more audio and video content is created and shared, the incompatibility problem could worsen, and lawsuits could emerge if a large institution were found to be violating license agreements by enabling mash-ups that violated the distribution terms. Lessig says the problem is that the law has not caught up to the networking era, though there may not be a simple solution. It is easy enough to alter the licensing language, though Software Freedom Law Center founder Eben Moglen warns that if the law is not revised, works that blend licenses could not be shared. To resolve that issue, Lessig is developing new language that will provide people creating works licensed under the FDL free access to Creative Commons. Lessig is also trying to put together a legal advisory board to standardize copyleft licenses to permit sharing. He believes that by catching the licensing flaw early, the copyleft community will be able to stave off major litigation.
    Click Here to View Full Article

  • "PC or People: Who's the Boss?"
    CNet (12/20/05); LaMonica, Martin

    Clearing up haziness over whether the human operator or the PC is really in control is a priority area for Microsoft, which has brought human-machine interface expert and designer Bill Buxton into its research division to address software design issues. Buxton says his job is to determine how beneficial technologies that support good user experiences culturally as well as individually can be made. Designing ubiquitous computing to be more manageable is one area of concentration, and Buxton envisions two possible development tracks for ubiquitous computing: One in which a "society of devices" begins to complement each other and collectively improves user experience while simultaneously lowering the complexity of real-world engagement; and another in which chaos ensues. "There's an increasing awareness that every time you introduce any technology into a society or culture, it will have an impact," he says. "Once you acknowledge that, then it behooves you to make your best effort to understand what that impact will be and design it to so that it's a positive contribution." Buxton says a design goal of embedded computing is to make the machine invisible and unnoticeable. He prefers the concept of a device that performs a single task very well by itself but can cooperatively function with other devices over the idea of a "super appliance" overstuffed with functionality. Buxton believes the PC will not go away, but its status as the leading human-computer interface will change as problematic features and functions migrate to more appropriate and PC-complementary platforms.
    Click Here to View Full Article

  • "Clarkson Engineer and 'Spoofing' Expert Looks to Outwit High-Tech Identity Fraud"
    Clarkson University News (12/09/05)

    Clarkson University engineering professor Stephanie C. Schuckers is an expert on "spoofing" of biometrics and is examining ways to outwit sophisticated high-tech identity fraud. "Today, biometric systems are popping up everywhere--in places like hospitals, banks, even college residence halls--to authorize or deny access to medical files, financial accounts, or restricted or private areas," she says. "As with any identification or security system, biometric devices are prone to 'spoofing' or attacks designed to defeat them." Cadavers' fingers or fake fingers molded from plastic--or even Play-Doh or gelatin--could be misinterpreted as authentic, says Schuckers. "My research addresses these deficiencies and investigates ways to design effective safeguards and vulnerability countermeasures. The goal is to make the authentication process as accurate and reliable as possible." With research funding from the National Science Foundation, the Office of Homeland Security, and the Department of Defense, Schuckers is currently designing methods to correct for spoofing vulnerabilities in fingerprint scanners. The research into fingerprint scanners is a joint initiative involving Clarkson researchers along with researchers from West Virginia University, Michigan State University, St. Lawrence University, and the University of Pittsburgh. Schuckers and her research team used dental materials and Play-Doh molds to make casts from live fingers, and also collected cadaver fingers, and found a 90 percent false verification rate with their faked samples. This spoofing rate was reduced to least than 10 percent after Schuckers and her research team improved the scanners with a computer algorithm that could detect the perspiration pattern seen in live fingers.
    Click Here to View Full Article

  • "Aging Boomers Seek TECH Rx"
    EE Times (12/19/05) No. 1402, P. 1; Merritt, Rick

    Presenters at last week's White House Conference on Aging called for the creation of research and development initiatives for applying technology to enhance the lives of the elderly population, which is expected to expand globally from 600 million people now to 1.2 billion in about two decades. "By and large, getting funding seems to be extraordinarily difficult in [the field of elderly-assistive technology]," said University of Michigan professor Martha Pollack. "It seems to fall between the cracks of agencies that don't work on health issues, like the National Science Foundation, and government health agencies that don't sponsor technology research." Researchers stressed the need for core work in areas such as sensor networks, artificial intelligence, user interfaces, robotics, location-based services, and privacy and security to fulfill senior citizens' requirements. But they also emphasized the importance of engineers collaborating with service providers, policy makers, caregivers, and users. Pollack testified before Congress last year that lawmakers should establish a joint funding program involving the participation of the NSF and either the National Institute on Aging or the National Institute on Biomedical Imaging and Bioengineering that would act as a stable funding source for assistive technology. Intel group manager and chairman of the Center for Aging Services Technologies Eric Dishman said delegates at last week's White House conference recommended, among other things, the development of incentives to spur proper use of health information technology. He expects recommendations to also include a request for a national commission to encourage innovation and R&D to better the aging experience, and the urging of incentives for widespread implementation of telehealth technologies.
    Click Here to View Full Article

  • "Neural Network Sorts the Blockbusters From the Flops"
    New Scientist (12/17/05) Vol. 188, No. 2530, P. 25

    Oklahoma State University information scientist Ramesh Sharda has developed an artificial neural network to recognize what makes a movie successful by using data on 834 movies released during 1998 and 2002. Sharda's research found that the neural network can judge a film based on seven key factors: The "star value" of the cast, the movie's age rating, the time of release against that of competitive movies, the film's genre, the degree of special effects used, whether or not it is a sequel, and the number of screens it is expected to open in. The movie is then placed in one of nine categories from "flop" to "blockbuster." Sharda is currently expanding the system to include DVD sales, and building a Web site where users can enter movie parameters where the software will give a forecast of the movie.
    Click Here to View Full Article

  • "The Future That Never Was"
    Electronic Business (12/05) Vol. 31, No. 12, P. 46; James, Geoffrey

    Many electronic products with revolutionary potential have failed to pan out for lack of consumer appeal, technological advancement, or timely marketing. The Beckman Model 2230 analog computer flopped because of miniaturization and programmability issues, and this allowed IBM's 360 digital computer to become the de facto mainframe standard; had the Beckman machine featured digital emulation, things might have turned out differently, and the market debut of cell phones, Wi-Fi modems, and other analog-intensive products might have occurred earlier. AT&T's video-enabled PicturePhone failed to find market acceptance due to a lack of long-distance capability, which would have been addressed by video compression technology. Had both the phone and the compression product been released concurrently before another factor--consumers' fear that video telephony would intrude on their privacy--could undermine the concept, AT&T would probably rule the cable TV market and be the single U.S. ISP today. The Xerox Alto workstation, which was comparable to an Apple Macintosh in terms of form factor and function, might have jump-started the computer industry 10 years earlier, if it had been rolled out quickly instead of languishing for a decade. Another failed product, the Heathkit Hero robot, might have been a trendsetter if its multi-sensor functionality had been paired with generalized artificial intelligence, which was undeveloped at the time. Without that critical ingredient, high cost and complicated programmability doomed to obscurity what could have been the first successful robotic servant. Generalized AI might also have enabled Thinking Machines to deliver its highly-hyped, massively parallel Connection Machine supercomputer, which was never realized.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM