ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Thunderstone or ACM. To send comments, please write to [email protected].
Volume 8, Issue 886: Friday, January 06, 2006 Learn more about Texis, the text-oriented database providing high-performance search engine features combined with SQL operations and a development toolkit, that powers many diverse applications, including Webinator and the Thunderstone Search Appliance.   Learn more about Texis, the text-oriented database providing high-performance search engine features combined with SQL operations and a development toolkit, that powers many diverse applications, including Webinator and the Thunderstone Search Appliance.

  • "U.S. Falling Behind China, India for New Engineering Graduates"
    Investor's Business Daily (01/05/06) P. A1; Mandaro, Laura

    Technology and engineering firms fear that a stagnating rate of graduates in the hard sciences could compromise the United States' hegemony over foreign competitors in the fields of software development and technological innovation. Increased interest in science and engineering among foreign students in nations such as China, which currently boasts 21 percent of the world's engineering graduates, and India and Russia, both of which produce 8 percent of the world's engineering graduates, has only exacerbated the concern in the United States which, in contrast, produces just 6 percent of the world's engineering graduates. The number of U.S. freshmen who intend to pursue computer science as a major dropped from 16 percent in 2000 to 9.6 percent in 2002. This trend has prompted major computer companies such as Microsoft and Intel to pursue offshore outsourcing programs not only for rudimentary coding assignments, but also for significant research and development initiatives. While the Cold War competition with Russia drew many students to engineering in the 1960s, and the 1990s had the appeal of soaring salaries, the post-dot-com economy has eroded the appeal of the technology industry for many future job-seekers. The unemployment rate for computer scientists from 2000 to 2004 was the highest of any four-year period since 1972, at one point eclipsing the national unemployment rate, though the job market has since rebounded, as computer science graduates from the best schools can look forward to starting salaries of roughly $70,000. Despite such perceptions, the U.S. job market for computer scientists is actually improving, as the Labor Department predicts that the demand for science and engineering occupations will increase by 47 percent over the next five years, compared to a more moderate 15 percent increase for jobs in other fields.

  • "Think Tank, House Eye H-1B Abuses"
    EE Times (01/02/06); Schiff, Debra

    Recent studies have found widespread exploitation of the H-1B visa program among employers, in addition to existing evidence of alarmingly low wages. Legislation pending in the House introduced by Rep. Bill Pascrell Jr. (D-N.J.) is aimed at curbing H-1B abuses, ranging from discrimination based on immigration status to outright fraud, though wage discrimination remains the most controversial issue. Augmenting the controversy is a recently released report from the Center for Immigration Studies (CIS) that highlighted a $13,000 wage discrepancy between domestic employees and workers in the country through H-1B visas. The proposed Defend the American Dream Act aims to reform H-1B policy by mandating that companies determine their estimates of prevailing wages based on local data, as well as maintaining the current cap of 65,000 visas and shortening the duration of a foreign worker's stay by limiting it to a non-renewable three-year duration, or a two-year period, renewable for another two years. The bill was introduced by Sona Shah, an American Indian who sought to combat discrimination against U.S. workers and the exploitation of foreign labor. The CIS report also highlighted the practice of consultancies, or body shops, that contract H-1B workers on a daily basis at low wages to perform basic IT tasks, while categorically refusing to employ U.S. workers. University of California, Davis, computer science professor Norman Matloff argues that companies save money with H-1B employees either by hiring workers with comparable experience and qualification at a lower salary, or by employing a younger foreign worker at a portion of the cost of an older American. The proposed bill would authorize random auditing of applications for visas and impose stiffer penalties on violators, as well as increasing the fee for processing an H-1B visa threefold to $4,500.
    Click Here to View Full Article

  • "Phone Companies Set Off a Battle Over Internet Fees"
    Wall Street Journal (01/06/06) P. A1; Searcey, Dionne; Schatz, Amy; Young, Shawn

    BellSouth, AT&T, Verizon, and other large phone companies want to collect fees from Internet content providers for fast access to high-quality music, movies, and other material that is sent over their telecommunications networks. Providers that pay such fees would find their user transactions prioritized, which means consumers would be able to access content faster. Network operators are attempting to prioritize Internet traffic amid rising consumer demand for the reliable delivery of phone service, real-time video games, and video. Critics claim exacting a toll for priority content delivery is tantamount to extortion, which puts competition and ultimately consumers at a disadvantage. Vonage CEO Jeffrey Citron said this setup essentially means customers will be charged for bandwidth twice, and he is concerned such a move will encourage cable companies that offer broadband to also append additional charges. Phone companies cite the need to find new sources of revenue as cable and Internet-based companies steal customers away by offering cheaper phone service. Washington is taking a cautious attitude to the phone companies' plans: "We need a watchful eye to ensure that network providers do not become Internet gatekeepers, with the ability to dictate who can use the Internet and for what purposes," admonished FCC Commissioner Michael Copps. Mark Cooper of the Consumer Federation of America is worried that a fee system for prioritized content delivery would allow the phone companies to block smaller Internet companies' products from the marketplace.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "That's Why It's Called Research"
    HPC Wire (01/06/06) Vol. 15, No. 1; Bell, J. William

    In a recent interview, Duncan Buell, interim dean of the College of Engineering and Information Technology at the University of South Carolina, outlined his thoughts on reconfigurable computing. The field of reconfigurable computing emerged shortly after the advent of the FPGA, enabling users to tailor the arithmetical logic of the instruction set to their individual computing requirements. Recent years have seen significant improvements in chip performance, with multiple processors running in parallel, a development that could have a major impact on the scientific disciplines. Buell believes that the hardware to support this shift will need to be written in a basic language such as C or C++ to engage the scientific community. Reconfigurable computing will need a few evangelical pioneers to forge the path and perform the initial experimentation, and then report back to the rest of the community, though he says any application offering a significant improvement will require at least six months of development for adequate programming.
    Click Here to View Full Article

  • "Security Flaws on the Rise, Questions Remain"
    Security Focus (01/05/06); Lemos, Robert

    Pervasive bugs in Web applications contributed to the first major increase in publicized security vulnerabilities in three years, though different databases offer competing figures on the number of security risks discovered in recent years. A recent examination of four major databases consistently indicated a spike in vulnerabilities stemming from easily discovered flaws in Web applications and a doubling of the number of errors found in software, and security analysts believe that such vulnerabilities will not disappear any time soon. The National Institute of Standards and Technology (NIST) has developed the National Vulnerability Database that uses the Common Vulnerability Scoring System to produce a standardized reading of security flaws. Because each of the four databases surveyed uses different cross-referencing techniques and editorial policies, meaningful comparisons are difficult. CERT, which was one of the databases surveyed, reported 5,198 vulnerabilities in 2005, though that finding has been disputed. Whatever the figure, CERT's conclusion that 2005 saw a spike in vulnerabilities is legitimate and widely agreed upon. Most vulnerabilities are not catastrophic, however. "Web-based vulnerabilities are all over the place and they are really easy to find--they are the low-hanging fruit," said Symantec's David Ahmed. "We have had high-profile vulnerabilities, but that is not what is driving this increase." Computer scientists are more concerned with flaws embedded in the software developed by major companies. It should also be noted that any analysis of software vulnerabilities does not concern products developed in the current year. "These numbers are showing the state of practice from a few years ago, rather than what the current state of practice is today," said CERT's Jeff Havrilla.
    Click Here to View Full Article

  • "Semantic Web Travel Services on a Voyage of Discovery"
    IST Results (01/05/06)

    The IST-funded SATINE program is developing a secure interoperability structure for utilizing peer-to-peer tourism networks on the Web based on semantic technologies, ultimately seeking to overcome the incompatibility issues that plague existing databases. The tourism industry currently operates under the services powered by Global Distribution Systems (GDS), which are inaccessible to many smaller enterprises. GDSs are also outmoded systems with unfriendly interfaces that depend largely on private networks. The travel industry has united to form the Open Travel Alliance and has developed an XML specification to facilitate communication between partners on issues such as checking for availability, booking, and insurance. By semantically enriching travel Web services and offering them in a distributed environment, the SATINE project improves on this initiative. The SATINE project also developed enrichment techniques for e-business XML registries based on OWL-S ontologies that describe the semantics of Web services. Enabling a machine to correctly interpret travel terms such as 'booking' is a difficult task which, once accomplished, creates interoperability among incompatible systems. "The creation of complex services through the orchestration of simple Web services is an important task that is of particular relevance in the travel business: Apart from the typical examples, like the composition of package tours, more sophisticated services like a flight booking based on the availability of tickets for a certain cultural event are conceivable," said project coordinator Asuman Dogac, a computer engineering professor at Middle East Technical University and director of its Software Research & Development Center.
    Click Here to View Full Article

  • "The Patent Epidemic"
    Business Week (01/09/06) Vol. 3966, No. 60; Orey, Michael

    A host of litigation pending before the Supreme Court this year has the potential to broadly reshape current patent law, as the justices are taking a closer interest in issues such as the definition of an obvious invention, which would be immune to patent protection. Many businesses, including Microsoft and Cisco, have been lobbying the Supreme Court to restrict over-patenting for its potential to curb innovation. With roughly 400,000 patent applications filed each year, the Patent and Trademark Office issued 181,000 in 2004, compared to 99,000 in 1990. Recent years have seen the proliferation of patents relating to computer software, business practices, and genetics, though many critics of the patent process allege that most amount to little more than rehashed combinations of existing methods and technologies. Currently, Microsoft is embroiled in between 35 and 40 patent suits, while Cisco is involved in seven, which has sparked widespread concern throughout the industry that engineers can inadvertently infringe on existing patents in their normal course of work. As a result, many companies are pursuing a strategy known as defensive patenting, in which they unnecessarily increase their patent applications to guard against infringement suits. Cisco, for example, has recently increased the number of patents for which it annually applies from a few hundred to roughly 1,000. While companies are attempting to ward off lawsuits by increasing their patent portfolios, many analysts believe the erosion of the obviousness test for patents is to blame for such a strategy. Courts and patent examiners are not permitted to deem a patent obvious unless there is a previously documented reference to the combination of elements that comprise the innovation.
    Click Here to View Full Article

  • "Computers Estimate Emotions" (01/04/06)

    Researchers from the Fraunhofer Institute for Computer Graphics Research IGD in Rostock will present at CeBIT 2006 techniques they have developed to give a computer the ability to understand the emotional state of its user. The scientists have developed a glove with sensors for recording factors such as heartbeat and breathing rate, blood pressure, skin temperature, and electrical resistance of the skin. "It is connected to a device that evaluates and saves the data," explains Christian Peter, engineer at the department for Human-Centered Interaction Technologies. "We are also working on techniques that will enable computers to interpret facial expressions and extract emotional elements from voice signals." The researchers face the difficult task of training the computer beforehand because emotions are not easy to interpret, but they say they have achieved some success in this area. A computer that is able to determine that a user is frustrated may be able to respond in a manner that would not prompt someone to smack the monitor or damage the unit in some other way. CeBIT is scheduled for March 9-15, 2006, in Hanover.
    Click Here to View Full Article

  • "Better Robots Could Help Save Disaster Victims"
    New Scientist (01/05/06); Kleiner, Kurt

    The development of search-and-rescue robots continues to be held back by the lack funding from government and industry, according to William L. Whittaker, a roboticist at Carnegie Mellon University in Pittsburgh. Roboticists say new search-and-rescue robots would have been beneficial in efforts to save lives, such as at Sago Mine in Talmansville, W.Va., where 12 miners died. Although the rescue workers made use of a robot, it was a commercial model that was not designed to navigate a mine, and after moving 21 meters into the tunnel it became bogged down. Robin Murphy, director of the Center for Robot Assisted Search and Rescue at the University of South Florida, describes the robot as being slow, ineffective, and designed more for bomb disposal. Whittaker is designing a robot that would use instruments such as laser rangefinders to create detailed three-dimensional maps of tunnels inside of a mine altered after an accident. His colleague Howie Choset plans to give a robot the ability to move like a snake through small spaces in a mine or a building that has collapsed. In addition to monitoring conditions and determining whether it is safe for rescue workers to enter, the next-generation of robots may also allow survivors to talk to rescuers and bring them food, oxygen, and medicine.
    Click Here to View Full Article

  • "Panel Urges Paper Record of Electronic Votes"
    Richmond Times-Dispatch (VA) (01/06/06); Whitley, Tyler

    A Virginia legislative subcommittee has recommended that the state test a voting system that creates a paper trail to verify the accuracy of electronic machines, though paper trails may be unnecessary if the electronic machines in a pilot program prove reliable. Subcommittee chairman Timothy Hugo said that he intends to introduce legislation permanently mandating a paper trail for every vote cast, though the measure is likely to be defeated by election officials and registrars. Under the recommendation, the State Board of Elections will establish a monitoring program in several precincts to compare electronic returns with paper records. A permanent paper trail could be required if officials discover significant inaccuracies, though the pilot program will not begin before 2007. Computer scientist Alex Blakemore, co-founder of Virginia Verified Voting, says the pilot program study needs to be objective to be useful, and "the devil is in the details." Hugo believes that paper records could restore voter confidence in a tarnished election process. Virginia plans to use $30 million in federal funding to replace its punch-card and mechanical lever machines with touch screen and optical-scan systems under the Help America Vote Act. The new machines were used in the statewide election of November, which saw the closest contest in modern history in the race for attorney general. A recount shifted 37 votes from Democrat Creigh Deeds to his victorious Republican opponent, Robert McDonnell.
    Click Here to View Full Article
    For news on ACM's e-voting actitivities, visit

  • "Euro Rules Force Cleaner Gadgets"
    Wired News (01/04/06); Captain, Sean

    Europe's new environmental rules for improving the disposal and recycling of electronics is likely to have a major impact on high-tech companies based in the United States. The standards for limiting toxic materials in electronic products apply to products sold in the European Union. Companies are "not going to make one product line for the European Union and another line for the rest of the world," says Richard Goss with the U.S.-based trade organization Electronic Industries Alliance. The rules, known as RoHS (for "the restriction of the use of certain hazardous substances in electrical and electronic equipment"), take effect July 1, and limit two brominated flame retardants, polybrominated biphenyl and polybrominated diphenyl ether, as well as the heavy metals lead, mercury, cadmium, and hexavalent chromium in almost all electronics products. RoHS has also inspired China to attempt to create similar rules. What is more, California's Electronic Waste Recycling Act, which takes effect Jan. 1, 2007, and affects computer monitors, laptop computers, and TVs, bases its heavy-metal rules on RoHS. European officials say the rules will make it easier and safer to recycle or dispose of electronic products.
    Click Here to View Full Article

  • "Job Jitters Just Won't Stop"
    InformationWeek (01/02/06) No. 1070, P. 54; McGee, Marianne Kolbasuk

    While there is still room for growth in the IT sector for workers with select skills, hiring across the industry will remain slow in 2006 and raises will be scant, with just 23 percent of technology professionals recently surveyed reporting that they plan to hire IT managers this year. While 47 percent say that they plan to increase their IT staffs, companies are specifically seeking applicants with skills in business applications, project management, and software development. Economic pressures, system consolidation, and outsourcing are the principal reasons why 12 percent report that they will reduce their staffs. Companies are also increasingly pursuing applicants who can merge IT skills with business expertise. By the third quarter of 2005, unemployment in the IT sector had decreased by 1.5 percent from the same time a year earlier, with 3.44 million IT professionals employed, up from 3.32 million in 2004. Health care and other industries actually report difficulty recruiting top-tier IT talent, with financial services and other sectors offering far higher salaries. Generous compensation is the surest way to prevent IT talent from defecting to other companies or industries, and only 1 percent of respondents predict that IT salaries will go down in 2006, while 83 percent believe that they will increase; 15 percent expect no change.
    Click Here to View Full Article

  • "Quantum Cryptography: When Your Link Has to Be Really, Really Secure"
    EDN Magazine (12/16/05) Vol. 50, No. 26, P. 41; Schweber, Bill

    Quantum cryptography (QC) can deliver utterly secure data transmission through the harnessing of the laws of physics, photon quantum states, and Heisenberg's uncertainty principle. BBN Technologies devised a fully operational, multi-node QC system that has been running for over two years, connecting a trio of Boston-area institutions through a 12-mile loop of unused dark optical fiber. The system, which was developed under a 2002 Defense Advanced Research Projects Agency grant, was based on the random polarization of photons, and subsequent selective polarization filtering and polarization-direction detection. A QC system can be employed for either one-time pad or key-exchange cryptography. The generation of a single photon with known quantum states involves the stimulation of a nonlinear crystal by a laser pump, which consequently creates twin photons with identical quantum states, also known as "entangled" photons. Completing a quantum-encrypted link between sender and receiver requires a setup that includes all-optical, electronic, and electro-optical components, including sources, delay lines, phase shifters, couplers, splitters, and optical fibers that incorporate elements that both do and do not maintain polarization. Although very sophisticated, the system can operate by itself with autocalibration, start-up mode, and self-test mode. Continuous data throughput is also supported. BBN Technologies' Chip Elliott says the next step is to make the systems smaller, cheaper, and more hardware-based.
    Click Here to View Full Article

  • "Networking Tomorrow's Battlefields"
    Military & Aerospace Electronics (12/05) Vol. 16, No. 12, P. 26; McHale, John

    Situational awareness will evolve as all battlefield elements--soldiers, commanders, vehicles, etc.--are networked together through a combination of technologies, including Internet Protocol (IP), wireless networking, and software-defined radio. Facilitating network-centric warfare is a goal of the U.S. Army's Warfighter Information Network-Tactical (WIN-T) program, which General Dynamics' Bill Weiss says is designed to provide warfighters with "access to critical battlefield information, seamless connectivity to the global information grid, unified network operations, joint interoperability, and security across a host of platforms and points of presence." WIN-T constitutes a secure, high-bandwidth, wireless communications network that will connect soldiers on the battlefield to voice, data, and video, incorporating intelligence, reconnaissance, surveillance, netted weapons, and the Future Combat Systems. One of the WIN-T program's most challenging aspects is the incorporation of on-the-move technologies--radio, satellite, cellular, and IP capabilities--that maintain warfighters' linkage to the network and each other, regardless of whether they are stationary or moving; this will enable commanders to receive the right information constantly. The Joint Common Decision and Execution Capability (CDEC) system developed by Raytheon Network Centric Systems is described by Raytheon's Thomas Flynn as an IP-enabled, "remoted, distributed, and nondedicated" system of elements networked into a single entity. This will help support interdependent elements and semi-automated operations in tomorrow's battlefield. BAE Systems is developing the Adaptive Joint C4ISR Node (AJCN), which can interoperate with American and coalition systems and support real-time network-centric connectivity. BAE's Matt Merryman says AJCN can establish a wide area network from the air with a common data link that combines command, control, communications, and computer, intelligence, and reconnaissance components.
    Click Here to View Full Article

  • "Are the Bad Guys Winning?"
    Campus Technology (01/06) Vol. 19, No. 5, P. 20; Gale, Doug

    Cybersecurity authority Eugene Spafford is not a big believer in the current strategies for cracking down on electronic vulnerabilities, which have grown by more than 10 a day in 2004 and have increased 20-fold since 1995. In recent testimony before the House Science Committee, Spafford, professor and executive director of the Center for Education and Research in Information Assurance and Security at Purdue University, said no one should be shocked if there are more breaches, defacements, and viruses in the immediate future. "The software and hardware being deployed today have been designed by individuals with little or no security training, using unsafe methods, and then poorly tested," said Spafford. "This is being added to the fault-ridden infrastructure already in place and operated by personnel with insufficient awareness of the risks." Spafford, a former member of the President's Information Technology Advisory Committee, believes systems that are simpler, sturdier, and better-made would solve the problem. However, the revenue stream of hardware and software vendors is based on the regular introduction of new and more powerful products that make systems more complex and vulnerable; and research is focused more on short-term patches rather than on making computer architectures more secure. Spafford says that left unchanged, the current system could implode. Alternatively, the market could start demanding and rewarding vendors that focus on making systems simpler and more secure, or people could limit the use of IT to avoid security problems.
    Click Here to View Full Article
    Eugene Spafford is also chair of ACM's U.S. Public Policy Committee (USACM)

  • "How to Erase Hidden Database Design Errors"
    Software Test & Performance (12/05) Vol. 2, No. 11, P. 28; Sweeney, Mary Romero

    Effective database design's impact on database performance is frequently neglected, and there are six notable design flaws that can negatively affect performance, according to database expert Mary Romero Sweeney. The removal of referential integrity constraints will eventually lead to systems inundated with invalid data records that result in poor system performance, but this removal may be necessary in certain cases; Sweeney recommends "careful, considered analysis" to thoroughly understand the performance consequences of referential integrity's use or non-use. Database indexing strategies should also be adjusted through regular examination of the types of queries being performed on various tables, since an overabundance of indexes can slow down queries, while the addition of appropriate indexes to a table can enhance performance. The denormalization of data, or the violation of established normal forms in basic database design, does not need to be carried out as often as it usually is, the author maintains. Sluggish performance can result from inefficient locking architectures, and Sweeney advises frequent review and re-strategizing of the locking architecture scheme by staff proficient with the organization's database management system, who will also vigilantly monitor these conditions as the company's data changes. The author discourages the inclusion of directly embedded SQL statements in the application code of today's database applications, and instead favors the employment of code that calls and executes stored procedures. Sweeney says organizations that do not physically partition files and data will be doing application performance a disservice. She also recommends a good profiler and warns that overreliance on standard performance tools and tuners can cause important design issues to be overlooked.

    [ Archives ]  [ Home ]