HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 688:  Monday, August 30, 2004

  • "Supercomputer Seeks Comeback"
    Wired News (08/30/04); Gohring, Nancy

    The vector supercomputer is once again available in the United States after a long period of quiet, but industry observers say that cluster computing may soon close the gap on the ability to perform complex calculations. Vector computing suffered from the SGI acquisition of Cray in the mid 1990s, which used the company's patents and cash to develop cluster computing. Additionally, a Cray lawsuit against NEC prompted a 454 percent tariff on the latter company's products, meaning new vector supercomputer technology was unavailable in the United States during that period. But while NEC went on to build the record-setting Earth Simulator for the Japanese government, cluster computing made supercomputing available to the masses in the United States. Now, with Seattle-based Tera Computing having bought Cray in 2000, the vector supercomputer market is attempting a comeback in the United States. Tera Computing assumed Cray's name, and vice president Peter Ungaro says that even though the market is small, it presents a huge opportunity for Cray, which shares the field only with NEC. U.S. government agencies are wary of using foreign products, and often use vector supercomputers to simulate nuclear weapon explosions, for example. As long as agencies are willing to pay premium for the speed afforded by vector supercomputers, analyst Frank Gillett says there may always be a niche for those products. Meanwhile, Oak Ridge National Laboratory scientific applications technical leader Trey White says cluster supercomputers will reach a limit size where too much resources are required for communicating and synching. He believes even the 130,000-processor Blue Gene supercomputer built for Lawrence Livermore Lab by IBM will not be able to perform certain applications vector supercomputers can. SGI's Jeff Greenwald insists cluster supercomputing solutions are still developing new capabilities, and will soon be able to complete the type of workloads previously reserved for vector machines.
    Click Here to View Full Article

  • "Tech Workers Stay Put as Economy Perks Up"
    USA Today (08/30/04) P. 1B; Kessler, Michelle

    Voluntary turnover among technology workers continues to fall as just less than 9 percent of IT workers voluntarily left their jobs last year, according to a survey by Aon Consulting of 595 of the world's biggest tech firms. Aon also reports that 11.2 percent of the IT workforce was laid off last year, down from 20.3 percent in 2001, but notes that salaries are now rising for every category of tech worker and 29 percent of firms are hiring normally. The lack of voluntary movement is a surprise to analysts given the apparent improving economy, and the study's author, John Radford, says, "It's as low as I've seen it, and I've been tracking these numbers since the early 80s." Experts say the lack of movement is likely due to the hangover from the tech bubble burst as well as offshoring concerns. Still, they say employers shouldn't become complacent as IT pros should see better job offers as the employment picture brightens.
    Click Here to View Full Article

  • "A PC That Packs Real Power, and All Just for Me"
    New York Times (08/30/04) P. C3; Markoff, John

    Cheap desktop PCs and Linux-powered clusters seemingly put an end to the era of custom-designed workstations, such as those once hawked by Sun Microsystems, Silicon Graphics, and Apollo Computer. But now a new firm called Orion Computers is hoping to revitalize the market for workstations by offering single-user solutions that provide significant processing power from a small-footprint system. Orion is selling a $10,000 desktop workstation that uses 12 Efficeon processors from Transmeta, contributing toward performance that should be about 10 times that of a regular desktop PC while using just about the same amount of power that a normal PC does. A larger, $100,000 workstation uses 96 processors and will draw 1,500 watts of electricity, which is the maximum available from standard electrical outlets. Although experts are divided over the success of Orion, the company offers interesting insight on the development of business computing. Many billion-dollar, computing-intensive firms today, such as Google, do not even consider using custom-designed computing solutions. Meanwhile, power draw is becoming an increasingly important consideration in computing, says consultant John Mashey, who has worked as a computer designer at Silicon Graphics and Mips. Colin Hunter and Ed Kelly, Orion's founders, said they chose the Efficeon processor because of their stringent high-performance, low-power goals, although both previously worked for Transmeta. IBM has also opted for a similar approach with its Blue Gene supercomputer, which relies on many low-power processors packed tightly together. Analyst Addison Snell says Orion allows engineers and other heavy computing users to bypass queues for laboratory computing clusters.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Open-Source Community Skeptical About Microsoft's Sender ID License"
    eWeek (08/25/04); Vaughan-Nichols, Steven J.

    Microsoft's Sender ID technology requires a license that is at odds with open-source software licenses, making it unlikely that the Internet Engineering Task Force's (IETF) email authentication standard will be successful as currently described. Sender ID is a fusion of Microsoft's earlier Caller ID technology with the Sender Policy Framework (SPF) devised by Pobox.com founder Meng Weng Wong. But even as the company prepares to release the developer implementation of Sender ID, the license Microsoft has attached to its technology prohibits transferring of intellectual property rights and modification. Open-source software licenses, on the other hand, use licenses such as the General Public License that not only protect, but insist on those characteristics, making it unlikely that companies using open-source software will be able to use Sender ID. In addition, Microsoft's Sender ID license requires users to notify the company of how they plan to implement the technology, which could give them critical information about their competitors' plans. This last quality alone has been reason enough for other technology to be rejected by the Open Source Initiative and Free Software Foundation, says open-source expert and lawyer Lawrence Rosen. Negotiations are ongoing between lawyers such as Rosen and Microsoft legal representation, and while Microsoft seems to be flexible on other issues, it will not give up its refusal of sublicensing. Sendmail chief technology officer Eric Allman says Microsoft has agreed to not involve end users in any license agreement, but still requires licensing agreements from companies such as Sendmail that would probably lead to the standard not being as widely deployed as possible. It is possible that Microsoft's Sender ID could be dropped altogether in favor of a tweaked version of the original SPF technology, says Allman.
    Click Here to View Full Article

  • "Have E-Books Turned a Page?"
    CNet (08/27/04); Becker, David

    Book publishers are starting to understand what consumers really expect from e-books, after stumbling for more than a decade. Sales continue to grow steadily; the Open eBook Forum says e-book sales totaled $3.23 million in the first quarter of 2004, up 28 percent from a year ago. Nevertheless, industry observers expect e-book adoption to continue at a gradual pace, until book publishers come up with a digital distribution system that is similar to what the music labels have devised. "They're more traditional, they're very decentralized, and it just takes them longer to work out issues," says analyst Jean Bedford. The book publishing industry is still struggling to determine which devices would be most suitable for e-books, while studies show that consumers want to read digital content on the compatible PCs, handheld devices, and personal digital assistants that they already have. Meanwhile, top publishers have been slow to make their titles available in e-book form because of concerns of illegal copying of material. Digital rights management issues also continue to plague the industry. Mike Violano of eReader says, "There are far too many standards and ways of doing things now, and that's a source of frustration for customers," while Bedford says, "There's no good DRM, period. Publishers all want heavy-duty DRM, but the problem is that anything you do gets in the way of buying and using e-books." Other analysts say e-books represent an enormous shift in behavior that will take time, and the technology must improve to make reading digital books as convenient as paper. Sean Devine of Safari Books says, "The market will really change at a point where the readability on screen is as good as paper." Meanwhile, as the publishing industry standardizes on PDF and XML technologies for its production process, it has the infrastructure in place to move its content to the digital retail market.
    Click Here to View Full Article

  • "Japan Helps Drive Surge in Global Patent Applications"
    IDG News Service (08/27/04); Kallender, Paul

    An increase in the number of international patents signals renewed interest in research and development and worldwide economic recovery, according to the World Intellectual Property Organization (WIPO). Although the United States continues to lead with 17,278 international patent applications in the first six months of 2004, Japan has extended its international patent prowess, claiming 10,393 patents compared to 8,349 patents for the same period last year. In 2003, Japan surpassed Germany for the No. 2 spot, and while Germany's patent application numbers have remained essentially flat, the Japanese have sped ahead with research and development boosts at large electronics firms such as Matsushita, which owns the Panasonic brand, and Sony. WIPO figures do not include all patent applications, some of which are done solely in-country, but do offer a representative sample of worldwide patent activity. Chinese companies have also increased their patent applications, rising to No. 14 in the global rankings with 856 applications at this year's midpoint. In Japan, Matsushita and Sony broke into the top five global patent appliers for the first time in 2003--company-specific numbers for 2004 are not yet available. Meanwhile, 2003 research and development expenditures at Matsushita increased 10.5 percent while Sony increased spending by 6.9 percent. Samsung Electronics and LG Electronics also significantly increased their patent application standing in recent years, with Samsung more than doubling its applications in 2003 with 194 submissions. LG Electronics also doubled its patent applications, submitting 156 applications in 2003.
    Click Here to View Full Article

  • "Wi-Fi Goes Airborne"
    Technology Review (08/27/04); Hellweg, Eric

    Earlier this year, Lufthansa Airlines began offering a new service from Boeing--broadband-speed wireless Internet connections while in the air. This month, wireless connection company iPass announced a partnership with Boeing's Connexion, and in roughly six months, corporate frequent fliers on some flights will be able to access files on their intranets while aloft. A few flights already offer the airborne Wi-Fi service, and users can securely log on to corporate networks. The service uses a Boeing ground station to transmit an Internet connection through a satellite gateway to a satellite, where Boeing rents transponder space. The satellite beams the signal to an antenna on the aircraft's fuselage. Airlines charge $15 for access for flights lasting less than three hours, $20 for flights between three and six hours, and $30 for flights over six hours; per-minute plans are also available. The service gives airlines some immediate revenue gains, and offers them some cost savings as well--Boeing is working with airlines on applications that could permit real-time diagnostic monitoring of planes in flight. This would serve both pilots and air traffic control, as well as replace existing Internet connections that operate at just 9.6 Kbps. Boeing has been working for nearly four years to get FAA and airline approval for its Wi-Fi service. Connexion's Stan Deal says, "The last barrier to break was bringing a broadband connectivity solution to the airline that was affordable for passengers and for the airline to equip...it took a lot of system engineering." The high-speed connection also makes possible real-time data streams for air traffic controllers. Lufthansa's Michael Lamberty says, "We'd also like to use the connection to send satellite maps and weather maps to the pilots."
    Click Here to View Full Article

  • "Kids Still Shunning IT in Worrying Numbers"
    silicon.com (08/26/04); Sturgeon, Will

    U.K. secondary students generally feel unenthused about IT, according to new General Certificate of Secondary Education (GCSE) and other government schooling data. Just 1.1 percent of all secondary school students chose A-Level computing in the United Kingdom this year, even less than the paltry 1.4 percent of students enrolled last year. In real terms, 1,400 fewer students took A-Level computing qualifications this year. Among students younger than 16 years old, interest in IT is still low, with just 1.7 percent of GCSE students, or 8,488 students, choosing IT as a subject, compared with 9,280 music students and 17,831 expressive arts/drama students. Rob Chapman, whose The Training Camp organization offers IT educational opportunities, says the U.K. government needs to do more to stress the importance and benefits of an IT career. "The government should be treating IT in the same way as it treats the 'three R's.' IT is now as fundamental to society as reading, writing, and arithmetic," he says. Explaining the salary opportunities for qualified IT professionals could be one solution in encouraging greater numbers of IT students, as is countering the perception that IT is only about technical computer work. Projecting the image of IT as creating services for people and business could help dispel "geekiness" myths and encourage greater numbers of girls to enroll in IT courses, says NeoOne managing director Hetty Browne.
    Click Here to View Full Article

  • "Where the Fantastic Meets the Future"
    Business Week (08/25/04); Shapiro, Sarah R.

    Bell Labs physical sciences vice president Cherry Murray says corporate research and development is more important than ever, and is becoming faster and more global in nature. Whereas the federal government used to fund two-thirds of research and development, business has now taken on the lion's share of research funding. The global economy and communication technologies such as the Internet have also created a situation where corporate research and development is no longer focused on a single market and is getting new technologies to market faster. Wavelength division multiplexing systems were prototyped at Bell Labs in the 1980s and brought to market in just seven years, for example. Murray says research and development occur simultaneously nowadays, with new technologies being added to still-developing products when appropriate. Businesses are also globalizing their research and development in order to get closer to their customers and ensure world-class talent in their facilities. China, India, Taiwan, Japan, and Europe are all focused on building out their own high-tech industries and continue to press the United States, especially in physical sciences and engineering. Overall, Murray says this is a good thing because it will create a prosperous global economy. The availability of venture capital for startup firms is also bringing the science and business fields closer together. At Bell Labs, about 20 percent of research goes toward long-term projects, and about 10 percent of Bell Labs researchers are working on basic science without plans for specific products at all. At the same time, colleagues are looking out for technology that might successfully cross-over into their own project. In order to take advantage of future technology, engineering graduates must be able to think on a system-level and leverage computing power to model complex systems.
    Click Here to View Full Article

  • "NSF Funds Database Gymnastics"
    Government Computer News (08/16/04) Vol. 23, No. 23, P. 48; Jackson, Joab

    The National Science Foundation (NSF) is funding a slew of database Web services projects that would enable local, state, and federal agencies to easily pull data from diverse database sources. Commercial database solutions do not provide for this type of cross-agency interoperability, so the NSF, through its Middleware Initiative, is forging the way by funding 31 projects worth a collective $25 million. The University of Southern California has already demonstrated one Web services project that links 11 databases in an application measuring freight traffic over Los Angeles freeways. The system hooks up to databases responsible for sea and air shipping by describing a uniform way to categorize and describe the data. This ontology allows researchers to quickly create workflow models for capturing data from different sources. Virginia Tech and Purdue University researchers have created a similar database Web services scheme that allows citizens and Indiana Family and Social Services case workers to view information needed to complete online social services. Previously, the only way to execute on these services was to contact individual state agencies through fax or phone, or by making multiple in-person visits. The system uses Common Object Request Broker Architecture server technology to rationalize differences between the Oracle, IBM, and ObjectStore relational database management systems used. Each of these projects requires ontologies that merge data sets from various sources. Related NSF projects include automatic mapping technologies that match similar data, such as addresses, phone numbers, or ZIP codes. The University of Southern California and Washington University in St. Louis used machine translation techniques to automatically match columns of similar data despite different column headers.
    Click Here to View Full Article

  • "Mobile Java Gets More Juice"
    Wireless Week (08/26/04); Smith, Brad

    Nokia and Vodafone have agreed to work together in an open standards development initiative to make compatible the different devices of manufacturers that use Mobile Java. Device fragmentation has been a problem for Mobile Java, called Java2 Micro Edition, an open-standards set of specifications that has been implemented in different ways on various handsets. As a result, developers have had to write their applications several times for the different handsets on a carrier network. Nokia and Vodafone intend to work within the Java Community Process and use existing technology as they define J2ME services. Their work will involve the mass-market and smart devices specifications, JSR248 and JSR249, which refer to Connected Limited Device Configuration and Connected Device Configuration. Sun Microsystems will handle licensing and development of technology capability kits and reference implementations for the initiative, which is supported by a number of companies and industry groups, including the World Wide Web Consortium. Analyst John Jackson views the effort as a "significant step toward accelerating the development and distribution of more robust mobile applications."
    Click Here to View Full Article

  • "From Factoids to Facts"
    Economist (08/26/04)

    Microsoft researcher Eric Brill is working on new Web searching technology that provides single-word answers to direct questions. The Ask MSR prototype basically distills information from the Web for users, and is a natural progression in Web search technology. Whereas AltaVista pioneered Web indexing and Google found a better way to return relevant answers, the new technology promises to get to the heart of the matter--providing answers. Ask MSR (MSR represents Microsoft Research) restructures simple questions such as "When was Marilyn Monroe born?" into declarative phrases like "Marilyn Monroe was born," and then does searches using those phrases. The results are processed and ranked according to frequency, and the correct answer lands in the top three results 75 percent of the time. Oftentimes, the wrong results are obvious, so that human intelligence acts as an effective final filter. Eventually, Brill and collaborator Radu Soricut of the University of Southern California aim to build a Web search system that would provide much more detailed answers based on Web search results. Their efforts have been described in a jointly written paper, "Beyond the Factoid," where 50-word answers are envisioned drawing on the native intelligence of the Web, not traditional artificial intelligence. As the Web grows, the accuracy of the answers will grow commensurately. Brill and Soricut's technique relies on the same "noisy channel" model used for computerized spell checkers and speech recognition software. Using a broad set of paired questions and answers, such as those stored in frequently asked question lists, the team's system provides the basis for Web search queries.
    Click Here to View Full Article

  • "Survey Reveals What's Hot & What's Not: Top 10 Online Computer Science Degrees"
    Virtual University Gazette (08/04)

    Among the 10 most popular online graduate technology degrees, the top five are closely contested, says Vicky Phillips, CEO of GetEducated.com, which conducted the annual survey. The top five degrees are computer information systems, management of technology, management of information systems, general computer science, and Internet technologies. "Employers today will pay a premium for someone who presents with hard tech skills topped off by the ability to motivate others and manage large scale projects," especially at the graduate level, notes Phillips. The remaining five most popular online master's degrees in technology are computer programming, computer network management, computer security, database technologies, and e-commerce. All the degrees are indicative of what is going on in the overall job market across the country, according to Phillips. Furthermore, people who earn managerial degrees also obtain such desirable skills as problem-solving, communication, and collaboration, which are compatible with different technology fields. Ranking at 11th is electrical engineering, while computer engineering ranks 13th; this is unusual given engineering's "promising" salary levels and current strength, says Phillips. She also forecasts that over the next three years, demand will rise for degrees in computer security in the wake of increasing post-Sept. 11 security threats.
    Click Here to View Full Article

  • "Encryption Gets a Boost"
    Federal Computer Week (08/23/04); Olsen, Florence

    The National Institute of Standards and Technology (NIST) re-energized the stodgy cryptography industry a few years ago when it began holding a global cryptography competition, inviting cryptographers to submit their best encryption algorithms. The competitions helped the agency develop the Advanced Encryption Standard (AES), a significant advance for the encryption field that has helped to make cryptographic technologies more common and invisible. Cryptographic experts say crypto-technologies both keep sensitive electronic data private and guarantee that data has not been altered. AES was quickly accepted by the information security industry and many other governments. It runs faster than older algorithms and is more efficient, and encryption specialists say it is ideal for small wireless devices or systems with limited processing capacity. Meanwhile, elliptic curve cryptography (ECC) has made public-key systems popular--they generate and distribute cryptographic keys. Like AES, ECC is faster than older technologies, and National Security Agency officials have been using ECC and AES to modernize many federal agencies' public-key systems. Stephen Stark, IT manager of the Energy Department's Nevada nuclear test operations laboratory, says that new capabilities to manage encryption keys and policies from a central server can improve information security and make security administrators' burdens lighter. He says encryption schemes that rely on users to enable encryption, such as PGP, are not sufficient because people often forget to use them. Another challenge is finding a single solution that can be applied across an entire organization.
    Click Here to View Full Article

  • "Mesh Networks: The PacketHop Initiative"
    Siliconindia (08/04) Vol. 7, No. 7, P. 22; Sastry, Ambatipudi

    The PacketHop system uses standards-based mobile mesh networking technology to guarantee compatible links to existing infrastructure, writes PacketHop chief technology officer Ambatipudi Sastry. PacketHop's value to homeland security and public safety was demonstrated this past February in a Golden Gate Safety Network exercise that tested the network's survivability, instantaneousness, security, and interoperability through its support of multi-agency communication, coordination, and response. Mobile broadband connectivity was maintained across disparate networks, numerous devices, and rugged terrain. Other markets where mobile mesh networking can play a significant role include the home networking, automotive, enterprise, and consumer sectors. PacketHop uses distributed encryption, key management, and authentication techniques to sustain secure mesh operation even with the loss of all connections to centralized management entities. To ensure roaming capabilities, self-contained dynamic Internet Protocol addressing and rendezvous technologies let devices connect to or remove themselves from a mobile mesh and/or link to public or private fixed infrastructure while maintaining ties to critical services. Quality-of-service is guaranteed through the use of standards-based quality-of-service protocols that enable multicast protocols, management of multimedia data flows, and layer 2-aware routing methods. Both distributed and centralized management and policy control is used for network management. PacketHop also provides interoperability with existing wireless LAN infrastructure with no need to upgrade or replace wireless access points. PacketHop mobile mesh networking software enables the instant configuration of a wireless broadband network when it is loaded onto an array of IP compatible radio-equipped gadgets, and the PacketHop mobile mesh broadband solution is comprised of network client software, a network controller, and a network management system. In addition, the company has built a kit of applications that function in a server-free, peer-to-peer environment across all IP devices and networks.
    Click Here to View Full Article

  • "Is Technology Adoption Quickening?"
    Electronic Business (08/04) Vol. 30, No. 8, P. 16; Haughey, James

    Personal computer integration took nearly a generation in wealthy countries, but the market remains largely untapped on the world stage, considering less than 10 percent of adults and teens across the globe have a PC. A variety of social acceptance issues slowed the integration of PCs from the time they first appeared in the late 1970s. Prices are much more affordable these days, and within three years PCs in China and India will be comparable in price to the cost of units in the United States in the early 1990s. How quickly the rest of the world embraces PCs will have a huge impact on the future growth of the electronics market. In China and the Middle East, PCs could be viewed as more useful tools if Internet access was not limited, and steady adoption in these markets would start to bring down prices. Moreover, governments in developing countries will have to deal with monopolies so that the cost of electricity, telephone lines, wireless access and taxes, tariffs, and delivery costs of replacement or upgrade parts do not hinder adoption. Nonetheless, the cellular handset market might offer the most growth for the electronics market over the rest of the decade. The personal communications devices are being adopted more quickly in developing countries, and they are cheaper, easier to use, and require no electricity service, complex local language software, or even literacy.
    Click Here to View Full Article

  • "Computing the Cosmos"
    IEEE Spectrum (08/04) Vol. 41, No. 8, P. 28; Hellemans, Alexander; Mukerjee, Madhusree

    The Virgo Consortium is an international group of scientists that intends to simulate the evolution of the universe on a supercomputer in order to tackle some of the most fundamental cosmological mysteries. The supercomputer recruited to model the universe is an 812-processor IBM Unix cluster at the Max Planck Society's Computing Center in Germany, and the consortium plans to start storing all its output data in public databases accessible to scientists worldwide, which will allow comparisons to be drawn between the simulated universe and its real-world counterpart. The Virgo group decided to start the simulation 380,000 years after the Big Bang, based on the cosmic microwave background; this past June marked the completion of the first simulation, and over the next few months a detailed picture of the universe's broad distribution of matter is expected to emerge as data is fully processed. Because the interaction of the astronomical number of particles involved in that distribution is beyond the abilities of any existing or foreseeable computer to simulate, the Virgo astrophysicists have approximated those conditions using 10 billion mass points that are evenly distributed and interact exclusively through gravity. The simulation comprises a virtual cube big enough to contain the universe's largest cosmic structures and detailed enough to image the formation of the cosmic web. Some 100 million trillion gravitational interactions had to be calculated for a precise simulation, an impossible task for the IBM cluster, so the researchers split the virtual cube into several billion smaller volumes and lumped together the mass points within these volumes during the calculations, while calculations for short-distance interactions were accelerated via tree algorithms devised by astrophysicist Volker Springel. The Virgo scientists hope the next simulation will accurately portray the formation of galaxies and other celestial objects and phenomena that come close to their real-world equivalents.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM