Volume 5, Issue 508: Monday, June 16, 2003
- "Computing's Big Shift: Flexibility In the Chips"
New York Times (06/16/03) P. C1; Markoff, John
Adaptive computing employs chips with circuitry that changes on a moment's notice; chips with this ability can handle several functions that otherwise would require separate chips, and would be ideal for wireless devices that have to interoperate with different protocols and networks or consumer products upgradeable via Internet download. Proponents of the new technology say it is the next revolution in computing, similar to the advent of the microprocessor in 1972. They say adaptive chips can have all the flexibility of a microprocessor with the performance of specialized hardware. Adaptive chips are different from field programmable gate arrays (FPGAs), because they are much easier to reprogram. Advances in static RAM (S-RAM) have contributed to the viability of adaptive computing since S-RAM memory chips can now replicate entire chip circuit designs. Early Intel executive and Silicon Valley financier Gordon A. Campbell is a big backer of several small adaptive computing firms, including QuickSilver Technology, which is finalizing a chip that will enable people to watch a DVD while recording a television show. Reconfigurable computing pioneer Andrew Singer is heading another Campbell-backed startup, called Rapport Technologies. A decade ago, Singer worked at display firm Radius, which sold an FPGA-equipped monitor that switched between vertical and horizontal views depending on how the user positioned the screen. Other large firms are also exploring adaptive computing, including Intel, IBM, Hewlett-Packard, and Infineon. Still, adaptive computing has skeptics, including Transmeta founder David R. Ditzel, who says, "It sounds like an intriguing idea, but it's not clear what the applications will be to make it commercially viable."
http://www.nytimes.com/2003/06/16/technology/16CHIP.html
(Access to this site is free; however, first-time visitors must register.)
- "ROI, Security Driving IT Employment Trends"
EarthWeb (06/13/03); McMahon, Steve
The IT market is showing signs of stabilization, despite mass layoffs and little elevation in salary last year, as indicated by studies from META Group, International Data (IDC), and the Information Technology Association of America (ITAA). IDC predicts worldwide IT spending will rise 5.8 percent and U.S. spending 4.4 percent this year, and a 2002 Forrester Research study found that eight out of 10 IT employees said their executive officers use IT ardently; additionally, ITAA notes that IT managers plan to create 1.1 million positions in 2004, and Gartner foresees a rapid increase in demand for skilled IT workers--indeed, statistical analysis shows that the demand for software engineers, computer support experts, systems analysts, information systems managers, and network and computer systems administrators will increase 100 percent over the next seven years. The factor most responsible for revitalizing demand for IT skills appears to be the need for cost-cutting applications and improved security via the integration of existing software systems. There is also a greater concentration on deploying technologies that offer a solid return on investment and provide infrastructure and support. Other developments that could affect IT employment include the rapid growth of Linux, the anticipated shift of 58 percent of industry profits from hardware to software, services, and consulting in two years, and a rise in spending on computers and peripherals. IT professionals will be able to leverage these trends if they stick to long-term career strategies and capitalize on macro rather than micro progressions. Accomplishing this means continuously upgrading one's skills and remaining on the cutting edge of technology.
http://itmanagement.earthweb.com/career/article.php/2221691
- "TSA Modifies Screening Plan"
Washington Post (06/14/03) P. E1; O'Harrow Jr., Robert
The Transportation Security Administration (TSA), partly in response to hundreds of written complaints from people and organizations, has revised plans for a second-generation Computer Assisted Passenger Prescreening System (CAPPS II) in order to reduce its intrusiveness, according to documents and government officials. An earlier CAPPS II draft called for analysis of passenger records through federal computer systems and artificial intelligence, and indicated that officials wanted the authority to broadly use such records. The revised version would require each passenger to reveal their name, address, birthday, home phone number, and passenger name record. Chosen details about each passenger would be given to commercial data services that would deliver a risk score determining whether each passenger is who he or she claims to be and has roots in the community; the draft privacy notice says that the commercial services would not be permitted to hold onto the scores in a "commercially usable form." The amended CAPPS II draft will likely include the appointment of a "passenger advocate" to handle complaints about various problems, while passengers would be screened by a "black box" system containing records about suspected terrorists. Center for Democracy and Technology staff counsel Lara Flint notes that this announcement is a positive development, but is still awaiting solid evidence that TSA is following its promises. The draft privacy notice declares that "TSA recognizes that inaccuracies in the commercial data may exist and that the CAPPS II system must allow for and compensate for such inaccuracies." Some privacy experts are still unsure even with the new rules, because a great deal about CAPPS II remains covert.
Click Here to View Full Article
(Access to this site is free; however, first-time visitors must register.)
- "Your Blink Is My Command"
ABCNews.com (06/13/03)
Ted Selker of MIT and Roel Vertegaal of Ontario's Queen's University are focused on the development of context-aware computers that can pick up on "implicit communication" relayed through eye or body movements to carry out commands. Much of the technology the two colleagues are working on incorporates eye-recognition systems. For example, a toy dog has been modified to bark when it receives infrared signals as a person wearing special eyeglasses stares at it; it is also programmed to stop barking if the person is blinking a lot, or is not looking in its direction. The dog can tell when two people wearing the glasses are looking at each other because lights on both pairs blink when eye contact is made. Selker and Vertegaal are also developing Attentive TV, in which the eyes of a person viewing a program on a computer are monitored with a camera; Vertegaal explains that the program starts or stops depending on where the eyes are focused. Another technology being worked on is "Eyepliances" that allow users to activate or deactivate appliances by staring at them and issuing voice commands. Meanwhile, users can more efficiently deal with interruptive phone calls through Eyeproxy, a device with quivering eyeballs that takes messages or patches calls through depending on whether the user decides to look at them. The technology Selker and Vertegaal are developing is five to 15 years away from commercialization.
Click Here to View Full Article
To read more about the attentive user interface work by Ted Selker and Roel Vertegaal, see the
March 2003 issue of Communications of the ACM.
- "Hacker Alert"
Wall Street Journal (06/16/03) P. R9; Richmond, Riva
July 1, 2003 will mark the enactment of a precedent-setting California law requiring companies to immediately notify California residents of online intrusions that may have compromised their personal information and made them vulnerable to identity theft; organizations that fall under the law's jurisdiction will include those located in California as well as those who do business in California. In anticipation of July 1, corporate lawyers have been advising clients to boost computer system security and establish incident-response and notification protocols. Stewart A. Baker of Steptoe & Johnson recommends that organizations first formulate a strategy for assessing security intrusions and the probability that personal data has been exposed. The next step is to set up a modus operandi for management, IT security staff, and legal counsel to ascertain whether notification is necessary and who specifically should be notified. Legal experts say companies are also responsible for ensuring that effective countermeasures--data encryption, firewalls, intrusion-detection systems, and authentication--are in place. Mark Bohannon of the Software and Information Industry Association is worried that the law's language concerning relevant security intrusions and the exact time to release notices of security breaches is opaque, which could leave organizations vulnerable to class-action suits and other forms of litigation. The California statute is the template for Sen. Dianne Feinstein's (D-Calif.) Database Security Breach Notification Act, a proposed national disclosure plan encompassing all entities that conduct interstate commerce, including those located outside the United States. Feinstein's bill would disallow private lawsuits, instead authorizing the FTC to fine companies $5,000 for each violation, or $25,000 for every day that companies fail to comply.
- "Government to Investigate IT Visas"
VNUNet (06/11/03); Fielding, Rachel; Jones, Rob
The National Audit Office (NAO) and Work Permits (UK) will independently evaluate the UK's method of providing visas for overseas IT workers. At issue is how firms place ads for positions at low wages and hire foreign workers when people fail to apply. In the first quarter of 2003, a total of 4,800 IT visas were distributed to foreign IT workers, about half those issued at the same time last year. Work Permits (UK)'s investigation will last three months, relying on a panel of researchers who will cooperate with the Professional Contractors' Group (PCG). Visa issuance procedures may be altered if problems are discovered, says PCG's external affairs director Ian Durrant. PCG says advertising procedures will also be scrutinized. The NAO is currently evaluating the situation to see if a full investigation is necessary, said an NAO spokeswoman. Recruiting firm The Skills Market says 26 percent of IT employees in the United Kingdom were unemployed in 2003's first quarter, and PCG estimates that 30 percent of its members have lost jobs in the IT field. Despite these statistics, however, IT positions were taken off the UK government's skills shortage list in September 2002.
http://www.vnunet.com/News/1141538
- "Poker Playing Computer Will Take on the Best"
Edmonton Journal (06/12/03); Cormier, Ryan
A team of artificial intelligence researchers at the University of Alberta has spent the last 10 years developing a computer program that can play poker, and they believe the program could conceivably outclass all human players within a year. The pseudo-optimal poker program (PsOpti) is unique in that it is capable of bluffing, and working with imperfect information. "If you do not bluff, you're predictable," notes Jonathan Schaeffer of the university's Games Research Group. "If you're predictable, you can be exploited." PsOpti is based on the game theory formula developed by Nobel Prize-winning mathematician John Nash. Ph.D. student and project researcher Darse Billings says the formula attempts to outline an outcome for the game that is fair to everyone. Schaeffer says that most original game research was based on games with perfect information, and adds that poker and other games with imperfect information have much more critical real-world applications. Reasoning with imperfect information, as a poker player does, could be useful in areas ranging from international negotiations to purchasing an automobile.
Click Here to View Full Article
- "Hobbyist Wins a Patent for PC's"
New York Times (06/16/03) P. C2; Chartrand, Sabra
Banking service company technician Claude M. Policard has transformed his hobby, collating an archive of digital music on his PC, into a two-in-one desktop that is secure against computer viruses, a design that was awarded a patent last week. The PC is equipped with two separate operating systems, two hard drives, and two memory banks so that personal computing files and Internet data are stored independently. Software programs allow the user to build spreadsheet or word-processing files on one hard drive, while the second drive is used to access email and Internet downloads; the user can toggle back and forth between the master computing system and the Internet system either through a switch embedded in the PC case or a third microprocessor that utilizes keyboard key sequencing. Both operating systems could also be monitored by this microprocessor. According to Policard's patent, the Internet system is connected only to "components that cannot be affected by malicious software." Furthermore, the Internet computer system can be outfitted with countermeasures against known viruses, and it is also programmed to snare new viruses. "The big advantage of the patent is that any new virus will not pass into the main computer system," Policard declared last week. Data can also pass between the master computer and the Internet computer, even though the desktop runs on independent systems.
http://www.nytimes.com/2003/06/16/technology/16PATE.html
(Access to this site is free; however, first-time visitors must register.)
- "Java Should Be Open-Source, Creator Says"
Computerworld (06/13/03); Sliwa, Carol
Sun Microsystems vice president and Java creator James Gosling says the strength of the developer community and the variety of interests behind Java are robust enough for Java to become open-source. "My personal feeling is that we're over the edge, but I also feel a little nervous about that," he says. Gosling admits that many people at Sun Microsystems would disagree with him, because of worries that Microsoft could fragment and weaken Java, leveraging its market strength to push through incompatible technology. Currently, Sun controls Java development through the Java Community Process (JCP), and is fostering Java-based open-source projects through the new Java.net online community and the Java Research License, which allows non-commercial development using Java's core. JCP program chair and Sun chief engineer Rob Gingell says internal debate over making Java open-source heightened in May among Sun field engineers who work with customers. However, the contentious issue in that discussion was more about open-source development style than intellectual property issues. Gosling also says Sun's new Project Rave Java development toolkit should make the programming language a more effective channel of creativity. Project Rave tools make it easy for low-end programmers to code in Java while ensuring their work has the necessary hooks and framework for expansion. Gosling, who unveiled the programming language eight years ago, says it's only within the last year that he began to believe that Java could survive as an open-source platform, but he admits that he's not always convinced that he's right.
- "Bend It Like Robo-Beckham"
Salon.com (06/10/03); Gutkind, Lee
Carnegie Mellon University robotics professor Manuela Veloso ascertained that true advancement of robotics technology cannot take place without collaboration. This was her motivation for organizing RoboCup, an effort to spur research and development by inspiring teams of engineers to create machines that can play soccer, the ultimate goal being robot players capable of defeating World Cup champions by 2050. This encapsulates what Veloso describes as "the essential loop: Developing robots that function simultaneously with action [movement], cognition [thinking], perception [vision and awareness]." The RoboCup matches pit robots developed by international teams against each other, and this year's championship competition will take place this July in Padua, Italy. The dedication and ambition of RoboCup participants illustrates an important facet of roboticist subculture: The deep, often emotional, relationship between scientists and their machines. Students such as Sarah McGrath of the University of Manitoba say the frustration they feel while struggling to get their robots to function properly fuels their determination to solve the problems. McGrath calls the driving force behind her efforts "the fascination of creating something, no matter how long it takes."
Click Here to View Full Article
- "Shock Waves Tune Light"
Technology Research News (06/11/03); Smalley, Eric
MIT researchers have discovered through computer models that exposing a photonic crystal to shockwaves can induce a dramatic Doppler shift and narrowing of bandwidth in lightwaves passing through the crystal. The findings could lead the way to quantum computer advances and less expensive and faster telecommunications. MIT scientist Evan Reed says the effects are the result of a temporary change in the spacing between the crystal's perforations that occurs when the shockwave travels through the lattice. He notes that the lightwave undergoes a visible color change during the Doppler shift, which is 10,000 times greater than normal; this effect could be harnessed to adjust the frequencies of lightwaves produced by cheap light sources so they are comparable to the frequencies generated by optical fibers. Reed adds that the bandwidth narrowing effect--the result of wedging the lightwave between a shockwave and a reflective surface, according to simulation--could be used to improve the efficiency of solar cells. Reed also indicates that the shockwave effect could be applied to quantum cryptography and other single-photon quantum information initiatives. Theoretically, a photonic crystal altered by a shockwave does not absorb and re-emit photons, enabling quantum properties and the data they symbolize to be saved. Reed explains that literal shockwaves are not necessary to induce the effects, and asserts that "Methods involving nonlinear optical materials, acousto-optical materials or [microelectromechanical systems] devices are likely to be more useful than shockwaves due to their non-destructive and repeatable nature."
Click Here to View Full Article
- "Pentagon to Move to Next-Generation Internet"
Reuters (06/13/03); Navarro, Marisa
The Pentagon on Friday announced that it will move to an IPv6-based Internet infrastructure by 2008 in order to overcome IPv4's limited numbering system, security shortcomings, and packet-loss problems. Pentagon CIO John Stenbit says today's technology-driven armed forces need a robust Internet infrastructure to tie together the Defense Department's Global Information Grid, which includes sensors, videoconferencing, weapons, information systems and connected devices, and aircraft. Stenbit says it will take five years to switch over to the new system, but the department must start making the switch with all new purchases after Oct. 1, 2003. Stenbit says, "My best guess is that it's going to happen commercially before 2008...if we don't start buying the stuff today, we're in trouble when it happens."
Click Here to View Full Article
- "Info With a Ball and Chain"
Newsweek (06/23/03)
An era of online content restriction is emerging, thanks to the advancement of digital-rights-management (DRM) software. David Weinberger, author of "Copy Protection Is a Crime Against Humanity," alleges that DRM by itself is not evil, but its real-world applications are, because "Theres no user demand for it. It's being forced upon us by people with vested interests." DRM advocates argue that the technology gives users more options; for example, safeguards encourage entertainment companies and other copyright holders to distribute their content online. In addition, Eric Cullen of Microsoft claims that DRM software supports legal consumer usage while restricting illegal usage. However, the software can also block legal applications--for instance, DVD anti-copying measures prohibit "fair use" activities and the only way to carry out such activities is to subvert the measures, which is illegal under the Digital Millennium Copyright Act (DMCA). Princeton computer scientist Edward Felten contends that DRM corrupts the Internet's fundamental purpose--the open exchange of information so that everyone benefits. There are signs that Congress has started to take the copy protection issue seriously: Sen. Sam Brownback (R-Kan.) will soon present a proposal "to ensure that our nations media producers and distributors do not clamp down on the ways in which [consumers] traditionally and legally use media products."
http://www.msnbc.com/news/926304.asp
- "Lessons From Building the "Spatial Web""
CIO (06/01/03) Vol. 16, No. 16, P. 130; Schell, David
Members of the Open GIS Consortium (OGC) are committed to interoperability for computer processes that are carried out by geospatial technology users. OGC members believe that people and organizations cannot adequately deal with the concept of space if geoprocessing is not interoperable. If geospatial technology users are to gain a sense of clarity, orientation, and behavior about activities in a given space and time, integrated spatial analysis and display capabilities are necessary, and the resulting interoperability would allow for combinable resources involving data, components, and resources. The Web, which has a spatial processing layer sitting on top of it, has become an example of the open platform sought by the OGC. The interoperability focus of the OGC comes at a time when vendors are pushing a proprietary strategy and when differences between spatial technologies, such as GIS and earth imaging, exist. Nonetheless, the consortium has a program that encourages technology suppliers to test their developments for industry-wide interoperability. For example, the OGC produced SensorML and other Sensor Web components within five months of the EPA expressing wishes for a real-time modeling environment based on Web access to thousands of geolocated sensors. The interoperability approach of the OGC is expanding the market for technology vendors as a result of the demand for integrating spatial capabilities into information systems, and it could very well impact the emerging platforms in other areas of the IT industry.
http://www.cio.com/archive/060103/et_pundit.html
- "A Game of Chance"
New Scientist (06/07/03) Vol. 178, No. 2398, P. 36; Mackenzie, Dana
The Advanced Encryption Standard (AES) certified by the National Institute of Standards and Technology (NIST) in 2000 may not live as long as originally conceived, according to a cryptic disclosure last year. Schlumberger-Sema cryptographer Nicolas Courtois and Macquarie University's Josef Pieprzyk unveiled a flaw in AES security by formulating an attack strategy that the standard is not prepared to handle. Although this was not an indication that AES had been successfully subverted, Crypto-Gram editor Bruce Schneier warned that the possibility exists. At the core of AES is Rijndael, an algorithm that requires a hacker to sift through 2(128) keys to find the key needed to crack an AES-encrypted message. Courtois and Pieprzyk said the same job can be done by searching through 2(100) keys, and although that in itself is an outrageously arduous task--one that would take longer than the universe's current age to accomplish--it significantly reduces the longevity NIST promised. Courtois and Pieprzyk cited AES' substitution box (S-box) as the most vulnerable point, and argued that Rijndael's creators ended up making S-box computation predictable precisely because they were trying so hard to make it unpredictable. AES was supposed to be effective for a century, but Courtois and Pieprzyk's announcement may have reduced the standard's usefulness to a mere decade. "Cryptography needs to prevent not only present, but also any future attacks, even if they are unlikely to happen," Courtois explains. "That is why the current version of AES will probably be discarded."
- "Supercomputers for the Masses?"
eWeek (06/09/03) Vol. 20, No. 23, P. 54; Taschek, John
High-performance computing clusters (HPCCs) are replacing monolithic water-cooled machines as the supercomputer of choice for technical computing tasks. By making use of off-the-shelf components and the Linux operating system, HPCCs cost less to build and maintain than traditional supercomputers. Stanford University's Bio-X cross-discipline research laboratory, for example, houses an HPCC comprised of 300 Dell computers running Red Hat Linux and connected via Fast Ethernet. System architect Steve Jones says Gigabit Ethernet would boost the performance of the HPCC considerably, but that cost concerns prevent him from using that technology. Still, the Stanford cluster will be positioned in the Top 500 Supercomputers list that ranks the world's fastest systems twice every year; increasingly, HPCCs are populating the top echelons of the list. Dell clustering group director Reza Rooholamini says HPCCs are making their way into the commercial realm as well, though mostly for technical applications such as oil exploration, vehicle design modeling, and bioinformatics. Setting up databases to work with HPCCs is complex, though solutions are available from IBM and Oracle, and recompiling existing applications for use with HPCC is nearly impossible. However, Microsoft's Greg Rankich expects the intersection of grid computing technology and HPCCs to have a significant impact on the commercial world, and estimates grid-enabled applications will arrive on the scene in three years.
http://www.eweek.com/article2/0,3959,1121367,00.asp
- "Getting In on the (Copyright) Act"
Electronic Business (06/01/03) Vol. 29, No. 8, P. 20; Barlas, Stephen
Rep. Rich Boucher's (D-Va.) Digital Media Consumers' Act attempts to amend the controversial Digital Millennium Copyright Act (DMCA) in order to satisfy critics who complain that the law's provisions allow copyright owners to strangle innovation and trample over consumers' fair-use rights. "The broad range of impacts the DMCA has had were not foreseen when that law was passed," explains Consumer Electronics Association VP Michael Petricone. Under the DMCA's anticircumvention provision, copyright holders can sue peer-to-peer networks to flush out people who allegedly pirate digital content, while users are forbidden from bypassing anti-copying technology on DVDs or CDs under any circumstances and software developers can only carry out reverse engineering to make products compatible. An attorney for Skylink Technologies, which is being sued by garage door opener manufacturer The Chamberlain Group for reverse engineering a technological safeguard so it could supposedly access copyrighted software, states, "Companies are trying to apply the DMCA in contexts that were never intended." Boucher's amendment loosens up the reverse engineering ban, permits users to circumvent copy controls for fair-use purposes, and requires CD makers to clearly label products with playability-limiting features. However, congressional support for Boucher's proposal is paltry. Brad Williams of Gateway thinks content owners such as the music and movie industries should devote more time to developing new products and services instead of pursuing legal action against alleged digital copyright infringers.
Click Here to View Full Article
To read more about ACM's reactions to DMCA visit http://www.acm.org/usacm
- "Self-Repairing Computers"
Scientific American (06/03) Vol. 288, No. 6, P. 54; Fox, Armando; Patterson, David
The growing complexity of computer systems adds up to their increased fragility and unreliability, which is why recovery-oriented computing (ROC) is so important, write Armando Fox of Stanford University and David Patterson of the University of California, Berkeley. Researchers from both schools are developing guidelines for building "ROC-solid" computing systems based on four criteria: Fast error recovery, improved tools for locating the source of bugs in multicomponent systems, the injection of test errors to gauge system behavior and aid systems operators in their training, and programming systems to support an "undo" function. Fox and Patterson also call for the development and circulation of benchmark software designed to test how fast a computing system recovers. Stanford graduate students George Candea and James Cutler are working on micro-rebooting, a technique whereby a computer's subcomponents can restart independently, obviating the need to shut down the entire system just to correct an error affecting only a few elements; a system-wide reboot can last up to 60 seconds, while micro-rebooting has a duration of less than one second, according to initial results. Because failures are often attributed to unpredictable interplay between system components, Berkeley's Mike Chen and Stanford's Emre Kiciman and Eugene Fratkin have developed PinPoint, a program that traces errors and uses data-mining to determine the elements most likely responsible. The Berkeley and Stanford researchers have proposed a software version of test circuits that inject artificial errors to gain insight on how well systems recover and how they can be bolstered. A trio of Berkeley grad students has devised a benchmark application that could help customers decide whether to purchase a system by testing its failure performance. Finally, an undo function that enables operators to delete unintended inputs is missing in large computer systems; as a demonstration, Patterson and Aaron Brown of Berkeley have created a prototype email system with an operator undo utility.
Click Here to View Full Article
|