HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 674:  Wednesday, July 28, 2004

  • "Researchers Aim for Plain-English Debugging"
    Associated Press (07/26/04); Crissey, Mike

    The National Science Foundation has invested $1.2 million in Carnegie Mellon University professor Brad Myers and grad student Andrew Ko's Whyline (Workspace for Helping You Link Instructions to Numbers and Events) project, a debugging program that enables users to ask questions about computer glitches in plain English. If a program appears to go wrong during the testing process, a user can hit a "Why" button, halting the program and causing questions based on programmed events to appear while highlighting lines of programming code related to the question in a window. A second window displays what happened while the program was in operation, complete with a timeline and flow chart. Myers and Ko tested Whyline with a group of grad students whose programming experience levels ranged from neophyte to expert, and found that users could detect bugs eight times faster and do 40 percent more programming with the program. So far Whyline has only been employed to debug programs in Alice, an academic programming language that renders interactive 3D worlds using a limited command vocabulary. A program's complexity determines how easy or difficult it is for Whyline to present the correct questions and answers, while its helpfulness could be limited even further by adding Whyline to an even more complex language, such as Java. Whyline is a component of the national End Users Shaping Effective Software (EUSES) initiative, whose goal is to make computers more user-friendly by fundamentally changing their appearance and operation. Professor Andreas Zeller of Germany's Saarland University has developed AskIgor, a Web-based debugging program that attempts to explain the cause of program errors once programmers tell it when the program works and when it malfunctions.
    Click Here to View Full Article

    The September 2004 issue of Communications of the ACM will feature an article on the Whyline project written by Myers, Ko, and John Pane.

  • "Horizon Programs Introduce Girls to Career Possibilities in Technical Fields"
    Newswise (07/23/04)

    Clarkson University's Horizons programs encourage girls in middle school to pursue technology-oriented careers. The National Council for Research on Women reports that the number of women working in engineering and computing fields has stayed roughly the same for the last two decades, despite a rise in the number of women scientists. The Horizons camps offer girls a view into possible technical careers through hands-on workshops that introduce them to fields such as robotics and chemistry, as well as the opportunity to view women role models. The girls are targeted at an age where they are defining their identity and forming conceptions about what is possible for their future education and career, says program director and school psychologist Bobbi Laird. Horizons sends invitations to participating schools throughout New York where two seventh-grade girl students are nominated to join in the camp. The first-year program involves science, mathematics, and computer science courses and workshops where students can investigate career opportunities and receive leadership training; participants have the option of re-enrolling for a second-year camp, which involves more sophisticated hands-on projects such as building working robots and conducting environmental analysis. Laird says the program introduces the girls to women role models who are dynamic, critical-thinking leaders and who are interested in helping people. Horizons is part of Clarkson University's Pipeline Programs and Academic Success office, which offers a continuum of support for under-represented students in technology and science throughout their education.
    Click Here to View Full Article

  • "Broadband: Why Policies Must Change"
    CNet (07/27/04); Borland, John

    Both Democratic presidential candidate John Kerry and his Republican rival George W. Bush are citing the importance of universal broadband access in their campaigns, but the winner of the presidency will face an uphill battle in delivering on his promise. Federal communications policy is a tangled miasma characterized by life-or-death struggles between phone companies and cable networks, while telecommunications rules established when broadband was still a question mark will probably be reopened on Capitol Hill next year, which means more lobbying efforts and political dissension. Recommendations that could help lead to a national broadband policy are founded are three central pillars: Incentive-driven competition; authorization for municipalities to construct their own high-speed networks; and the revision of communication regulations instituted before the emergence of the modern Internet. Affordable loans, tax credits, and other forms of financial aid to firms building their own networks so they can compete with phone and cable monopolies are examples of incentives. Hundreds of U.S. municipalities have launched or are planning to launch their own broadband networks, mostly in underserved regions, while cable and telecommunications companies are trying to block them legally. This has prompted officials such as American Public Power Association general counsel Richard Geltman to call for a congressional intercession to protect local broadband projects, even if state legislators balk. Communications analyst Kevin Werbach observes that "existing legal structures force regulators and other policy makers to make absolute category distinctions that don't work when you're talking about Internet-based applications," and his solution is an Internet-based "layered" regulatory system that applies similar policies to corresponding components of disparate networks. Fearing government intrusion, phone and cable companies prefer the selective removal of federal controls by the FCC until all industries are minimally regulated.
    Click Here to View Full Article

  • "Powerful New Computer Cluster Will Tackle Complex Problems in Physics and Computer Science"
    Currents--UC Santa Cruz (07/26/04); Stephens, Tim

    The University of California-Santa Cruz is home to a computer cluster of 36 dual-processor Apple Xserve G5 nodes from the Hierarchical Systems Research Foundation co-founded by UCSC alumnus David Doshay, who is working closely with physics grad student John Donohue on a project that will employ the cluster for research on magnetic phase transitions and protein folding. Magnetic phase transitions are an example of multiple-length-scale phenomena that Doshay is interested in, and modeling such phenomena requires a huge amount of computational power. Another project the cluster is being used for is UCSC computer science professor Charlie McDowell's work on developing a computer program that plays the ancient Chinese game of go. Doshay says the development of chess-playing programs has been far more successful than go-playing programs, as chess programs have reached the skill level of grand masters while go programs can only play at casual amateur levels. The go-playing program developed on the cluster could complement the physics project. "The software infrastructure we're developing is intended to make it easy for someone from another field, like physics, to program the cluster to do what they need to do--it's all about ease of programming," explains McDowell. Doshay says he plans to boost the clusters' computing power by increasing the number of nodes. "Even 72 processors is limiting for these kinds of problems, and there is plenty of room in the racks for expansion," he remarks.
    Click Here to View Full Article

  • "High-Tech Employment Numbers Drop in Second Quarter"
    IDG News Service (07/27/04); Gross, Grant

    The continued offshoring of IT jobs by U.S. companies has fueled a decline in employment for domestic software engineers, programmers, hardware engineers, and computer scientists and systems analysts between the first and second quarters of 2004, according to Bureau of Labor Statistics (BLS) numbers analyzed by the IEEE-USA. There were 2.96 million computer-related jobs in the U.S. in the second quarter of 2004, whereas last year the average stood at 2.98 million. Overall, the number of computer-related U.S. professionals experienced a fall-off of about 9,000 between the first and second quarters this year. High-tech employment trends reported by the IEEE-USA on July 26 include a dip in the number of employed U.S. software engineers from 856,000 to 725,000 between the first and second quarters; a decline in the number of U.S. computer scientists and systems analysts from 672,000 to 621,000 during the same period; a decrease of computer programmers from 591,000 to 575,000; and 3,000 fewer employed computer hardware engineers, making the second-quarter total 83,000. Conversely, employed U.S. electrical and electronics engineers enjoyed an increase from 327,000 to 351,000, although this is 12,000 less than the 2003 average. Unemployment rates from BLS showed significant decreases in each of the measured IT fields between the first and second quarters, but Gary Steinberg of the BLS reported that those statistics and the IEEE-USA's statistics emphasize different factors. The IEEE-USA's Chris McManes noted that the plummeting unemployment numbers accompanied by decreasing employment are odd, and said, "We think a lot of that would be...people being discouraged and leaving the field."
    Click Here to View Full Article

  • "UC San Diego Launches University-Industry Research Alliance to Address Challenges to Future Shared Networked System Infrastructures"
    UCSD News (07/23/04); Ramsey, Doug

    The new Center for Networked Systems (CNS) at the University of California, San Diego, (UCSD) joins academic and industry efforts in designing new converged network systems. Grids and pervasive computing have made networks far more central to today's computing paradigm than in the past, says CNS founding director Andrew Chien, the Science Applications International Corporation professor in Computer Science and Engineering at the Jacobs School. In order to best design large, networked systems, researchers from academia and the industry need to work in close collaboration. CNS participants include Hewlett-Packard, Alcatel, Qualcomm, AT&T, the California Institute for Telecommunications and Information Technology, and the San Diego Supercomputing Center. These groups have committed roughly $9 million to the center, in addition to the more than $10 million in ongoing related research at UCSD. A multi-disciplinary and collaborative approach is especially necessary to design distributed, heterogeneous network systems, says UCSD Jacobs School of Engineering Dean Frieder Seible. UCSD has a reputation of excellence in networking research, including grid technology, network measurement, and the monitoring of network threats such as worms and denial-of-service attacks. CNS will first form a funding committee to determine the first multi-year projects, which likely will include measurement projects enabling network system managers to model large-scale networks and assess security, while new routing architectures and optical networking technologies will also be investigated. A total of 16 researchers from UCSD will work on CNS projects in conjunction with other academic researchers and experts from AT&T, Qualcomm, and other participant groups.
    Click Here to View Full Article

  • "Real-Life Science in the Lab of Tomorrow"
    IST Results (07/28/04)

    The purpose of the IST Program-funded Lab of Tomorrow is to improve learning by providing students with real-life examples of science theory that interest and motivate them; these examples are derived from data extracted by tiny programmable devices that can be incorporated into apparel, balls, and other objects. Two specific devices have been developed by the Lab of Tomorrow: A wearable "Sensvest" computer system that monitors heart rate, acceleration, and body temperature as the user moves, and a ball with embedded 3D sensors that can be used to analyze acceleration; these gadgets are wirelessly linked to a base station from where students can accumulate data through a classroom computer interface. "For students and teachers it represents a major qualitative upgrade to physics teaching, something that is particularly important at a time when studies show interest in science is declining among students of high school age," notes project manager Sofoklis Sotiriou of Greece's Ellinogermaniki Agogi. "We believe the use of advanced technology keeps the motivation of students high because it connects real-life situations with science." Sotiriou says trials showed that students' learning capacity, interest, and motivation improved significantly with the use of the technology. The Lab of Tomorrow technology is also well suited for the rigorous curricula typical of most European educational systems, although Sotiriou is critical of the systems' relative rigidity and resistance to change, which complicates the deployment of technology in classrooms. He says the Lab of Tomorrow consortium intends to lobby Europe's education ministries for the incorporation of new technologies into learning, and also investigate applications outside of education. Sotiriou believes the devices could be commercialized for use in health care as well as professional and nonprofessional sports.
    Click Here to View Full Article

  • "It Should Be So Simple"
    MSNBC (07/25/04); Roberts, Timothy

    The move to simplify gadgets and software is finally gaining momentum after more than 10 years of industry deliberation, partly due to a growing consumer backlash against complexity. Examples of easy-to-use products with simple designs include the PalmPilot, the Google Web site, Apple Computer's iPod music player, an unadorned plug-in box containing integration software from Cast Iron Systems, boxed network security software from Fortinet, an upcoming accessory-free cell phone from Kyocera Wireless, and IBM's Autonomic Computing division, whose goal is to ease the installation, usage, and maintenance of systems by making them self-managing. "Computing ought to be as simple as talking on the telephone or turning on the faucet," notes Stanford University computer science professor Alex Aiken, who adds that computing has yet to reach that level of simplicity. The move toward ease-of-use is being driven by economics, according to IBM Fellow Curt Cotner. The cost of hardware and software has been decreasing, and companies are now focusing on how to lower costs in terms of people, since fewer people are needed to maintain simpler systems. This strategy is key to IBM's goal of becoming the vendor of choice for small- and medium-sized enterprises that cannot afford expansive IT workforces. Cast Iron Systems CEO Fred Meyer comments that software has been made so complex that it cannot support 90 percent of its desired applications, and he thinks software designers and computer manufacturers are simplifying their products because of user demand. A major challenge to maintaining simplicity is muzzling engineers' urge to bundle snazzy features into the newest gadgets: "The desire not to over-engineer a phone is just as difficult as it is to add new features," says Kyocera's John Chier.
    Click Here to View Full Article

  • "I, Pool Shark"
    Globe and Mail (CAN) (07/24/04) P. F9; McIlroy, Anne

    Queen's University roboticist Michael Greenspan has developed Deep Green, a computerized billiards player that Greenspan and his team want to make capable of besting the greatest human pool champions. Greenspan was challenged in his goal by assertions from pool jockeys that the game was too complex, while robotics experts argued that the project was too simple. Deep Green consists of a mechanized arm supported by an overhead frame, while the arm's movements are guided by a camera. Greenspan and his research team's goal thus far is to enable the robot to recognize individual billiard balls so that it will not sink the eight ball, and consistently sink balls from any table position. The machine can accurately sink the cue ball from anywhere in the table, but its accuracy will need to be improved so that Deep Green can sink the remaining balls with the cue ball; the robot also makes weak breaks, so Greenspan will need to correct this flaw either by making the mechanical arm more powerful or programming it to move faster. Greenspan plans to pick the brains of experienced local billiards players so that he and his students can apply strategic thinking to Deep Green. Greenspan thinks Deep Green could become the pool equivalent of Deep Blue in five to 10 years, but the robot may also pave the way for machine vision breakthroughs such as better color interpretation. University of Alberta researcher Jonathan Schaeffer explains that his and Greenspan's emphasis on the entertainment applications of artificial intelligence can extend to more serious applications: "If you can't solve...problems in the simple domain of a game then you can't hope to solve them in the more complicated real world," he contends.
    Click Here to View Full Article

  • "Online Popularity Tracked"
    Technology Research News (08/04/04); Patch, Kimberly

    Cornell University and Internet Archive researchers have extended the calculation of a baseball player's batting average to the measurement of the popularity of items available for sale or download on the Internet by determining the ratio of the users who download them to the users who read the item description. "The batting average addresses the more subtle notion of users' reactions to the item description as it appears in the fraction of users who go on to download the item," explains Cornell computer science professor Jon Kleinberg, whose work was sponsored by the National Science Foundation and the David and Lucile Packard Foundation. From the batting average's tendency to experience sudden upward or downward shifts, Kleinberg concluded that popularity is influenced by a high-traffic Web site's decision to link to or mention an item at a specific point in time, which attracts a lot of users whose interests are possibly more variegated than those usually drawn to the item's description. The researchers used data from the Internet Archive to infer that abrupt changes in items' batting averages were consistent with real-world events that spurred what was usually a new hodgepodge of users toward an item's description. "For each item, we can imagine keeping a running history of the on-site spotlighting and active external links that have affected the item over the previous years and months, together with a summary of the effect on the item's popularity," notes Kleinberg, who extends this notion to reviews of items. Sudden shifts in batting averages were tracked with a pattern recognition algorithm based on Hidden Markov Models that monitor a sequence of states to identify the system generating them as well as anticipate future states. The researchers are developing models that can deduce a user's actions and intentions when visiting high-traffic sites such as Amazon or the Internet Archive.
    Click Here to View Full Article

  • "UK Researchers Shortlisted for 1m Award"
    ZDNet UK (07/26/04); Parsons, Michael

    The Malicious and Accidental Fault Tolerance (MAFTIA) project administrated by Newcastle University's Dr. Robert Stroud in collaboration with research teams in Portugal, Switzerland, and France was shortlisted for the fifth annual 1-million-euro EU Descartes Prize, which will be split equally between two recipients to be announced on Dec. 2. MAFTIA is designed to boost the security of large network infrastructures by supplying a new way to consider and construct networks that can close the gap between "dependability and security," as described on the Web site of Newcastle University. "You have to avoid a single point of trust and therefore a single point of failure," explained Stroud. Among the protocols and implementations MAFTIA outlines to enable intrusion-tolerant systems is middleware protocols for secure group communication, a schematic for creating inclusive trusted third-party services, a framework for a large-scale distributed intrusion detection system, and the design and deployment of an intrusion-resilient distributed authorization service. The MAFTIA researchers attempted to build a formal middleware verification and evaluation tool using "a rigorous model for reactive cryptographic systems that allows for formal specification and verification of security properties under a standard cryptographic semantics." Stroud said IBM's Zurich Research Labs has devised a series of protocols to clone services online, which the Internet's inherent instability makes a formidable challenge. He noted that IBM assembled services such as a replicated certification authority that enable trustworthy third-party services to be duplicated so that deployments can differ in the event of an attack.
    Click Here to View Full Article

  • "Quantum Computing, Secure Communication Closer to Reality"
    UCLA News (07/21/04)

    UCLA researchers report in the July 22 edition of Nature that they have successfully inverted a single electron spin in a conventional off-the-shelf transistor chip and detected the current changes caused by the electron's flipping, a significant breakthrough in the push to make practical quantum computing a reality. "We have gone from [manipulating] millions [of electron spins] to just one," boasts UCLA physics professor Hong Wen Jiang, who adds that the research proves that an ordinary transistor can be modified to suit quantum computing. UCLA professor and director of UCLA's Center for Nanoscience Innovation for Defense Eli Yablonovitch elaborates: "This means that conventional silicon technology is adaptable enough, and powerful enough, to accommodate the future electronic requirements of new technologies like quantum computing, which will depend on spin." Flipping a single electron was a tough challenge, but one that was dwarfed by detecting the flipping itself. Jiang and grad student Ming Xiao used the transistor at minus over 400 degrees Fahrenheit; Jiang and Yablonovitch envision flipping a single electron on a transistor operating at room temperature, which would be much more practical from a commercial perspective. The electron's spin was flipped by shining a microwave frequency onto the transistor. The UCLA team's work was underwritten by the U.S. Defense Advanced Research Projects Agency, the Defense MicroElectronics Activity, and the Center for Nanoscience Innovation for Defense. Yablonovitch notes that quantum computing's applications could include secure communications, eavesdropping, rapid code-breaking, and perhaps riskless voting. Research groups from IBM and the Netherlands have also reported the detection of a single electron spin, although the techniques they used differed. Measuring the entanglement of two electron spins is the next challenge.
    Click Here to View Full Article

  • "In Most of Europe, Electronic Voting Loses Out to Paper Ballots"
    Dow Jones Newswires (07/26/04); Miller, John

    European citizens and governments generally prefer traditional paper-based voting because of unresolved reliability and security issues surrounding electronic voting. The EU Commission will not endorse e-voting, while major e-voting deployments in Ireland and the United Kingdom were terminated or heavily criticized because of security or technical lapses. EU Commissioner Erkki Liikanen doubts that a total pan-European adoption of e-voting will ever come about because of concerns with "technical" problems and "social acceptability." E-voting critics point out that a paper ballot offers an audit trail that e-voting lacks, and cite e-voting's vulnerability to hackers and computer bugs as additional shortcomings. Fueling the arguments of paper ballot supporters are incidents such as a 2003 Belgian election in which almost 4,100 extra votes for Maria Vindevoghel's Communist Party were recorded in a precinct of Brussels due to a malfunction triggered by a cosmic ray. But glitches such as this have not dissuaded e-voting's proponents, as Belgian citizens are generally satisfied with e-voting, while the Dutch have been voting electronically for over three decades. "It doesn't bother anybody because we're used to it," explains Judich Sleider of the Dutch Ministry of the Interior. The government of Ireland planned to adopt the Dutch e-voting system by purchasing 7,000 machines from Nedap in a move the Irish Prime Minister touted as symbolic of the country's modernization, but the project was scrapped before the June election when an independent commission found that the devices had not undergone rigorous testing for bugs or hacker attacks. Nevertheless, the Irish government has requested a follow-up assessment by the commission, and plans to use e-voting for the 2006 general elections.

    For information on ACM's e-voting activities, visit http://www.acm.org/usacm.

  • "Finding a Way to Make the Net Truly Global"
    Star (Malaysia) (07/27/04); Moreira, Charles F.

    The first-ever ICANN meeting on Internationalized Domain Names (IDNs) was held in Kuala Lumpur, Malaysia, where participants debated how best to incorporate non-Latin scripts into the Internet infrastructure. ICANN Chairman Vint Cerf and Internet Engineering Task Force representative John Klensin argued for a moderated approach that preserved interoperability between different scripts. If the underlying issues were not given paramount importance, Internet architecture might become fragmented so that language groups could communicate internally, but not with other groups. Cerf advocated using Unicode applications with XML and HTML presentation formats, while preserving the 8-bit American Standard Code of Information Interchange (ASCII) that is currently used pervasively in Internet architecture. Other delegates to the ICANN meeting, however, said local groups should play a more important role in the development and deployment of scripts in the Internet infrastructure: Open Forum of Cambodia advisor Norbert Klein noted that Cambodian authorities were not consulted when their Khmer script code table was developed by the Unicode Consortium, and that the script left out a number of necessary characters and included some wrong ones; Cambodia thus had to arrange workarounds with Microsoft to include application support for Khmer script. Multilingual Internet Names Consortium Chairman Khaled Fattal said no language should be dominant on the Internet, and that eventually Internet technologies and standards should be such that languages can be automatically translated for users. The consortium advocates whole addresses written in uniform script that can be read from left-to-right as well as from right-to-left. Cerf and Klensin said that type of capability would require significant change to core Internet infrastructure.
    Click Here to View Full Article

  • "Search Engine Experts Look Forward to Completely Digital Lives and Backwards to Washington's Letters"
    Innovations Report (07/26/04); Branton, Lorna

    Researchers gathered at the University of Sheffield in England commemorated 10 years of search engine technology and presented new innovations that will shape how people store and search for information. University of Sheffield computer scientist Mark Sanderson claimed the first Web search engine was created at the University of Stirling and released in 1994 with the name Jumpstation. The conference keynote speaker was Microsoft scientist Gordon Bell, who presented his MyLifeBits concept that will store every piece of information in a person's life including emails, videos, images, songs, telephone calls, and documents: "Within five years, PCs will be large enough to store everything we read, write, hear, and see," he said, and searching this trove of data will require new methods, such as an association with a life event that occurred about the same time. MyLifeBits was focused on researching those possibilities, Bell said. University of Massachusetts researchers presented innovative search technology that enables people to easily find information in handwritten documents, even historical documents with ornate handwriting. The team used about 1,000 pages of George Washington's letters for their project and were able to achieve a lower error rate than with normal handwriting recognition technology. The University of Massachusetts technique relies on transcriptions of a sample of the handwriting in order for the computer to learn the style; in the case of George Washington's letters, the team fed the computer 100 pages of transcribed pages. Eventually, the search tool will allow people to find information in common handwritten documents, such as letters, notebook entries, and lecture notes, said researcher Raghavan Manmatha.
    Click Here to View Full Article

  • "Life Has Gotten Even Shorter in Digital Age"
    USA Today (07/27/04) P. 1B; Baig, Edward C.

    Planned product obsolescence is a necessity to manufacturers whose profitability depends on regular turnover, while professional and consumer archivists are struggling with the dilemma of saving data produced by out-of-date machines. A 2002 survey for the Consumer Electronics Association concluded that consumers expect an average lifespan of more than 11 years for color TVs and over nine years for home stereos, while cell phones and personal digital assistants are expected to last less than five years. Usually a product's lifespan is shortened by the emergence of more advanced products rather than mechanical failures, although Donald Norman, author of "Emotional Design: Why We Love (or Hate) Everyday Things," reports that computer users are more willing to continue to use obsolete machines because they still have practical value. A more difficult question concerns the preservation of digital data, which can be lost if it is stored in a format that is no longer supported or read by current technology. Rand senior computer scientist Jeff Rothenberg believes the solution to format obsolescence is to design future systems that can emulate outdated machines, as migrating from one format to another will inevitably result in lost or corrupted data. When it comes to backing up digital data, many consumers are lazy: A recent consumer poll by InfoTrends/CAP Ventures found that only 24 percent of respondents upload digital pictures to an online photo service, and InfoTrends founder Kristy Holch says that consumers are unaware that retrieving digital photos will become harder in the next several decades. Various digital archival technologies--CDs, DVDs, advanced printers, etc.--have undergone accelerated aging tests, but their longevity, though impressive, cannot account for all variables. Parties on both sides of the issue support the adoption of impartial testing standards.
    Click Here to View Full Article

  • "Producing the Future of IT at MIT"
    Computerworld (07/26/04) Vol. 32, No. 30, P. 26; Anthes, Gary H.

    MIT VP for information services and technology Jerrold M. Grochow says MIT has left an important legacy to the computer world, and lists Unix ancestor Multics, Project Athena, and TCP/IP as significant technologies that the university developed or contributed to that have had a lasting impact. Multics principles, for example, are embedded in every major contemporary operating system, while Project Athena developed the Kerberos security protocol currently employed by Sun Microsystems, Apple Computer, and Microsoft. Grochow says MIT's physicists are experimenting with transmitting half a terabyte between MIT and CERN in Geneva, and pushing for close-to-real-time transmission. Grochow observes that MIT is investigating the feasibility of changeable computing environments enabled by radio-frequency ID and other embedded, wearable technologies that enable the network to adjust in response to a person's presence. MIT has implemented two-way security measures that look for both external and internal threats, and Grochow thinks major corporations are learning the wisdom of adopting such a strategy. The university has embraced safeguards that are less firewall-based and more application- and role-based: An example is Kerberos security, in which everyone on campus sports a security ID that is consistently recognized and interrogated by each application, while peoples' roles are distinct for each distinct application. Grochow argues that it is impossible to anticipate what gadgets will become computing devices and what types of services they will want, and notes that security and technical issues will complicate the situation. "We have to look pretty expansively in the future at the kinds of computing devices people are going to have and carry around and embed in their clothes," he suggests.
    Click Here to View Full Article

  • "Faster, Cheaper, Better"
    Economist (07/24/04) Vol. 372, No. 8385, P. 72

    The supercomputer field is experiencing a renaissance after being eclipsed by the emergence of the Internet in the late 1990s: Unlike in the 1980s, when the most talented people in computer science worked to custom-build the fastest computers, many of today's supercomputers have been put together using off-the-shelf processing and networking components; these systems are cheaper than those built from scratch, and make use of commercial products such as the chips used in Sony PlayStation2 game consoles or Apple G5 processors. Notably, IBM has begun selling supercomputer-type machines to corporate buyers, and the recent Council on Competitiveness organized a supercomputer conference which drew companies as diverse as film studios and packaged-goods maker Procter & Gamble. Supercomputer development has traditionally been driven by scientific applications, specifically the modeling of climate change and nuclear explosions, and the current top-three computers in the world are used for these purposes. Another area of interest for supercomputer developers is the modeling of how proteins fold, which last year yielded an important insight into the ability of a progesterone molecule to bind two proteins instead of just one. The commoditization of supercomputing, however, has outpaced basic supercomputing research, according to a report from the president's science advisor; in response, Congress has raised funding for basic supercomputer research that will look into more customized systems. International Supercomputer Conference speaker Steve Wallach said during the meeting last month that the software side of supercomputing needs to be reworked. Hardware is incredibly powerful, but it is inhibited by compilers that often leave a portion of commercially available systems idle when conducting real-life computing tasks.
    Click Here to View Full Article

  • "While Rome Burns?"
    Software Development (07/04) Vol. 12, No. 7, P. 48; Wayne, Rick; Morales, Alexandra Weber; Lum, Rosalyn

    "Code Complete" author Steve McConnell told SD West 2004 attendees that good software relies chiefly on personal discipline rather than technology, and expressed hope that premature optimization will be eliminated by faster hardware; he urged developers to concentrate on code clarity and understandability, and harbored little concern for offshoring. IBM Rational's Grady Booch delivered a keynote speech detailing software's evolution into a ubiquitous element of virtually everything, and stated that companies' competitiveness is directly related to how well their software teams innovate. He warned that software developers must anticipate future obstacles such as the limits of Moore's Law, and consider complexity, process, teams, and tools in order to wring better performance out of platforms and applications. A panel of database experts aired their views on a number of emerging technologies: Radio frequency ID tags were seen by some as overhyped bar codes, and by others as a potentially major disruptive technology. The experts concurred that open source offerings' market presence is increasingly apparent, while one expert said the economic value of grid computing has led to a tipping point for the cost of writing additional layers for enterprise applications. Consultant James Hobart told his "Designing Mobile Applications" class that mobile applications developers should concentrate on usability rather than technology, and predicted that the time is soon coming when mobile devices will adapt to users' behavior and expectations. Inastrol President Terry Moriarty held a tutorial in which she opined that the schism between data and object modeling is widening, but she hopes to start a company that can bridge the gap between the opposing camps by translating business rules into developer jargon via standards that move from language to model to code. Finally, Internet Access Methods Consulting President Gerry Seidman detailed JXTA peer-to-peer technology to his class, explaining that JXTA essentially enables secure ad hoc networks to self-discover, authenticate, and operate with little, if any, centralized management.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM