HomeFeedbackJoinShopSearch
Home
 
ACM TechNews sponsored by Learn more about Texis, the text-oriented database providing high-performance search engine features combined with SQL operations and a development toolkit, that powers many diverse applications, including Webinator and the Thunderstone Search Appliance.   Learn more about Texis, the text-oriented database providing high-performance search engine features combined with SQL operations and a development toolkit, that powers many diverse applications, including Webinator and the Thunderstone Search Appliance.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Thunderstone or ACM. To send comments, please write to [email protected].
Volume 7, Issue 773: Friday, April 1, 2005

  • "Father of Palm Handhelds Focuses on Making Computers Even Brainier"
    USA Today (03/30/05) P. 3B; Maney, Kevin

    Palm handheld creator and Redwood Neuroscience Institute founder Jeff Hawkins recently unveiled a for-profit company, Numenta, that will concentrate on expanding computer intelligence through the application of his brain research. Hawkins theorizes that the human brain's ability to recognize patterns and make predictions stems from the storage and use of memories to build models of the world. This idea can be translated into computer software through algorithms developed by Redwood Institute mathematician Dileep George, who founded Numenta with Hawkins and Donna Dubinsky; the software will be Numenta's core technology. Hawkins expects his theories to be practically applied to computers within five years, giving them the ability to search digital photos and model weather systems more accurately through pattern recognition, for instance. The technology could allow scores of far-flung weather sensors to continuously relay data to a computer that would assemble each piece of input into a cohesive whole, in a manner similar to the optic portion of the brain's neocortex. This would enable the system to visualize global weather in real time and predict future patterns. Oil exploration could also benefit from such innovations, as the computer might be able to predict the location of oil deposits using geologic data. Dubinsky, who will serve as Numenta's CEO, hopes to see chips that can be incorporated into computers and other products to give them brainlike qualities.
    Click Here to View Full Article

  • "A Miss Hit"
    Guardian Unlimited (UK) (03/24/05); Schofield, Jack

    Wired magazine editor-in-chief Chris Anderson wrote an article last year titled "The Long Tail" that said Internet firms such as Amazon.com, eBay, and Google based their businesses on obscure information and products rather than popular items; the meme gathered currency among bloggers, who applied the "long tail" concept to different fields and industries. Sales of entertainment media, the popularity of Web sites, and the frequency of words in language all follow a basic pattern with peak sales or usage near the front, with decreasing popularity as the product or information ages. Bookstores only carry a publication that still sells well and cinemas only show about a dozen movies at a time, but the Internet allows companies to make money from infrequent sales of less popular items. This less-popular stage of a product lifecycle is called the long tail, and the long tail concept is tightly linked to digital technology because digital products are not limited in ways a cineplex or Blockbuster movie rental outlet are, for example. When an old movie is made available for download online, it is not guaranteed a wide popularity, but it is likely to generate some interest simply because of its greatly expanded potential market. Google is a long tail business because its AdWords advertisement model targets many small audiences with targeted advertisements; conversely, Web sites used to sell comparatively few banner ads that were expensive. Interestingly, the long tail concept contradicts the Creative Commons movement somewhat because it makes old content valuable, whereas Creative Commons is based on the idea that old content is not worth commercial distribution. The long tail idea also runs counter to current entertainment industry efforts to pair the prices of online digital and physical retail products, since long tail thinking says less popular content will make money if sold online at low cost.
    Click Here to View Full Article

  • "Cybersecurity Regs Would Be Tricky"
    Federal Computer Week (03/25/05); Olsen, Florence

    Although the federal government believes establishing a broad regulatory framework for cybersecurity would prevent counterterrorism, a 57-page report from the Congressional Research Service suggests the Internet is too vast to control, especially since cooperation and coordination among worldwide Internet-users is most likely impossible. More challenges for such guidelines include varying opinions on the best forms of cybersecurity and the rapid advancement of technological change that often surpasses efforts to implement regulatory standards. The framework, which is still a priority for the House Homeland Security Committee's Economic Security Infrastructure Protection and Cybersecurity Subcommittee, would require standards and certifications, increased training and education, frequent internal and external audits, mandatory reporting of security breaches, development of cybersecurity insurance in the insurance industry, and the implementation of cybersecurity within public- and private-sector architecture plans. The best means of obtaining more cybersecurity control is if Congress granted the Homeland Security Department or similar agency regulatory control over cyberspace industry, according to the report.
    Click Here to View Full Article

  • "Kevin Mitnick and the Art of Intrusion--Part 1"
    VNUNet (03/22/05); Phillips, Tim

    Hacker-turned-security-consultant Kevin Mitnick, who has compiled stories of exploits in his book "The Art of Intrusion" as a guide to hackers' goals and attack strategies, notes that the companies he offers his services to are too concerned with regulation compliance and making money to fortify their network security. He reports that many of his clients have inadequate security, and commit blunders such as allowing users to write passwords down when they change them, as well as store passwords in email. Mitnick says many people assume that the defenses they set up--firewalls, intrusion detection systems, etc.--can manage themselves, and thus fail to configure them properly. He laments that some clients did not heed his advice and implement security appropriately when he examined their defenses during follow-up consultations. Mitnick recommends that system administrators consider wireless if wireless networks have been deployed, and be cognizant of physical security. He says protection should not be restricted to the network perimeter, and makes a case that firewalls and intrusion detection systems should be employed for both detection as well as defense. The ex-hacker explains that intrusion strategies have not really changed, noting that there are few major differences between war dialing and war driving, for instance.
    Click Here to View Full Article

  • "On the Trail of the Zombie PCs"
    New Scientist (03/26/05); Biever, Celeste

    Programmers participating in The German Honeynet Project are detailing their attempts to track down and monitor malicious "bots" used to turn vulnerable PCs into "zombie" computers that hackers use to coordinate exploits ranging from identity theft to spamming to disruption of online businesses. The German computer experts last week published their first paper describing their efforts to counter the menace using an army of bogus zombie computers to infiltrate chat rooms and networks habited by zombie machines and their controllers. A fake zombie is generated by running Mwcollect2 code that mimics vulnerable operating systems on a Linux computer, forming a honeypot that draws and contains malicious bot programs. The Honeynet team can also observe a bot's operations with Mwcollect2: For instance, when a bot hosted by a fake zombie attempts to enter a chat room, the Linux computer records the bot's passwords and logins, its chat room nicknames, and the IP address of the chat room controller's server. This data is used in conjunction with the Drone software package to build convincing spy bots that log in and out of chat rooms and download any new malicious programs the hacker orders his bots to retrieve. However, hackers can be tipped off to the spy bots' presence because the spy bots are programmed to be non-malevolent. German Honeynet Project founder Thorsten Holz has thus far monitored the activities of more than 160 bots, some of which individually control 50,000 infected systems; he estimates that the global zombie population totals about 600 million. Holz and Symantec note that zombie controllers are becoming increasingly crafty, producing zombie networks that try to keep a low profile by using bots that run from multiple servers and that chat less frequently.
    Click Here to View Full Article
    (Access to full article is available to paid subscribers only.)

  • "One-on-One"
    Government Security News (03/05) Vol. 3, No. 6, P. 12; Goodwin, Jacob

    Department of Homeland Security (DHS) top cybersecurity official Andy Purdy says the most worrisome threat to national cybersecurity is the growing black market of tools that spread spyware or launch denial-of-service attacks. These networks of hijacked computers are currently being used by criminals with financial motives, but could potentially be harnessed by terrorists or hostile states, warns Purdy. Alarmingly, a friendly nation alerted the DHS to a bot net with more than 1 million zombie nodes. With such resources, cyberterrorists could corrupt data and cause cascading failures in critical infrastructure and economic systems. The reliability of data is especially important for the economy, says Purdy. Other failures such as the Northeast power blackout of 2003 and the Comair computer crash that stalled domestic air travel over the holidays give some indication of the effects of a sophisticated cyberattack. Purdy notes that many of the most financially devastating attacks have not been solved, and that potential cyberterrorists could act with similar stealth. The National Strategy to Secure Cyberspace changed the fundamental stance of the government from one of apprehension to preparedness in the face of unknown, but possible threats. Purdy is acting director for the DHS National Cyber Security Division and says the group has consistently received adequate support from DHS leadership, despite the resignation of previous director Amit Yoran last September. Given that no recommendations or issues raised have been rejected by DHS leadership, Purdy questions the creation of a new, higher-level office.
    Click Here to View Full Article

  • "Retirement Having Little Effect on IT Skills, Survey Finds"
    Government Computer News (03/25/05); Miller, Jason

    The federal IT workforce is aging, but its IT skills remain strong, according to a new study by the CIO Council and the Office of Personnel Management. The CIO Council and the OPM surveyed 22,104 IT workers in 12 General Schedule or Foreign Service categories involved in IT or that have key tech components, and found that 42 percent were between the ages of 46 and 55. The second IT Workforce Capability Assessment Survey reveals that 12 percent intend to retire within three years, although 19 percent are eligible to retire. In the 2004 survey, requirements analysis and systems life cycle management have replaced computer languages and knowledge management in the top 10 technical competencies of the federal IT workforce. Though the IT workers are the most skilled in desktop applications, the Windows operating system, and document management, they have the least expertise in the Unified Modeling Language, Apply Macintosh operating systems, and biometrics. There is widescale certification in network support, project management, and operating systems among the IT workforce, which is largely involved with customer support, project management, applications software, IT security, systems analysis, and data management.
    Click Here to View Full Article

  • "Study Seeks Hi-Tech Mountaineers"
    BBC News (03/23/05)

    Researchers at the University of Aberdeen and the Macaulay Institute are recruiting mountain climbers and hillwalkers to participate in a year-long study to evaluate the ability of geovisualization software to evaluate terrain. Geovisualization can be used to create, inspect, and manipulate 3-D representations of mountainous terrain from the computer screens of PCs, laptops, and portable computers. The researchers plan to compare the technology with the process of creating Ordnance survey maps to represent terrain. Conventional maps rely on a walker to interpret contour lines and estimate the height of routes, but the same section of terrain can be viewed directly using an animated computer model with the software. "We plan to gather evidence that will allow a better evaluation of the potential impact of such Geovisualization methods for mountain environments in Scotland," says Aberdeen's David Pearson, who is leading the study. "Computer-based Geovisualization is a very effective tool for representing, exploring, and analyzing data of such complexity."
    Click Here to View Full Article

  • "Gadgets Rule on College Campuses"
    USA Today (03/29/05) P. 1B; Davidson, Paul

    U.S. college campuses have been invaded by an army of laptops, cell phones, and other wireless technologies students use to maintain constant connections between their classwork, their peers, and their professors. Critics such as college consultant Warren Arbogast warn that the push to digitize campuses with information technology is raising the cost of tuition and college budgets while offering no proof that such investments improve students' productivity or marketability. Others caution that the technologies can be a distraction from studying as well as an easy path for cheating and plagiarism. Infotech Strategies Chairman Ken Kay sees a need for anytime-anywhere access and high-bandwidth information on campus, arguing that it helps cultivate "problem-solvers and analytical thinkers" that the country needs in order to sustain its global competitiveness. Some professors are concerned that the connectivity provided by IT is eroding the importance of face-to-face interaction to students. Penn State University education professor Donald Heller is worried that students who rely heavily on cell phones and instant-messaging "won't develop the face-to-face communication skills needed to be successful." Some students are also feeling overwhelmed by technology: University of North Carolina-Chapel Hill sophomore Harmony Davies remarks that her teachers seem to be stressing the importance of email to such a degree that she feels she is lagging if she fails to check it every day.
    Click Here to View Full Article

  • "NFC: Nearly There"
    Wireless Week (03/15/05) Vol. 11, No. 7, P. 48; Brown, Karen

    Philips and Nokia continue to make progress on near-field communications (NFC), but advocates of the technology must narrow the scope of its potential uses, convince manufacturers to include it in their electronic devices, and find big customers, like RFID and contactless payment have in Wal-Mart and McDonald's, respectively. NFC is a wireless personal area network scheme that would allow people to wave a handset to perform simple data transfers, including transactions. The technology is based on the ISO 18092 standard and connects devices within a range of about 10 centimeters and at throughputs of 424 Kbps. NFC has an advantage over Bluetooth in that its tags are only automatically activated to set up connections when they come within range of other NFC-enabled devices, which means their handset batteries would not be easily drained. Two chips are in the test phase at Philips, which envisions the technology in a range of devices, rather than one segment, and Nokia expects to have the chip in a product, a shell for the 3220 handset, later in the year. Nokia says an NFC-enabled mobile phone would allow a person entering a train station to automatically download train schedules, wave the device at a contactless transaction reader to pay for tickets, and have the phone touch another NFC-enabled device to share information with another traveler, such as the URL for a Web site. NFC makes a strong case as an emerging wireless technology because it would facilitate wireless transactions.
    Click Here to View Full Article

  • "The Brain Behind the Big, Bad Burger and Other Tales of Business Intelligence"
    CIO (03/15/05) Vol. 18, No. 11, P. 48; Levinson, Meridith

    The restaurant industry is the exception to the rule when it comes to effectively using business intelligence (BI) software to extract useful insights that can be exploited to boost the bottom line. CKE Restaurants, parent company of the Hardee's fast food chain, employed a BI system to weigh numerous factors in deciding whether rolling out the Monster Thickburger nationally would yield solid returns-on-investment. This move is typical of the strategic decisions major restaurant chains have been making with the help of BI software over the past decade, and the success they owe to BI stems from their dependence on their business operations, in which the software plays a core role. Restaurant chains are better equipped to use BI effectively than other industries because heavy competition has made their culture adaptable to rapid change; their BI efforts are closely calibrated to their business tactics; the insights generated by their BI systems can help improve both operations and profit margins; and they have overcome key obstacles to BI success. During the planning stages of a BI implementation, executive decision-making should be thoroughly analyzed so that the information executives require to facilitate rapid and accurate decisions is understood. Data quality is also important, and Ruby Tuesday CIO Nick Ibrahim recommends that companies devise plans that outline how the data will be used once it is acquired, as well as practices for avoiding redundant data and techniques for presenting data in a sensible manner. Performance metrics that are most appropriate to the business must be developed, and the context that shapes those metrics must also be provided. CIOs are advised to be particularly sensitive to users' feelings and concerns so that they are more acceptable of BI.
    Click Here to View Full Article

  • "The Changing Landscape of Networks"
    Siliconindia (03/05) Vol. 8, No. 2, P. 30; Sindhu, Pradeep

    The time has come to build a Multi-Protocol Label Switching (MPLS)- and IP-based "infranet" that blends the reliability and functionality of private networks with the pervasiveness of the Internet, writes Juniper Networks founder and CTO Pradeep Sindhu. This will eliminate the barriers that hinder the creation of economically feasible, advanced applications that follow a global, any-to-any paradigm. Infranets are designed to support new and emerging services and applications such as grid computing, while boasting the flexibility needed for managing sudden changes in networking requirements. Security, control, reliability, and quality of service guarantees are automatically bundled into each application so that service providers can bill appropriately and reap financial rewards from an assortment of services. Infranets can also enable these rich applications to be supported across multi-provider networks. Sindhu writes that the infranet must include supplemental intelligence to support quality assurance, accounting, and billing for new services, and additional control so that service providers can share traffic over their networks and be remunerated accordingly. Service providers that follow the infranet approach are better positioned to ensure security and performance for value-added applications. "Infranets can accommodate both the legacy model of distributed computing and the new service-oriented architecture of Web services," Sindhu explains, adding that the infranet is critical to finally realizing a successful Internet Protocol business model.

  • "Will .Mp and .Mobi Make Life Easier for Mobile Users"
    IEEE Pervasive Computing (03/05) Vol. 4, No. 1, P. 6; Cole, Bernard

    Two new top-level domains--.mp and .mobi--designed to accommodate the viewing constraints of mobile Internet devices are expected to be available by mid 2005, despite opposition from some who say the domains violate the device independence of the Internet. The .mp domain is already in the "sunrise" phase, meaning corporate trademark owners have begun registering their names under the domain. The domain has not received as much criticism as .mobi because it was already established and approved as the county-code domain for the Commonwealth of the Northern Mariana Islands before it was set aside for the mobile community. Saipan Datacom, registrar for .mp, plans to offer both commercial and personal sites under the domain, with an emphasis on individual and small business users. While .mp is signing up Web developers, .mobi is still encountering opposition from the Device Independence Working Group (DIWG) and other critics who say the Internet does not need another domain exclusively for mobile devices. ICANN recently approved the new domain, and the sunrise period for second-level domains and Web sites under .mobi are planned for mid 2005, says Nokia's William Plummer. Plummer disagrees with the argument that .mobi will fragment the Internet, saying ".mobi will accelerate convergence of mobility and the Internet, not send them on different paths." Plummer says .mobi will only use IPv6, making possible a nearly unlimited number of domain names, and will offer security features specific to mobile devices.
    Click Here to View Full Article
    (Access to full article is available to paid subscribers only.)

  • "Hack License"
    Technology Review (03/05) Vol. 108, No. 3, P. 75; Garfinkel, Simson

    In reviewing McKenzie Wark's book, "A Hacker Manifesto," Simson Garfinkel outlines Wark's central argument and notes flaws that, in the reviewer's opinion, make the title of the book misleading. Wark, a professor at New School University, sees recent skirmishes over copyrights, trademarks, and patents as the latest manifestation of the ancient battle between producers and the ruling classes, which are respectively represented in his book by hackers and "vectoralists." According to Wark, vectoralists rework laws and technology to dam up the free vectors through which information flows throughout society while charging for use of the others. However, Garfinkel notes that Wark paints hackers in a generally favorable light without acknowledging the fact that ethical "white hat" hackers would not be needed without "black hat" hackers who engage in criminal activity. Nor does Wark mention that many white hats were formerly black hats. Garfinkel also calls attention to Wark's failure to make any reference to hardware or hardware hacking: Continued hardware innovations dictated by Moore's Law are not possible without vectoral control, while hardware hacking--the traditional modification of hardware to add capabilities not envisioned by the original designers--is often practiced in the same antivectoralist spirit that Wark says motivates software hackers. Garfinkel cites Richard Stallman, author of "The GNU Manifesto," and Lawrence Lessig, author of "The Future of Ideas," for their views on hacking. Stallman's unfavorable attitude toward intellectual property laws makes him a kindred spirit to Wark, while Lessig argues that both technology and legislation is blocking the public's access to its cultural heritage; his proposed workaround, Creative Commons licensing, establishes a body of work that can be freely cited, reprinted, and built upon.
    Click Here to View Full Article

  • "If Smallpox Strikes Portland..."
    Scientific American (03/05) Vol. 292, No. 3, P. 54; Barrett, Chris L.; Eubank, Stephen G.; Smith, James P.

    A group of Los Alamos National Laboratory researchers has developed EpiSims, software that can run computerized simulations of disease epidemics in order to determine the best ways health officials can respond. A disease's trajectory throughout the social network as well as the points where intervention would be most effective are simulated by modeling the interactions of each individual in a population. The researchers built EpiSims atop the TRANSIMS urban planning model and utilized high-performance supercomputing clusters to facilitate individual-based modeling on a scale of millions of people; EpiSims allows a virtual pathogen to be released into these populations, so its spread can be monitored and different types of interception can be tested for effectiveness. One experiment simulated a smallpox outbreak in a highly accurate model of Portland to determine the feasibility and efficacy of mass vaccination, targeted vaccination, and quarantine. The results indicated that, no matter which response strategy was followed, time was the most critical factor in limiting casualties. EpiSims is also being adapted to simulate an H5N1 flu virus pandemic and possible intervention scenarios as part of the National Institute of General Medical Services' Models of Infectious Disease Agent Study. The Los Alamos researchers will incorporate historical data about pandemic flu strains and information about H5N1 into their model of the virus. The researchers are also building and connecting other sociotechnical system models to provide virtual laboratories where solutions to a broad spectrum of real-world problems can be investigated.
    Click Here to View Full Article

  • "Location-Aware Networking: We Know Where You Are"
    Network Magazine (03/05) Vol. 20, No. 3, P. 46; Greenfield, David

    The deployment of enhanced 911 (E911), which is already required in several states, calls for the implementation of a location-aware infrastructure that supports VoIP over Wi-Fi and similar technologies, but enterprises should examine the opportunities--and drawbacks--location-aware networks present for applications beyond E911 compliance. A deployment roadmap for VoIP E911 consists of three stages: I1, in which residential or retail 911 calls are routed to the local Public Safety Answering Point through the PSTN or the cable or CLEC network; I2, whereby complete VoIP services are deployed across the PSTN; and I3, which delivers E911 over IP. Enterprise VoIP networks will need to track caller location in order to comply with E911, and the options open to Wi-Fi vendors, in order of increasing accuracy, are tracking via base station or user access point, triangulation, and RF fingerprinting. An additional software infrastructure must be deployed to support the various location-aware applications, and the IETF's Geographic Location/Privacy (Geopriv) Working Group has defined such an infrastructure. Location information is stored as location objects in a special server, and these objects carry a description of their location and information usage terms; a Wi-Fi device's location is detected by a "location generator" that relays the information to the location server as a location object, and from there the data goes to the client at login via DHCP or the TIA's Link Layer Discovery Protocol-Media Endpoint Discovery. The AP network transmits additional location objects to the server contiguous with the device's movement through the Wi-Fi space. Privacy is maintained by tracking the object carried by the individual rather than the individual, while data usage rules carried within each object also enhance privacy. Such rules include giving location information a 24-hour life span to deter the creation of historical tracking information databases.
    Click Here to View Full Article

  • "Visual Modeling's New Look"
    Application Development Trends (03/05) Vol. 12, No. 3, P. 31; Scheier, Robert L.

    Visual modeling, when properly executed, can enhance software application development by reducing manual coding, incorporating reusability into whole segments of application design, making maintenance less of a headache for IT organizations, and ensuring that business and development teams are working on the same problems. Visual modeling communicates in an easy-to-understand way the requirements for new applications and their technical design while concealing the complexity of the underlying code and business needs. Modeling aligns best with applications whose size and complexity justifies their use, and whose developers and project managers are adept enough to use modeling appropriately. Each subsequent visual model describes the final application more explicitly: The first model consists of use cases; the second presents workflow diagrams outlining the business needs the application must satisfy and the class hierarchies of the application's component objects; the third is a platform-independent model that labels the necessary technical elements; the fourth is a platform-specific model indicating the particular products to be employed for each element; and the final model visualizes a detailed physical design that clearly illustrates where the various application components are located and how they work together. Modeling can often be hindered by "analysis paralysis," in which customers waste time attempting to model their needs perfectly, and IBM Rational Software distinguished engineer Alan Brown suggests concentrating on a particular series of business problems and modeling them via a set of "relatively short iterations." Noblestar CTO Paul Pocialik says code-level developers need solid skills in business and interpersonal communication, in addition to technical skills. Kathrein's Reinhard Rossmy says a project should be split 30-30-30-10 between analysis, modeling of class diagrams and systems architecture, source code writing/compilation/testing, and deployment, respectively.
    Click Here to View Full Article

  • "Where Do System Standards Go From Here?"
    Business Communications Review (03/05) Vol. 35, No. 3, P. 38; Waclawsky, John G.

    Cisco Systems' John G. Waclawsky writes that successful standards expand the general market because they focus on creating end-user value by reducing product or product components' complexity and cost, and making businesses capable of developing new products or services. Despite the fact that the component ecosystem is the channel through which every new application that yields substantial end-user value today has come to market over the last 10 or 15 years, the ruling telcos and their equipment providers refuse to jettison their ineffectual systems-based model, and have fought, wormed their way into, and co-opted the component-based standards bodies to perpetuate their monopolization. Waclawsky attributes the cooperation between incumbent telcos and their suppliers to a common fear of a competitive market stemming from commoditization driven by component standards. From this rationale, he reasons that "the incumbents are struggling to maintain their legacy, monopoly-based telephony business model, while they invent themselves as wireless, data, and TV services providers under the umbrella of monopoly protection." However, Waclawsky believes such a business model is ultimately doomed, since the market will not likely countenance it forever. He recommends the modularization of system standards in terms of both process and output. The adoption of a categorization and project development strategy, along with a testing architecture not unlike the Good Housekeeping seal, could yield significant benefits for incumbent suppliers as well as new innovators. The end result would be a "cookbook" of technically workable possibilities that integrators could apply to the construction of future network environments.


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM