HomeFeedbackJoinShopSearch
Home

Be a TechNews Sponsor

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM. To send comments, please write to [email protected].
Volume 7, Issue 796:  Wednesday, May 25, 2005

  • "C++ Gets a Multicore Tune-Up"
    TechWeb (05/24/05); Wolfe, Alexander

    University of Waterloo computer science professor Peter Buhr is offering a new set of extensions for the C++ programming language that aims to help software developers take advantage of multicore and multi-threading processors. Buhr has released the micro-C++ project under an open-source license and will present the technology to the Gelato Federation technical meeting this week in San Jose. Intel and Hewlett-Packard provided financial support for Buhr's project through the Gelato group, which promotes Itanium and Linux systems. Micro-C++ is not limited in terms of operating system or processor technology, and basically enables programmers to easily separate threads in their code using four new classes not included in the original C++ language; the code is then translated into normal C++ and converted into an executable image with help from a compiler and a micro-C++ runtime library. Buhr says a lot of software development has focused on Java over the last six years, but now people are turning again to C++. There are other projects and technologies that deal with C++ for multicore and multi-thread processing, including Posix threads, the Boost.org project, the Adaptive Computing Environment toolkit, and Sourceforge's C++ threads library, but none of these technologies has won widespread support or been incorporated into official C++ language. Buhr also works on the C++ subcommittee that is exploring revisions for easier multicore programming, but he says there is currently no clear path for adding that functionality.
    Click Here to View Full Article

  • "ACM Joins World Community Grid"
    AScribe Newswire (05/24/05)

    The ACM joined IBM's World Community Grid on May 24, and will encourage its 80,000 members to join the legions of participants contributing idle computing time to conduct humanitarian scientific research. "[ACM] has the potential to double our membership and in turn double the power that World Community Grid has to work on projects that help humanity," declared IBM International Foundation President Stanley Litow. World Community Grid has donated 10,000 years of computer run time to research in half a year by tapping over 106,000 participating personal and business computers. This has led to the completion of more than 60 percent of the Human Proteome Folding Project, which is producing data on individual protein behavior in the human body that will yield insights into the development of disease cures. World Community Grid's goal is to organize the world's largest distributed computing architecture for humanitarian works. The grid can operate five to six projects annually for nonprofit and public organizations. World Community Grid will consider projects in a variety of fields, including environmental research disciplines such as climatology, pollution, ecology, and preservation; basic research into human health and welfare; and medical research in such areas as genomics, epidemiology, proteomics, and biological systems. "World Community Grid provides a unique technology solution to some pressing global problems," said ACM President David A. Patterson. "We are asking our members all over the world to join with us in this innovative initiative to improve the lives of our member communities."
    Click Here to View Full Article

    For more information on ACM's involvement in World Community Grid, visit this site.

  • "'Future' Reviews Human-Machine Connection"
    eWeek (05/24/05); Louderback, Jim

    The Future in Review (FIRE) conference featured views from leading technologists who offered predictions about lifetime storage, the impact of multiprocessing on software applications, the increasing importance of statistical analysis, and other issues. A CTO roundtable with AMD's Fred Weber, Microsoft's Rick Rashid, and Hewlett-Packard's Dick Lampman discussed the near-term possibility of mobile systems that record people's daily life. Terabyte hard drives will be able to store every conversation and take a picture every minute of people's lives, and Microsoft researchers are currently working on a "black box for humans" that uses sensors such as accelerometers and motion detectors to record individual human activity, said Rashid; Weber touted fast-declining prices of LCD displays, and predicted LCDs would become as ubiquitous as mirrors. Multiprocessing lets software developers implement new algorithms that take advantage of parallel computing, which Weber said would soon lead to amazing new applications. One Microsoft research project is using data collected from Web searches, combined with statistical analysis, to intelligently answer people's natural language questions. RSA CEO Art Coviello said there was currently a lot of experimentation going on in terms of Web services, but that a lack of sophisticated standards was hindering an interoperable Web services platform; he also said Web applications are still relatively simple and that many are basically front-ending old applications. Pharming techniques could potentially lead to wholesale identity theft in the next year and a half if anti-spyware software is compromised, and many companies still use inadequate password authentication regimes where passwords are easily broken. Regular computer users also need to start using basic protections such as personal firewalls, updated operating systems, and other protective software.
    Click Here to View Full Article

  • "Software Antagonists Square Off in EU Parliament"
    Reuters (05/23/05); Lawsky, David

    Technology companies and open-source advocates are clashing over a broader patent protection scheme from the European Union. Supporters of open source contend that Linux and other freely distributed software would be threatened by the current draft, which extends patent protection to portions of software used to help programs communicate with each other. Companies such as Microsoft and Apple Computer claim that without such safeguards, open-source companies would be able to expropriate their development costs. Writers of free software need data on how proprietary programs interact with other software in order to design their programs to ensure interoperability. A technical improvement's qualifications for patentability are determined by its uniqueness and non-obviousness, but extending such a definition to software is a thorny prospect; open-source advocates want determination of patentability to reside in how the invention interacts with physical "forces of nature," while proprietary software sympathizers demand more abstract definitions that cover interoperability information. Clifford Chance's Thomas Vinje warns that antitrust enforcement will become tougher with such measures in place. Former French prime minister and European Parliament member Michel Rocard has proposed a revised version of the EU patent directive that shields open-source software, and Vinje says the absence of such an amendment could effectively kill open source. The directive aims to help harmonize the rules for "computer-implemented inventions" across the EU, which Francisco Mingorance of the Business Software Alliance says would spare small companies the headache of defending the same idea in multiple countries.
    Click Here to View Full Article

  • "'Sound the Alarm'--Congress Gets Candid"
    HPC Wire (05/20/05) Vol. 14, No. 20; Curns, Tim

    Prior to a hearing before the House Science Committee last week, Reps. Frank Wolf (R-Va.), Vern Ehlers (R-Mich.), Sherwood Boehlert (R-N.Y.) and Don Manzullo (R-Il.) announced plans to hold a national "Innovation Summit" later this year. Wolf said he recently met with a prominent group of scientists who opined that America's science and innovation effort is stagnating or in decline, signaling a destabilization in the foundations of U.S. leadership. In addition, fewer American scientists are earning patents and Nobel Prizes, or producing published papers. "The time has come to sound the alarm," Wolf argued, warning that a fall-off in long-term basic scientific research could have a major impact on national security and the U.S. economy. "We worked to include language in the recently passed supplemental appropriations bill directing the secretary of Commerce to work with groups...to put on a national conference this fall in Washington to begin focusing like a laser beam on this issue," he said. Wolf stressed that alacrity is critical to the summit's organization, but he was heartened by the country's tradition of taking the initiative. He also said he wrote a letter to President Bush in which he called for a three-fold increase in U.S. innovation investment over the next 10 years, espousing his belief that innovation could become a national priority through the White House's leadership. Another priority for Wolf is increasing the importance of math and science to young students, and he said such subjects should be taught as early as elementary school.
    Click Here to View Full Article

  • "Database Hackers Reveal Tactics"
    Wired News (05/25/05); Zetter, Kim

    Three young hackers suspected of breaking into the LexisNexis database claim the intrusion was done to make a name for themselves rather than to commit identity theft. One of the suspects is also a member of the Defonic Crew hacking group, and says his hack of America Online encouraged him and other Defonic members to take on bigger hacking challenges; "Shasta," a hacker who is not a suspect in the LexisNexis case, says the successful AOL intrusions bred carelessness among Defonic Crew when it came to not leaving a trail. Last March, LexisNexis admitted that intruders penetrated a database belonging to its Seisint subsidiary and used name searches to appropriate the personal data of up to 310,000 people, but the hacker suspects claim they were unaware of this until a friend of one of them, pretending to be a teenaged girl, engaged in an online chat session with a Florida policeman with a Seisint account. The suspect coaxed the officer to click on an attachment containing a Trojan horse with promises of erotic content, and the program downloaded to his computer and gave the hacker total access to his files, including one linking to Seisint's Accurint service. Another suspect in the LexisNexis breach used a Java script to find other active Accurint accounts, and uncovered an account belonging to a Texas police department; he then contacted Seisint posing as a LexisNexis tech administrator and coaxed an employee to reset the account's password so he could create new accounts in the police department's name. A separate investigation that may be related to the LexisNexis case led to several arrests in California, and Santa Clara County Deputy District Attorney Jim Sibley theorizes that more than one hacker group may have breached LexisNexis, given its shoddy security.
    Click Here to View Full Article

  • "Enron Offers an Unlikely Boost to E-Mail Surveillance"
    New York Times (05/22/05); Kolata, Gina

    The public disclosure of reams of email messages investigated in the Enron probe gave scientists the opportunity to test their theory that a group's intentions could be inferred by tracking emailing and word usage patterns without actually reading the messages. After just a few months of scrutiny, about six research groups say they can capture important data and are polishing their data categorization and analysis ability. Carnegie Mellon University computer science professor Dr. Kathleen Carley says the Enron data showed an explosion of activity among top executives before the investigation, while the sudden cease in communications between each other and the accompanying uptick in communications with legal counsel after the probe began indicated growing nervousness. Queen's University computer scientist Dr. David Skillicorn says analysis revealed a junior-level executive of significance who was not listed in Enron's organizational charts, and such a detection could be applied to probes of terrorist networks. Each crisis was marked by a surge in email, and certain messages featured word choices, routing patterns, and other indicators that enabled analysts to separate these emails from extraneous business or personal messages. The scientists expect intelligence agencies to be conducting similar classified investigations of international email traffic, but University of Tennessee computer scientist Dr. Michael Berry is concerned that civilian email surveillance could have Orwellian overtones. For example, companies could use such techniques to keep tabs on employee attitudes and activities without actually eavesdropping on email exchanges, while advertisers could customize pitches based on word searches on individual email accounts. "Will you let your email be mined so some car dealer can send information to you on car deals because you are talking to your friends about cars?" Berry asks.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Survey: Little U.S. Interest in Next Generation Internet"
    IDG News Service (05/24/05); Gross, Grant

    The implementation of next-generation Internet Protocol (IPv6) is a low priority for IT decision-makers in U.S. industry and government, according to a recent Juniper Networks survey, despite warnings that without wide IPv6 adoption the United States could lose its global technology leadership. Just 7 percent of 349 respondents ranked IPv6 as "very important" to reaching their IT goals, while 35 percent failed to see a compelling reason for adopting IPv6; 30 percent said budget issues were a major obstacle, and 17 percent cited technical transition difficulties. The top IT priorities among respondents, in descending order, were improving quality of service, enhancing and streamlining cybersecurity, and making network management easier. IPv6 can facilitate such improvements, said Juniper Networks' Rod Murchison at the Coalition Summit for IPv6. U.S. IT decision-makers appear to think that IPv6's only advantage is more IP address space than IPv4, when the protocol can also substantially reduce the costs of Web multicasting, enable an easier transition to VoIP, improve Internet connection quality, simplify network management, and provide less complex security configuration, according to speakers at the summit. Summit Chairman Alex Lightman urged Congress to make the adoption of IPv6 among federal agencies mandatory, and to establish a national IPv6 coordination office with an annual budget of approximately $10 billion.
    Click Here to View Full Article

  • "The Valley in a World Gone 'Flat'"
    Mercury News (05/23/05); Helft, Miguel

    Silicon Valley will face competition from all over the world in the near future as other regions gain access to the same knowledge and other crucial resources needed to innovate, says New York Times columnist Thomas Friedman an interview. Friedman's new book, "The World Is Flat," argues that a series of technological and political events in the 1990s have set the stage for a more competitive world where smart people in other countries no longer have to emigrate in order to play on the global scene. Though this transformation is at its beginning stages, areas that have solid infrastructure, a good education system, a stable political environment, and rule of law will be able to compete in a flat world faster. Friedman says this reality is already changing the way U.S. startups think, forcing them to become multinational companies from the beginning in order to win venture capital. Overall, markets will not only become more competitive, but also grow much faster and become more complex, while knowledge workers will have to be able to move horizontally, from search engines to search engine optimization, for example. Friedman also says America's absolute standard of living does not have to fall, even as its relative lead over other countries diminishes; he notes that massive U.S. investment in Japan and Germany after World War II coincided with six decades of growth in the U.S. standard of living. In order to keep its innovative edge, U.S. leaders need to ensure every young person has access to a university education, and the United States should also invest in solving a major problem, similar to the space-race effort following Sputnik. Friedman proposes making America energy-independent, which would boost the number of scientists and engineers in the United States, bolster national security, and lower the average price of goods.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Scientist Blames Web Security Issues on Repeated Mistakes"
    E-Commerce Times (05/24/05); Germain, Jack M.

    BBN Technologies researcher Peiter Zatko believes the Internet's vulnerability to catastrophic failure is rooted in scientists and engineers repeatedly committing the same mistakes, but he does think this situation can be remedied and is heartened by industry's growing awareness of the problem. His view is that programmers must stop coding programs riddled with access holes that stem from calls within a program for certain convenience actions. Zatko says the abuse of the Internet's critical infrastructure makes an all-in-one security solution impossible, and partially attributes the infrastructure's weakness to engineers overworking the Internet's intended use. He says the addition of utilities and telephone service to the Internet puts further strain on the network. Zatko recommends that scientists cross-field their knowledge in order to find effective solutions to the Internet's security flaws, insisting that "We need to break up the old boy network." He sees the technology industry's reversion to dedicated services instead of multipurpose devices as a positive step, and advises the continuation of this trend. Zatko expects the repeated abuse of the Internet to halt once it becomes too dangerous, too complicated, and too costly to use safely. Once that point is reached, people will start clamoring for government regulation, he predicts.
    Click Here to View Full Article

  • "Debating the Safety of a Tiny Technology"
    Baltimore Sun (05/22/05) P. C1; Cohn, Meredith

    Critics of a State Department initiative to embed radio frequency identification (RFID) tags in passports warn that such measures will make U.S. citizens easy targets, while advocates claim the technology will ensure tighter security and more efficient confirmation of travelers' IDs at national borders. The companies developing and testing e-passport technology are following standards set by the International Civil Aviation Organization, which require that the tags can only be decoded by readers no more than four inches away. Although government and industry officials argue the difficulty and cost for outsiders to read the tags reduces the danger of eavesdropping, the State Department is considering additional data protection measures. Kelly Shannon with the State Department's Bureau of Consular Affairs says passports equipped with covers of aluminum foil or similar material could be immune to eavesdroppers when closed. The Smart Card Alliance's Randy Vanderhoof says the data stored on the RFID tags would be less susceptible to tampering than printed information, while the cost of the decoding software and the limited range in which the tags can be read are additional deterrents. Opponents such as the Association of Corporate Travel Executives counter that even if enemies cannot decode the information in the passport tags, their very presence will label their owners as legitimate targets. The organization wants the data on the tags to be encrypted, or for the passports to be read by special readers used exclusively by border control agents; another preference is that the passports use non-contactless chips. An April survey found that 98 percent of the group's members are opposed to the chips.
    Click Here to View Full Article

  • "Hardware Today: Grid Computing Means Business"
    ServerWatch (05/23/05); Robb, Drew

    Dr. Carl Kesselman with the USC/Information Sciences Institute believes grid technology use is poised to explode. "Increased deployment of grid technologies in the commercial sector will break the traditional silos that characterize current infrastructure deployments," he says. Major hardware vendors are making heavy grid agenda investments in anticipation of this trend. Grid computing is chiefly perceived as a way to configure the performance of many systems into a virtual supercomputer in order to solve problems beyond the capacities of even the biggest supercomputers, but many people see its computing resource optimization ability as a tool for connecting information distributed among multiple locations. An information grid can also decouple applications from dedicated hardware, facilitating runtime decisions as to the most efficient operational point for applications. By making unharnessed capacity available on demand, grids could potentially eliminate low server utilization rates. "The real business value of the grid lies in the ability to lash together disparate and widely dispersed computing and data resources across an organization in new ways," says IBM's Al Bunshaft. He adds that companies can then exploit these resources to enhance collaboration, give workers on-demand IT resource access, and speed up business, analytical, and scientific processes.
    Click Here to View Full Article

  • "Complexity, Chemistry, Commuting and Computing"
    ITWorld.com (05/19/05); McGrath, Sean

    The programming language wars are likely never to go away because they represent different people's opinions about how to manage inherent complexity, writes XML expert and Propylon CTO Sean McGrath. Tesler's law states that for every business process, there is a base level of complexity that can never be erased, only moved. Sometimes eliminating complexity is not even a goal, but simply offloading it to somewhere else is: It all depends on point-of-view and thresholds for risk, stress, time, and other factors. In terms of software programming, different programming languages deal with complexity at different levels; Perl arms algorithm designers with a number of language features with which to handle complexity, while Java and C# rely on standard libraries such as JDK and CLR, respectively. Python programmers use a core set of complexity management devices repeatedly for a variety of situations. Each of these approaches deals with inherent complexity in a different way, and which one is correct will depend on the programmer or team's tolerance for complexity and point-of-view. The same concept was behind the RISC and CISC debate, in which some saw a small number of fast instructions as more useful than a broader range of instructions running slower. Instead of focusing on eliminating complexity with a preferred tool, programmers should realize there is inherent complexity that is addressed in different ways with different tools.
    Click Here to View Full Article

  • "Cyberinfrastructure for E-Science"
    Science (05/06/05) Vol. 308, No. 5723, P. 817; Hey, Tony; Trefethen, Anne E.

    Participants in Britain's e-Science program are developing an infrastructure for a new generation of multidisciplinary and collaborative science software applications capable of searching, accessing, moving, manipulating, and mining data contained in massive, distributed repositories. One example representative of the e-Science initiative is the creation of a "middleware" infrastructure for experiments with CERN's Large Hadron Collider, which will generate several petabytes of data per year to be distributed to more than 1,000 collaborating physicists. The infrastructure will allow researchers to establish suitable data sharing/replication/management services and enable decentralized computational modeling and analysis. Another example is an Integrative Biology project whereby research groups based in England and New Zealand will use a virtual organization to routinely combine their activities for heart disease research. E-Science application developers will be able to tap the tools, educational content, documentation, and experience from the Web services community when constructing their applications by exploiting Web services technology developments. This will free up the e-Science community to create application domain-specific higher-level services while the IT industry handles the design of the underlying infrastructure's building blocks. E-Science application projects are being complemented by a research program dedicated to the exploration of long-term computer science challenges driven by e-Science needs. The Open Middleware Infrastructure Institute was recently set up to make sure that middleware components developed in England are compatible with those developed elsewhere.

  • "The Robot Army That Thinks for Itself"
    New Scientist (05/21/05) Vol. 186, No. 2500, P. 30; Biever, Celeste

    Robots such as "Grunts" from Frontline Robotics could patrol areas in teams using Wi-Fi to share input, but their usability is limited by cost and size issues; moreover, such machines are not fully autonomous and cannot function without centralized control. Researchers such as MIT postgraduate student James McLurkin and HRL Research Labs scientist David Payton are trying to overcome these limitations by developing decentralized swarm robots funded by the Pentagon. McLurkin has created small wheeled robots that detect obstacles and talk to one another via infrared, and these machines successfully filled 280 square meters of floor space by spreading out evenly during a 2004 government demonstration. His robots can also form into herds, regroup, move in circles, play a tune, and automatically recharge themselves. Payton, meanwhile, has devised "pherobots" that transmit infrared "pheromones" when they encounter an object of interest. Each pherobot picks up and re-transmits these signals to its nearest neighbors until a trail forms. Payton admits that swarm robots are unlikely to catch on in the military until the ability to move in three-dimensional space is added. Still, NASA and Boeing personnel engaged in the U.S. military's Future Combat Systems project have expressed great interest in testing robot swarms.
    Click Here to View Full Article
    (Access to full article is available to paid subscribers only.)

  • "Fiber Wars"
    Red Herring (05/09/05)

    U.S. phone companies are pouring investment into fiber-to-the-home (FTTH) service after federal regulations pulled back line-sharing requirements, but companies still face challenges from municipalities that have gone ahead with their own fiber initiatives. Verizon, SBC, and BellSouth have all announced major fiber investments as they strive to capture so-called "triple play" packages where they provide voice, data, and video to customers. A household that pays $40 per month for each of those services justifies the $1,500 average investment required to connect homes in 2005, especially considering their traditional phone subscriptions are under pressure. In 2004, the average connection price was $1,650, down from $7,500 in 1993. Yankee Group analyst Matt Davis says the regulation specter could appear again as phone companies introduce video-over-broadband services, which would put them under cable TV regulations. The companies are taking different technological approaches, with Verizon touting on-site "optical network terminals" that provide three phone lines, a cable TV connection, a data line, and 12-hour battery backup in case of a power outage. The cable TV jack is intended for broadband TV service expected to start later this year. Verizon's FiOS Internet service offers speeds up to 30 Mbps at premium rates. SBC is offering FTTH to new housing, but fiber-to-the-node for existing structures, where DSL links run data between the house and neighborhood switch. BellSouth's fiber-to-the-curb offering uses copper between the home and the street, but with next-generation DSL technology will be able to offer up to 24 Mbps speeds. Meanwhile, the Fiber to The Home Council of North America says the number of community-driven fiber projects has doubled year-after-year, reaching 214 communities--mostly isolated cities that see fiber as an economy booster. Phone companies regularly oppose these efforts in court and with media blitzes.
    Click Here to View Full Article

  • "Taking on the Cheats"
    Nature (05/19/05) Vol. 435, No. 7040, P. 258A

    The incidence of plagiarism by academics--particularly self-plagiarism--is increasing, but evaluating the scope of the problem is difficult. Academic publishers and editors hope to use software designed to identify plagiarism in student essays to uncover academic plagiarism. Student anti-plagiarism services compare essays against vast volumes of documents compiled from Web trawls and acquisitions from media outlets, showing supervisors which portions of the essays appear to be plagiarized and from where the plagiarized material originates. Publishing experts say the software could be easily adapted for academic documents by fastening it to publishers' online peer review management systems. Anti-plagiarism services can be used to spot plagiarism itself, while software such as University of Arizona, Tucson, computer scientist Christian Collberg's Self-Plagiarism Detection Tool (SPlaT) can identify duplicate publication by capturing papers from authors' Web sites and drawing comparisons between each other and other papers added manually. SPlaT is free and already available. Meanwhile, Cornell computer science PhD student Daria Sorokina has reworked an existing algorithm to compare two documents and find any occurrence of at least six words in a row. Cornell physicist Paul Ginsberg says the software has already found many instances of "awkward overlap" between articles posted on the arXiv physics preprint server. Ginsberg plans to post his results on the arXiv Web site for authors to respond in an effort to refine the algorithm. Blackwell Publishing President Bob Campbell believes publishers' collaborative development of an industry-wide detection system will deliver a comprehensive plagiarism solution, but establishing the system will take a few years even with smooth collaboration.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

    To read more on SPlaT and self-plagiarism, see Collberg and Stephen Kobourov's article "Self-Plagiarism in Computer Science," in the April 2004 issue of Communications of the ACM (Vol. 48, No. 4, p. 88)
    Click Here to View Full Article

  • "Coming: Sensors and Pixels Everywhere"
    Computerworld (05/23/05) P. 34; Rosencrance, Linda

    Anatole Gershman, Accenture Technology Laboratories' global director of research, foresees the advent of three technologies that will drive business applications over the next three to five years: Intelligent sensor networks, scalable intelligence methods, and pixel-driven technology that facilitates remote presence and action as well as information management. He says, "Those are technologies that enable us to sense--intelligent sensor networks; to think--technologies that enable our systems to think, which is scalable intelligence; and technologies that enable us to act on all this intelligence." Gershman details the Reality Online vision, in which technology lets people link to and view physical realities, which are reflected in their systems, in real time. He lists intelligent shopping carts as an example, noting an Accenture prototype that models a specific customer and his or her shopping habits so that it can predict what purchases the customer is likely to want or need, and communicate with the customer to facilitate such acquisitions. Further out, Gershman points to the incorporation of radio frequency identification tags in consumer apparel that can be used to access unique services via sensors, tagging, and tracking technologies. The so-called Online Wardrobe system would allow consumers to keeps tabs on the clothing they already own and help them buy coordinating items either online or in actual stores. Gershman also predicts that camera phones will enable customers to show as well as tell merchants what kinds of items they are looking for.
    Click Here to View Full Article

  • "Building Large-Scale ZigBee Systems With Web Services"
    Sensors (05/05) Vol. 22, No. 5, P. 24; Enwall, Tim; Bahl, Venkat

    Large-scale ZigBee systems can be enabled for network discovery, extraction, commissioning, configuration, management, security, event/rule logic, and data management applications via Web service "brokers," write Tendril Networks CEO Tim Enwall and Ember's Venkat Bahl. Application developers could tap a common suite of foundational software design and run-time tools and services offered through standards-based Web services. Service brokers can function as structured mechanisms to regulate communications, such as routing requests along node-to-application, node-to-node, and application-to-node pathways. With such software services, developers can immediately concentrate on application-specific material, the rules governing the physical environment, the data aggregation and synthesis necessary for effective decision making, and the human and computer communications to be relayed to facilitate the appropriate user outcomes. Moreover, if the developer is familiar with the ways in which the broker's services operate, then the developer only has to know what a ZigBee system is capable of. Knowing the various ins and outs of MEMS sensors, ZigBee mesh networking routing algorithms, wireless network reliability, node operating systems, internode networking stacks, protocols, and how they are integrated into the application is therefore unnecessary. A service broker must provide a logical abstraction layer that virtually maps out the network and its capabilities for the developer; a set of core services that allow the existing infrastructure and new network entries to be discovered, and that shield the network from the unsanctioned introduction of network components; a rules engine to enable an application's algorithms and hierarchical processing across a spectrum of networks; understandable, manageable, and optimized data flow as well as universal availability of preprocessed data throughout the enterprise; and simulation capability.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM