Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to technews@hq.acm.org.
Volume 6, Issue 667:  Monday, July 12, 2004

  • "For Hackers, Shop Talk, a Warning and Advice"
    New York Times (07/12/04) P. C3; Thompson, Nicholas

    This year's Hackers on Planet Earth (HOPE) conference featured speakers such as Apple Computer founder Stephen Wozniak, who bemoaned that people today consider hackers to be synonymous with terrorists to such a degree that the government has instituted excessively harsh penalties against violators of computer fraud regulations. Wozniak described hacking as mainly "just some kid trying to do something funny," illustrating his argument with his own hacking escapades, which included such pranks as manipulating the phone system to place a free call to the pope. Wozniak told the younger attendees that they should follow a code of ethics and resist the temptation to do harm, a view espoused by many veteran hackers. HOPE conference head of security Mike Roadancer said he thinks younger hackers have a strong need for guidance and discipline. A recurring contention among speakers and participants at the conference was that they hack chiefly to expose security holes in corporate computer systems in the hopes that their actions will lead to improved data protection and privacy. "If a hacker breaks into a company's system, and that system isn't properly secured, that company should be held liable," remarked veteran hacker John T. Draper. A good portion of the event was devoted to arguing the need for the government to loosen its monitoring and control of computer networks. Sessions were held to help hackers become more competent, while others concentrated on tools that could help penetrate or secure computer systems.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "IT Jobs Slowly Grow, Wages 'Still Settling'"
    IT Management (07/09/04); Doody, Angela

    There is definite, albeit slow, growth in the IT job sector, according to findings from the ComputerJobs.com job-placement firm, the Bureau of Labor Statistics, and Challenger Gray & Christmas, and this is a positive development after dramatic layoffs and a surge in high-tech offshoring over the past several years. ComputerJobs.com has witnessed 30 percent to 40 percent growth in the number of jobs in the areas of network and database security in the past six months, while the Bureau of Labor Statistics reports that the number of jobs in computer systems design and programming rose from 1,103,500 to 1,115,500 between April and June. Meanwhile, Challenger Gray & Christmas CEO John Challenger notes that his firm has registered 13,500 new jobs in the U.S. computer industry in the last two months. However, Challenger cautions tech professionals not to get their hopes up about dramatic wage increases. "Wages are still settling some," he explains, while ComputerJobs.com's Michael Turner points out that employers rather than employees are the ones dictating hiring terms now. A recent ComputerJobs.com poll reveals that jobs that call for advanced skills are experiencing the best wage increases, while help desk personnel and programmers are earning less. Turner reports that most of the firms posting ads on his Web site for network security staff or security project managers and SQL database systems experts are hiring people with between three and five years of experience, as well as professionals skilled in business, management, or finance. He adds that many IT workers are obtaining certifications and specialized skills, which is also a heavy consideration for college graduates as they enter the technical job market.
    Click Here to View Full Article

  • "Doubts Over Touchscreen Tech Choice for Venezuela Recall"
    Associated Press (07/12/04); Olson, Alexandra

    Venezuela will use untested touchscreen e-voting systems for its recall referendum on Hugo Chavez's presidency, in spite of general consensus among computer scientists that such machines are vulnerable to tampering and malfunction, and while the wounds of an embarrassing Venezuelan e-voting snafu in 2000 are still fresh. The glitch forced the Supreme Court to delay the biggest election in the country's history, and killed any new deal with the software's U.S. supplier, Election Systems & Software. The pro-Chavez majority on the Venezuelan Elections Council approved a contract with Boca Raton, Fla.-based Smartmatic and its Venezuelan partners, software provider Bitza and the CANTV phone company. Smartmatic's products, unlike many e-voting systems in the United States, will provide a printed record of each vote to support recounts of contested elections. The Smartmatic contract includes the deployment of 20,000 touchscreen systems and the operation of regional elections in September, while a contract for the referendum is being worked out. Elections council President Francisco Carrasquero contends that e-voting will eliminate the practice of ballot-stuffing that was symptomatic of Venezuelan elections before Chavez captured the presidency. Suspicions of political bias on the part of Bitza were raised by the Miami Herald in a May article reporting that a Venezuelan state industrial development fund had poured venture capital into the software company, which is tasked with integrating manual votes into the electronic system; Bitza replied that it would buy back the government's 28 percent investment. The referendum will be the first test of Smartmatic's system, and the election council promises that the system will be audited prior to the vote.
    Click Here to View Full Article
    For information on ACM's activities regarding e-voting, visit http://www.acm.org/usacm

  • "Linux Helps Make Weather Forecasts More Accurate"
    NewsForge (07/07/04); Lyman, Jay

    Climatologists and other scientific researchers are increasingly adopting Linux as a way to both reduce high-performance computing (HPC) costs and improve speed. Oak Ridge National Laboratory (ORNL) is using a new 456-processor, Linux-based Altix system from SGI to simulate the weather: Because the SGI system performs six times more powerfully than the previous system, ORNL researchers are able to run more detailed models faster, and produce more accurate and timely weather forecasts. ORNL associate laboratory director Thomas Zacharia says the gains have not come from Linux alone, which is a sturdier operating system, but also from improved mathematical algorithms, better interconnect technology, and advanced processor and memory components; he says many other weather modeling computers have moved to the Linux platform, including the Red Storm computer from Cray and IBM's BlueGene computer. SGI's Jill Matzke says the HPC community is migrating to standard-based technology such as Linux and Intel processors because of cost-effectiveness and system portability. NASA Ames Research Center deputy division chief John Parks says there are still advantages of proprietary technology in some cases, but that Linux is consistently gaining technological ground: Linux does not leverage some hardware features as well as proprietary solutions and generally lags in input/output performance, for example. Parks and other scientists expect Cray to release a Linux version of its X1 climate and weather simulator. The major advantage of Linux for HPC systems is the ability to run on a variety of hardware platforms, but a number of customized applications will have to be ported to the Linux platform, slowing adoption.
    Click Here to View Full Article

  • "Glimmer of Hope in Copyright Measures"
    SiliconValley.com (07/11/04); Gillmor, Dan

    Dan Gillmor is heartened by congressional legislation concerning copyright, specifically Rep. Rick Boucher's (D-Va.) Digital Media Consumers' Rights Act and growing criticism against Sen. Orrin Hatch's (R-Utah) Inducing Infringement of Copyright Act. Boucher's bill seeks to restore people's right to make "fair use" of copyrighted material; current copyright law allows owners of intellectual property to prevent fair use with technological measures, and criminalizes people who offer ways or products to circumvent the restrictions. Boucher says his proposal attempts "to restore the classic balance" between users' rights and creators' rights, and his bill has gained influential sponsors such as Reps. Joe Barton (R-Texas), John Doolittle (R-Calif.), Christopher Cox (R-Calif.), and Zoe Lofgren (D-Calif.). Hatch's bill, which was supposedly created to prevent children from being cajoled into downloading copyrighted material and porn, could also ban any technology with the potential for copyright infringement, regardless of any legitimate use it may have. Gillmor says the bill's language is vague enough to enable copyright holders to leverage the law against any technology they are opposed to. He adds that such a measure would choke technological innovation, and he is encouraged that opposition to the bill was motivated by the Senate's decision to send the bill to the floor without holding committee hearings; a coalition of trade groups and companies fired off a letter to Hatch contending that the bill threatens to invalidate a Supreme Court ruling that technology with many legitimate uses cannot be banned because it can be employed for illegal uses. The letter also noted that the bill could implicate product reviews that detail how the product operates, and urged that hearings be held before sending the proposal to the Senate floor. Such hearings could be conducted as early as this week.
    Click Here to View Full Article

  • "Cell Phone Mission Connects FAU Team, Motorola Vision"
    South Florida Sun-Sentinel (07/12/04); Katz, Ian

    Student researchers at Florida Atlantic University (FAU) in Boca Raton are developing fast-prototyping technology and techniques that will enable Motorola to produce new cell phone designs in a matter of days. FAU computer science director Ravi Shankar says the electronics industry has experienced a 1,000-fold increase in productivity over the last 20 years, enabling companies such as Dell to build made-to-order computers on demand. Armed with similar capability, Motorola would be able to quickly upgrade an existing model with new functionality--improved video quality, for example--on a much faster schedule. Motorola currently takes about two years to develop a new product, and the time interval means marketing and strategy executives have to see that far into the future to be successful. Motorola recently lost several market-share points to rivals because it made large investments in 3G smart phones, which turned out to be not as popular as Motorola had predicted. Rival manufacturer Nokia is also tapping university talent at the Royal Institute of Technology in Stockholm, and Stanford University and the University of California at Berkeley also have famously successful ties to local IT companies. FAU Science and Engineering Department Chairman Borko Furht says his university has the ambition to become "the MIT of the South," and emulate the academic-industry collaboration of California. The 14 FAU students who are working on the five-year Motorola project are focusing on segmenting product development into four categories: Specification, design, development, and prototyping; although each group's focus does not overlap with another's, the entire team must work together, says director Shankar.
    Click Here to View Full Article

  • "Easy Authoring for Mixed Reality"
    IST Results (07/09/04)

    The European Union-funded AMIRE mixed reality (MR) project has completed a toolkit that enables people without development expertise to create MR applications quickly and professionally. MR applications enable users to receive relevant, real-time data based on their location, and requires a number of other technologies including portable computers, wireless networking, and GPS. AMIRE is an Information Society Technologies project now in its 27th month, and is tasked with facilitating the efficient creation and modification of MR applications. The new toolkit provides reusable elements such as gems, components, and frameworks so that designers do not have to recreate those assets. The goal is to integrate MR into real-world applications, and the AMIRE project has already succeeded in testing the toolkit in two MR applications--one in the Bilbao, Spain, Guggenheim Museum and the other for oil refinery workers in Austria. The museum application provided location-specific exhibit information to visitors as they toured the facility. The oil refinery MR application helped apprentices at the plant and benefited factory maintenance. The goal of the project was to remove the programming aspect from the creation of MR applications, and enable non-experts to author and modify designs, says project coordinator Jose Luis Los Arcos. While the AMIRE project lessens the cost of MR application development, deployment in the real-world is still limited by the cost of hardware and communications, he says. Potential sectors that would benefit from MR include manufacturing, tourism, and health care.
    Click Here to View Full Article

  • "Can Computers Argue?"
    Innovations Report (07/05/04); Lewis, Joyce

    Computers can not only argue, they can evaluate conflict-resolution strategies that include reformulating actions or evading confrontation, according to a new paper co-authored by professor Nick Jennings of the School of Electronics and Computer Science at the University of Southampton. Computer agents are autonomous systems used in commercial and industrial projects and are thought to be one of the most important new technologies in computer science and a new way of thinking. Jennings will present his paper at the third international joint conference on Autonomous Agents & Multi-Agent Systems (AAMAS 2004), which takes place July 19-23 at Columbia University in New York; he is general co-chair. AAMAS 2004 is co-sponsored by SIGART, the ACM Special Interest Group on Artificial Intelligence. Jennings paper will assess the effectiveness of argumentation-based negotiation (ABN) for agents working in multi-agent systems. He says, "Artificial intelligence programs of this kind can deal with difficult problems and aid humans in many different environments. But to improve their performance, we need to ensure they have the ability to overcome real-world problems such as conflict." Agents offer much potential for use in e-commerce, but also in robotics, computer games, and information retrieval.
    Click Here to View Full Article

  • "Cybersecurity Research Underfunded, Executives Say"
    Government Computer News (07/08/04); Jackson, Joab

    The National Science Foundation (NSF) can only fund about 10 percent of the research proposals it receives in regards to improving IT security, according to testimony at a House Government Reform Subcommittee on Technology, Information Policy, Intergovernmental Relations and the Census hearing this week. "There are good ideas in the cybersecurity area that we're simply not able to fund," said NSF computer and information science and engineering directorate assistant director Peter Freeman. He said the foundation has received over 150 proposals for a current solicitation in computer security, about a third of which show promise, but that the agency has only enough money to fund 10 percent of the total. Hratch Semerjian, the acting director of the National Institute for Standards and Technology (NIST), said computer security deserves more emphasis and that it is an important part of nearly every new application developed by the institute. The NSF has requested $751 million for networking and IT research next year, while the NIST has requested $57.9 million for computer science research, with another $6 million specifically for cybersecurity. Overall, Rep. William Clay (D-Mo.) says federal spending on IT-related R&D will total about $2.2 billion this year, but would fall to $2 billion in 2005 under President's Bush budget proposal.
    Click Here to View Full Article

  • "Transparent Desktop Opens Doors"
    Wired News (07/09/04); Kahney, Leander

    Facetop is a videoconferencing system that enhances collaboration by combining a live video feed of users with a shared, transparent image of the desktop on one screen. The position of the users' fingertips is tracked, and collaborators can use natural pointing gestures to open and close files, choose text, or communicate ideas about the document. Facetop was initially designed by University of North Carolina at Chapel Hill computer science professor David Stotts and graduate student Jason Smith as a tool for pair programming, but the system can be applied to medical imaging, remote teaching, and other areas. Lectures or PowerPoint presentations could also be augmented with Facetop by projecting the speaker in the background of the document, where he or she can point to key sections. Smith explains that users have no problem flipping back and forth between the desktop and the subject: "It's like being in a room of conversations but having no trouble paying attention to only one," he says. The system runs on the Mac OS X operating system, whose Quartz rendering engine can make any segment of the interface transparent. Facetop has been deployed on two Apple PowerBooks and a pair of $100 FireWire cameras; the researchers expect the system to work well on the Internet, although up to now it has only been tested on Ethernet networks. The rollout of Facetop for the PC will probably be held up until the 2006 release of the Longhorn upgrade to the Windows operating system, which will feature a similar graphics subsystem.
    Click Here to View Full Article

  • "How the World Is Learning to Love ICANN"
    Register (UK) (07/08/04); McCarthy, Kieren

    The international community is softening somewhat to the new ICANN management, as that team tries to rally support for itself before the "memorandum of understanding" it has with the U.S. government expires on Sept. 30, 2006. Anger over past neglect and bullying tactics from ICANN management led the United Nations to set up a "preparatory committee" investigating Internet governance; the group will present its final report at the World Summit on the Information Society (WSIS) this November. If ICANN opposition continues to grow, observers speculate the U.N. International Telecommunications Union (ITU), which runs the WSIS meeting, may take over ICANN functions. New ICANN CEO Paul Twomey has launched a diplomatic blitzkrieg, enlisting business managers and lawyers to do the governance job that was previously handled by North American computer scientists. ICANN is currently working to change the redelegation of country-code top-level domains (ccTLDs), such as .de for Germany and .uk for the United Kingdom. The Internet Assigned Numbers Authority (IANA) has slowly been taken over by ICANN, which used the monopoly on ccTLDs to force national governments and other parties into binding legal arrangements if they wanted to manage their ccTLD. ICANN recently resolved several outstanding redelegation disputes involving the ccTLDs for Nigeria, the Occupied Palestinian Territory, and Libya, and ICANN policy and development support head Paul Verhoef expects to resolve another 10 to 15 cases soon. Many countries, however, have abstained from dealing with ICANN because they do not want to lend it legitimacy. ICANN and the ITU are jointly hosting a ccTLD operation workshop this July in Kuala Lumpur.
    Click Here to View Full Article

  • "Corporate Governance Task Force Pushes Security Best Practices"
    Enterprise Systems (07/07/04); Schwartz, Mathew

    A new report from the National Cyber Security Partnership's (NCSP) corporate governance task force says getting executives involved in security is the best way to protect the nation's critical infrastructure. The report, "Information Security Governance: A Call to Action," suggests more federal funding for software development tools that root out defects, a management framework for information security governance, and more executive-level and boardroom-level attention to security. Unisys managed security services global director John Summers says the report's aim was to help governments and companies correctly implement and secure an electronic infrastructure. He says, "One of the challenges that all organizations are trying to address--the government in particular--is what is the right way to implement [and] secure an electronic infrastructure." Unisys is assisting the Transportation Security Administration with network implementation, including IT security. Summers believes that critical infrastructure industries are moving from making security imperative to making it routine. Companies usually want others to define security standards and responsibilities, but it is hard to define best practices when things are still evolving, he explains. To complement the NCSP report, Summers recommends the National Institute of Standards and Technology's security infrastructure best practices, which are intended for federal agencies. It is too soon for regulations because threats and responses are changing too quickly. Summers says that security is more about risk management; security assessment should involve the needs of the business overall.
    Click Here to View Full Article

  • "The New Geek"
    PC Magazine (07/13/04); Lohr, Steve

    A new breed of IT professional is emerging, one with a technical background who can channel his skills into multiple disciplines: These so-called "New Geeks" are expected to help usher in what IBM's Irving Wladawsky-Berger calls the "post-technology" era, in which technology tools are applied to business and societal problems rather than to the tools themselves. The technologies with the most potential are designed to bypass institutional barriers and traditional hierarchies to communicate, share data, and automate transactions, and such technologies include more advanced social networking, speech recognition and natural-language programs, search software, virtual-team software, and intelligent agents to help reduce the complexity of e-commerce. This trend is expected to lead to higher productivity levels, economic growth, and standards of living. The New Geek movement is also reflected in the growth of interdisciplinary programs in the computer science departments of major universities. Meanwhile, analysts expect microprocessors, memory, communications speeds, and hard drive storage to continue to improve exponentially over the next decade. Professor Erik Brynjolfsson of MIT's Sloan School of Management remarks that tech investments by themselves offer little in terms of productivity, but combining them with specific work practices produces the largest gains. The best-performing firms employ teams more frequently than their competitors, and follow a model in which easily quantifiable work is centralized and computerized while tasks that require interpersonal skills and local knowledge are decentralized. New Geeks will be optimally positioned to design and implement such tech-driven enhancements, and Brynjolfsson says the people most likely to benefit in the new global labor market will be those who can find and create innovative tech applications.
    Click Here to View Full Article

  • "Semantic Web Is 2 Steps Closer"
    eWeek (07/05/04) Vol. 21, No. 27, P. 46; Chen, Anne

    At ACM's 13th International World Wide Web Conference in May, World Wide Web Consortium leader Tim Berners-Lee declared that the Semantic Web is now ready for its second phase of development thanks to the ratification of the Resource Description Language (RDL) and the Ontology Web Language (OWL). He said developers should focus on creating Semantic Web applications in order "to bootstrap and justify the Semantic Web in the short term." The Semantic Web is conceived as a framework that can structure Web page content so that computers, not just users, can understand the data they are displaying. RDL meshes together an array of applications, providing syntax via XML and nomenclature via URLs, while OWL defines structured, Web-based ontologies that facilitate richer, cross-application data integration and interoperability. Berners-Lee advised attendees to avoid seeking out a killer Semantic Web application, arguing that the technology's justification will come out of the emergence of new links among information. He also thought MIT's Haystack tool, which was spotlighted at the conference, showed promise. The client-side Java application gives different classes of user data--email, addresses, and Web bookmarks, for instance--a consistent interface. The trade-off is the application's size, which slows it down and occupies 512 MB of memory.
    Click Here to View Full Article

  • "Wireless Mesh Links Everyday Devices"
    EE Times (07/05/04) No. 1328, P. 43; Poor, Robert

    Low-cost radio-frequency integrated chips (RFICs) plus control software imbue everyday objects with the ability to network with one another, transmitting control and sensing data. New embedded RFIC technology is best used with mesh networking techniques, as opposed to the star architecture commonly used for existing wireless technology such as Wi-Fi; this type of networking provides reliability, extends battery life, and eases deployment and operation. In any wired or wireless network setting, nodes must overcome attenuation, interference, and multipath obstacles, and mesh networking deals with attenuation and interference more effectively because nodes connect to their nearest neighbor, reducing distance and its side effects. Reduced transmission distances also decrease the threat of multipath signals, where multiple signals cancel each other out upon reaching the antenna. Low-power networking protocols such as IEEE 802.15.4 do not reach the data transmission speeds of Wi-Fi, but high-data rates are not necessary or even desirable for the type of sensing and control operations mesh networking is used for; always-on, high-speed connections would quickly drain the batteries of deployed nodes, which ideally will last for several years before exhausting their charge. Currently available embedded RFICs use just 50 milliwatts of power when active, and would last just 2.5 days if constantly active--but the average light switch, however, only requires transmission of four bits each day so that existing RFICs could be expected to have a duty cycle of just 0.1 percent, operating up to six years without recharge. Embedded RFICs deployed in a mesh network would also reduce human operation costs since the network could self-repair or be patched with the addition of extra repeater nodes, instead of having to physically re-position the deployed nodes.
    Click Here to View Full Article

  • "Putting Rules Engines to Work"
    InfoWorld (06/28/04) Vol. 26, No. 26, P. 34; Owen, James

    A business rules management system (BRMS) enables business analysts to translate business rules into a simple programming language similar to English, giving them direct control over the rules that determine enterprise application behavior and granting them the power to change business rules themselves rather than rely on IT. Gartner estimates that IT operating costs can be reduced by 10 percent to 15 percent with a properly deployed BRMS, which also raises confidence that business rules are being implemented properly. A BRMS enables analysts to "code" business rules into declarative IF-THEN-ELSE statements that can interact with other rules, and the system arrives at a solution by thoroughly "chewing over" the rules. The similarity of BRMS statements to English means that any rule can be altered, added, or removed, passed on to the IT department for integration and testing, and be rapidly shuttled into production. Analysts must consider the type of application involved and the nature of the existing infrastructure before they decide to use a BRMS, which is particularly well-suited for proprietary large enterprise business applications, especially those in which the business logic and the rest of the application are kept separate in the original design. Integrating a BRMS into an existing system is a more challenging proposition, requiring the examination of each existing business logic component. The application of the BRMS to the financial sector has been wildly successful, while other frequent BRMS users include telecom companies, who employ the systems for a bevy of applications ranging from message routing to maintenance-related downtime scheduling to customer relations management.
    Click Here to View Full Article

  • "Under Attack"
    InformationWeek (07/05/04) No. 996, P. 54; Hulme, George V.

    The growing threat of malware means more expensive downtime for businesses, as demonstrated by the results of the InformationWeek Research 2004 Global Information Security Survey. The percentage of companies reporting between four and eight hours of downtime and between eight and 24 hours of downtime due to cyberattacks rose 4 percent to total 22 percent this year, while companies reporting downtime of one to three days thanks to virus or worm infections more than doubled between 2003 and 2004, to total 16 percent. Hackers and malware authors have a wider spectrum of exploits to choose from as more businesses equip their increasingly distributed workforces with mobile devices and remote system access, and implement peer-to-peer networks, instant messaging, wireless local area networks, and extended supply chains. About 60 percent of survey respondents are planning to increase security spending, which currently averages 12 percent of IT budgets. Among the biggest problems mentioned by respondents are poorly designed security tools and flawed software applications; about one-third of those polled want software vendors to be held legally and financially accountable for insecure products, although 47 percent of U.S. respondents believe vendors should not be held liable if they have proven secure development practices. Many information-security managers complain that security tools fail to supply the right kind of information about security threats that would enable them to concentrate on shoring up only the most vital systems. Strategies poll respondents are employing include the deployment of technology to detect anomalous network behavior and lock down applications, and insisting that contracts with software vendors stipulate that vendors must comply with best security practices and promise that their products have been adequately tested. Most security professionals opine that a considerable amount of time will pass before applications are more secure, a view borne out by the fact that this year's respondents are prioritizing the augmentation of application security.
    Click Here to View Full Article

  • "The State of Social Tools"
    Darwin (06/04); Boyd, Stowe

    Corante Research managing director and software researcher Stowe Boyd predicts that social software tools designed to influence culture will undergo considerable convergence because they share similar core benefits. "The tools that we will use to make sense of the world must be far more socialized than today's solutions, and only those tools that make that transition will be on tomorrow's desktop," he posits. Boyd notes that past discussions of social tools have focused on their four key categories, or "Co"s: Communication tools such as instant messaging, email, and Web conferencing; coordination tools such as calendaring, task and product management, and contact management solutions; collaboration offerings such as file and application sharing, wikis, and blogs; and community tools represented by social networking, group decision, swarmth, etc. Boyd says these product categories have started to overlap toward a point where distinct market niche definitions will no longer apply, which will spur "a collision of many sorts of products, with widely varying starting points and orientations, and [that] could lead to a wholesale recasting of product categories that we almost take for granted." Boyd cites major enterprise instant messaging solutions, which are experiencing reconfiguration into a blended communication/collaboration tool that supports a wide gamut of features spanning from one-to-one text messaging to many-to-many real-time collaboration, as an example of an enabling technology for socializing almost all nonsocial software. Software-based project coordination could undergo a sea change toward a less precise, more diffuse approach, and Boyd forecasts that there will be a switch to blog networks that will be used by collaborating networks to determine the next series of tasks to be executed via emergent decision making.
    Click Here to View Full Article