Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM. To send comments, please write to technews@hq.acm.org.
Volume 7, Issue 838:  Wednesday, September 7, 2005

  • "FSF Looking to Raise $500,000 for GPL 3 Evangelizing"
    IDG News Service (09/07/05); Martens, China

    Free Software Foundation (FSF) executive director Peter Brown said yesterday that his organization has received its first funding to raise awareness of and stimulate interest in the next version of the General Public License, GPL Version 3. Brown noted in an interview that FSF expects to raise $500,000 by year's end, which will go toward funding the GPL Version 3 Development and Publicity Project, an outreach effort that targets communities worldwide, particularly artistic groups. The foundation plans to apportion funding to its European, Indian, and Latin American branches, as well as other free software organizations, in the hopes they will establish their own GPL 3 publicity campaigns. Brown said FSF is putting together the infrastructure to accommodate the many comments expected following the circulation of a draft GPL 3.0 in either late 2005 or early 2006. He said a key consideration in drafting the new GPL iteration is addressing incompatibilities with other licenses without sacrificing user freedoms. Technological changes such as the advent of Web services must also be taken into account, Brown explained. He had no definite answer as to whether GPL 3 will include a patent retaliation clause stipulating that anyone who patents software would lose the right to use free software, which FSF Europe President Georg Greve mentioned as a possibility in a Sept. 6 Reuters story. Still, Brown restated FSF's opposition to software patents. He said, "We believe that software patents have no place. DRM is all about restricting individuals' freedoms."
    Click Here to View Full Article

  • "Report Attacks Europe's Poor Record in Information Technology"
    Computer Business Review (09/06/05)

    A new report from Indepen Consultants in Britain attributes the last 10 years' economic erosion in Europe to policies that constrain the productive employment of information and communications technology. The report finds that ICT contributed nearly three times as much to annual growth in U.S. labor productivity than European labor productivity between 1996 and 2000. In the previous 50 years Europe exhibited fast economic and social growth, whereas now it is lagging behind other world economies. Europe's smaller ICT investments and productivity relative to other economies is puzzling, in light of the fact that ICT is generally available to everyone, and this leads the report's authors to conclude that once-effective policies are no longer effective. A European Commission study revealed a general productivity slowdown for Europe in the mid-1990s, owing to the performance of France, Italy, Germany, and Spain. The Indepen report says using ICT productively and profitably requires "creative destruction," in which companies rise and fall, workers are hired and fired, and general-purpose skills become increasingly necessary. But making more investments in ICT capital and skills is not enough to ensure solid returns in the current European climate. Policymakers must take care that regulation does not hinder creative destruction and innovation, and the report recommends the removal of barriers to the reallocation of labor, the creation and destruction of firms, and market integration.
    Click Here to View Full Article

  • "'Six Degrees of Separation' Theory Explained in New Algorithm by UMass Amherst Researchers"
    University of Massachusetts Amherst (09/06/05); Ehrenberg, Rachel

    University of Massachusetts Amherst researchers have developed an algorithm that helps explain the sociological underpinnings of the "six degrees of separation" theory, and which could be applied to ad-hoc wireless networks, peer-to-peer file sharing networks, the World Wide Web, and other decentralized networks. UMass Amherst doctoral student Ozgur Simsek says the expected-value navigation algorithm explains the properties of the social network outlined by Travers and Milgram, who revealed through an experiment with message-passing that people are separated from each other by an average of six acquaintances. Efficiently navigating such networks is the purpose of the algorithm, which exploits portions of the network's underlying architecture, according to Simsek and computer science professor David Jensen. The searching algorithm incorporates homophily--the tendency of like to associate with like--and the predisposition of more gregarious individuals or nodes to act as hubs so that the short routes can be found with little knowledge about the central network's structure. The algorithm essentially sends messages to the target by passing them to sociable individuals with the greatest similarity to the target. Simsek and Jensen presented the algorithm at the 19th International Joint Conference on Artificial Intelligence.
    Click Here to View Full Article

  • "Pushing Girls Toward Science"
    Edwardsville Intelligencer (IL) (09/05/05); Malone, Zhanda

    A report from the National Science Foundation estimates that in 2001, 35% of the students enrolled in undergraduate physics, computer science, and math classes and 16% of those enrolled in undergraduate engineering classes were female. Meanwhile, women comprised less than 10% of students enrolled in graduate physics and engineering classes. A team of researchers at Southern Illinois University Edwardsville (SIUE) recently received a $360,000 grant designed to boost the participation of women in engineering and the sciences through efforts such as a high school robotics competition coordinated by professor Jerry Weinberg with the SIUE School of Engineering's Computer Science faculty. The professor says the program starts with teams of six to 10 students who will use robot kits to design, construct, and program a group of small mobile devices. "Participants will learn to comprehend how the tools of math and science are used in creative projects, and to learn about their application in the everyday world," Weinberg says. Weinberg says the participants will be studied in detail to acquire a better understanding of how such programs influence the way girls perceive their skill in science, technology, engineering, and math (STEM). In addition, Weinberg says the study will hopefully reveal how this perception affects girls' long-term study and career tracks.
    Click Here to View Full Article

  • "Bug Hunters, Software Firms in Uneasy Alliance"
    CNet (09/06/05); Reardon, Marguerite

    The "responsible disclosure" of security flaws can be a contentious issue between software firms and security researchers. Researchers who do not comply with Microsoft's disclosure guidelines and publicly expose a bug in detail before it is fixed can get into trouble, but independent security researcher Tom Ferris argues that Microsoft takes so long to release patches that full disclosure is warranted; critics also say full disclosure puts pressure on software makers to improve the security of their products faster. IDefense Labs director Michael Sutton says relationships between security researchers and software makers have generally improved over the last several years, and Microsoft, for one, is attempting to get into hackers' good graces through "Blue Hat" conferences and other outreach efforts. Cisco and Oracle, on the other hand, have earned researchers' enmity by failing to expeditiously fix bugs after researchers report them, as well as not updating researchers on their progress, in keeping with responsible disclosure guidelines. Director of Germany's Red Database Security Alexander Kornbrust publicly revealed a half-dozen security vulnerabilities in Oracle software when the software maker failed to issue fixes some two years after he first reported them, and he says Oracle only gave him feedback immediately after he alerted the company to the bugs' existence. Former White House cybersecurity adviser Howard Schmidt says responsible disclosure of software bugs is critical, given America's reliance on IT systems. He suggests that technology companies' lack of responsiveness to security researchers' warnings could be addressed through an intermediate government agency, namely the U.S. Computer Emergency Readiness Team.
    Click Here to View Full Article

  • "Tiny Sensors Run Forever (Almost)"
    Wired News (09/06/05); Glasner, Joanna

    Several technologies for transmitting information wirelessly over unlicensed radio spectrum are targeting consumers, but supporters of the ZigBee specification say the technology is better than rivals such as radio frequency identification (RFID) and Wi-Fi in certain instances thanks to its low power consumption. One such instance is the deployment of large sensor networks: Upcoming ZigBee products in this vein include Lusora's system for monitoring elderly people, in which sensors peppered throughout the residence can notify family members when problems or interruptions in daily routine are detected. Meanwhile, Eaton Electrical's Home Heartbeat sensor network is designed to watch parts of a home (electrical devices, water pipes, and so forth) and take remedial action if problems crop up, as well as notify owners if a device is unintentionally left on. ZigBee sensors boast a battery life of three to five years. ZigBee Alliance Chairman Bob Heile cites power efficiency and ease of mesh networking as the technology's chief selling points, while analyst Joyce Putscher says ZigBee networks are more advantageous than their wired counterparts in their simplicity of construction and expandability. Putscher projects that ZigBee product shipments will surpass 150 million units in two years, in a best-case scenario. Analyst Erik Michielsen says putting ZigBee products in an affordable, easy-to-install package will be the key factor in the technology's commercial success.
    Click Here to View Full Article

  • "Unions Step Up Organizing of IT Workers, Outsourcing Fight"
    eWeek (09/05/05); Koprowski, Gene J.

    The growing trend of offshoring and a generally gloomy perception of the IT labor market has unions increasingly aiming their organizing efforts at technology professionals. A union-supported survey identified a "growing pessimism" in the IT landscape, citing the export of jobs overseas and the influx of foreign workers crowding the marketplace as factors contributing to the low mark of 54% of workers who predict increased demand in their industry. Union leaders are reaching out to white-collar workers having identified common threats to their job security that defy classification based on profession or education level. A trend illustrating the need for the unionization of IT workers is the increasingly held view of them as commodities, rather than as coveted and talented individuals. After a split in the AFL-CIO in which many large unions sought to free themselves from the bureaucracy of the umbrella group, there is now a host of independent unions seeking to organize traditionally non-unionized workers, and these groups are capitalizing on the recent emergence of overseas outsourcing to pursue and protect workers with measures such as PR campaigns, boycotts, and lawsuits. IT tops the list of departments vulnerable to offshoring efforts, with India being the principal beneficiary, having claimed $20 billion in contracts last year; China came in second with $600 million. One obstacle to the unionization of tech workers is that they are often scattered around the country, working from remote locations and collaborating with each other through the Internet, though that diffusion has unions using the familiar tools of email and Web communities to try to bring them together.
    Click Here to View Full Article

  • "Man Against Machine"
    National Science Foundation (09/01/05)

    University of Texas at Austin researchers Uli Grasemann and Risto Miikkulainen applied a genetic algorithm to the development of a program that can digitally enhance images of fingerprints better than the FBI's WSQ fingerprint image compression program, according to the National Science Foundation. The researchers' work was supported by the NSF's Computer and Information Science and Engineering directorate. The WSQ program compresses fingerprint images to roughly one-fifteenth their original byte size, but the genetic algorithm Grasemann and Miikkulainen used took the basic programming instructions for compressing graphic images and from them "evolved" a much better program after 50 generations. This evolution proceeded by having effective solutions generate offspring as ineffective solutions died out, so that individual solutions improved over time until a final, optimal solution emerged. "There is definitely tremendous potential to increase the quality of work in many areas of science and engineering using genetic algorithms," Grasemann says. The FBI has almost 50 million sets of criminals' fingerprints on file and adds approximately 5,000 daily, while up to 60,000 digital fingerprint image transactions are processed by the bureau each day. Grasemann and Miikkulainen were honored at the 7th Annual Genetic and Evolutionary Computation Conference, which was detailed in ACM's Proceedings of GECCO 2005.
    Click Here to View Full Article

  • "PCs Could Make In-Dash Splash"
    USA Today (09/07/05) P. 5B; Maney, Kevin

    Touch-screen computers are expected to be incorporated into next-generation cars as PC enthusiasts penetrate the market. Damien Stolarz, author of "Car PC Hacks," says interest in car PCs is growing thanks to the affordability of computers. He adds that "the market is educated enough" because of the migration of satellite radio, DVD players, GPS navigation, and other technologies to automobiles. The PCs would be installed in car dashboards and control information, communication, and entertainment, while also facilitating previously undreamed-of operations such as the downloading and reading of email. Stolarz envisions "personalized telemetry," in which drivers keep tabs on their friends and vice-versa through GPS, wireless links, and computerized maps. Cameras could capture accidents as they occur, or be used in conjunction with software to determine the direction of a motorist's gaze. A warning could then be triggered if the driver is distracted or nodding off. Concerns about the safety of car PCs abound, even though hackers and companies claim writing touch-screen software for car PCs has entailed a great deal of effort and research.
    Click Here to View Full Article

  • "Sending Out an SOS: HPCC Rescue Coming"
    HPC Wire (09/12/05) Vol. 14, No. 35; Lazou, Christopher

    The annual SOS Forum is designed to encourage multinational collaboration on investigations into new avenues of high-performance cluster computing (HPCC). The most recent SOS Forum focused on how supercomputers will nurture future scientific breakthroughs, with particular emphasis on the computational properties necessary for this development, whether this transition requires a new facility model or can be handled by traditional supercomputer centers, the enhancement of science via existing and emerging supercomputer architectures, and software and programming models that enable scientists to fully exploit supercomputers more easily. SOS participants aim to establish alliances between supercomputer centers and computer vendors in addition to international joint ventures between centers of excellence. Tapping computing's experimental potential to facilitate scientific innovations requires a concentrated effort, and the necessary infrastructure includes capability platforms with ultra-scale hardware, software and libraries, hardware and software engineers, and funding for seamless access by teams of researchers exploring Grand Challenge problems. National Leadership Computing Facilities being set up at Oak Ridge National Laboratory, CSCS, and elsewhere to extend the boundaries of large-scale scientific computing were highlighted at the forum. Such facilities will act as distance-learning centers whose offerings include incubator suites, joint facility offices, conference areas, and student and post-doctoral programs. The facilities will sponsor educational outreach, industrial outreach, and international collaboration.
    Click Here to View Full Article

  • "Chaos to Rule Internet in 2010"
    Computerworld Australia (09/02/05); Crawford, Michael

    Swinburne University of Technology professor Trevor Barr presented the Smart Internet 2010 report on Sept. 1, which predicted that in five years the Internet will be thrown into anarchy by the unchecked proliferation of malware, spam, and fraudulent email. The report, which came out of interviews and consultations with approximately 35 global IT experts and 28 secondary sources, outlines four issues believed to be of central importance to the Internet: The adaptive user environment, the "smart Internet," rich media, and chaos rules. Barr said chaos rules dictate that the Internet's functionality is breaking down due to an excess of applications and problems, and by the absence of trust in online transactions. The adaptive user environment school of thought points to a profound lack of a killer application from vendors or carriers over the last 10 years, demonstrating that technological change is being driven by users. This led to the prediction that successful products will stem from user-centric design in 2010. Barr argued that providing basic Internet service, rather than a smart Internet, should be a priority, as only 700 million people out of 6 billion currently have Net access. "We are really only dealing with questions of access and affordability...rich media is the territory of a lot of technological innovation but the smart designers will keep the users in mind," reasoned Barr.
    Click Here to View Full Article

  • "A World of IT Opportunities"
    Des Moines Business Record (09/04/05); Morain, Erin

    After big declines in the technology industry early in the decade, the demand for IT professionals is on the rise again. The Bureau of Labor Statistics reports a lower rate of unemployment among IT workers than the rate for the general economy, while a Robert Half report has found that 14% of CIOs intend to add full-time workers to their staffs in the third quarter, and 38% identify business expansion as the engine for hiring initiatives. As new platforms have emerged, companies must add IT personnel to create and maintain those systems. The area of technology support has enjoyed the most growth as Web applications and their attendant security concerns have reshaped the way companies conduct business; universities are picking up on this trend, offering programs of study concentrating on security to prepare graduates for a job market that is responding to increased government regulations. The increased demand has made the search for talent more competitive, as companies are now having to step up their efforts to retain quality workers. Companies are also enacting option-to-hire contracts, where employers enjoy a chance to test out prospective hires and evaluate their contribution to the business as a whole, rather than on a narrow, technical level. The demand for well-rounded employees poses an added challenge to the academic community to produce workers who bring business savvy in addition to technical skills. Due to the characterization of the IT market as volatile and uncertain, colleges are seeing fewer students pursuing technology-related fields, and the prospect of a worker shortage could lead to greater employee turnover and increased offshoring initiatives.
    Click Here to View Full Article

  • "Tech 'Computer Boot Camp' May Be First of Its Kind"
    El Defensor Chieftain (08/31/05); Topliff, Mike

    The New Mexico Institute of Mining and Technology is launching a novel program for its freshmen to teach them how to build a PC from scratch. The program is modeled after a graduate course astrophysics professor David Westpfahl has been teaching since 1997, and the components for constructing the PC are donated from corporations and state agencies. The idea originated from a surplus of obsolete computers in the university's Tech Computer Center (TCC) and the recognition that every student should have access to a computer. The university's Scholarship for Service program is also on board, having received funding and training in return for a commitment from its students to federal service after graduation. The Microsoft Academic Alliance program has enabled the issuing of individual licenses to students under the overarching TCC license, provided that each student is enrolled in the class for credit. There are currently 29 freshmen enrolled in the program, 19 of whom arrived on campus without a computer. Computer science professor Lorie Liebrock noted that the students have progressed at a faster rate than expected, and have challenged course leaders with questions about security and Linux, despite mild frustration among a few. The program has successfully dispelled students' fears of computers, and has even moved some to consider computer science as a major.
    Click Here to View Full Article

  • "Fiber-Optic Research Net Gains Steam"
    Network World (09/05/05) Vol. 22, No. 35, P. 10; Garretson, Cara

    National Lambdarail (NLR) is a nationwide fiber-optic network oriented toward advanced network technology research, one that offers more dedicated capacity than other networks; this is accomplished through the use of dense wavelength division multiplexing that delivers as many as 40 simultaneous light wavelengths, also known as lambdas, each boasting a transmission speed of 10Gbps. NLR executive director Tom West says the network's infrastructure is managed by users rather than a vendor, and NLR has mandated that over half its capacity be devoted to network technology research. Users receive full use of one of the lambdas for a fixed amount of time by paying a usage fee, which allows researchers to experiment with new technologies without fear of interfering with others or being interfered with. "It's like a highway system; I have my own lane, and no one else can get in," explains West. The first research organization to use NLR was the Pittsburgh Supercomputing Center, which employed the network as part of its TeraGrid distributed research architecture. The center's Gwendolyn Huntoon says the flexibility and power of NLR's network is critical for advanced IP research, while the cost of NLR is less rigid than it is for some commercial services. Tony Conto with the Mid-Atlantic Crossroads (MAX) consortium says his organization intends to partner with a similar organization and become a NLR member by early 2006. Conto says NLR will be used by MAX members for telemedicine and physics research in addition to advanced network research.
    Click Here to View Full Article

  • "United States Facing Cyber Security Crisis, Experts Tell Capitol Hill Briefing"
    Today's Engineer (08/05); Reppert, Barton

    Experts such as MIT professor F. Thomson Leighton and Purdue University professor Eugene Spafford painted a bleak picture of U.S. cybersecurity at the July 26 Forum on Cybersecurity on Capitol Hill. Leighton, a senior member of the recently disbanded President's Information Technology Advisory Committee (PITAC), called for a dramatic increase in funding for basic research and development in civilian cybersecurity. He said the country's IT infrastructure still faces grave and immediate threats despite positive steps such as Congressional approval of the Cyber Security Research and Development Act of 2002 and the establishment of a new position of assistant secretary for cybersecurity at the Homeland Security Department. Leighton said the private sector plays an important role in securing IT infrastructure, but the federal government must sponsor the discovery and development of cybersecurity technologies underlying private-sector security products and services. He also said the federal cybersecurity effort has shifted toward classified military rather than civilian R&D, concurrent with a move favoring short-term over long-term research across all sectors. Spafford warned that public support for better cybersecurity measures will not be spurred until U.S. IT infrastructure suffers "a very large and significant failure." PITAC issued a report in February recommending a $90 million a year increase in the National Science Foundation's budget for civilian cybersecurity research; more DHS and DARPA-directed investment in civilian cybersecurity R&D; and a stronger effort to promote recruitment and retention of university cybersecurity researchers and students.
    Click Here to View Full Article

  • "On the Frontier of Search"
    Time (09/05/05) Vol. 166, No. 10, P. 52; McCarthy, Terry

    The search engines of the future will be modeled more on sophisticated patterns of human thought than impersonal statistics and algorithms; smarter, more customized searches will produce results based on an individual's preferences and interests, rendering obsolete today's methods that yield endless strings of hits sorted by generic relevance. The number of searches performed in the U.S. jumped 22% from July 2003 to July 2004, according to a comScore Media Metrix. Google accounts for 36.5% of the searches, followed by Yahoo! at 30.5% and MSN with 15.5%. More sophisticated search engines that capitalize on metadata are bringing vast repositories of pictures and video within reach: Viisage has developed a search technique that analyzes unique facial features, such as cheekbone structure and the tip of a nose, to produce matches from databases that can help law enforcement personnel identify criminals. Internet usage is also becoming more common on cell phones, which has led to a host of companies focusing their efforts on creating local searches that would help a user on the street locate a restaurant or learn about the landmark in front of them. Semantic searches that match results based on an understanding of the meaning of query terms are also emerging. Users are also changing the way searches are conducted by providing descriptive feedback about a site's content that becomes part of its metadata: A more controversial search method analyzes a user's history of activity, known as a clickstream, and customizes results based on anticipated interests. Satellite and auditory-recognition programs are also offering new approaches to previously inaccessible data.
    Click Here to View Full Article
    (Access to the full article is available to paid subscribers only.)

  • "Not the Internet You Know"
    Chronicle of Higher Education (09/09/05) Vol. 52, No. 3, P. A31; Kiernan, Vincent

    To address the growing need for fast networks capable of transmitting vast amounts of information in the scientific community, networking researchers are working to create a faster offshoot of the existing Internet2 limited to academic applications. The project, Hybrid Optical and Packet Infrastructure Testbed (HOPI), could eventually result in faster consumer connections and possibly lead to the development of the next Internet. To meet the continually growing requirements for bandwidth, network developers are using circuit switching to reserve a unique conduit for a transaction that allows as much information as necessary to flow without interruption, just as a telephone conversation is conducted on its own, private circuit; because it ties up a circuit even when no data is being transmitted, circuit switching can be inefficient when applied to a network. Another technique, packet switching, segments a body of information into packets and sends them individually across the Internet, often taking different paths as assigned by a router. This method can reliably transmit information relatively quickly, though it has difficulty handling the volume of data the scientific community is sending. HOPI amalgamates the two, sending the bulk of information through packet switching, but reserving circuits for high-volume users as necessary. The National Science Foundation and the Energy Department have launched similar endeavors, known respectively as UltraLight and UltraScienceNet, which will both be linked to HOPI. With Abilene, Internet2's high-speed network, ending its scheduled life in 2007, its successors will have to address integration and ever-increasing requests for bandwidth.
    Click Here to View Full Article
    (Access to the full article is available to paid subscribers only.)

  • "GIS Unshackled"
    GeoWorld (08/05) Vol. 18, No. 8, P. 32; George, Randy

    The emergence of open source GIS tools and public Web Mapping Service (WMS)/Web Feature Service (WFS) data resources will provide a strong foundation upon which flexible, specialized Web interfaces to GIS functionality can be built. Open source database projects particularly suitable for spatial applications include PostgreSQL, whose server is "spatially enabled" by PostGIS. Many tools exist for capturing, creating, and changing data to reside in spatial tables, such as the JUMP Unified Mapping Platform, a graphical user interface (GUI)-based spatial data viewing/processing application. The emergence of Web GIS in the Open source community is unsurprising, given that open source technology and the Internet have matured concurrently. The ground was cleared for flexible, custom GIS Web apps by a number of factors, including the advent of W3C XML standards and Scalable Vector Graphics (SVG), and efforts by the Open Geospatial Consortium. WMS specifications provide standardized URL-encoded models for the access of geospatial databases via Web interfaces; overall, Web apps' core elements are browser-based user clients, HTTP server middleware, and geospatial database servers. Public WMSes can be accessed through diverse, easily developed custom SVG Web interfaces, and additional data resources are expected to emerge as public WMSes grow more advanced. The means to create custom browser-based interfaces is provided by SVG thanks to a sophisticated array of animation features, event listeners, and image filter effects.
    Click Here to View Full Article

  • "Describing the Elephant: The Different Faces of IT as Service"
    Queue (08/05) Vol. 3, No. 6, P. 26; Foster, Ian; Tuecke, Steven

    The enterprise IT environment's transition to distributed, low-cost, and frequently heterogeneous collections of servers has fragmented the architecture into segregated silos, and reintegration must take place so that the new environment can support the advantages of vertical decoupling and horizontal integration. Many popular terms are floating around and breeding confusion, but they all relate to the move from vertically integrated silos to horizontally integrated, service-oriented systems. Grid is an umbrella term for solutions that relate to the flexible use of distributed resources for various applications, while grid infrastructure speaks of a layer for horizontal infrastructure integration. The meaning of utility computing and on-demand coincide with that of grid in that they refer to IT as service; data center automation seeks to automate operations on applications that are usually not modified for distributed execution. Service-oriented architectures (SOAs) are collections of services for interface separation and deployment necessary to facilitate compatibility, location transparency, and loose service/client coupling, while Web services are a set of technologies needed to bring SOAs into being. A horizontally integrated, service-oriented enterprise IT architecture involves applications using workload managers to coordinate access to physical resources through a common grid infrastructure layer for managing that supports resource modeling, monitoring and notification, allocation, provisioning, life-cycle management, decommissioning, and accounting and auditing. The grid infrastructure must deploy management capabilities consistently across heterogeneous resources while avoiding vendor lock-in through standardization and open source software, and efforts in these areas are proceeding quickly.
    Click Here to View Full Article