HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Parasoft Logo    Automated Software Analysis - by Parasoft:
Error prevention for unit testing, coding standards analysis,
functional/performance testing, monitoring and reporting.
http://www.parasoft.com/products


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Parasoft or ACM. To send comments, please write to [email protected].
Volume 6, Issue 723:  Wednesday, November 24, 2004

  • "E-voting Faces New Scrutiny"
    CNet (11/24/04); Lemos, Robert; Shim, Richard; Hansen, Evan

    The Government Accountability Office (GAO) will launch an investigation into e-voting irregularities in the recent presidential election, according to five Democratic representatives who asked for the review. Experts have credited e-voting systems for performing relatively well and said that most of the problems on Nov. 2 could be attributed to poorly trained election workers and other non-technical aspects. Electionline.org director Doug Chapin hopes the GAO will spur Congress to get more involved in the voting reform effort, perhaps ordering national voting standards to be established or requiring paper receipts so that e-votes can be audited. The Help America Vote Act passed in 2002 was bolstered in October when Congress agreed to increase funding for new voting systems from $500 million to $1.5 billion, though experts have complained that much of the money has been spent on systems that have not been independently tested. The House Judiciary Committee has received reports of about 57,000 incidents that will be investigated by the GAO, said the legislators who called for the examination: Some voters said they had chosen different candidates than the ones shown in their vote confirmation, and machines in some polling places crashed and rebooted without providing evidence that cast votes had been counted. The Ohio Secretary of State admitted President Bush inexplicably received nearly 4,000 phantom votes in preliminary results, while a House of Representatives group urging an investigation singled out counties in Florida, Ohio, New Mexico, and Pennsylvania for special consideration, saying those areas suffered the worst reported problems. A recent paper published by the University of California at Berkeley showed some substance to claims that some Florida votes were influenced depending on the type of voting machine used. Apparently, voters who used electronic machines tended to favor President Bush in proportion to the number of registered Democrats in those areas.
    Click Here to View Full Article

    For information on ACM's activities involving e-voting, visit http://www.acm.org/usacm.

  • "Congress Ups H-1B Visa Cap by 20,000"
    Computerworld (11/22/04); Thibodeau, Patrick

    Among the provisions contained in an omnibus spending bill passed by Congress this past weekend is a 20,000 increase in the H-1B visa cap, which only applies to foreign national master's and Ph.D. graduates of American universities. IT worker organizations such as the National Society of Professional Engineers wanted Congress to retain the original 65,000 cap, backing their argument with claims that employees allowed into the country on H-1Bs are competing for U.S. jobs, and insisting that no serious shortfall in the national engineering and high-tech workforce exists. Also opposed to an increase is the American branch of the IEEE which contends that high-tech unemployment levels have dipped since the cap was lowered, citing U.S. Bureau of Labor Statistics. According to the bureau's findings, the number of out-of-work IT professionals fell by 92,000 between the first and third quarters of 2004, with the population of unemployed programmers shrinking by 37,000 to 25,000 and computer scientists and systems analysts declining from 48,000 to 17,000. "Because U.S. industry has been more restricted in its ability to bring overseas guest workers into the country, it has had to hire more U.S. citizens to fill open positions," explains IEEE-USA President John Steadman. On the other hand, Joanna Smith Bers with DB Marketing Technologies sees a shortage of U.S. graduates trained in math, statistics, marketing, and business, which partly explains the attraction to H-1B holders. High-tech business and industry organizations pressuring Capitol Hill for an increase in the H-1B cap complain that the congressionally authorized raise falls short of the mark: AEA VP John Palafoutas says about 50,000 additional visas are needed.
    Click Here to View Full Article

  • "Silicon Valley's No Place to Find a Job"
    Investor's Business Daily (11/23/04) P. A4; Turner, Nick

    Four years after the dot-com meltdown, many Silicon Valley tech workers have yet to swallow a bitter pill: An unwillingness or inability to relocate or make a career change will most likely mean long-term unemployment. Challenger, Gray & Christmas estimates that nearly 55,000 tech industry positions were eliminated in the third quarter of this year, a 60 percent increase over second-quarter losses and the highest number since the 82,328 layoffs recorded in fourth-quarter 2003. Furthermore, CEO John Challenger doubts that the current resurgence in hiring will balance out the losses, while Silicon Valley's slump could continue for several years. Onetime San Jose high school principal Art Darin says the Valley's slump tells students that a college technical degree no longer guarantees a good job, while Challenger says that this factor could discourage students from studying technology and lead to a shortage of much-needed tech workers if the economy bounces back. He also says tech workers share a certain degree of responsibility for their plight by waiting too long for the job market to improve and refusing to consider relocation or lower-paying jobs out of pride. But Challenger criticizes employers for not trying hard enough to place out-of-work employees in new positions. Associate dean of National University's academic center Charlene Ashton reports that many high-tech workers are reconsidering their careers in the wake of the slump, and some are deciding to become teachers through initiatives such as her school's Tech to Teach program. She also notes that between 100 and 150 former tech employees are pursuing teaching degrees outside the program. Ashton acknowledges, however, that some of these people are independently wealthy.

  • "Europe Delays Vote to Unify Patent Rules"
    New York Times (11/23/04) P. C12; Meller, Paul

    Poland's announcement that it is considering withdrawing support for a proposed law that defines the patentability of technological innovations has caused European governments to postpone voting on the legislation. Wlodzimierz Marcinksi with Poland's ministry of science and information technology said the Polish government does not fully endorse the text the European Union's 25 members agreed on in May because legal experts think it leaves a door open for software patents. At the same time, Polish officials consider the proposed law to be important to the protection of real inventions, as opposed to incremental modifications of computer programs. The Foundation for a Free Information Infrastructure, an open-source advocacy organization, argues that Poland's withdrawal is a sensible move because it will ease the patenting of "pure" software in Europe. The directive, seen as a way to unify patent rules throughout Europe, was amended by the European Parliament a year ago to include a ban on data-processing innovation patents and stricter rules for the registration of computer-related inventions, a measure that tech companies object to on the grounds that it would nullify the patenting of valuable technologies. "We want a law that would prevent software patents but would allow the patenting of inventions that exist in the real world," noted Mark MacGann with the European Information, Communications, and Consumer Electronics Technology Industry Associations. James Heald with the Foundation for a Free Information Infrastructure said a compromise can be reached between supporters of Parliament's version and proponents of the version agreed to by EU members, with additional consultation on both sides of the fence. A vote on the directive has been rescheduled to the first half of December.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Group Proposes System to "Connect the Dots" About Terrorist Attacks"
    GovExec.com (11/18/04); Harris, Shane

    The Atypical Signal Analysis and Processing (ASAP) computer system proposed by the RAND Corporation is designed to filter massive amounts of information through multiple networks and databases in order to help U.S. intelligence agencies identify and analyze the most relevant indications of imminent terrorist attacks. "An information search that could take dozens of intelligence analysts days to complete could be carried out within hours" with ASAP, according to principal researcher John Hollywood. Comparisons to the Pentagon's defunct Total Information Awareness (TIA) initiative are inevitable, and former TIA director John Poindexter warns that Capitol Hill could strongly oppose ASAP unless privacy issues are adequately addressed. Hollywood argues that the need to "anonymize" data is not as pressing with ASAP, as the system would concentrate on information already obtained by the government or within public records. RAND's official position is that "the ASAP network would work with a small and restricted data set consisting solely of intelligence and homeland security information," and would search that data only if there is a high enough level of suspicion to justify a subpoena under existing U.S. law. The RAND proposal also suggests that ASAP could be used to analyze information concerning infrastructure, commerce, and industry that has a direct impact on national security and America's economy. ASAP's development could span several years, and Hollywood advises that in the meantime officials should set up online bulletin boards where suspicious activities or incidents could be posted by counterterrorism agencies. He also encourages the use of search engines to correlate queries about activities with already posted information.
    Click Here to View Full Article

  • "Software Detects the True Artist"
    Wired News (11/22/04); Shachtman, Noah

    A team of Dartmouth College researchers has developed a set of computer algorithms that can be used to verify or refute the authenticity of artwork via statistical analysis, although statisticians and museum curators note that testing has only just started. Dartmouth computer science professor and project leader Hany Farid was in a unique position to devise such techniques, having digitized paintings and Egyptian tombs and combined them into 3D images, as well as creating computer programs that can automatically spot doctored high-resolution photos. Similar methods were employed by Farid and Dartmouth mathematics professor Daniel Rockmore to analyze a painting attributed to Pietro Perugino by photographing the artwork and digitizing it into an enormous 16,852 by 18,204-pixel image. The picture's six faces were split into several hundred 256 by 256-pixel blocks that were filtered and run through the algorithms, which generated a set of numbers that were graphically plotted, indicating that at least four different artists worked on the canvas. The algorithms were also used to confirm art historians' suspicions that five out of 13 drawings credited to Flemish artist Pieter Bruegel were imitations, according to a paper in this week's Proceedings of the National Academy of Sciences. Stanford University physics professor David Donoho says the Dartmouth researchers' method "will be much more persuasive when they can make a prediction of fraudulence." Farid, however, envisions the software as just one more analytic tool in the arsenal of art authentication, in addition to forensic conservation and examination by connoisseurs. But he notes that the algorithms would fit nicely into the niche application of determining how many people worked on a single piece of art.
    Click Here to View Full Article

  • "Consortium Sheds Light on Dark Fiber's Potential"
    EE Times (11/22/04); Mokhoff, Nicolas

    The National LambdaRail (NLR) project takes advantage of unused underground fiber-optic cables, or "dark fiber," left over from a 1990s buildout to establish a national networking infrastructure for networking research and R&D into next-generation network protocols, services, technologies, and applications for science, medicine, engineering, and other fields, according to NLR President Tom West. The privately funded NLR consortium has raised $115 million to implement a five-year plan to set up 40 U.S. networks, and West says these networks will exist side by side in the same fiber-optic cable pair because each will be supported by its own individual lightwave or lambda. At the recent SC2004 conference, West sent out a call for network "plumbers" to maintain the NLR infrastructure so that a national-scale network can be developed. NLR board member John Silvester reports that NLR is a national-level deployment similar to the Corporation for Education Network Initiatives in California project, which involved the implementation of a multilayer advanced network supported by leased dark fiber. West says NLR and similar initiatives are expected to help cultivate new technologies and markets and boost the country's economic health. Research agencies already control and operate certain 10 Gbit lambdas such as the OptiPuter, an IP-based backbone that employs optical networking rather than individual computers as the central processing component. The "supernetworks" supported by the infrastructure allow scientists to interactively visualize, examine, and correlate massive amounts of data from multiple optical net-linked storage sites. OptiPuter principal investigator Larry Smarr is in charge of a National Science Foundation-funded effort to establish high-performance science environments or "collaboratories" that employ lambdas on multiple levels--campus-wide, state-wide, nationally, and internationally.
    Click Here to View Full Article

  • "Rules of the Collaboratory Game"
    Technology Review (11/23/04); Bender, Eric

    Scientists are increasingly relying on collaborative, networked technologies to share computing and informational resources. The National Institute of Health's Biomedical Informatics Research Network (BIRN), for example, helps schizophrenia researchers share brain scans, instrumentation, software tools, and other resources, and BIRN coordinating center director Mark Ellisman says the trend toward collaborative research tools cannot be stopped. In fact, there are now over 200 collaboratories ongoing, says Gary Olson, a professor of human-computer interactions at the University of Michigan and a partner in the Science of Collaboratories project, funded by the National Science Foundation. Olson defines collaboratories as "an organizational entity that spans distance, supports rich and recurring human interaction oriented to a common research area, and provides access to data sources, artifacts and tools required to accomplish research tasks." The cost and complexities of research are driving scientists to take a collaborative approach, but success requires careful planning and an understanding of the dynamics behind such an endeavor, such as the need for high-capacity bandwidth and computing resources. Ellisman says recruiting computer scientists to work on the cutting-edge infrastructure can be challenging as well, while laboratory scientists must be convinced of the value of releasing their experiment data after they have published their hypothesis, in order for others to build on their study or take it in new directions. This is important when tackling big problems such as genome sequencing or HIV/AIDS, says Ellisman. Often the tools and data each lab works with does not automatically synch, so collaborators have to agree on common data formats or conversions; in addition, the tools need to be easy to use, otherwise user adoption will remain low.
    Click Here to View Full Article

  • "Study Tracks African Americans in IT Programs"
    NewsFactor Network (11/22/04); Martin, Mike

    A National Science Foundation grant will allow Virginia Tech researchers to investigate the reasons why African Americans enter the IT field. Undergraduate IT enrollment for African Americans rose dramatically during the 1990s, but enrollment has fallen precipitously in the last four years. The purpose of the $617,000 study is to find out what factors encourage African Americans to pursue IT when the job market for technology jobs is not as good as it was in the late 1990s, says Virginia Tech Center for Global E-Commerce director France Belanger. In 2000, 3,330 undergraduate computer science degrees were awarded to African-Americans, the last year hard numbers were available, up 67 percent from 1991, when just 1,997 African American students earned degrees, according to Virginia Tech's Wanda Smith, the study's director and an associate professor of management. The Virginia Tech group will also look into the characteristics of current African American IT professionals, such as learning styles, visual-spatial intelligence, and resilience of personality. Mentoring and internships will also be considered as factors that could help recruit more African Americans into academic and professional IT careers. The researchers are working from a model that shows how students choose their course of study, perform academic IT work, and transition into the work force; the researchers plan to give three sets of surveys to undergraduates, first-year graduates, and first-year IT employees to help determine motivations and inform future training programs. Belanger notes few other career options offer as rewarding a return-on-investment as does the IT career path.
    Click Here to View Full Article

  • "More Funding Needed for Security R&D, Committee Says"
    Government Computer News (11/19/04); Jackson, William

    The U.S. government received poor marks from the chairman of the President's IT Advisory Committee's (PITAC) cybersecurity subcommittee for its failure to provide sufficient research and development funding to improve IT security. MIT faculty member and Akamai Technologies chief scientist F. Thomas Leighton made his assessment during a presentation of draft findings and recommendations from a subcommittee study at a Nov. 19 PITAC conference. The committee also found that more and more government research projects are becoming classified and short-term results-oriented, and the panel called for a reversal of these trends and the institution of a central authority to assess research requirements and coordinate government funding. These conclusions were based on an analysis of funding for basic research by the Defense Advanced Research Projects Agency (DARPA), National Science Foundation (NSF), Homeland Security Department, National Institute of Standards and Technology, and National Security Agency (NSA). Agencies such as the NSA and DARPA receive the lion's share of the R&D funding for military and intelligence programs; most civilian security research funding comes from the NSF's $30 million CyberTrust program, which the PITAC subcommittee thinks should be increased by at least $90 million yearly. The subcommittee urged more funding for radical, long-term R&D, as well as a certain degree of acceptance that some projects may not pan out. Future areas of concentration identified by the subcommittee include end-to-end system security; computer authentication strategies; cyberforensics; monitoring and detection; secure software engineering; mitigation and recovery methodologies; fundamental networking protocol security; new technology modeling and test beds; and metrics, benchmarks, and best practices for assessing and deploying security products.
    Click Here to View Full Article

  • "ICANN Pitches the Internet's Future"
    Register (UK) (11/20/04); McCarthy, Kieren

    In a bid to secure its role as the primary administrator of the World Wide Web, ICANN recently published its Strategic Plan for the next three years. On Sept. 30, 2006, ICANN's memorandum of understanding with the U.S. government expires, and the control of the Internet will effectively be up for grabs. One alternative to ICANN could be the International Telecommunications Union, which governs most of the international communications networks besides the Internet and is itself run by government entities. In its Strategic Plan, ICANN appears to address many of the complaints that have been lodged against it since it was formed in 1998. Up until the appointment of Australian diplomat Paul Twomey as CEO last year, ICANN had largely been accused of being too secretive, elitist, unresponsive, and U.S.-centric. With the release of its new Strategic Plan, the organization lists several ways it plans to improve upon these criticisms. For example, ICANN is pledging to revamp its complaint procedures, regulate its own decision-making process, be more open with the public, and establish more offices around the world. The proposed changes are designed to position ICANN as the best choice for the administration of Internet-related affairs when the 2006 deadline arrives. The author argues that ICANN as structured in the new strategic plan is a better choice for running the Internet than the ITU; whereas with the ITU governments are all-powerful, ICANN is still steered by the technologists that created the Internet. Nevertheless, the author says ICANN must make good on its promises of creating an Ombudsman, establishing the Independent Review Panel, and generally bringing greater openness to its proceedings.
    Click Here to View Full Article

  • "Storing and Managing Valuable Employee Experience"
    IST Results (11/19/04)

    The PELLUCID project has produced a software platform that can be used to develop systems for managing the experiences of employees in the public sector. By providing tools and methods for recording, storing, and distributing the experience and knowledge of workers, public sector organizations would have an easier time bringing new employees up to speed, as they replace workers who have moved on to other departments. Project coordinator Simon Lambert of Rutherford Laboratories near Oxford says the representation of working context is what makes the experience management system special. "We have very rich representation of working context that includes position in the work process as well as the attributes of the case in hand," says Lambert. "We use domain-specific reasoning about the similarity of cases, and can match the advice offered to the context accordingly." The PELLUCID system provides "Active Hints" to users as they attempt to solve a problem, and users can rank the usefulness of hints to aid the system in future assistance efforts. The system has been used as a pilot application for call center support for the Consejeria de la Presidencia in the regional government of Andalusia, and users say their work has been made substantially easier. Lambert plans to work with the public sector to make the technology more widely available.
    Click Here to View Full Article

  • "Government Uses Color Laser Printer Technology to Track Documents"
    Medill News Service (11/22/04); Tuohey, Jason

    For decades, color laser printers from Xerox and other companies have been modified to embed their serial number and manufacturing code on each document the machines produce for the purpose of tracking down forgers. Xerox research fellow Peter Crean says his company's printers code these numbers in millimeter-sized yellow dots that are invisible to the naked eye, but are revealed through close examination when shining a blue LED light on the printout. "It's a trail back to you, like a license plate," he notes. Crean says his company originated the technology about 20 years ago to address concerns that Xerox color printers could be easily employed to forge bills. He says the encoding is performed by a chip positioned deep within the printer's innards, close to the laser, that cannot be disabled by "standard mischief." Crean says that law enforcement in the United States and other countries use the technology to trace counterfeiters. The embedded serial numbers could also be used to track down any person or business that printed a document, but Lorelei Pagano with the U.S. Secret Service insists that the government only authorizes the use of the technology for information-gathering in the case of criminal acts. This has done nothing to assuage the fears of lawyer John Morris with The Center for Democracy and Technology, who argues that consumers must at the very least be notified of the technology's presence in the machines.
    Click Here to View Full Article

  • "Mapmaking Turns to the Terrain of the Face"
    San Francisco Chronicle (11/23/04) P. G1; Fost, Dan

    A joint project between three Canadian learning institutions will use Silicon Graphics technology to create interactive facial simulations that can be manipulated in real time, with potential applications in video games, the film industry, and crime or terrorism investigation. SGI's role in the project will take the form of a $200,000 Onyx4 UltimateVision system, which uses proprietary chips and software. The venture, which involves the participation of the University of Toronto, Queens University, and the Sheridan Institute of Technology and Advanced Learning, is part of a $10 million initiative primarily underwritten by the Canadian government. The psychologists enlisted for the project plan to precisely isolate facial patterns with the models, which will be taken from 3D scans of human faces. Possible experiments could include how people react to or interpret the simulations' expressions and how such reactions are affected by slight alternations in the expressions. Avrim Katzman of the Sheridan Institute's Visualization Design Institute says the effort will use UC San Francisco researcher Dr. Paul Ekman's work with identifying the many facial muscles used in expression as a foundation. He explains that facial communication is based on the interaction of multiple parameters, such as the words spoken, their vocal quality, and the expressions employed. The SGI system will allow researchers to "change those parameters at will," Katzman says.
    Click Here to View Full Article

  • "Pointillist Protection"
    Computerworld (11/22/04) Vol. 32, No. 47, P. 30; Hamblen, Matt

    Carnegie Mellon University's CyLab is working on new computer security methodologies and other breakthrough technologies with federal and private-sector contributions. A $6.4 million National Science Foundation grant will fund CyLab's Security Through Interaction Modeling, a project that will model interactions between people, their computers, and external attacks, and incorporate the models into computer safeguards. Another CyLab security project is designed to monitor deviant behavior that may result from buffer overloads and other bugs: The method contrasts a precomputed model of a system's expected performance against the combined application interactions with the operating system; this project is named Seurat after the French pointillist painter because there are numerous points, layers, or places where one could gauge a system's activities to uncover evidence of an attack. CyLab co-director and dean of the Carnegie Mellon College of Engineering Pradeep Khosla notes that several CyLab projects are focused on the design of self-healing or autonomic computer systems. One such project is the Self*Storage System, which operates on the principle that systems possess more than one point of failure; this means that a system can rapidly ascertain whether a piece of data has been tainted and automatically revert to its original state. CyLab's Coral project, meanwhile, is devoted to developing network defense components that repel malware. Carnegie Mellon researcher Chenxi Wang explains that Coral aims to make a network capable of detecting an attack early and responding to it in real time by identifying the handful of critical nodes that are chiefly used to spread the infection. Khosla says CyLab has a broad research mission, in order to help achieve "a world where we can push measurable, sustainable, secure, trustworthy and available data."
    Click Here to View Full Article

  • "Defining Tomorrow's Database"
    eWeek (11/15/04) Vol. 21, No. 46, P. D1; Coffee, Peter

    Tomorrow's database must balance increasingly sophisticated intruders, corporate governance requirements, and public awareness of database security risk by offering higher levels of analysis and rich-media flexibility without jettisoning expensive server, software, and developer skills investments, while simultaneously fulfilling critical roles in more regions, rapidly delivering services on shorter notice, and better shielding itself against both intentional threats such as deliberate misuse and unintentional threats such as inadvertent data disclosure. The database platform and applications developer must therefore adjust to a fluid mission and environment by combining speed, flexibility, and survivability in new ways through a blend of real-time knowledge and business logic that spawns competitively advantageous applications. Pressures on the size of databases resulting from the wider deployment of radio-frequency identification tagging, for example, will be characterized by intermittent surges in activity rather than continuous upward trends. With such developments unfolding, it is more critical than ever to design security, whenever practical, directly into the database instead of depending on each application that deploys the database to effectively participate in security; in fact, Gartner recently calculated that halving application vulnerabilities before implementation would reduce enterprise configuration management and incident-response costs by 75 percent. The growing power and availability of database programming capabilities gives designers more opportunities to develop and enforce information access policies instead of leaving them to be relayed to, understood by, and deployed in the code of numerous disparate application developers. Developers should not succumb to the temptation to base database design and operations on the need to keep risks at a minimum in order to comply with corporate or legislative mandates, but rather increase their participation in business unit discussions of risk and reward.
    Click Here to View Full Article

  • "Coming Out of Cold Storage"
    Software Development (11/04) Vol. 12, No. 11, P. 30; Morales, Alexandra Weber

    Results of the 2004 Software Development salary survey indicate a shift in priorities among employed software developers over the last five years from job perks and flexible schedules to job stability, as well as a slight growth in salary to just below $80,000--a significant development after two years of zero growth at $77,000. Job-seeking has basically remained level: 31 percent of respondents report that they are somewhat seeking employment while 7 percent say they are actively seeking employment; reasons for seeking employment, in ascending order, are offshoring, mergers, firings, stock options, job market opportunities, and job stability. Satisfaction levels are more or less the same compared to last year, with less than 3 percent reporting extreme dissatisfaction with their compensation package, 14 percent reporting general dissatisfaction, 21 percent neutral on the subject, 47 percent reporting general satisfaction, and 15 percent reporting extreme satisfaction. Average bonuses have fallen from $5,000 for staff and $10,000 for managers last year to $4,000 for staff and $8,000 for managers this year. The highest-paid developers are those who work with Java Messaging, J2ME, COBRA, J2EE, and SOAP, while the lowest-paid developers are those who work with Delphi, Cobol, Fortran, EDI, .NET, and Oracle; interestingly, the mean salary for people who work on open source/free software is $88,000 for staff and $98,000 for managers. Respondents report having been with their companies for seven years, on average, and expecting to retain their current occupation for two to five years. Web services account for 31 percent of respondents' applications this year compared to 23 percent in 2002, while application service providers grew only 5 percent to 22 percent between 2000 and 2004. The portion of female respondents declined 2 percent to 11 percent in 2004, and the average female developer earns $79,000 while the average male earns $87,000.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Building Minds, Not Widgets: Technology for the Business of Learning"
    IT Professional (10/04) Vol. 6, No. 5, P. 12; Post, William

    California State University (CSU) Chico CIO William Post writes that IT will never fulfill its potential unless academic CIOs can successfully extend IT's role from a records management tool to a cost-effective learning-enhancement technology. CSU has launched its Academic Technology Program with this goal in mind. At the heart of the program are eight initiatives: Support for students' academic excellence and success by improving their interaction with the school's administrative and academic systems; provision of online modules and associated support services for foundation skills; promotion of interdisciplinary collaborative identification, development, and exchange of digital learning materials; support for the sharing of academic technology effectiveness research results and the creation of new knowledge on educating with technology; cultivation of effective technology use skills among teachers, staff, and administrators; the organization of course development teams proficient in instructional and interface design, media production, project management, and programming; the development of a standards-based academic technology shared-services environment; and the creation of a CSU digital marketplace. Post cautions that CIOs should not undertake a learning-process redesign without first reviewing previous deployments of transformational learning technology, the lesson being that "despite process redesign, incompatible or mismatched technology cannot 'fix' a university's learning processes." He explains that CIOs who are accustomed to being the big decision-makers in enterprise technology solutions must come to terms with playing a more understated collaborative role in the learning culture. The academic CIO can assist in the development of common cost definitions and learning effectiveness recognition, but stakeholder acceptance hinges on stakeholders (faculty, administrators, etc.) determining how to assess their own work.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM