HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 730:  Monday, December 13, 2004

  • "Justices Agree to Hear Case on File Sharing"
    New York Times (12/11/04) P. B1; Greenhouse, Linda

    The Supreme Court has agreed to examine the case of Metro-Goldwyn-Mayer Studios v. Grokster, and its ruling could determine whether online file-sharing services are complicit in acts of copyright infringement. Four years ago, the entertainment industry filed a lawsuit that successfully led to the shutdown of the Napster file-sharing network, and then proceeded to file suits against individuals who exchanged copyrighted content over the Internet, although this tactic proved to be ineffective. The current case, which was filed against Grokster and StreamCast Networks, was previously dismissed by the San Francisco federal district court and the U.S. Court of Appeals for the Ninth Circuit, which ruled that Grokster and StreamCast's technology differed enough from Napster's so as to protect the networks from liability for aiding digital piracy. However, the plaintiffs maintain the ruling "has immunized Grokster and StreamCast for the millions of acts of copyright infringement that occur on their services every day." The defendants submitted a brief citing former MPAA President Jack Valenti's 1983 warning that VCR technology would have a disastrous effect on the film industry's bottom line, which never panned out and turned into an enormous revenue stream for the studios. The plaintiffs counter that the immunity ruling granted to VCRs 20 years ago should not apply in this case because, unlike VCRs, file-sharing networks are primarily used for copyright infringement. Attorneys general of 40 states, 130 recording artists, and 15 national organizations that called themselves reliant on "meaningful copyright protection" filed friend-of-the court briefs on behalf of the entertainment industry. Friend-of-the-court briefs on behalf of the defendants were filed by the Computer and Communications Industry Association and the nonprofit Library Archive.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Cyber-Security Office Calls for More Clout"
    eWeek (12/10/04); Rash, Wayne

    The Cyber-Security Industry Alliance (CSIA) recently issued recommendations calling for the reorganization of the Department of Homeland Security's (DHS) Directorate of Information Analysis and Infrastructure Protection, with particular emphasis on elevating the rank of the National Cybersecurity Division's director to an assistant secretary's position. Proponents believe this would add clout to the federal cybersecurity initiative and secure much-needed additional funding; CSIA executive director Paul Kurtz describes current cybersecurity funding levels as "minuscule." Although DHS cybersecurity division deputy director Lawrence Hale cheers the CSIA recommendations as a positive step toward raising the profile of cybersecurity, he maintains that cybersecurity and physical security are inseparable, and should therefore be treated equally. Hale notes that both the government and the private sector must cooperate if real progress in infrastructure security is to be made. "We need to press on the priorities we already have set, which include working with the telecom and IT industries to prevent major Internet disruption and protect critical infrastructure of the United States," he declares. Hale says the DHS is collaborating with the National Communications System to identify and patch security flaws in the IT and telecom infrastructure, or at least find ways to contend with disruptions. He claims that the cybersecurity division's goals are expanding, as evidenced by the launch of new software assurance and control system security programs. Hale says that critical infrastructure is growing increasingly vulnerable as a result of its connection to the Internet, and therefore new industries must be made to recognize the dangers as well as the need to secure the infrastructure.
    Click Here to View Full Article

  • "Mind Over Matter: Brain Waves Guide a Cursor's Path"
    Washington Post (11/13/04) P. A8; Weiss, Rick

    Scientists at the New York State health department's Wadsworth Center have developed a noninvasive brain-computer interface that can translate mental impulses into cursor movements. Lead researcher Jonathan Wolpaw's wearable "thinking cap" is studded with sensors that record electroencephalogram signals from the user's scalp, and transmit those signals to a computer through an external wire. In an online edition of the Proceedings of the National Academy of Sciences, Wolpaw and co-worker Dennis McFarland explain that the technology allows people to modulate signal intensities in specific regions of the brain and direct the two-dimensional movements of an onscreen cursor via a software program that "learns" their neural impulse patterns in much the same way that voice-recognition programs become familiar with users' verbal eccentricities. The report highlights four people's progress in which they were trained on Wolpaw's interface and learned how to move a cursor to zap onscreen targets, suggesting that the technology may one day enable paralytics to be more independent. Duke University researcher Miguel Nicolelis, whose work involves training primates to move robotic limbs with their thoughts through implanted electrode arrays, argues that devices such as Wolpaw's will not restore motor function, a view echoed by Case Western Reserve University's Dawn Taylor. She acknowledges, however, that a lot of people may prefer techniques such as Wolpaw's to implanted brain interfaces. Taylor is also skeptical that mind-reading technology is on the horizon, given the difficulty in analyzing transient thoughts.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "The Image Masters"
    Jerusalem Post (12/12/04); Siegel-Itzkovich, Judy

    Hebrew University computer scientists have developed a number of technologies that advance computer displays and imaging, such as an algorithm that automatically colorizes black-and-white movies with only cursory guidance, or 3D photography made using just a single digital camera. Hebrew University's computer graphics lab developed the colorization technology to help produce more life-like color for old prints and movies; with human artists assigning scribbles of approximate colors in selected regions, the computer software is then able to fill in the image with suitable gradients of color. Another algorithm developed at Hebrew University helps meld photographic images taken at different light exposures so that they look like what is seen with people's eyes: Photographs of a lobby with a garden beyond glass doors are merged with the software program so that the inside and outside regions are depicted as they would appear in real life instead of too dark or washed in light; the resulting image can also be displayed on a computer monitor. The 3D photography technology developed by Hebrew University researchers has already led to HumanEyes Technologies, which is currently in talks with printer companies Epson and Hewlett-Packard about building new printers that would easily produce the 3D photos. Currently, the 3D images can only be produced by pasting paper print-outs onto lenticular plastic. HumanEyes software compiles images taken with a normal digital camera that is panned across a scene taking photos in continuous mode. The software uses the different images to uncover objects that would be hidden in any single frame, and can also be used to create full panoramic images that reflect 360-degree views. Professor Shmuel Peleg believes the technology will lead to 3D photos becoming the industry norm.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "The Need for Transparent, Accountable, and Verifiable U.S. Elections"
    YubaNet.com (12/11/04); Alexander, Kim

    The United States should set standards for federal elections, including paper records for all votes, public verification of computerized vote counts, and rules managing the use of electronic voting machines, writes California Voter Foundation President Kim Alexander. The recent presidential election left a significant minority of American voters unconfident about the integrity of the election; in order to bolster voter confidence and to ensure fairness, the federal government needs to impose minimum security standards on the thousands of local jurisdictions that currently govern election procedures. Such requirements would not amount to a standardized system, but would mean disallowing Internet connections for computers running vote-counting software and modem connections between polling places and central offices. Alexander advises that voting machine vendors should disclose corporate information and source code to election officials, and paper ballots should always be on hand to substitute for electronic voting should those machines fail; the most needed rules are paper record requirements and routine public verification. Currently, only California, Maine, Ohio, and Illinois require paper receipts that can be used to spot-check electronically recorded votes. Required public verification would mean organizations that request recounts would no longer have to pay for it themselves and relieve public officials of the burden of making a recount decision. Elections have routine public verification, helping to stop rumors and speculation. Alexander says Congress may reconsider the Help America Vote Act next year and argues that they should use this opportunity to ensure fair and secure elections nationwide.
    Click Here to View Full Article

  • "Computer Modeling Lets Scientists Make Virtual Re-creations of Ancient People, Things"
    Pittsburgh Post-Gazette (12/13/04); Spice, Byron

    Ancient objects, people, and locations are being virtually reconstructed through computer modeling, and these reconstructions are seen as a complement to museum exhibits as well as an aid to anthropological and archeological research. University of Pittsburgh Medical Center radiologist Dr. Douglas Robertson has simulated an ancient Egyptian mummy mask with CT scanning and 3D modeling. The model rendered the mask's contours as a mesh of triangles, and the virtual mask was texture-mapped by combining 2D photos of the mask taken at 45 degree intervals with the 3D model by flattening out the triangles into two dimensions. Robertson and his students often employ 3D computer modeling to generate bone simulations to aid the design of implantable medical devices. The radiologist notes that "A lot of engineering software isn't designed for natural shapes," so special software and methods are often required. Using the virtual mask, which can be rotated to any view, researchers determined that a section was damaged, and St. Louis Art Museum curator Sidney Goldstein says the model simplified the rapid prototyping of a precise silicone plastic replica to be used as a guide for repairing the damaged area. Meanwhile, Carnegie Mellon University computer scientist Yang Cai and his students used Reflex 3D Systems software to digitally reconstruct the face of a Botai man from a 5,500 year-old skull found in Kazakhstan. The student team also employed computer modeling to generate a 3D panorama of an ancient Kazakh village from photos and geomagnetic maps. Cai says, "The computer pulls everything together, and we discover more than is possible with just a [two-dimensional] painting."
    Click Here to View Full Article

  • "No End to His Imagination"
    Investor's Business Daily (12/13/04) P. A3; Brown, Ken Spencer

    The groundwork for the computing era was laid down by British mathematician, cryptographer, and logician Alan Turing, who formulated the basic theories behind what was to become the electronic computer. In 1936, Turing envisioned a device capable of solving any problem expressed as a mathematical algorithm, and this instrument, later termed a Turing Machine, represented the first attempt to describe a general-use device that could store data and instructions and be programmed for numerous math problems. These ideas were applied to Turing's "bombe" machine, which helped decipher secret German codes in World War II by using an electromechanical process of elimination to accelerate decryption. In 1946, Britain's National Physical Laboratory approved Turing's plan to construct an electronic Turing Machine, renamed the Automatic Computing Engine (ACE). ACE was designed to boast far more usability than the electronic calculators of the period, in that it could be programmed for all types of calculations; one biographer notes that Turing, in essence, "invented the art of computer programming." Turing proposed whether or not machines could truly think in his1950 essay "Computing Machinery and Intelligence." He admitted that he could not reach a definite conclusion on the matter, but proposed that eventually computers would be able to deceive humans into thinking that they are intelligent. This notion formed the basis of the Turing Test, an experiment to see whether people could distinguish humans from machines in a typewritten conversation, which still plays a role in modern-day artificial intelligence research.

    For information on ACM's A.M. Turing Award, and its winners, visit http://www.acm.org/awards/taward.html.

  • "Tweaks to High-Tech Visas Revive NSF Scholarships"
    Science (12/10/04) Vol. 306, No. 5703; Bhattacharjee, Yudhijit

    The new omnibus spending bill for 2005 includes a provision that reforms the process for allowing foreign workers to hold high-tech jobs, and revives the National Science Foundation's Computer Science, Engineering, and Mathematics Scholarships (CSEMS) program. The NSF introduced the CSEMS program in 1999 after Congress imposed a $1,000 application fee for H-1B visas and redirected some of the revenue to the foundation. However, the authority to collect the fee expired last year, which meant that the CSEMS scholarships handed out earlier in the year would have been the last round. The omnibus spending bill that Congress recently passed reinstates an H-1B fee, increases it to $1,500, and pushes the overall H-1B visa cap from 65,000 to 85,000. The NSF could receive as much as $38.3 million a year, starting in 2006, when its first round of new awards is likely to take place, according to the foundation's Duncan McBride. Meanwhile, the maximum amount of the two-year scholarship has tripled to $10,000, and CSEMS areas of study have been expanded to include biotechnology and other high-tech disciplines that are demanding jobs.
    Click Here to View Full Article
    (Access to full article is available to paid subscribers only.)

  • "Robotic Pods Take on Car Design"
    BBC News (12/10/04); Sandhana, Lakshmi

    Toyota will spotlight prototype "personal mobility devices" as well as other assistive robotic machines and concept vehicles at the Expo 2005 next March. The prototypes exemplify the state of the art in wearable exoskeleton technology: The four-wheeled "i-unit" vehicle, which is fabricated from environment-friendly plant-based materials, can carry a single passenger along specially equipped lanes in an autopilot mode facilitated by intelligent transport system technologies, while the two-legged, joystick-controlled "i-foot" can manage a perambulating speed of approximately 0.83 mph and go up stairs. Toyota says these vehicles are designed for mass transport, but can also give independent mobility to disabled and elderly people. "They are clearly what we call concept vehicles, innovative ideas which have yet to be transformed into potential products and which are a few years away from actual production," notes Dr. David Gillingwater with Loughborough University's Transport Studies Group. Although he says the technology carries the biggest benefits for the elderly and mobility-challenged, the question remains whether such vehicles will appeal to their sensibilities, given design limitations and other factors. Dr. Erel Avineri with the University of the West of England's Center for Transport and Society says the devices need more refinement. For instance, the vehicles' interiors restrict older passengers' ability to rotate the neck and upper body, which is an important consideration when backing up. "In general, introducing a new technology requires the passenger to change behavior patterns that have served the older passenger for decades," Avineri says.
    Click Here to View Full Article

  • "Innovative Take-Off System Could Lead to Safer, Cleaner Air Travel"
    Innovations Report (12/07/04)

    Computer scientists at the University of Nottingham are developing a system that uses sophisticated computer models to assist runway controllers with the scheduling of aircraft for takeoff. Runway controllers primarily rely on their own observation and mental calculations in performing their jobs. The system would factor in aircraft size, speed, and route; cover aircraft taxiing to the airport holding point and those already waiting there; and respond quickly to changes. In addition to improving the efficiency and safety of aircraft scheduling, the system would help cut down on the time aircraft spend on the ground with running engines, which would reduce noise and fuel pollution, and save fuel. A computer-based system that reduces aircraft clearance delays by as much as 25 percent could result from the research, which is being funded by the Engineering and Physical Sciences Research Council and National Air Traffic Services, and is being run in conjunction with Heathrow Airport Air Traffic Control.
    Click Here to View Full Article

  • "Intelligence Platform at AI Conference"
    Scoop (NZ) (12/10/04)

    Next week will mark the launch of the NeuroComputing Environment for Evolving Intelligence (NeuCom2004) software platform for commercial and educational use at the Neurocomputing and Evolving Intelligence conference. NeuCom is the brainchild of the Auckland University of Technology's Knowledge Engineering and Discovery Research Institute (KEDRI), and KEDRI director Nik Kasabov classifies the platform as an evolving intelligence (EI) system based on his team's hypotheses and techniques of evolving connectionist systems (ECOS). Kasabov explains that EI systems "learn and improve incrementally starting with little knowledge and develop over time." NeuCom incorporates over 60 data analysis, data visualization, data mining, classification, forecasting, optimization, modeling and rule discovery, decision support, image processing, and information integration methods; the system runs on all Windows- or Linux-based computer platforms, and its application has been mapped out for education and for constructing intelligent systems in bioinformatics, business data analysis, adaptive control, medical decision support, and agriculture, among other areas. NeuCom is comprised of ECOS modules, some of which are being used for projects such as SIFTWARE, a gene expression data analysis system currently being employed experimentally by Pacific Edge Biotechnology. Another project that uses ECOS is a renal function evaluation system designed to provide an explanation for impaired kidney function. Kasabov says the system "is always adaptable, trainable on new data and extracts rules that help medical professionals give an accurate, personalized prognosis."

  • "Adapt or Die"
    ITWeb (South Africa) (12/09/04); Humphries, Fay

    Speakers at the Enterprise Architecture: Blueprinting the Future conference, which was held in Johannesburg in November, sounded a theme of "adapt or die." Meta Group's Willie Appel described an adaptive organization as one that "simultaneously evolved with the demand and structure of its markets in ways that ensured both survival and success," and that such a business needs an adaptive IT organization behind it. The key requisites for an adaptive business, said Meta Group's Brian Burke, include a speedy recognition of changes in market demand and structure, as well as an organizational readiness for change. This means that "a successful adaptive strategy is based on a killer business case, not a killer technology," he said. Keynote speaker Jan de Klerk, of Sanlam, said the key disciplines of enterprise architecture, IT governance, and information and risk management need to be integrated and interrelated in order to build an adaptive organization. Some challenges to creating an enterprise architecture, according to Absa Group's John Hayden, are the difficulty of explaining the enterprise architecture concept to business role players, and the need for enterprise architecture to deliver tangible value in terms of delivery of business projects. CSC Computer Sciences' Peter Deering said that about half of all IT projects are either abandoned or late or they fail to deliver, noting that the way around this is to balance both business and IT by making certain the enterprise architecture has business processes at heart. Craig Martin of SiloFX said that new ways of thinking are necessary to bridge the gap from planning to implementation, which will require a mapping between human understanding and technology; he advocated mapping human knowledge by using the branch of semantics known as ontologies.
    Click Here to View Full Article
    (Access to full article is available to paid subscribers only.)

  • "Ex-CIA Chief Gates Warns on Cyberterror"
    Associated Press (12/05/04); Easton, Pam

    Cyberterrorism is the most potent weapon for terrorists, said former CIA director and current Texas A&M University President Robert Gates, speaking at a two-day, Texas-based terrorism conference. Gates pointed out recent incidents where massive harm was inflicted by relatively simple incidents; in 2000, a teenager in the Philippines causing $10 billion in damage to the U.S. economy with a single "love bug" virus, and the recent Northeast electricity blackout was triggered by a single system failure in Ohio. The CIA and National Security Agency assigned 50 computer specialists to figure out how to disable the nation's electrical grid in 1998 and found it took only two days for the group to devise a plausible plan. If terrorists organize themselves to target U.S. computer systems, the results could be devastating. Gates admits that terrorism was easier to fight when terrorists were sponsored by governments, but now the terrorism movement is motivated by religion, is more revolutionary, and much harder to gauge. Gates says, "Terrorism is a global challenge that will take many forms and many years to defeat or contain" with an almost certainty that terrorists will again target America.
    Click Here to View Full Article

  • "Opening Up the Code"
    InfoWorld (12/06/04) Vol. 26, No. 49, P. 42; McAllister, Neil

    Major software vendors are offering open-source products and services, although not every effort is consistent in terms of approach or motive. Microsoft allows select customers to access the source code of certain products under a "look but don't touch" scheme; IBM has made its Eclipse developmental environment and Cloudscape embedded Java database open source in the hopes of encouraging the Java development community and reaping profits; and Computer Associates has open-sourced its Ingres database management system to serve as the basis for more sophisticated solutions that customers can put together without buying additional software licenses from rival vendors. Open source software movement founder Bruce Perens explains that customers, not software vendors, are the real drivers of vendors' adoption of open source, which indicates that there is no more room for vendor complacency. "It's up to us to adjust our business models to incorporate the reality [of open source projects] to celebrate their success, and to leverage them, and to continue to innovate at levels of the stack that our customers deem as being valuable," contends Doug Heintzman of IBM's software group. Perens estimates that probably 10 percent of the software companies develop internally or by contract is differentiating, which directly impacts the bottom line; the remainder is nondifferentiating software that only consumes money. He says open source development lowers the toll of nondifferentiator cost centers, allowing companies to channel some of their software budget to differentiators. "With open source...[w]e distribute the risk simply in the act of distributing cost, because we do the distribution of cost up front rather than when people buy a finished product," Perens notes.
    Click Here to View Full Article

  • "A New Metrics System for IT"
    CIO (12/01/04) Vol. 18, No. 5, P. 89; Bowen, Ted Smalley

    Improving IT productivity--and by extension business value--by measuring IT's "true" merit is a formidable challenge for CIOs, given that there are no standard IT performance metrics, although CIOs expect such standards to emerge within the next five years or so. A cross-departmental council at Cisco Systems devised metrics last year to rate the efficiency of the company's online ordering processing, and discovered that fewer than 30 percent of orders were being automatically sent to manufacturing because of the manual input necessitated by high error rates; Cisco CIO Brad Boston reports that the council traced the process, outlined metrics, and optimized the process so that the percentage of automatically routed orders was doubled in a matter of months. Productivity metrics have started to make inroads in the business process design and analysis arena, and CIOs see these value metrics as critical to improving operational management and communications and coordination with line-of-business and senior management. Traditional high-level productivity metrics include IT budget as a portion of total revenue, usually contrasted with peer firms in the same industry and the proportion of IT staff to total workforce; British Telecom CEO Sinclair Stockman says optimizing IT system benefits relies on "the efficiency and effectiveness of the process change, the training process, and the ruthlessness with which you actually go after the business benefits that were delivered as a result of the systems being deployed." Intel CIO Doug Busch says there is no good way to calculate IT productivity either for measurement or forecasting, and doubts that knowledge worker evaluations are as reliant on productivity metrics. Stockman points out that a lack of repeatability in IT projects and processes complicates the quantification of IT's business role. Similarly, the difficulty of measuring an internal IT staff's business benefits compared to the relative ease of assessing those of offshore software coders can bias managers toward offshoring.
    Click Here to View Full Article

  • "'DETER' Fills IT Security Testing Void"
    Government Security News (11/04) Vol. 2, No. 9, P. 16; DePompa, Barbara

    The federal government devised the Cyber Defense Technology Experimental Research (DETER) Network to address a shortage of independent testing and assessment of security technologies as they move from academic labs and small businesses to market. Defense Advanced Research Projects Agency (DARPA)-funded research into network security raised the need for a testbed to model the Internet's heterogeneous makeup over five years ago, which led to the development of DETER through a $10.8 million grant from the Homeland Security Department and the National Science Foundation, although a $20 million grant was originally recommended. The value of government testbeds stems from the need for independent results and cross-product integration, while the absence of a joint testbed to nurture next-generation integrated security solutions and the security tech market's lack of maturity means federal agencies must take the testing reins from private firms, who usually position tests "to optimize results, and meet short term, 'time to market' goals," notes Cyber Defense Agency CEO Sami Saydjari. The DETER testbed is managed by researchers at UC Berkeley and USC's Information Sciences Institute, and housed in three permanent hardware nodes in Berkeley, Los Angeles, and Arlington, Va. DETER exists apart from the Internet so that malware infections and other attack scenarios can be tested without endangering the Net. DETER operates around the clock and averages about 400 experiments daily, and Terry Benzel of USC-ISI reports that starting in 2005, DETER will be opened up to experimentation by government, academic, and industry researchers using malicious rather than benign code. Benzel envisions the testbed being accessible to experimenters anywhere thanks to a digital "dashboard" application, while DETER's wider acceptance among other federally-sponsored network testbeds may constitute an even greater challenge.
    Click Here to View Full Article

  • "Are Web Services Finally Ready to Deliver?"
    Computer (11/04) Vol. 37, No. 11, P. 14; Leavitt, Neal

    Standards organizations and industry consortia are working on Web services specifications, but without the presence of an all-encompassing authority, developers are unsure of what standards they will support in the long term, according to Evans Data analyst Joe McKendrick. Groups include the World Wide Web Consortium (W3C), whose early Web services specs often concentrated on low-level, core functionality; the Organization for the Advancement of Structured Information Standards (OASIS), whose focus has been on security, authentication, registries, business process execution, and reliable messaging; the Liberty Alliance, whose mission is to develop an open standard for federated network identity that complies with all existing and emergent network devices; and the Web Services Interoperability Organization (WS-I), which releases guidelines and tools to help developers create software enabled for existing Web services specs (the WS-I Basic Security Profile and WS-Federation, for example), and has been working to address interoperability problems by encouraging collaboration among Web services vendors. OASIS is examining BEA, SAP, and IBM's Business Process Execution Language for Web services spec as a possible business process automation standard, while a subgroup of the W3C is developing Web Services Choreography Description Language 1.0 as a standard set of rules governing the interaction of different components and their sequential arrangement. Of the two rival reliable messaging specs, WS Reliable Messaging and WS-Reliability, only the latter has been sent to a standards body, and Hitachi's Eisaku Nishiyama reports that proponents of both specs are attempting to arrive at a compromise. A survey from Evans Data indicates that developers are split nearly 50-50 on whether multiple competing standards could hinder Web services deployment, though W3C's Philipe Le Hegaret doubts this will stifle the adoption of Web services.
    Click Here to View Full Article

  • "Creating a More Intelligent Future"
    Futurist (12/04) Vol. 38, No. 6, P. 48; Wagner, Cynthia G.

    The theme of last summer's 2004 World Future Society conference, which covered subjects ranging from brain/mind studies to machine intelligence, was that foresight is essential to mankind's future, and is an attainable goal. Trends projected by futurologists at the conference included a reverse engineering of the human brain, which could yield such breakthroughs as an "intimate merger" of machines and human intelligence by 2029 that may lead to computer-enhanced experiences, new forms of art, and direct linkage between human minds. One of the more sinister predictions from British futurologist Ian Pearson is the creation within six to 11 years' time of an artificial life-form with its own belief system, one that is not subordinate to humans. This pessimistic forecast was countered by inventor Ray Kurzweil's hope for a symbiotic partnership between people and artificial intelligence: "The main solution to the perils of strong AI is supporting our values of liberty, openness, democracy, respect for diversity, and knowledge," he explained. Steven Bankes of the RAND Frederick S. Pardee Center for Longer Range Global Policy and the Future Human Condition described such a partnership when he talked about the potential creation of "robust strategies" using scenario-generating computer applications complemented by human plausibility studies. Work in the field of assistive machine intelligence was also detailed by Arlington Institute President John Petersen, who described the Digital Analysis Environment as a tool that will abstract clusters of terms employed in articles for particular projects. Intel's David Tennenhouse foresaw the advent of proactive computing, concluding that future computers "will anticipate our needs."
    Click Here to View Full Article

  • "IT Jobs Show Promise for Workers with Disabilities"
    Monster.com (12/04) Hoffman, Allan

    Workers with disabilities face a number of challenges in finding work, especially due to the fact many employers do not know how to accommodate an employee's disability. But according to disability experts, the IT industry holds promise for workers with disabilities because of the variety of technologies being used to help people cope with disabilities both at home and in the workplace. IT workers, along with IT departments and employers, may be among those most willing to push for technologically advanced solutions, say advocates for people with disabilities. Some employers are learning about these technologies as older workers develop disabilities, says Vicki Hanson, co-chair of ACM's Sprecial Interest Group on Accessible Computing (SIGAccess). "It's really the aging workforce that has people concerned these days," she says, referring to the growing number of older workers with visual impairments, hearing loss and other disabilities. "They want to keep those workers on the job, and that opens the employer's eyes." Often, the worker informs the employer about accommodations. "Employers on the whole aren't as aware of what they can or should do," says Hanson, who also is manager of the accessibility research group at IBM's T.J. Watson Research Center. "Often they rely on the employee." Efforts are under way to bring more workers with disabilities into IT and increase awareness of how employees with disabilities can be accommodated with use of assistive technologies.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM