CFP 2006 Explores Computer Freedom and Privacy
Issues
Association for Computing Machinery (03/17/06)
Amid debates over government surveillance of citizens, the amassing of
personal information databases, and Internet censorship, ACM's Computers,
Freedom and Privacy 2006 conference will explore the technologies at the
epicenter of these issues. The CFP 2006 conference, titled "Life, Liberty
and Digital Rights," will be held May 2-6, at the L'Enfant Plaza Hotel in
Washington, D.C. Panels, discussions, workshops, technical demonstrations
and keynote speakers will tackle many information technology security and
privacy issues culled from todays headlines. Participants include
internationally acclaimed experts on electronic voting, government
surveillance, Internet governance, digital rights management, adware and
spyware, and federal privacy laws, among others. The conference also
features tutorials, plenary and concurrent sessions, and provocative
Birds-of-a-Feather sessions. Highlights of the conference include the Big
Brother Awards and the EFFs Pioneer Awards, which recognize milestones as
well as dubious distinctions in the online world.
For more information about CFP, or to register for the conference, visit
http://www.cfp2006.org
Click Here to View Full Article
to the top
Association for Computing Machinery Honors International
Authority in Logic Programming, Artificial Intelligence
AScribe Newswire (03/16/06)
ACM has honored Jack Minker with the ACM/AAAI Allen Newell Award for his
advancements in logic-based computer science methods and his contribution
to scientific discourse. Minker has tirelessly advocated freedom and human
rights for scientists working in countries with oppressive regimes
throughout his career. Viewed as a founding father of deductive database
programming, Minker has edited and co-edited numerous books on that
subject, as well as logic programming and the potential application of
logic in artificial intelligence. Minker also led the fight to liberate
scientists Anatoly Shcharansky and Aleksandr Lerner from the Soviet Union,
and campaigned to deliver medical aid to Andrei Sakharov and his wife while
they were exiled in Gorky. Minker has served as vice chairman of ACM's
Committee on Scientific Freedom and Human Rights, published extensively in
Communications of the ACM, and received ACM's Outstanding Contribution
Award in 1985 for his contribution to human rights. Minker, an ACM Fellow,
has chaired the NSF's Computer Science Advisory Board and served as a
member of the NASA Study Group for Machine Intelligence and Robotics.
Minker will receive the Allen Newell Award at the annual ACM Awards Banquet
on May 20 in San Francisco.
For more information, visit
http://campus.acm.org/public/pressroom/press_releases/3_2006/newell.cfm<
br>
Click Here to View Full Article
to the top
Tech Industry Asks for Government Help
eWeek (03/15/06) Carlson, Caron
Citing fears that the United States is in danger falling behind emerging
countries such as China and India, Intel Chairman Craig Barrett and IBM's
John Kelly appealed to lawmakers at a March 15 hearing of the Senate
Committee on Commerce, Science, and Transportation to initiate legislation
that will bolster science and math education, improve broadband deployment,
and elevate the budgets of the NSF and other research groups. Sen. John
Ensign (R-Nev.) said that 90 percent of the scientists and engineers in the
world will live in Asia by 2010. Ensign and Sen. Joe Lieberman (D-Conn.)
are sponsoring legislation to address the industry's concerns, and Sen. Max
Baucus (D-Mont.) is expected to introduce his own legislation this year
dealing with education, energy reform, and health reform. Norman
Augustine, retired CEO of Lockheed Martin, told the panel that the current
condition was a long time in the making, and that it would not be resolved
overnight. Augustine highlighted the industry's declining focus on
research as companies are motivated increasingly by short-term profits and
are funneling more money into product development than basic research.
"Industry is abandoning slowly the R part of R&D," Augustine said, calling
for government to pick up the slack.
Click Here to View Full Article
to the top
Internet Panel Mulls Defenses Against New, Potent
Attacks
Associated Press (03/16/06) Bridis, Ted
A new form of cyberattack being dubbed by some as "distributed reflector
denial of service" that focuses on the computers that help direct Internet
traffic worldwide will be a focus of ICANN's security committee at its
upcoming meeting in New Zealand. The attacks, though similar in nature to
typical denial of service ones, are far more potent, requiring fewer hacked
computers to launch and much simpler to amplify. Researchers have detected
around 1,500 such attacks first launched late last year that briefly
shuttered commercial Web sites, large ISPs, and leading Internet
infrastructure firms. VeriSign chief security officer Ken Silva said that
attacks earlier this year used just 6 percent of the Internet's more than a
million name servers to flood networks but that the attacks in some
instances outpaced 8 gigabits per second, a mega-assault by typical
standards. ICANN security committee head Steve Crocker says, "It's like
they built a better bomb by having it enriched." Columbia University
Internet researcher Steven M. Bellovin says, "A lot of this stuff will take
a while to clean up.'' Possible fixes to vulnerabilities include filters
that block out forged data traffic and new limits on specialized name
server computers.
Click Here to View Full Article
to the top
Google Prevails in Copyright Fight
Wall Street Journal (03/17/06) P. B4; Delaney, Kevin J.
A lawsuit accusing Google of copyright infringement, defamation, and other
instances of wrongful conduct was dismissed by federal Judge R. Barclay
Surrick on March 10. Internet publisher Gordon Roy Parker filed the
lawsuit in U.S. District Court in Philadelphia, alleging that Google's
archiving of copyright material Parker posted on the Usenet community of
electronic bulletin boards constituted a breach of copyright, and that its
inclusion of excerpts from his site in its search results was an act of
copyright infringement. Surrick maintained in his ruling that those
activities, along with Google's caching of Web pages, was not infringement,
and he cited a January decision in Nevada District Court supporting
Google's practice of making copies of cached Web pages accessible to users
via its search results. Thelen Reid & Priest attorney William Patry noted
that the Parker decision dismissed the claims of copyright infringement
without raising the issue of "fair use." However, some legal experts said
Parker's ruling did not establish binding precedent by definition, and they
did not concur on whether his decision demonstrates any trend in judicial
analysis, when weighed against other recent court opinions in lawsuits
filed against Google. Last month, a Los Angeles federal judge concluded
that Google's image-search service probably infringed on the copyrights of
the Perfect 10 adult-entertainment company by displaying thumbnails of its
images, but he dismissed Google's liability when users clicked on the
images and gained access to third-party sites showing pictures purloined
from Perfect 10. Google litigation counsel Michael Kwun declared that
Surrick's decision establishes the consistency between Google's service and
copyright law principles, but Parker, who plans to appeal the ruling,
claimed that Google's "entire business model is based on freeloading on
other people's content."
Click Here to View Full Article
- Web Link May Require Paid Subscription
to the top
Model-Driven Development, AJAX Shortcomings Aired
InfoWorld (03/15/06) Krill, Paul
While acknowledging the popularity of AJAX and model-driven software
development, speakers at the SD West 2006 conference agreed that both
technologies are far from mature. Microsoft's Jack Greenfield said that
model-drive development needs to integrate more fully with patterns and
frameworks. "I don't believe in high-level models where I push a big red
button and it generates a lot of stuff that I'm expected to live with,"
said Greenfield. Modeling can also suffer from a knowledge gap if only one
or two people on a development team are familiar with the technology.
Greenfield argues for the adoption of best practices to enable modeling to
fulfill its promise of creating systems without reinventing existing
technologies. Greenfield argued that OMG Model Driven Architecture does
not actually have an architecture, relying instead on the Unified Modeling
Language. The speakers and attendees noted that metadata management and
hiring practices are critical to the successful deployment of model-driven
development. When asked about using Eclipse technology in modeling,
Compuware's Joe Kern said the MetaObject Facility has more power and
enables more functions than Eclipse. In a panel discussion of AJAX, an
exchange between an audience member and a presenter focused on the
shortcomings of JavaScript. The presenter, author Christian Gross, agreed
that the current version of JavaScript suffers from limited extensibility,
maintainability, and enforcement, but noted that an unreleased JavaScript
2.0 that resolves many of those problems exists, though he could not
explain why it has not yet been released. Gross also noted that the AJAX
vendors are out of touch with the community of users.
Click Here to View Full Article
to the top
H-1B Visa Cap Hike Sought in Immigration Bill
Computerworld (03/15/06) Thibodeau, Patrick
The Comprehensive Immigration Reform Act of 2006 includes a provision to
increase the H-1B visa cap from 65,000 to 115,000 and make it easier for
foreign nationals with advanced degrees to gain permanent residency.
Although the bill would eliminate the 20,000 H-1B visas for advanced degree
holders, there are provisions that would increase the 115,000 cap once it
has been reached. The U.S. Senate Judiciary Committee is debating the
bill, which could make its way to the full Senate as early as the end of
March. However, the fate of the bill is uncertain because of the wide
range of issues it covers, including immigration policy and security, which
makes it controversial. Advocates of increasing the number of H-1B visas
say they will look to another bill if the legislation fails in the Senate.
The Institute of Electrical and Electronics Engineers opposes an increase
in the cap, but believes the permanent residency process should be eased
for foreign workers. The U.S. Bureau of Citizenship and Immigration
Services will start accepting H-1B applications for the fiscal year 2007 on
April 1, and the cap is expected to be reached at the record-setting pace
of a year ago--in five months--or faster.
Click Here to View Full Article
to the top
Let Me Hear Your Body Talk: UH Scientists Mine Biomedical
Data
University of Houston News (03/15/06)
A team of five researchers from the University of Houston is attempting to
train computers to obtain health information from their users in an
NSF-funded study. Through computer-powered non-invasive imaging
applications, the researchers are studying brain activity, human learning,
and cognitive impairment, as well as facial-expression analysis and
biometric security. "The project will involve a hybrid software system
designed to acquire, analyze, integrate, securely store, and visualize
large volumes of data obtained from a human subject in real time," said
George Zouridakis, associate professor of computer science and project
leader. Zouridakis and his team are building on existing information
technology practices to develop software tools for practical application in
biomedicine. Each of the five researchers has a different area of
specialization and works in a different lab. The grant is designed to
bring their diverse perspectives together, such as Zouridakis' work with
dense-array scanners to analyze the electrical, magnetic, and infrared
features of brain activity and computer science professor Marc Garbey's
work in high-performance computing and computational life sciences.
Associate professor Ioannis Kakadiaris is the founder and director of the
Computational Biomedicine Lab, home to pioneering research in
cardiovascular informatics and multispectral biometrics. Associate
professor Ioannis Pavlidis directs the Computational Physiology Lab, and
has developed a computer system to conduct touchless physiological
monitoring. Assistant professor Ricardo Vilalta's research has focused on
massive data analysis in the hopes of extracting meaningful patterns. The
researchers will collect data from test subjects with sophisticated sensing
systems such as thermal cameras, multimodality brain activity scanners, and
3D geometry video cameras.
Click Here to View Full Article
to the top
High-Def's Got Nothing on This Machine
Collegiate Times (03/15/06) Berger, Michael
Researchers at Virginia Polytechnic Institute's Center for Human Computer
Interaction are trying to create a monitor system containing as many pixels
as possible in an attempt to optimize the amount of information displayed
on a set amount of space. Assistant professor Christopher North leads the
Gigapixel Lab, which has already developed a display that contains more
than 30 million pixels and spans across 24 reconfigurable monitors, as well
as a rear-projection system that includes 18 monitors. North says the lab
already has the equipment to build a 50-touch screen display. North and
his team have developed new hardware and programs to interact with the
displays, such as mice that can quickly navigate across screens and a 3D
tracking system that uses overhead cameras and gloves to direct the cursor
across the screen in accordance with the user's movements. Andrew Sabri, a
senior computer science major, is adapting open-source code to run on the
displays. So far, he has modified a version of Warcraft and made Quake run
on the 24-monitor display.
Click Here to View Full Article
to the top
Coding Tool Is a Text Adventure
Wired News (03/15/06) Norton, Quinn
During the O'Reilly Emerging Technology Conference in San Diego last week,
developer Matt Webb introduced a new software tool that enables programmers
to contribute to the development of code in a collaborative environment.
Webb says the difficulty of solving programming problems when he often does
not work in the same physical location as his partner Jack Schulze prompted
the development of the collaborative programming environment. The tool,
called playsh, is based on the popular multi-user domains (MUDs) of the
early 1990s, and is similar to old text games like "Zork" in that
descriptions such as "north" need to be typed to move north, and "look"
would need to be typed to examine an object. Users do not interact with
graphics, but rather the program code, data, and hardware devices. "It
treats the Web and APIs as just more objects and places, and is a platform
for writing and sharing your own code to manipulate those objects and
places," says Webb. The tool resembles the customizable form of a MUD
known as a MOO (MUD object-oriented), which allows participants to program
objects into a virtual world to create them and develop a game as they go
along. Written in Python, playsh provides a basic description of rooms and
a Python interpreter that users in rooms can access, as they contribute to
the code and interact with objects and each other.
Click Here to View Full Article
to the top
In Search of a True Multimedia Experience
IST Results (03/14/06)
In an effort to manage the competing demands of power consumption and
picture quality when streaming data, the IST-funded BETSY project is
developing a methodology and implementation framework to optimize the
trade-offs. Set to conclude in February 2007, the project intends to
stream multimedia content to wireless mobile devices built to conform to
network conditions and the availability of terminal power, reducing energy
consumption by as much as 20 percent. "Today, the quality of audio-video
and gaming in prototypes of wireless networked embedded devices is not
comparable to the high quality that people are used to from their
traditional TV and audio sets," said Harmke de Groot of Philips Research,
who also serves as project coordinator. "The project's results will enable
users to enjoy multimedia experiences with freedom of movement in a
networked home or hotspot." Multimedia streaming technology is in
increasing demand as homes are becoming networked environments and a mobile
consumer base is increasingly dependent on wireless hotspots. BETSY
evaluated scenarios that mirror typical networking occurrences, analyzing
such factors as energy restrictions, shared bandwidth, and wireless network
links. De Groot says the project seeks to improve the overall quality of
the streaming framework through the use of the video model. The
researchers are exploring how to control breezes, the groups of units that
control data streams, potentially spanning different formats and devices,
that do not need to be synchronized with streaming. BETSY has the
potential to modify the configuration of the functional components of
access points and base stations, alter breezes to react to external
distortions, and oversee the load distribution and prioritization in the
networked home. Project participants are also designing a software control
framework to facilitate run-time trade-offs.
Click Here to View Full Article
to the top
Researchers: Impact of Censorship Significant on Google,
Other Search Engine Results
Network World (03/15/06)
Country-specific search engines that have free-speech restrictions often
produce different results for searches, according to researchers from
Indiana University. Filippo Menczer, associate professor of informatics
and computer science, and Mark Meiss, a computer science doctoral student,
are behind the CenSEARCHip project, which comes at a time when Google,
Yahoo!, and MSN are developing different versions of their search engines
for specific countries. Menczer and Meiss have set up a Web site that
details the differences in the query results generated by such search
engines, and provides side-by-side query results. Meiss says conducting a
search on political topics such as human rights and democracy will lead to
different result in queries. Although the U.S. search site would provide
text references and images of the Chinese government crackdown on
protestors, in response to a query on Tiananmen Square, the Chinese site
would deliver results primarily for hotel and tourist information. "We
wanted to explore the results returned by major search engines and in doing
so to foster an informed debate on the impact of search censorship on
information access throughout the world," says Menczer.
Click Here to View Full Article
to the top
Mobiles May Beam Cheap Broadband to Bush
Australian Broadcasting Corp. News (03/13/06) Salleh, Anna
New-generation personal computers and mobile phones have the potential to
bring more affordable wireless broadband services to areas with low
population numbers, according to Dr. Mehran Abolhasan of the University of
Wollongong. Abolhasan, a telecommunications and computer engineering
specialist, is the head of a project that will attempt to use an ad-hoc
network to bring cheap broadband access to an indigenous community in
Western Australia during the second half of the year. The ad-hoc network
will make use of small portable computer devices, such as PCs and mobile
phones, to transmit and receive microwave signals, and act as nodes for a
communications network. The use of inexpensive technology to communicate
wirelessly between units and free open-source software would keep the cost
down for broadband in remote communities, according to Abolhasan. He also
likes the decentralized approach because the devices would still be able to
communicate with each other and re-route messages if a unit breaks down.
Abolhasan believes the ad-hoc network would be able to extend existing
infrastructure such as a satellite network to individual homes, and
facilitate communications locally such as the operation of a broadband
television station from a community center. Such networks need little
power and can run on solar energy. "If it works in the outback, it should
work just about anywhere," Abolhasan says of technology he believes could
be available within three to five years.
Click Here to View Full Article
to the top
Calit2 Researchers Deploy Disaster Communications Network
at San Diego Mardi Gras Festivities
UCSD News (03/13/06) Curran, Maureen C.; Ramsey, Doug
Researchers from the California Institute for Telecommunications and
Information Technology (Calit2) teamed up with San Diego law enforcement to
create a wireless mesh network, stringing together wireless boxes, cameras,
laptops, cell phones, and a satellite dish to provide real-time information
to first responders during Mardi Gras festivities. The tests of the
network during Mardi Gras proved that it could be used to disseminate video
feeds and other information during an emergency or disaster. "This was the
real world converging with research, prototyping, developing, and improving
tools," said Calit2 UCSD director Ramesh Rao, a professor of electrical and
computer engineering in the Jacobs School of Engineering. To simulate
disaster conditions, the researchers acted as if the communication network
in a 24-block area of downtown San Diego was already down when setting up
the network. Each camera installed contained a networking box to link back
to the police command center, and police can monitor the video feeds on
their cell phones. The small screen made it difficult to see in great
detail, but the camera feeds to the police command posts were of high
quality. Police also tested a wireless system for tracking the locations
of fellow officers and equipment. While inclement weather forced the
researchers to outfit the equipment with hastily assembled rain gear, the
system met their expectations and all the devices functioned
satisfactorily.
Click Here to View Full Article
to the top
Beyond Benchmarking
HPC Wire (03/17/06) Vol. 15, No. 11,Feldman, Michael
The Performance and Architecture Lab (PAL) at Los Alamos National
Laboratory (LANL) is exploring new methods to analyze and predict the
performance of supercomputers. Predicting and calibrating performance can
help inform budget and procurement decisions at LANL, as well as its
government sponsors, such as DARPA and NNSA. PAL team leaser Adolfy Hoisie
notes that benchmarking is no longer sufficient to deal with the
complexities of performance modeling that must now address application
workload, hardware architecture, and the operating system. PAL can model a
broad array of clustered systems containing various processors,
interconnects, and other pieces of hardware. Hoisie says that his
researchers discovered that the LANL's ASCI Q supercomputer was only
operating at half its potential capacity and located the specific sources
of performance degradation. The model is so thorough because it tracks the
performance of complete applications to produce a model of optimal
performance. LANL focuses particularly on computational biology,
astrophysics, and global climate modeling. LANL researchers have
specifically been looking at ways to make hardware accelerators more
heterogeneous. With its multicore architecture, IBM's Cell Broadband
Engine was an early example of an integrated heterogeneous system. Hoisie
predicts that the next crop of petaflop machines will continue to comprise
clustered nodes of steadily increasing processor counts and new high-speed
interconnect fabric. Working closely with IBM, PAL also assessed the
future of the Blue Gene architecture. While the researchers continue to
center their attention on Linux, they are considering a future where
today's Linux operating model is impractical for systems containing
thousands of processors.
Click Here to View Full Article
to the top
One Language to Bind Them All
Software Development Times (03/01/06)No. 145, P. 27; O'Brien, Larry
A battle for a new form of programming--.NET programming--will be fought,
using the C# programming language as the battleground. C# is expected to
be the language that is most tightly calibrated to the underlying Common
Language Infrastructure (CLI) platform, allowing the language to continue
its domination of .NET programming. Language Integrated Query (LINQ) will
largely reign over the next iteration of C# and the transition to .NET
programming. But even prior to that, Microsoft's WinFX, the
next-generation application programming interface (API) for Windows
operating systems, will spotlight the CLI and C#. The integration of the
CLI and SQL Server 2005 functioned as the testbed of C# and the CLI's
ability to be employed in the toughest environments, while sources within
Microsoft say the needs of the SQL server team helped the CLI and Base
Class Library teams address quality and performance issues and create
enhancements. The integration demonstrates that C# and managed code
subsystems can be blended into massive codebases with challenging
performance requirements. Managed code will be the platform of choice for
the bulk of Windows development with the emergence of WinFX.
Click Here to View Full Article
to the top
Hollywood: The Revenge
New Scientist (03/11/06) Vol. 189, No. 2542, P. 42; Fox, Barry
Accompanying the release of new blue-laser discs that can record and store
high-definition movies is renewable copy protection, a means for Hollywood
studios to update copy safeguards on consumer disc players with no need for
a phone line or Internet link. Playing a disc triggers the duplication of
encrypted data onto the player and the upgrade of anti-piracy software, but
people warn that unforeseen consequences could provoke a consumer backlash.
The possibility that the technology may unintentionally render players
incapable of playing new or favorite discs is one fear, but such problems
will not become evident until the technology is widely used. The
Electronic Frontier Foundation's Seth Schoen reports that two issues should
be of concern to customers: The risk of accidental equipment breakage, and
the risk that the entertainment companies will elect to remove
functionality that the product has at the time of purchase. Cryptography
Research offers Self-Protecting Digital Content (SPDC) as an alternative to
renewable copy protection: SPDC conceals a computer program on every disc
that interrogates the player's hardware and software before the disc is
played, and temporarily halts playback and displays movie studio contact
details on the screen if anything anomalous is detected.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
The Elusive Goal of Machine Translation
Scientific American (03/06) Vol. 294, No. 3, P. 92; Stix, Gary
Software developers contend that machine translation (MT) is starting to
approach human-level performance thanks to brute-force computing
techniques. Slow progress in this area since the first MT experiments in
the 1950s led to a scarcity of funding and enthusiasm, while Systrans, the
largest MT company currently in existence, saw only $13 million in annual
revenue for 2004 because of the shortcomings of its rules-based system.
Such systems require language specialists and linguists in specific
dialects to arduously produce large lexicons and rules relating to
semantics, grammar, and syntax. Statistical MT uses brute-force
calculation to crunch through existing translated documents to ascertain
the probability that a word or phrase in one language corresponds to
another. Using statistics to gauge how frequently and where words occur in
a given phrase in both languages provides a word reordering template for
the translation model. A language model uses its own statistical analysis
of English-only texts to predict the most likely word and phrase ordering
for the already-translated text; thus, the probability that a phrase is
correct directly reflects how often it occurs in the text. The differences
between statistical MT and rules-based MT are fading slightly as
statistical MT researchers have begun to employ methods that account for
syntax, and that eliminate the intercession of linguists. Nevertheless,
"The use of statistical techniques, coupled with fast processors and large,
fast memory, will certainly mean we will see better and better translation
systems that work tolerably well in many situations, but fluent
translation, as a human expert can do, is...not achievable," says Keith
Devlin of Stanford University's Center for the Study of Language and
Information.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top