Casting Ballot From Abroad Is No Sure Bet
New York Times (06/13/07) P. A1; Urbina, Ian
The Defense Department has spent more than $30 million over the past six
years trying to find a way to help the 5 million American soldiers and
civilians living abroad vote securely and efficiently, but no clear
solution has been found so far. The Pentagon's current Web-based system is
slow and confusing and filled with security and privacy problems, according
to security experts and Congressional auditors, who say the system is
vulnerable to undetectable hacking and vote tampering. Only 63 voters used
it in the 2006 election to request and return ballots via the Internet,
according to the Defense Department. Although the department is
responsible for helping all overseas voters, civilians are not allowed
access to the system. The traditional paper ballots sent to overseas
voters have also caused problems, as voters often wait until the last
moment to return the ballot, get confused because rules and deadlines vary
state to state, and ballots are often lost or delayed in the mail. The end
result is that anywhere from a quarter to half of all overseas voters fail
in their attempt to vote, according to voting experts at the National
Defense Committee and the Overseas Vote Foundation. Military officials say
their voting system is merely a means of expanding options beyond the use
of regular mail, and that voting assistance officers have been placed in
military units worldwide. Department of Defense spokesman Steward Upton
did not answer questions about the military's system, except to say that
the Pentagon has asked for suggestions from the private sector on how to
improve the system before the next election.
Click Here to View Full Article
to the top
House Committee Begins Examination of Offshore
Outsourcing's Impact
Computerworld (06/12/07) Thibodeau, Patrick
At a congressional committee hearing focusing on the impact of IT
offshoring, U.S. Rep. Bart Gordon (D-Tenn.) warned that soon the best jobs
may all be located overseas. "If current trends continue, for the first
time in our nation's history, our children may grow up with a lower
standard of living than their parents," said Gordon, chairman of the House
Committee on Science and Technology. The hearing included testimony from a
four-member panel of policy analysts and industry representatives, and was
the first in a series of "fact-finding explorations" on the impact
outsourcing is having on U.S. workers and the domestic economy. Martin
Baily, chairman of the Council of Economic Advisers during the Clinton
administration, said the United States is benefiting from the influx of
talented scientists and technologists who want to study and work in the
United States, but that such foreigners are not treated well by the U.S.
government. Baily said the United States needs to do more to strengthen
its educational systems and encourage students to major in science and
technology. Princeton University economics professor Alan Binder testified
that the outsourcing of IT services jobs is still in its infancy, but that
it is impacting a demographic (white collar workers) that is not accustomed
to competing with low-cost overseas labor. "This strikes me as a
potentially potent political group," Binder said. Binder said the U.S.
government needs to adopt policies to make the transition easier and to
minimize the negative impact of outsourcing, including improving
unemployment pay, health insurance, and job training for displaced
workers.
Click Here to View Full Article
to the top
Bailey Organizes Grant Program for Faculty of High
Minority Enrollment Institutions
Hamilton College (06/12/07)
Various ACM special interest groups have matched a grant that Mark Bailey,
an associate professor of computer science at Hamilton College, has secured
from the National Science Foundation to help faculty from colleges with
large numbers of minority students attend this week's Federated Computing
Research Conference in San Diego. Bailey manages the program, which will
provide support for travel so that the faculty members will be able to
attend one of the 17 research conferences. The program hopes faculty
members' attendance will help serve as a catalyst for the development of
research programs at their schools, as well as to learn how to attract more
students from traditionally underrepresented groups to computer science.
Click Here to View Full Article
to the top
Purdue Creates Scientifically Based Animation of 9/11
Attack
Purdue University News (06/12/07) Tally, Steve
An animated simulation of the attacks that toppled the towers of the World
Trade Center on Sept. 11, 2001, has been created by Purdue University
researchers so that structural engineers can study the buildings' collapse
in order that future disasters may be avoided. "Scientific simulations
restrict us to showing the things that are absolutely essential to the
engineer," explains Rosen Center for Advanced Computing director Christoph
Hoffmann. "This gives us a simulation that doesn't deliver much visual
information to a layperson. Our animation takes that scientific model and
adds back the visual information required to make it a more effective
communication tool." The new animated visualization owes a lot to computer
science professor Voicu Popescu, who devised a translator application that
establishes a connection between computer simulations and computer
visualization systems to automatically render simulation information as a
three-dimensional animated scene. The animation clearly represents
elements, such as fire and smoke, that were not included in the scientific
simulation, imbuing the computer model with a previously absent level of
realism, according to Popescu. The visualization shows that most of the
damage to the towers was caused by the weight of the fuel carried by the
aircraft that slammed into the buildings, and not the aircraft themselves.
The National Science Foundation partially funded the Purdue research.
Click Here to View Full Article
to the top
Tech Companies Set Goals for Energy Efficiency
IDG News Service (06/12/07) Gohring, Nancy
The Climate Savers Computing Initiative is a coalition of some of the
largest technology companies committed to saving energy by improving the
power efficiency of the equipment they make and use. Companies committed
to the program include Google, Microsoft, Intel, Hewlett-Packard, Dell, and
Sun Microsystems. The group plans to improve power efficiency for computer
and servers and encourage end users to apply underused power management
techniques. Google's Urs Holzle said only about 50 percent of the power
that leaves the outlet reaches a PC because energy leaks out of inefficient
power cords. Climate Savers has established a series of standards for
power efficiency in servers and PCs that members are suggested to adopt by
July 2010. A more efficient power cord for a PC would cost about $20 more,
and a power efficient server would cost an additional $30, according to
Intel's Pat Gelsinger. Gelsinger said that over time the cost premium will
drop as volume production increases, and that end users will save on energy
bills, also helping to offset the cost. Climate Savers also will work to
educate and encourage end users to take advantage of power management
mechanisms built into PCs. "Ninety percent of PCs are capable but aren't
utilizing power management techniques," Gelsinger said. Climate Savers
standards for improving power supply efficiency and power use management
techniques would reduce global carbon emissions by 54 million tons per
year, and would save a projected 62 billion kilowatt hours of energy in
2010, worth about $5.5 billion in energy costs, according to the group.
Click Here to View Full Article
to the top
ACM GIS 2007 CFP Extended
O'Reilly Radar (06/12/07) Forrest, Brady
The 2007 ACM International Symposium on Advances in Geographic Information
Systems is scheduled for Nov. 7-9, 2007, in Seattle. ACM says the series
of symposia and workshops will foster interdisciplinary discussion and
research among researchers, developers, users, and practitioners involved
with novel systems based on geospatial data and knowledge. ACM GIS offers
a forum for original research contributions involving all conceptual,
design, and implementation aspects of GIS, extending from applications,
user interface considerations, and visualization to storage management and
indexing issues. The event has been held since 1993, but for the first
time it will be held apart from its long-time host conference, in part to
give it more name recognition in the GIS community. The event may appear
to be academic, judging from last year's program, but there is a wide range
of topics and they are very fascinating, and Microsoft and Google are
sponsors. Topics include image databases and 3D spatial modeling under the
category of modeling and querying, computational geometry and spatial data
mining under systems and implementation, and earth observation and
geosensor networks under applications. Reaching out to blogs might
generate some real-world discussion. For more information about the ACM
GIS 2007, visit
http://www.cise.ufl.edu/dept/acmgis2007/
Click Here to View Full Article
to the top
International Team Rebuilds Ancient Rome Digitally
UVA Today (University of Virginia) (06/11/07) Ford, Jane
An international team of archaeologists, architects, and computer
specialists from Italy, the United States, Britain, and Germany have
finished the largest, most complete simulation of a historical city ever
created. The simulation, called "Rome Reborn 1.0," is the result of a
10-year project, based at the University of Virginia and started at the
University of California, Los Angeles. The simulation was created using
the same high-tech tools that are used for creating simulations of modern
cities, such as laser scanners and virtual reality. Rome Reborn 1.0 is a
3D model of Rome in 320 A.D. that runs in real time, allowing users to
navigate with complete freedom, even entering important buildings such as
the Roman Senate House, the Colosseum, or the Temple of Venus and Rome, the
largest place of worship in Rome. Rome Reborn 1.0 can be updated to
include new information as discoveries are made, and future releases of the
program will include other places in the evolution of the city from the
late Bronze Age in the 10th century B.C. to the Gothic Wards in the 6th
century A.D. Virtual modeling has allowed historians, archaeologists, and
scientists to recreate buildings and monuments that no longer exists or to
digitally restore sites that have been damaged. The models can be used to
test new theories or to take students on virtual tours of historical sites.
Rome Reborn project director Bernard Frischer, director of the Institute
for Advanced Technology in the Humanities at the University of Virginia,
says, "This is just the first step in the creation of a virtual time
machine, which our children and grandchildren will use to study the history
of Rome and many other great cities around the world."
Click Here to View Full Article
to the top
New Games Blur Reality, Fantasy
Associated Press (06/12/07) Bluestein, Greg
The latest advances in virtual reality gaming feature sleeker, more
mobile, and more environmentally interactive experiences based on augmented
reality (AR). For example, University of South Australia researchers have
created a version of the popular shooting video game Quake that allows
users with a wraparound visor and backpack to walk around streets and fight
superimposed computer objects. The University of Singapore has created a
human Pac-Man game that places virtual yellow dots along city streets,
letting users play as either Pac-Man or one of the ghosts. Mark
Billinghurst has created an animated children's book that turns into a 3D
popup, changing with each page when viewed through head-mounted goggles.
Billinghurst has also created an AR tennis game that allows players to use
their cell phones as rackets on a virtual court superimposed on a regular
table. "Within five years people will be able to easily experience
augmented reality applications on their mobile phones, in their homes,
schools, hospitals, workplace and cars," Billinghurst says. "One of the
most exciting things is that the current generation of mobile phones have
the processing power, display resolution, and camera quality necessary to
provide compelling AR experiences." Georgia Tech researchers are
developing AR Facade, an AR game that simulates an argument between a
married couple. The player can chose to do nothing, try to settle the
situation, or intensify the argument. As the player talks, a researcher
types the words into a computer behind the set, which uses complex
algorithms to determine the virtual character's responses. Georgia Tech
human-centered computing Ph.D. student Steven Dow says the purpose is to
gain a better understanding of how humans and computers interact.
Click Here to View Full Article
to the top
Luring the Other 68 Percent Through the IT Door
eWeek (06/08/07) Perelman, Deborah
Penn State researchers report that women are underrepresented in the IT
industry, and that increasing the number of women in the field would
increase the ranks of the overall workforce and promote diversity.
According to the U.S. Bureau of Labor Statistics, in 2004 women accounted
for almost 60 percent of the U.S. labor force, but only 32 percent of the
IT workforce. A 2005 Information Technology Association of America study
also found that women who leave the IT industry are less likely to return
than their male counterparts. The new research paper, "What Do Women
Want?: An Investigation of Career Anchors Among Women in the IT Work
Force," found that a recruiter's typical sales pitch emphasizing job
promotion and security is not as attractive to women as it is to men.
Professor of information sciences and technology Eileen Trauth says,
"Human-resources personnel need to recognize that women have diverse values
and motivations throughout their careers and tailor hiring and retention
practices to fit those needs." Although researchers found that women's
career anchors were fairly constant throughout their careers, some were
subject to variation. For example, those that valued technical competence
early in their careers generally placed value on it later in their careers,
while lifestyle factors, such as the desire to balance life and work, were
important to women with young children but less so as the children aged.
"Addressing women's under-representation not only will help tackle the
anticipated IT worker shortage but will help foster a diverse work force, a
cornerstone of both innovation and economic development," Trauth says.
Click Here to View Full Article
to the top
Hardware Designed to Protect Data From Theft By
Hackers
Chicago Tribune (06/11/07) Van, Jon
In an effort to make computers more secure and reliable, University of
Illinois at Urbana-Champaign researchers have been working for more than a
year on the Trusted ILLIAC project, an effort to develop hardware that is
capable of configuring itself to give each application a unique signature.
The hardware cannot be reprogrammed by hackers and creates a barrier to
protect sensitive data. "Hackers cannot reprogram it, and even insiders
cannot access this data," says Ravi Iyer, chief scientist of the
university's Information Trust Institute. "If they try to access it, they
crash the application. They cannot corrupt it or even touch it." The
National Science Foundation provided funding for the project, and
university researchers also worked with researchers from Motorola, IBM,
Hewlett-Packard, and Intel. Iyer says prototypes of the hardware could be
made into cards that could be inserted into computers, but incorporating
the hardware in processors is a more likely use of the technology.
Click Here to View Full Article
to the top
That Price Tag Should Make Them Think Twice
Washington Technology (06/08/07) Lipowicz, Alice
Computer expert Peter G. Neumann suggested the federal government should
be cautious about implementing the Electronic Employment Verification
System (EEVS) on a larger scale, during a congressional hearing before the
House Subcommittee on Social Security. Speaking on the behalf of ACM's
U.S. Public Policy Committee, Neumann testified that vulnerability to a
breach should be a concern, and that the consequence of identity theft
would be devastating considering the verification system would contain all
the main personal identifiers that are used in the United States. "Any
compromise, leak, theft, destruction or alteration of the data would have
severe consequences to the individuals involved, including, but not limited
to, identity theft and impersonation," said Neumann, principal scientist in
the computer science laboratory at SRI International. Security risks
related to transmission of information, accountability for access to
information, scalability to handle the increase in user volume, and
accuracy of data are potential vulnerabilities, he explained. EEVS is an
immigration control measure of President Bush, and employers would use the
database to check Social Security numbers of workers. Bush wants to expand
the verification system and make it mandatory for all 5.9 million
employers. The verification system could cost $370 million to $470 million
a year, reported Richard Stana of the Government Accountability Office.
For more information about the hearing, visit
http://www.acm.org/usacm
Click Here to View Full Article
to the top
IU Data Capacitor Enables Collaboration for Faster,
Better Science
Indiana University (06/12/07)
Indiana University's Data Capacitor demonstrated a single client transfer
rate of 977 MB per second across the TeraGrid network. The data was copied
from a single computer equipped with a 10 Gigabit Ethernet card at Oak
Ridge National Laboratory to the Data Capacitor at IU's Bloomington campus.
The demonstrated transfer represents nearly 80 percent of the 10 Gigabit
network's theoretical capacity, according to Stephen Simms, who gave a
presentation on the Data Capacitor. "This technology has the potential to
significantly change how scientists collaborate across distance," Simms
says. Since launching the Data Capacitor in April, the file system has
supported several high-profile projects, including the Linked Environment
for Atmospheric Discovery Science Gateway, which provides meteorological
data, forecast models, and analysis tools, as well as the international
federation of crystallography labs under the Common Instrument Middleware
Project. Data Capacitor principal investigator Craig Stewart, the
associate dean of research technologies and chief operating officer of the
Pervasive Technology Labs at IU, says, "The wide-area capabilities we have
demonstrated for the Data Capacitor and the TeraGrid will enable IU to
better support scientific workflows--the end to end transformation of data
into knowledge through use of advanced cyberinfrastructure."
Click Here to View Full Article
to the top
DAC Panelists Call for IP Reuse Standards
EE Times (06/08/07) Mokhoff, Nicolas
Standards for silicon intellectual property (IP) was a focus of a panel
discussion at the Design Automation Conference on Thursday. Most
participants said IP reuse standards were needed, with Songjoo Yoo, senior
research manager at Samsung's SoC R&D Center, noting that a common approach
to transferring IP between provider and receiver would boost design
reusability and lower development costs, especially as the industry moves
to design complex SoCs at process technologies of 45 nm and below.
However, the panelists did not offer much detail for specific steps to take
for reaching such a level of data exchange. John Goodenough, director of
design technology at ARM, said IP interoperability standards have been
beneficial at this architectural level, and added that representation of IP
use within design flows needs to be consistent. Though ARM supports
IP-XACT, Laurent Lestringand, IC design manager at NXP Semiconductors, saw
the specification as a good start but ultimately may not offer the level of
alignment for IP providers and EDA vendors to maximize the benefits. "The
challenge is to enable a level of abstraction that allows system
architecture tradeoffs to be done together with the software and hardware
engineers where the software and future application is still being
developed or is speculative, and the IP is also in flight," said Loic Le
Tournelin, manager at Texas Instruments.
Click Here to View Full Article
to the top
University of Portsmouth Scientists Reinvent the
Wheel
University of Portsmouth (06/08/07)
University of Portsmouth scientists are developing an artificial
intelligence (AI) system for steering wheels that use microcomputers to
learn how the car is being driven and make adjustments according to
traveling speed and road conditions. The microcomputers perform 4,000
calculations per second, communicate with each other, and use AI to create
a safer and smoother drive. The driver remains in control of the car, but
the tires automatically adjust to changing conditions and speeds. These
intelligent tires mark the first time AI has replaced fundamental mechanics
within a motor vehicle. Artificial intelligence is used to control the
suspension, steering, and braking system, and is used to adapt to bends in
the road, potholes, and other potential hazards by adjusting the car's
reactions. Information gathered is retained in the computer's memory and
used the next time the car encounters similar road conditions. Dr. David
Brown of the University of Portsmouth's Institute of Industrial Research
says electronic traction control and suspension will counterbalance the
drop and drag effect of manual driving, "but the driver won't even know
it's there. It means a faster car but a safer one."
Click Here to View Full Article
to the top
Laws Threaten Security Researchers
Dark Reading (06/08/07)
The Computer Security Institute (CSI) recently formed a working group of
Web researchers, computer crime law experts, and U.S. Department of Justice
agents to discuss the possible effects laws might have on Web 2.0
vulnerability research. The group's first report highlights the fact that
some Web researchers said that if they accidentally find a bug on a site,
they may not inform the Web site's owner for fear of prosecution. While
security researchers are freely able to find bugs in operating systems,
device drivers, and other applications on their own machines, Web
researchers trying to find bugs on Web servers are dangerously close to
violating laws designed to prevent hackers from tampering with Web servers'
machines, and are afraid of the repercussions. The report analyses several
methods of Web research, including off-site information gathering about a
Web site, testing for cross-site scripting by sending HTML mail from the
site to the researcher's Web-mail account, intentionally causing errors on
the site, and conducting port scans and vulnerability scans. A Justice
Department representative said that using only one of these methods might
not provide enough evidence for a case against a hacker, and that it would
require evidence of several of these techniques, along with evidence of
attempts to hide such activity, to create a case. The CSI working group's
next objective is to explore disclosure policy guidelines and mirrored-site
guidelines for Web site owners. The group is also creating a list of
research methods so lawmakers and law enforcement can have a better
understanding of Web research methods.
Click Here to View Full Article
to the top
File System, Power and Instrumentation: Can Linux Close
Its Technical Gaps?
LinuxWorld (06/06/07) Marti, Don
A major development push is necessary to close three technical gaps that
Linux developers perceive in the open source kernel project: The file
system, power management, and instrumentation, which developer Andrew
Morton cited last month in a "State of the Kernel" talk at Google. Morton
said a new file system is in order, while file-system developer Val Henson
projected 16 percent growth in disk capacities, 5 percent growth in
bandwidth, and 1.2 percent growth in seek time by 2013; this translates
into longer and longer running times for the file-system-checking utility
(fsck). Henson added that there will be a greater frequency of fsck
because I/O errors are increasing. As for power management, Morton
complained that Linux cannot support anything other than on and off, and
even managing these two states is problematic. Director of Intel's Open
Source Technology Center Imad Sousou noted that some of his developers are
focusing on the power problem, and one effort in this vein is power-saving
functionality for multicore processors. Morton said the initiative to
close the instrumentation gap is probably the most intensive, and he listed
per-task I/O accounting and per-process footprint monitoring as good signs
of progress. Linux founder Linus Torvalds concurred that the file system
and power management gaps must be addressed, while the instrumentation can
already sufficiently address real-world performance problems. He sees
"development flow" as another gap in need of resolution, explaining that
"We've always had issues with how certain subsystems end up having
development problems due to some infrastructure issue, or just personality
clashes, or some methodology being broken. Sometimes it's not
'subsystems,' but hits at an even higher level, and we end up having to
change how we do things in general."
Click Here to View Full Article
to the top
Exploring the Deep Web
Government Computer News (06/04/07) Vol. 26, No. 13, Robb, Drew
Conventional search engines cannot plumb the depths of the deep Web, where
about 94 percent of the Internet's content resides, according to Deep Web
Technologies President Abe Lederman. Professional research needs the
information that can be found in the deep Web, and it cannot afford the
information overload symptomatic of public search engines. Federated
search engines can probe the deep Web by searching multiple databases at
the same time, and they can eliminate information overload by only mining
those databases required by a specific kind of information customer. The
Energy Department's Office of Scientific and Technical Information hosts
federated search sites such as E-Print Network, Science.gov, and
EnergyFiles to accelerate research. The Defense Technical Information
Center has installed the Science and Technical Information Network
Federated Search site that the Defense Department community can use to find
research information located in databases from sources that include other
federal agencies and periodicals from the Air University Library and Joint
Forces Staff College. Lederman cautions that deploying a federated search
system is no simple matter, and the first step is determining precisely
what kinds of searches users will carry out and in what databases the
desired data is located. Furthermore, the user interface must be
intuitive, easy to use, and have sufficient detail so that users can
pinpoint the exact source of relevant data. Additional challenges include
setting up connections to data sources and keeping them updated, as well as
guaranteeing the relevance and comprehensiveness of the returned data.
Click Here to View Full Article
to the top
Automatic and Versatile Publications Ranking for Research
Institutions and Scholars
Communications of the ACM (06/07) Vol. 50, No. 6, P. 81; Ren, Jie;
Taylor, Richard N.
Rankings are the usual methodology for evaluating academic and industrial
research institutions and their scholars, and among the recommendations to
improve assessment is the adoption of more quantitative measures such as
research and student output. Publications comprise one such measure for
research output, while the disadvantages of existing publication-based
rankings include their limited scope because of their reliance on manual
input, and the restriction of reported rankings to specific fields. Google
software engineer Jie Ren and director of the University of California,
Irvine's Institute for Software Research Richard N. Taylor have devised a
framework for automatic and versatile publication-based ranking that taps
electronic bibliographic data to process a wider spectrum of journals and
conferences that extend over longer periods of time, and that can adapt to
various policy choices. These choices include what fields and entities to
rank, the importance of journals and conferences in the field, the weight
to assign papers from different conferences or journals, the number of
years of publications that should be included, and how the score should be
distributed among contributors for a paper by multiple authors. Ren and
Taylor's framework supports the choice of field or subfield; the inclusion
of proceedings of conferences and workshops; unrestricted conference and
journal selection; the latitude for evaluators to assign equal or divergent
weights to papers from different publications; ranking of scholars and
institutions, both academic and industrial, and from inside and outside the
United States; the use of any preferred year range; and either equal or
variable score distribution among multiple authors. The researchers
selected INSPEC, which consistently supplies author affiliation
information, as the data source during the framework's design and
experimentation phase. Ren and Taylor used the framework to rank U.S.
computing graduate programs and the software engineering field.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top