Study Finds US Bias Against Women in Science
Reuters (09/18/06) Fox, Maggie
A new report from the National Academies suggests that a deep-rooted bias
in American culture is preventing women from reaching the highest levels in
science, math, and engineering. As a result, the nation is losing some of
its most promising leaders and researchers in these fields, which it cannot
afford if the country intends to compete globally in the years to come. In
addition to cultural changes, more opportunities at research universities
need to be available to women, according to the report. The National
Academy of Science, the National Academy of Engineering, the National
Research Council, and the Institute of Medicine all participated in the
study, which examined the many arguments for why women are not excelling in
science, from biological differences, hormonal factors, and child-rearing
demands to ambition and performance. University of Washington-Seattle
executive vice provost Ana Mari Cauce says "the committee found no sound
evidence to support these myths and often good evidence to the contrary,"
such as the high performance of females in mathematics at the high school
level. Women do not earn as much and do not advance as quickly as men, and
the situation is even worse for female minorities, says the report. A
focus on recruiting and promoting women needs to come from trustees,
university presidents, and provosts. "It is not a lack of talent but an
unintended bias...that is locking women out," says University of Miami
President Donna Shalala, who headed the committee. For information
regarding ACM's Committee on Women in Computing, visit
http://women.acm.org
Click Here to View Full Article
to the top
New Search Engine Can Be Used for Creative
Discovery
Newswise (09/18/06)
Virginia Tech's System X supercomputer is being used by researchers to
test a new search program called "Storytelling" that can find connections
between seemingly dissimilar information, unearthing a sequence of
relationships or events to build a chain of concepts between specific start
and end points. "The stories are pieced together by analyzing large
volumes of text or other data," explains Virginia Tech computer science
professor Naren Ramakrishnan. "Every day, there are new research results
reported in the [scientific] literature and there are discoveries waiting
to be made by exploring connections." Large scale search engines such as
Google serve as the template for the storytelling algorithm. Each
supercomputer "node" is tasked with indexing a piece of the biological
literature, and the nodes share data to help concretize links and establish
connections. "In future work, we aim to investigate other ways to
construct stories that mimic or complement how biologists make connections
between concepts," reports Ramakrishnan. "Our eventual goal is a product
that is an important tool for reasoning with data and domain theories."
Virginia Tech biochemistry professors Richard Helm and Malcolm Potts used
Storytelling to explore connections between research papers on yeast and
its ability to enter into and exit from a state of reduced metabolic
activity. The researchers had Storytelling compare two PubMed articles
against the abstracts of 140,000 publications about yeast. Their work led
to the article, "Algorithms for Storytelling," by graduate student Deept
Kumar, Ramakrishnan, Helm, and Potts, that was published in the Proceedings
of the Twelfth ACM SIGKDD International Conference on Knowledge Discovery
and Data Mining (KDD'2006) in August 2006.
Click Here to View Full Article
to the top
Boosting Software Developers' Productivity
IST Results (09/20/06)
An IST-sponsored research initiative has built a platform to ensure that
European software designers stay ahead of the technology curve by helping
them bring new applications to market faster. Drawing on the latest
techniques of model-driven development (MDD), the MODELWARE project
streamlines the process of incorporating cutting-edge basic research into
marketable software. "MDD improves developers' productivity by automating
production of most software artifacts, such as tests, documentation, and
code," said project coordinator Phillippe Millot. Software development has
been significantly impeded by the multiple languages, country-specific
standards, and hardware systems that designers had to address, requiring
the painstaking reengineering of low-level code, Millot said. "MODELWARE
takes the low-level code of an environment and draws a model that employs a
much higher level of specification and design abstractions. In other
words, the developer can work in the domain language he knows best," Millot
said. The machine-readable and executable models enable developers to use
sophisticated simulation tools early in the design process to test their
ideas, simply pushing a button to convert the model into system code. The
MODELWARE developers also designed a method for organizations to manage
important technological changes as they emerge, assessing the needs of the
organization and defining the steps required to implement and optimize the
change. The MODELWARE partners are working to ensure that industry adopts
the project's methods and tools. Many of the MODELWARE components, which
are already being deployed in areas such as telecom and air-traffic
management, are open source and freely available on the project's Web
site.
Click Here to View Full Article
to the top
CMU Computer Science Professor Wins $500,000 Genius
Award
Pittsburgh Post-Gazette (09/19/06) Templeton, David
Carnegie Mellon University computer science professor Luis von Ahn is one
of this year's 25 recipients of the MacArthur "genius" fellowship, which
comes with a $500,000 award "to reflect, explore, and create." The
MacArthur Fellows Program describes von Ahn as "a young computer scientist
working at the intersection of cryptography, artificial intelligence, and
natural intelligence to address problems of profound theoretical and
practical importance," adding that he is "tackling ever more challenging
questions at the frontiers of computer science." The award comes just one
week after the 28-year-old Guatemala native was named by Popular Science
magazine as one of its "Brilliant 10." Von Ahn is credited with having
invented the area of computer science known as human computation, which
tackles large-scale problems still beyond the ability of computers to solve
by harnessing the computational abilities of humans. He also helped invent
"Captchas," the security technique that prompts a user to enter the numbers
and letters displayed in an obscured script when seeking access to a secure
Web site. Other areas of von Ahn's research include gaming and computer
image-recognition. His "ESP Game," which has been licensed by Google,
creates image labels by pairing two random players who type descriptive
words about a photograph or other posted image until they hit on the same
term, ending the segment. With a couple billion images posted on Google,
von Ahn claims that it would take 5,000 participants two months of
continuous play to label all the images. Already, the amount of time
participants have spent playing ESP has exceeded the 7 million human hours
it took to build the Empire State Building, he says. "It took 80,000
people to build the Panama Canal and 50,000 people to send people to the
moon," he said. "Now we can have a billion people working together."
Click Here to View Full Article
to the top
UI Prof's Research Shows Games More Than Just Play
News-Gazette (09/18/06) Kline, Greg
University of Illinois speech communications professor Dmitri Williams
thinks massively multiplayer online games may actually cultivate
sociability rather than social isolation. Williams argues, "You get people
from a pretty wide range of backgrounds. The games drive people together.
They are levelers. It doesn't matter what you are outside." Williams and
Constance Steinkuehler of the University of Wisconsin co-authored a study
presented in August in the Journal of Computer-Mediated Communication
concluding that massively multiplayer games seem to be especially conducive
to social "bridging" in which people of differing backgrounds become
familiar with each other and broaden their horizons. Williams maintains
that live interaction is still far superior to online interaction. But the
UI professor believes the games fulfill a need in a country where people
are increasingly home-bound by choice, calling the games a virtual
equivalent to "third places" where people gather for informal social
interaction. Williams admits that the games do not seem to be particularly
conducive to social "bonding" that close friends enjoy, at least in the
short term. However, UI computer science student Shane Castle, president
of the game builders group in the ACM chapter on campus, says the
interaction with others is a big part of playing the games. He says, "You
group with people to accomplish tasks. You find people you relate to,
people you find interesting, people who have the same goals as you in the
game."
Click Here to View Full Article
to the top
UA Awarded $3.3 Million to Increase Participation by
Women in Science and Engineering
University of Arizona (09/18/06)
The University of Arizona will seek to boost the participation of women in
science and engineering careers under a five-year, $3.3 million grant
awarded under the National Science Foundation ADVANCE program that aims to
diversity the science, technology, engineering, and mathematics (STEM)
workforce. One of just five universities to receive full funding under the
program, UA's "Eradicating Subtle Discrimination in the Academy" initiative
will pursue its goal through a software project that promotes more
equitable recruiting and negotiating with faculty as well as
interdisciplinary diversity grants, mentoring, and workshops for young
researchers. "Far too few women pursue careers and succeed in science and
engineering," says Leslie Tolbert, UA vice president for research and
principal investigator in the effort. "We are going to address a key
underlying problem, subtle discrimination, which thwarts many efforts to
recruit and retain women in these fields." For information regarding ACM's
Committee on Women in Computing, visit
http://women.acm.org
Click Here to View Full Article
to the top
Berners-Lee Intelligent Web Requires Co-Operation
ZDNet UK (09/19/06) Bennett, Jonathan
The key to creating the Semantic Web will be to present existing databases
in standard formats, Web inventor Tim Berners-Lee told participants at the
Ordnance Survey's Terra Future conference yesterday in Southampton,
England. His vision of the Semantic Web, which aims to add meaning and
intelligence to the mass of largely unstructured data scattered across the
Web, will draw its power from the fusion of multiple sources of data. The
conference's subject, Terra Future, is particularly relevant to the
next-generation of the Web, given the proliferation of location-based data
that is being added to sites, he added. "Geospatial information is being
seen to be exciting by the Web 2.0 crowd, with things like geotagging, and
Google Maps," he said. Combining the data and time stamps from a picture
taken with a digital camera with information from his calendar, a computer
could infer where the photo was taken and add to the image's metadata,
Berners-Lee said. That kind of integration will not be possible, however,
if industries and interest groups do not adopt shared standards, such as
Resource Description Framework, and develop a common vocabulary.
Click Here to View Full Article
to the top
IU Informatics-Based Institute Will Benefit Tech
Industries, Scientists
Indiana University (09/19/06)
To accelerate the delivery of new data-search and usage techniques to
industry, the NSF has awarded the Indiana University School of Informatics
a grant to establish the Data and Search Institute (DSI). "Our mission is
to partner with industry to increase innovation and competitiveness in the
United States," said DSI director Beth Plale. "The future will be
dominated by those who can most effectively search for data, use it--and
create value from it." Florida International University is partnering with
Indiana in the project. The center's facilities are expected to be
available to industry executives in the region, as well as students,
preparing them for the real-world challenges that await by providing them
with access to the latest equipment. The institute's nationally recognized
staff includes researchers with expertise in communication protocols,
service architectures, databases, artificial intelligence, human-computer
interaction, bioinformatics, and social informatics.
Click Here to View Full Article
to the top
NSF Taps CRA to Form Computing Community
Consortium
CRA Bulletin (09/18/06) Harsha, Peter
The Computing Research Association has reached a three-year, $6 million
agreement with the National Science Foundation to establish a Computing
Community Consortium of computing experts that will provide leadership and
vision for the NSF's Global Environment for Networking Innovations project.
Under the agreement, the council will attempt to galvanize the research
community behind large-scale computing research efforts launched by the
NSF. "We're pleased that NSF has charged our organization with
establishing the CCC," says Dan Reed, chair of the Computing Research
Association and director of the Renaissance Computing Institute in North
Carolina. "Computing research continues to fuel the innovations that drive
economic productivity. We see the CCC as a mechanism that will enable
continued innovation by enhancing our community's ability to envision and
pursue long-term, audacious computing research goals." The CCC will be
lead by a council of between nine and 15 diverse members, all of whom will
be leaders of the computing research community.
Click Here to View Full Article
to the top
AI Invades Go Territory
Wired News (09/19/09) Borrell, Brendan
Efforts to improve computer programs' ability to play the ancient Chinese
game of Go are making progress, and a revolutionary new way to approach the
challenge was authored by Universite de Lille computer scientist Remi
Coulom, who created a program called Crazy Stone that garnered a gold medal
at the 2006 Computer Olympiad. Coulom says the difficulty of programming
for Go lies in the fact that, unlike chess, the pieces are not captured.
Crazy Stone taps Monte Carlo methods, in which potential moves are assessed
by simulating thousands of random games, and the computer scientist
describes this strategy as very easy to parallelize, which dovetails well
with new processors' multi-core architecture. The Monte Carlo algorithm
can fail to find optimal moves because it is impossible to sample each
possible random game, but Crazy Stone can circumvent this drawback; Coulom
notes that the program is smart enough to practice a sequence of moves that
looks more promising than others more frequently in the random games. It
is Coulom's contention that being a skilled Go player is not a prerequisite
for being a skilled programmer. In response to complaints that games
played by Monte Carlo strategies are boring, Coulom argues, "Monte Carlo
programs maximize the probability of winning, not the margin that they win
by. When they're very far ahead of the opponent, then they'll always play
a safe move, which might look boring compared to more aggressive
alternatives. It may be boring to watch, but it's more efficient in
winning games."
Click Here to View Full Article
to the top
Technology Lobbyist Named Top U.S. Cyber-Security
Official
Washington Post (09/19/06) P. A6; Krebs, Brian
Greg Garcia has been named assistant secretary for cyber security and
telecommunications by the Department of Homeland Security, which finally
filled the vacancy after 14 months. Garcia, who works at the Information
Technology Association of America, will monitor DHS' cybersecurity plans
for keeping critical information networks secure. DHS chose Garcia after
previous candidates for the job were criticized for lack of experience and
not having enough power in Washington. President Bush and his
administration have been criticized for their slow response to attacks and
for not being prepared. Part of Garcia's job includes creating a response
plan in the event of a major cyber attack and developing a blueprint for
protecting the countries critical information networks, including water and
power systems, transportation, and telecommunications.
Click Here to View Full Article
to the top
VeriSign, Critics Prepare for ICANN Senate Hearing
IDG News Service (09/18/06) Gross, Grant
A report released last week by Network Solutions criticizes ICANN for its
failure to audit security practices of domain operators. The report
focused on a six-year contract renewal with VeriSign that gives it the
right to increase prices by 7 percent in four of the years and what was
termed the right to near-automatic renewals afterwards. "We're facing a
contract that provides for...permanent monopoly, for fee increases without
justification and now without adequate security protections," said Jonathan
Nevett, Network Solutions vice president and chief policy counsel. "It's
mind-boggling that the contract has gotten this far." But VeriSign
responded that price hikes will be needed to safeguard the domain in light
of ever-more sophisticated attacks. As an example CSO Ken Silva noted an
attack that utilized upwards of 30,000 bot-net computers in January. "The
technology we're talking about to keep up with this kind of load is not
something you just buy off the shelf," Silva said. "Let's not lose sight
of the fact that security continues to be the biggest threat to the growth
of the Internet. Supporting that security and stability is going to
involve a lot of technology, a lot of talent and a lot of equipment."
Under the terms of the agreement, the contract can be changed through
consensus of ICANN's two security committees and can be canceled if it is
deemed that VeriSign is not performing adequately. Meanwhile, a Senate
committee has scheduled a hearing for Wednesday to discuss ICANN's future
once its memorandum of understanding with the Commerce Department
expires.
Click Here to View Full Article
to the top
Eye-Controlled Computer Operation
Fraunhofer-Gesellschaft (09/06)
Researchers at the Fraunhofer Institute for Industrial Engineering
collaborated with industry partners to develop a system that enables a
computer user to control a mouse through eye movements. The Eye-Controlled
Interaction (EYCIN) system is designed to provide paraplegics with greater
access to PCs, and enable maintenance technicians to click through menus
while their hands remain free. A camera captures a user's pupil movements,
which a software program then relays to the computer quickly enough so that
the movements of the mouse pointer are fluid. The motion is fairly easy to
calculate; the principal challenge is clicking the mouse. The researchers
created sensitive areas on the display that users can activate by fixing
their gaze on them for a certain period of time. A button on the screen
changes color twice before it clicks, indicating to the user whether the
command has registered. The small jerks of eye motion, or microcascades,
presented a major challenge to the researchers, requiring them to develop a
filtering system to prevent them from being relayed to the computer so the
mouse pointer would not flit erratically around the screen.
Click Here to View Full Article
to the top
Avoiding the Most Common Software Development
Goofs
Dr. Dobb's Journal (09/17/06) Chelf, Ben
Static source code methods are a viable defense against the most common
software development mistakes, argues Coverity founder and CTO Ben Chelf.
He writes that the organizational cost of software defects is reflective of
the distribution of defects across the software development lifecycle. The
strategy Chelf recommends is to uncover more defects earlier in the
development process. The Coverity founder lists a number of reasons why
developers commit errors, such as ignorance of the systems being developed,
the stress developers experience when struggling to accommodate deadlines,
the tedium of the coding process, and humans beings' difficulty in
repeating the precise same operation over and over without variation.
Chelf details a variety of common software development goofs, and writes
that static source code analysis technology can easily detect such errors.
"Compared with testing tools (e.g., purify), static source code analysis
has the benefit of analyzing all of the paths through a given code base and
is not tied to the particular test suite of the application," he explains.
"Compared with manual code audits or developer debugging, static source
code analysis technology isn't hindered by...human frailties."
Click Here to View Full Article
to the top
Learning Through Technology-Enhanced Collaboration
IST Results (09/19/09)
IST is funding two European research projects that are designed to improve
the knowledge-sharing capabilities of new technology. The COOPER
initiative is focused on developing an Internet-based platform that will
enable a group of users to work together on a project, using tools such as
chat, Internet telephony, and a document repository, explains Xuan Zhou,
who manages the initiative at the L3S Research Center in Hannover, Germany.
The two-year project is scheduled to launch the online network at two
universities and an industrial partner next year. "What we are aiming to
achieve is the creation of a collaborative learning environment that lets
people communicate, work together, and share knowledge whenever they want
no matter where they are," says Zhou. The TENCompetence initiative also
got underway in December 2005, but the focus is more on making e-learning
networks more interactive so that users are actively engaged. Researchers
involved in this four-year project plan to develop an advanced, open-source
and standards-based technical and organizational infrastructure.
Collaborative e-learning networks will be able to take advantage of the
models, methods, and technologies for creating, storing, and exchanging
knowledge resources; tools for developing new content and learning
activities; and methods for testing users on how they are picking up the
new competencies. TENCompetence trials will begin in Europe next year.
Click Here to View Full Article
to the top
Antisocial Robots Go to Finishing School
New Scientist (09/19/06) Vol. 191, No. 2569, P. 28; Marks, Paul
For a robot to operate so that its activities are not insensitive to its
human owners, it must be imbued with the ability to understand people's
moods, explained director of Waseda University's humanoid robotics center
Shuji Hashimoto at a conference on socially intelligent robots at England's
University of Hertfordshire. "Emotion is one of the most crucial factors
influencing the success or failure of communication between humans," he
notes. "Robots are going to need similar emotional capabilities if they
are to cooperate smoothly and flexibly with humans in our residential
environments." Hashimoto envisions owners wearing sensors that the robots
can use to identify stress signals, while the best form of response would
be formulated by neural networks. Still, he acknowledges that such robots
will only have the appearance of emotional sensitivity, and this approach
has disadvantages: The more responsive to people's emotions a robot
becomes, the more its software complexity increases, which ultimately makes
programming a major headache. Hashimoto thinks this shortcoming could be
avoided if robots are allowed to learn from their environment and build
their own sets of rules. "We have to design environments where human and
robot learn together," he argues.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Scholarship for Service
IEEE Distributed Systems Online (09/06) Vol. 7, No. 9, Liebrock, Lorie M.
Lorie Liebrock, an assistant professor of computer science and information
technology at the New Mexico Institute of Mining and Technology, is the
principal investigator for the school's Scholarship For Service Program.
SFS is one of five Federal Cyber Service Training and Education Initiatives
outlined by the National Plan for Information Systems Protection, and its
purpose is to address the dwindling number of information assurance
professionals employed by the U.S. government. Liebrock writes that
students who wish to participate in the institute's SFS program must
demonstrate "desire and ability to work in information assurance in federal
civil service." Qualifications for enrollment include a 3.0 GPA for
undergraduates and a 3.5 GPA for graduates. The SFS program pays students
to complete their degrees, providing yearly stipends of $8,000 for
undergraduates and $12,000 for graduates, as well as tuition, board, lab
fees, and an Internet connection. Once they finish their degrees the
students work for the government, one year for every academic year that was
covered. Liebrock serves as advisor for all SFS program participants, and
teaches a professional development course each semester where students are
taught how to present technical material in a professional manner, as well
as proper government standards and regulations. Students are taken from
many disciplines and their research spans a wide assortment of subjects.
Liebrock says she collaborates with recruiters in agencies to place
students in federal civil service jobs that match their skills.
Click Here to View Full Article
to the top
Personalization in Privacy-Aware Highly Dynamic
Systems
Communications of the ACM (09/06) Vol. 49, No. 9, P. 32; Sackmann,
Stefan; Strucker, Jens; Accorsi, Rafael
Retailers can personalize their relationship with customers via highly
dynamic information systems (HDS), but this can come at the cost of
customers' anonymity, write Stefan Sackmann, Jens Strucker, and Rafael
Accorsi of the University of Freiburg's Department of Telematics. Users'
desire to control personal data is undercut by the exploitation of
technologies such as sensor networks, radio frequency identification
(RFID), localization technology, and automatic video surveillance in HDS.
More and more in HDS, data is being accumulated without any indication, and
such collection occurs without any predefined purpose. In addition, the
falling cost of data storage means that data remains persistent and
undeleted once collected, while customers can be recognized and identified
by integrating simultaneous recordings of an event by different devices
from multiple perspectives. Furthermore, multiple events are registered
concurrently by recording devices. Modern privacy-enhancing technologies
are thwarted by the inherent data collection in HDS because of their
reliance on obscurity, or the concealment of data, the authors maintain.
Sackmann, Strucker, and Accorsi present a proposal for a system in which
the transparency of privacy is supported by the creation of evidence, which
relies on policies as reference for a compliant utilization of data and log
views that cover all data concerning an individual contained in an
information system. The proposal ensures the genuineness of log data via
secure logging through the employment of standard cryptographic methods,
while views on logged data are also a necessity, albeit one that has not
yet been provided but perhaps could be through the intercession of
regulatory institutions.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top