Computer Science Takes Steps to Bring Women to the
Fold
New York Times (04/17/07) P. D1; Dean, Cornelia
Over the past few decades women have been playing a larger role in science
and engineering, exceeding enrollment parity in mathematics, biology, and
other fields, but in the field of computer science women's role is static
or even shrinking when compared to men. In 1985, women received about 38
percent of computer science bachelor's degrees awarded in the United
States, a figure that fell to about 28 percent in 2003, according to the
National Science Foundation. At universities with graduate programs, only
17 percent of bachelors degrees went to women during the 2003 to 2004
academic year, according to the Taulbee Survey, conducted annually by a
computer science research organization. Many believe the percentage has
worsened in recent years as well, with computer science the only field in
science or technology in which women are consistently giving ground.
Carnegie Mellon University computer scientist Lenore Blum believes women
are essentially "canaries in the coal mine," and factors that drive women
away from computer science will eventually drive men away as well, such as
the dot-com bust, the outsourcing of high-tech jobs, and negative
stereotypes about jobs in computer science. Experts say these fears about
the computer science industry are blown out of proportion as there are more
computer science jobs today than at the height of the dot-com boom, and the
Bureau of Labor Statistics predicts demand for computer scientist in the
United States will increase during the coming years. Some programs, such
as the one at Carnegie Mellon, have shifted the focus away from programming
proficiency to overall achievements and broad interests in an effort to
attract more women applicants, but these changes have brought accusations
of lowered standards. To learn about ACM's Committee on Women in
Computing, visit
http://women.acm.org
Click Here to View Full Article
to the top
Open-Source Project Aims to Erase E-Voting Fog
IDG News Service (04/16/07) Kirk, Jeremy
University College Dublin computer science lecturer Joseph Kiniry believes
that e-voting is risky and available software systems are substandard,
saying governments feel obliged to use e-voting to feel modern, despite
computer security experts' warnings that e-voting is insecure. Kiniry
knows governments are going to forge ahead with the implementation of
e-voting systems, so he and a team of researchers designed an open source
e-voting software system that he hopes will create a secure foundation for
e-voting. In Kiniry's system, voters register at a government office and
receive a PIN. Later, the voter will receive a unique ballot in the mail,
and on election day, the voter will enter their voter ID code and PIN on
the Web site. To select a candidate, the ballot has a number next to each
candidate that is different for every voter, a type of pre-encryption,
ensuring that the number can only be used once and will be useless if
intercepted. Kiniry's system also provides the voter with a receipt number
to make sure the vote was counted. Recounts remain a problem because there
are no physical ballots, and like other systems, a recount would entail the
system running the same software over again, which is not an acceptable
solution according to Kiniry. Kiniry believes a possible solution would be
to allow a third party to develop their own software that could be used for
recounts, but because elections have such ambiguity without computer
technology, it is likely any electronic voting system will have ambiguities
as well. For information about ACM's e-voting activities, visit
http://www.acm.org/usacm
Click Here to View Full Article
to the top
Patently at Odds
Washington Post (04/18/07) P. D1; Sipress, Alan; Drezen, Richard
The computer technology and drug industries are fighting each other over
Congressional efforts to revamp the patent system for the first time since
the 1950s. The computer industry would like to see more flexibility in the
patent system for its fast moving and developing companies, while the drug
industry wants the strongest possible protection measures for its patents.
The technology industry uses a recent patent case involving Microsoft as an
example for the necessity to change patent law. For infringing on two
patents for MP3 technology owned by Alcatel-Lucent, Microsoft was ordered
to pay $1.52 billion, the largest patent penalty ever and widely considered
to be far too much even if Microsoft was guilty. Large tech companies are
more likely to infringe upon existing patents because software develops so
quickly, so the computer industry wants more flexibility to challenge
patents while being protected from excessive damages, particularly for
unintended infringements. Drug companies, however, want strong regulations
to defeat challenges and ensure violators pay hefty penalties for copying a
patented product, which can cost billions of dollars to develop and patent.
Emery Simon, counsel to the Business Software Alliance, argues that patent
law, which was originally intended to protect inventors, now discourages
individual innovation, and drug companies use patent law to impose strong
penalties and dominate the market, using patent law as a weapon against
generic drug companies and other competitors. Patent reforms bills are
expected to be introduced today in both the House and Senate, an issue that
also affects universities, small businesses, banks, and financial services
firms.
Click Here to View Full Article
to the top
Shape-Shifting 'Smart Dust' May Explore Alien
Worlds
New Scientist (04/18/07) Knight, Will
Several research groups are developing miniscule wireless sensors, or
"smart dust," that could be used to explore other planets, deep space, or
be sprinkled in a building to sense chemicals or vibrations. While these
tiny smart dust devices are currently theoretical and have only been tested
in computer simulations, the possibility to explore inaccessible
environments is promising. The tiny devices are only a few cubic
millimeters in volume and can perform simple sensing tasks and relay
information to other devices less than a meter away, but a group of them
could synchronize their radio signals to transmit messages over many
kilometers. University of Glasgow electronics researcher John Barker
created an electronic experiment simulating the turbulent winds of Mars to
test if minute changes in the dust particles' shape, shifting between
smooth and dimpled, would allow the dust to navigate. When the virtual
motes were programmed with a simple set of rules, Barker found that about
70 percent of the motes were able to successfully navigate a 20-kilometer
track. The technology to create these changes in shape does not currently
exist, but progress in electronic manufacturing could help miniaturize
smart dust with electro-active polymers that could change shape with
minimal energy consumption. Michael Sailor, who works on smart dust at the
University of California, points out that by distributing the task to
multiple, smaller devices, if one device fails, others can fill its role,
but smaller devices make it more difficult to place high-fidelity,
high-sensitivity sensors into the device.
Click Here to View Full Article
to the top
I.T.'s Top 81 R&D Spenders
Baseline (04/17/07) Hertzberg, Robert
The biggest technology companies have been spending a lot of time and
money on product development, with the majority of research and development
spending focused on consumer products such as games and Internet services.
A Baseline survey of 81 U.S. companies, more than half in the software
industry, shows that R&D spending increased 17 percent last year. The top
10 companies spent $32.5 billion on product development during their 2006
fiscal years, almost 70 percent of all R&D spending among the surveyed
companies. Massachusetts Institute of Technology professor Arvind said
once a technology company pulls ahead of the competition they are forced to
devote resources to research to ensure they stay at the top of the field.
Fast-growing companies are frequently forced to invest development dollars
to fine-tune products customers are using, such as the iPod at Apple and
printers at Hewlett-Packard, to protect a major source of revenue for the
company. Even if companies wanted to spend more money on new breakthrough
products or new enhancements for the most popular products, R&D executives
say it would be impossible because companies would be unable to hire enough
engineers and programmers. Nevertheless, some companies have developed
innovative methods for enhancing their R&D efforts. At Adobe, for example,
an "idea mentor" is responsible for encouraging "the engineers no matter
what, and to make sure they get heard," says Adobe director of technology
programs Leslie Bixel. "As the company has grown in size, that's one of
the biggest challenges."
Click Here to View Full Article
to the top
Docs Point to E-Voting Bug in Contested Race
Wired News (04/17/07) Zetter, Kim
Incident reports from a controversial election in Sarasota County, Fla.,
last November show that poll workers from at least 19 precincts contacted
technicians and election officials to report touch-screen sensitivity
problems, symptoms associated with a known software flaw in the I-Votronic
voting machine made by Election Systems & Software (ES&S). County
officials claim the election results were not altered by the bug, but
activists are arguing that the flaw might have contributed to the high
number of lost or uncast votes in what is now a highly contested and
controversial election. Voters complained of having to press the touch
screen harder and multiple times to register a vote, a symptom similar to
one caused by a bug the machine's maker revealed prior to the election but
was ignored by the county. The incident reports also cited problems during
the primary election two months earlier, a contradiction to a statement
made by Sarasota supervisor of elections Kathy Dent, who said no such
problems occurred during the primary. ES&S sent a sign to be posted in
polling places instructing voters to hold their finger on the screen until
their selection was highlighted, which could take several seconds according
to the sign. Dent chose to post a different sign listing steps for casting
a ballot and encouraging voters to check the review screen at the end of
the ballot for accuracy, but the replacement sign made no mention that
voters may need to hold their finger to the screen for several seconds. In
the election, Republican Vern Buchanan won over Democrat Christine Jennings
by fewer than 400 votes, and more than 18,000 ballots recorded no vote in
the race. Jennings is now contesting the results in court, and the House
Administration Committee has formed a special taskforce to investigate the
election.
Click Here to View Full Article
to the top
Is the Web Ready for HTML 5?
InternetNews.com (04/16/07) Kerner, Sean Michael
Mozilla, Opera, and Apple's Safari have joined together to propose
specifications for HTML 5 to the W3C, the standard body responsible for
HTML. The proposal includes Web Apps 1.0, Web Forms 2.0 specifications,
and backwards compatibility with HTML 4. Opera spokesperson Anne van
Kesteren said preserving information is the purpose of HTML 5. "By
remaining backward and forwards compatible, we hope to ensure that people
will be able to interpret HTML for decades if not centuries to come,"
Kesteren said. The last major upgrade was version 4.0, released in 1997,
and recommendations for HTML 4.0.1 were published in 1999. Mozilla CTO
Brendan Eich said HTML 5 will allow for better cross-browser compatibility,
better support for Web 2.0 applications, and better multimedia support.
Web Apps 1.0 would form a core part of HTML 5, and provide features that
make it easier to create Web-based applications, including context menus,
direct-mode graphics canvases, in-line popup windows, and server-sent
events. Web Forms 2.0 specifications include new attributes, DOM
interfaces, and events for validation and dependency tracking, and XML form
submission and initialization. Although HTML 5 is not yet a formal
standard, Mozilla and Opera have already built HTML 5 technologies into
their browsers. Microsoft has signaled whether it plans to participate in
the HTML 5 effort.
Click Here to View Full Article
to the top
Profit Slows Tech Innovation, Report Says
News & Observer (04/13/07) Simmons, Tim
The desire to seek big profits instead of less lucrative yet more
practical innovations is preventing technology from being transferred from
university labs to the U.S. marketplace, according to a report by Ewing
Marion Kauffman Foundation, a nonpartisan group focused on the advancement
of entrepreneurship. The report applauds the work of university
researchers, but says a "home run" philosophy among school officials could
hinder the development of new technologies. The report calls technology
transfer offices "the maximizer of revenue streams, rather than the grease
in the gears that allowed the system to flourish." Most technology
transfer offices were created after 1980 on the philosophy that most
scientists are not business executives, so the office would apply for
patents and negotiate license agreements with the university retaining the
legal rights to sell new technology and splitting the profits with faculty
members who created the technology. The Kauffman Foundation reports that
instead of increasing efficiency, the centralized process has created
bottlenecks that choke the transfer of new ideas. Officials from North
Carolina State University and University of North Carolina-Chapel Hill
dismissed the findings of the Kauffman group. Mark Crowell, the associate
vice chancellor responsible for the oversight of technology transfer and
economic development at UNC-CH, called parts of the report "silly" and
"naive." Dave Winwood, an associate vice chancellor who oversees
technology development at NCSU, said "home-run licenses" are not a focus of
daily operations and, "The opinions of the Kauffman group could hardly be
more inaccurate as regards to the technology-transfer structure, mission
and mechanisms in place at N.C. State."
Click Here to View Full Article
to the top
Dartmouth Gets Award for Cyber Security Studies
Dartmouth News (04/13/07) Burnham, Laurie
Dartmouth is set to receive more funding from the U.S. Department of
Homeland Security that will enable its Cyber Security Collaboration and
Information Sharing Project to further study cyber security. The Institute
for Information Infrastructure Protection (I3P) will receive $8.7 million
to conduct new studies on insider threats, privacy protection, and the
economics of cyber security, and the Institute for Security Technology
Studies (ISTS) will receive $3 million and continue its research into
security and privacy matters. "ISTS is excited to initiate several
research projects that will develop cutting-edge technologies, including
sensor networking, autonomic computing, video forensics, and public-key
infrastructure," says ISTS executive director David Kotz. "Addressing
real-world problems related to cyber security and infrastructure requires a
multidisciplinary approach," says I3P Chair Martin Wybourne. "The unique
character of the consortium enables faculty and students from many
disciplines to join forces to further our understanding of the issues."
Both institutes will also put some of the funds toward educational
programs, seminars, and workshops for students.
Click Here to View Full Article
to the top
Security Remains a Challenge for Browser
Developers
eWeek (04/17/07) Galli, Peter
At the Web 2.0 conference this week in San Francisco, leading companies in
the Web browser industry, ranging from open-source communities to software
powerhouse Microsoft, addressed the arrival of Web 2.0 and what effect it
would have on Web browsers, and everyone agreed that security was one of
the biggest challenges facing the industry. Charles McCathieNevile, the
chief standards officer at Opera, said security models on the Web are
immature, and that Web browser developers are committed to interoperability
and what users want, not starting another browser war. Microsoft's Chris
Wilson, platform architect for Internet Explorer, admitted that Web
browsers still have a long way to go. "They are all missing some of the
client-side features, but have certainly become far more robust over time,"
Wilson said. When asked what spurred the development of Web 2.0
applications, Wilson said social networking and mashups were widely
responsible, but Mozilla's chief technology officer Brendan Eich said
development tools were helpful. Eich said current efforts are focused on
making memory use more linear, but this type of development takes time.
Click Here to View Full Article
to the top
Complexity Is Killing IT, Say Analysts
IDG News Service (04/13/07) Krill, Paul
It is becoming increasingly difficult to handle IT system complexity, IT
experts said during IBM's Navigating Complexity conference. Tata
Consultancy Services vice president Harrick Vin noted that there are so
many kinds of problems, from security compliance and root cause analysis to
overlapping functions. "Essentially, what happens is we only have a
silo-based understanding of what is going on," said Vin. Changes to
operating systems, applications, workload types, and volumes are the source
of the rising complexity, according to Vin, who described the response as
reactive firefighting. Peter Neumann of SRI International's computer
science laboratory said old mistakes are often repeated, such as the buffer
overflow issue, and developer tools are not being used. There needs to be
"some sort of approach to complexity that talks about sound requirements"
and good software practice, Neumann said.
Click Here to View Full Article
to the top
Feds: Accuracy of Face Recognition Software
Skyrockets
LiveScience (04/13/07) Wood, Lamont
Face recognition software is 20 times better than it was five years ago,
according to the National Institute of Standards and Technology. In NIST's
latest results from its Face Recognition Vendor Test, the best face
recognition algorithms had a rate of false rejections of 1 percent,
compared with a failure to make correct comparisons 20 percent of the time
in 2002. Speed is not a key characteristic of the algorithms, which make
use of the single comparison approach, rather than compare every face in a
database to every other face. "We fed the algorithms lots of data to get a
statistically meaningful answer," explains Jonathon Phillips, an electrical
engineer who directed the test. "Our goal was to encourage improvement in
the technology, and provide decision makers with numbers that would let
them make an educated assessment of the technology itself." With random
lighting on each face, the rejection rate was about 12 percent, compared
with 20 percent five years ago.
Click Here to View Full Article
to the top
Intel R&D on Slow Boat to China
CNet (04/16/07) Krazit, Tom
Intel is showing more of a commitment to China by announcing plans to set
up a plant in Dalian to build chips, but the company plans to move more
slowly when it comes to research and development. In an attempt to bring
greater attention to its growing operation in China, Intel is holding its
semiannual Intel Developer Forum in the nation this week. Intel has
packaging facilities in Shanghai and Chengdu, and research labs in Beijing
and Shanghai, where much of the work involves software development.
However, Intel's next processor design is unlikely to come from China
because the chipmaker still has concerns about the local educational
system, which does not provide engineering students with the same level of
technical know-how that Americans receive from U.S. schools. Export
controls, which will keep the latest chipmaking technology from being
imported to China, are also a problem. "The fab announcement in Dalian is
certainly an indication that we're willing to do more in China, but we're
trying to pace ourselves," says Intel CTO and chief of labs Justin
Rattner.
Click Here to View Full Article
to the top
Parsing the Truths About Visas for Tech Workers
New York Times (04/15/07) P. BU4; Lohr, Steve; Giridharadas, Anand
Testifying before the Senate last month, Microsoft Chairman Bill Gates
said more H-1B visas for skilled workers would help make the United States
more competitive, arguing these workers are "uniquely talented" and highly
paid. Ronil Hira, a Rochester Institute of Technology assistant professor
of public policy, argues that H-1B visas are not attracting skilled workers
to the United States, but actually fast-tracking the outsourcing of
computing jobs, with companies hiring a limited number of skilled workers
to collect the requirements and specifications of a client and carry that
information to India where most of the software coding is done.
Traditionally, about half of all H-1B holders eventually get green cards,
but major outsourcing companies apply for thousands of H-1B visas but ask
for relatively few green cards, according to government figures. The
commerce minister of India Kamal Nath called the H-1B the "outsourcing
visa," but many economists believe that outsourcing is creating a more
efficient global trade in technology services, and the U.S. economy will be
more competitive with more job opportunities as a result. Technology
lobbying groups argue that the overflow for H-1B visas, applications for
which exceeded the yearly allotment in one day, is proof of the skills
shortage in the United States, but others disagree. George Mason
University associate professor of public policy David M. Hart says, "There
is no labor market test, using technically sound criteria, to determine
whether or not there is a shortage." Hart suggests an accurate measurement
would include recent wage trends and unemployment rates in specific
professions. Hira said Microsoft may be using the H-1B visa to bring in
the best and brightest, but, "it's definitely not representative of how the
H-1B program is being used today."
Click Here to View Full Article
to the top
Technology Can Improve Lives
Government Technology (04/16/07) Scott, Gina M.
Technology can make life easier for those willing to take advantage of it,
but people with disabilities, in the United States there are approximately
51 million, can have difficulty gaining access to that technology. A
recent United Nations report found that only 3 percent of Web sites are
accessible to persons with disabilities. In 2004, the Federal
Communications Commission held hearings to determine the effectiveness of
the Emergency Alert System, including to determine if emergency information
could reach people with disabilities. The Rehabilitation Engineering
Research Center for Wireless Technologies recommended the FCC expand its
rules to include new digital technologies and devices capable of reaching
people with disabilities in case of an emergency. Voting technology has
also been improved to be more accessible to the disabled. The November
2006 mid-term election was the first federal election to use voting system
updates mandated by the Help America Vote Act, which provides funding to
replace punch-card voting systems and set aside funding for local
governments to ensure access for individuals with disabilities. Electronic
voting machines are believed to be more accessible for voters with
disabilities, but some controversy has developed over their reliability,
security, and accuracy.
Click Here to View Full Article
to the top
Software 'Fix' Responsible for Loss of Mars Probe
New Scientist (04/13/07) McKee, Maggie
A preliminary report from a NASA investigation board cites commands
incorrectly sent to the wrong memory address on the onboard computer in
June 2006 as the primary reason for the loss of NASA's Mars Global Surveyor
Spacecraft. The Mars Global Surveyor had been in space for almost 10
years, the longest period for any spacecraft studying Mars, before NASA
lost contact on Nov. 2, 2006. The incorrect uploads caused the corruption
of two independent parameters, one of which caused a solar array to try to
move beyond its limit. The over-rotated solar array caused the spacecraft
to reorient itself in space, directly exposing one of its batteries to the
Sun, causing the spacecraft to mistakenly believe the battery was
overcharged, resulting in a shutdown of the charging process. The other
battery could not provide enough power to keep the spacecraft running, and
both batteries ran down in about 12 hours. The second parameter
malfunction caused the spacecraft's high-gain antenna to point away from
the Earth, making communication with the spacecraft impossible and the
unsafe thermal and power conditions to be unidentifiable by ground
controllers. The June 2006 upload was intended to correct a problem from
September 2005 that was caused when two engineers created two slightly
different updates to two redundant control systems for the high-gain
antenna, which led to a discrepancy with the computer's two memories. The
report also found that existing procedures were not thorough enough to
catch the resulting problems, and reductions in budgets and staff may have
contributed to the loss of the spacecraft.
Click Here to View Full Article
to the top
The Grand Challenge for Text Mining
Intelligent Enterprise (04/13/07) Grimes, Seth
ClearForest co-founder and computer science professor Ronen Feldman threw
down the gauntlet last year with his text mining grand challenge to create
"systems that will be able to pass standard reading comprehension tests
such as SAT, GRE, GMAT etc," but Seth Grimes writes that Feldman's proposed
test is incomplete. He asserts that a far lower f-score than Feldman
suggests would be sufficient enough to enable a machine to choose the best
of five answers in a multiple-choice reading comprehension test, and
speculates that "moderately sophisticated pattern-matching software" might
do the job. Grimes also points out that a reading-comprehension test such
as Feldman envisions would fail to mine real-world materials (call-center
notes, email, survey responses, etc.) that may contain ungrammatical or
fractured syntax, irregular or abbreviated spelling, or externalities or
references that are not addressed by studying a single source document. A
third flaw Grimes finds in Feldman's methodology is that the test may yield
correct answers that may not be factually true, given the debatable
accuracy of much of the information on the Web. The author reasons that
this problem could be resolved through the creation of a mechanism for
evaluating and scoring the correctness of identified responses so that a
single, best response can be arrived at. Grimes goes on to offer another
suggestion, which he terms "the synthesis of responses in accordance with a
contextually determined model." The author concludes that passing the
Turing Test--which posits that a machine is intelligent if a person cannot
distinguish between it and a human being in conversation--remains the
optimal grand challenge for text mining.
Click Here to View Full Article
to the top
Inconsistencies and Disconnects
Communications of the ACM (04/07) Vol. 50, No. 4, P. 76; Stone, Jeffrey
A.; Madigan, Elinor
Despite the growing emphasis that institutions of higher learning place on
information and communication technology (ICT) for student learning,
incoming freshmen's ICT skills are frequently inadequate, while a shift in
the workplace toward effective technology and information consumption is
exacerbating the situation. An absence of fundamental ICT know-how makes
the performance of valid research, information analysis, and communication
difficult, and also hinders coordination, knowledge exchange,
collaboration, and other essential knowledge work skills. Many states have
tried to bridge the gap in students' ICT competence by implementing
statewide curriculum standards, and Pennsylvania State University (PSU)
researchers carried out a content analysis of such standards in 10 states
three years ago. They concluded that more and more states are
acknowledging ICT literacy as a basic skill, and the emphasis of some
states on technology education is countered by a lack of solid ICT
standards in other states; in many instances the language of the standards
was too opaque to allow specific skills to be identified as learning
outcomes. Further research revealed an overall disconnect between
participating university students' perceived ICT skill levels and their
actual skill levels, and a correlation between their lack of ICT prowess
and the weakness of state technology curriculum standards. "There must be
a consensus among educators and business and government leaders on the
definition of ICT literacy and a generally accepted road map of how to
produce an ICT-literate individual," the PSU researchers argue. Most of
the 10 state standards polled by the researchers failed to cover the basic
skills necessary to do research, communicate, and navigate Web sites, while
basic computing skills needed to install and remove programs or work with
applications were also overlooked.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top