WGBH and ACM to Launch Initiative to Reshape Image for
Computing
AScribe Newswire (05/01/08)
ACM and the WGBH Educational Foundation will use a National Science
Foundation grant to create messages about the field of computing that will
attract more college-bound high school students from underrepresented
groups. The two-year New Image for Computing project, which launches in
June, is an effort to draw students from all segments of society to
computing careers. "We will mobilize thousands of computer professionals
to help deliver messages that illuminate the rich diversity of work in the
computing field--not just in technology companies but in the many
industries that rely on computing technology," says ACM executive director
John R. White. "Our target is high school students, who in most cases have
only a vague notion of what computer science majors actually do, even
though many have grown up with computers." Latina girls and
African-American boys will be a focus of the messages. The services of
marketing professionals will be used to accurately portray the rewards and
benefits of computing careers. ACM and WGBH plan to roll out the messages
nationally via channels that are popular with teens.
Click Here to View Full Article
to the top
H.P. Reports Big Advance in Memory Chip Design
New York Times (05/01/08) P. C4; Markoff, John
Hewlett-Packard scientists have developed a memristor, an electrical
resistor with memory properties that could be used to build very dense
computer member chips that require far less power than DRAM memory chips.
The memristor could also be used to create field programmable arrays.
Meanwhile, memristors' ability to store and retrieve a variety of
intermediate values, not just the binary 1s and 0s used in conventional
chips, could enable them to function like biological synapses, which would
make them ideal for artificial intelligence applications such as machine
vision and understanding speech. Independent researchers say the memristor
could quickly be applied to computer memory, but other applications could
be more challenging. Hewlett-Packard's quantum science research group
director R. Stanley Williams says the technology should be commercialized
fairly quickly. The memristor was first predicted in 1971 by University of
California, Berkeley electrical engineer Leon Chua, who says he had not
worked on the idea for several decades and he was surprised when
Hewlett-Packard contacted him a few months ago. The researchers have
successfully created working circuits based on memristors that are as small
as 15 nanometers, and Williams says it will eventually be possible to make
memristors as small as about four nanometers. In comparison, the smallest
components in today's semiconductors are 45 nanometers, and the industry
does not see a way of shrinking silicon-based chips below about 20
nanometers.
Click Here to View Full Article
to the top
Stanford Kicks Off Parallel Programming Effort
EE Times (04/30/08) Merritt, Rick
Stanford University's new Pervasive Parallelism Lab will receive a total
of $6 million from six companies to get the lab up and running. The lab
will consist of about nine faculty members and as many as 30 graduate
students. Advanced Micro Devices, Hewlett-Packard, IBM, Intel, NVIDIA, and
Sun Microsystems are contributing to the effort, largely because of
increasing concerns that software is unable to keep up with the evolution
of multicore processors. "People are starting to build multicore hardware
without really knowing how they can productively program it and that's
becoming a huge problem," says Stanford computer science department chair
Bill Dally. "The current use of threads and locks is error prone and hard
to maintain, so if we don't find a new approach soon we will be saddled
with some very bad legacy software." Dally says the last seven or eight
years have been something of a Dark Age for parallelism research after
DARPA pulled its funding for the field. Microsoft and Intel recently
announced a plan to spend a total of $20 million over five years to fund
parallel computing labs at the University of California, Berkeley, and the
University of Illinois at Urbana-Champaign. The Berkeley and Stanford labs
are expected to take similar approaches to solving the problem. Both labs
will assign some researchers to develop next-generation applications using
domain-specific languages, and both will develop new runtime environments
to help automate the job of scheduling and synchronizing multiple processes
running in parallel. Both labs will also research new hardware structures
that could ease the parallel programming problem.
Click Here to View Full Article
to the top
SIGGRAPH New Technology: From Enhancing Facial
Attractiveness to Virtual Maps
Business Wire (04/30/08)
The SIGGRAPH 2008 Technical Papers Program will feature 90 paper
presentations, which will be included this year in the journal ACM
Transactions on Graphics. Selected from an all-time high of 518
submissions, the papers come from computer graphics industry professionals
from all over the world. There will be presentations on modeling,
animation, rendering, and imaging, but a number of papers also focus on
topics such as scientific visualization, information visualization,
computer-aided design, human-computer interaction, computer vision,
robotics, film special effects, and computer games. Highlights include a
presentation on a technique for enhancing facial attractiveness in
photographs by researchers in Israel, and a paper on an automated system
for designing tourist maps by researchers at the University of California,
Berkeley. "These presentations give us a glimpse into a future with highly
realistic computer games, stunning feature film special effects,
intelligent cameras, and rich photo manipulation tools," says Greg Turk,
SIGGRAPH 2008 Technical Papers Chair from the Georgia Institute of
Technology. SIGGRAPH 2008 is scheduled for Aug. 11-15 at the Los Angeles
Convention Center.
Click Here to View Full Article
to the top
Experts Struggle With Cybersecurity Agenda
Government Computer News (04/28/08) Jackson, William
The Commission on Cyber Security for the 44th Presidency, established in
November by the Center for Strategic and International Studies (CSIS),
recently held the second of five scheduled public meetings to field
recommendations on issues surrounding information security, identity theft,
and government leadership. CSIS established the commission to create
recommendations for a comprehensive strategy to improve federal systems and
critical infrastructure cybersecurity. The objective is to have a set of
recommendations ready for the next president by November. Panelists at the
meeting said leadership is needed from the government and industry to
create a public/private partnership to create adequate security. Although
they were not in complete agreement on cybersecurity priorities, they did
agree that a single national data breach notification law is needed to
replace the patchwork of more than 40 state laws. Other topics included
creating a zero-tolerance policy for identity theft and requiring
verification for online transactions with consumers, requiring the Social
Security Administration to create a database linking Social Security
numbers with dates of birth to prevent the misuse of Social Security
numbers, and establishing an International Data Classification Standard to
help identify and assess value and risk to data.
Click Here to View Full Article
to the top
Can CUDA Language Open Up Parallel Processing?
EE Times Europe (04/30/08) Holland, Colin
More universities ought to offer massively parallel computing programming
courses while more graphic processor (GPUs) providers should consider
facilitating use of NVIDIA's Compute Unified Device Architecture (CUDA)
programming language on their devices, said NVIDIA chief scientist David
Kirk in a lecture at Imperial College, London. "Massively parallel
computing is an enormous change and it will create drastic reductions in
time-to-discovery in science because computational experimentation is a
third paradigm of research--alongside theory and traditional
experimentation," he said, adding that massively parallel computing also
has the potential to effect a democratization of supercomputing. There
must be an emphasis on massively parallel computing in education for every
scientific practitioner and not just computer scientists and electrical
engineers, Kirk stressed. NVIDIA created CUDA to operate on its own GPUs,
and the language has gained a lot of momentum in applying those GPUs to
applications in finance and other multivariant analyses. Some observers
say CUDA is beginning to shape academic debates over parallel processing
languages and hardware frameworks. Commercial applications associated with
computational finance, oil and gas exploration, and other computational
modeling efforts have employed CUDA, along with projects to enable swifter
and more powerful hybrid rendering within graphics. Kirk said that CUDA
parallelization offers superior linear scaling than people hand-coding for
multicore architectures. "I expect that you will see products that will
allow CUDA to run on multicores as well as enabling load balancing across
the two types of processor because what you really want to do is use all
the processors in your system," he said.
Click Here to View Full Article
to the top
Long Wait for Scarce Visas
Baltimore Sun (05/02/08) P. 1A; Brewington, Kelly
The U.S. Citizenship and Immigration Service received 163,000 H-1B visa
applications during the first five days of the application window this
year, vastly outnumbering the 65,000 visas available. In fact, demand for
H-1B visas has significantly outstripped supply for the past five years.
Employers say too few American workers have the skills needed to fill
high-tech jobs, forcing them to look for foreign talent. The massive
number of H-1B visa applications means USCIS will use a lottery to select
who receives the available visas. Immigrant advocates and high-tech
companies say America must import talent to stay competitive globally.
Critics argue that companies are using the visa program to displace
American workers and keep wages low, and that technology companies'
reliance on foreign workers has created a disincentive for American
students to study math and engineering to pursue high-tech professions.
Responding to the technology boom in the 1990s, the government increased
the visa cap to 115,000 in 1999, and again to 195,00 in 2001, only to
reduce it to 65,000 in 2004. "It's troubling because our economy now is
much more dynamic, much more diverse, and much more highly skilled than
during the tech boom of the 1990s," says Oracle's Robert Hoffman. "We are
operating under a 1990s immigration system, and that's absurd."
Click Here to View Full Article
to the top
Digital Deception
Washington Post (05/01/08) P. D1; Whoriskey, Peter
Human-mimicking computers are becoming increasingly successful at solving
CAPTCHA online tests intended to separate humans from computers. In April,
Hotmail CAPTCHAs were broken by a computer. The computer then created
numerous free Hotmail email accounts and sent out waves of spam, Websense
says. Similar attacks occurred this year at Microsoft's Live Mail and
Google's Gmail and Blogger. "What we're noticing over the last year is
that these tests meant to tell the difference between a human and a
computer are being targeted by more and more malicious groups," says
Websense's Stephan Chenette. "And they are getting better at it." Solving
CAPTCHAs with computers allows spammers to quickly create new email
accounts to send spam, which Ferris Research estimates could cost the U.S.
economy $42 billion annually. In addition to computers breaking CAPTCHAs,
low-wage workers overseas are being paid to solve them. In fact, Google
says it believes humans were involved in solving its CAPTCHAs. Microsoft
and other Web companies say they are interested in developing human
verification tests that are more difficult for computers to crack, but
making the tests harder for a computer could make them harder for humans as
well.
Click Here to View Full Article
to the top
Green Computing
Berkeley Lab Research News (04/14/08)
Director of Lawrence Berkeley National Laboratory's National Energy
Research Scientific Computing (NERSC) Center Kathy Yelick says the most
pressing problem in modern-day computing is power, and the computer
industry has attempted to address this challenge with the development of
multicore chips. NERSC researchers and the lab's Computational Research
Division are engaged in a coordinated series of projects to enhance the
energy efficiency of scientific computations by investigating subjects in
computer architecture, algorithms, and mass-storage-system designs. The
projects will concentrate on performing more science while using less
energy and enabling next-generation exascale computing systems.
Researchers are exploring the possibility of combining a very large number
of simple cores on each chip, which can allow them to lower clock rate and
save power while still squeezing out high performance; the researchers also
think using batteries to run the chips could vastly improve power
efficiency and effective performance. Researcher Erich Strohmaier is
looking at addressing the algorithmic challenges of gaining energy
efficiency via massive parallelism by developing a testbed for benchmarking
some of the critical algorithms across a broad range of scientific
applications so that a method for algorithmically assessing system
performance can be created. Another project aims to study a wide spectrum
of competing multicore computer architectures and measure how efficient
they are in performing sophisticated scientific computations. Project
leader Jonathan Carter says the goal is "to identify candidate algorithms
that map well to multicore technologies and document the steps needed to
re-engineer programs, to take advantage of these architectures," and to
perhaps help design an improved high-performance system by influencing
design components in multicore chips.
Click Here to View Full Article
to the top
Malicious Hardware May Be Next Hacker Tool
New Scientist (05/01/08) Inman, Mason
Hackers could soon start using a new tactic in which they gain control of
a computer by adding malicious circuits to its processor, say University of
Illinois at Urbana-Champaign researchers. The malicious circuits would be
able to avoid detection because they could manipulate computers at a deeper
level than a virus. A University of Illinois at Urbana-Champaign research
team led by professor Samuel King used a field programmable gate array
(FPGA) to create a replica of an existing open source processor with about
1.7 million circuits. The team added about 1,000 malicious circuits not
present in the processor. The malicious circuits allowed the team to
bypass security controls on the processor similar to how a virus gives
control to a hacker, but without requiring a software flaw. Attaching the
FPGA to another computer allowed them to steal passwords stored in its
memory and install malicious software that would give them remote control
of the computer's operating system. Putting malicious hardware on a chip
is not as easy as installing a virus, as the hacker must have either access
to a chip during its design or manufacturing, be able to build and sell
those chips to a computer manufacturer, or sneak their chips into computers
during assembly. However, as chips and their design processes become more
complex, it becomes easier for a hacker to infiltrate.
Click Here to View Full Article
to the top
Beating the Codebreakers With Quantum Cryptography
ICT Results (04/28/08)
Cryptography has been an arms race, with codemakers and hackers constantly
updating their arsenals, but quantum cryptography could theoretically give
codemakers the upper hand. Even the absolute best in classical encryption,
the 128-bit RSA, can be cracked using brute force computing power.
However, quantum cryptography could make possible uncrackable code using
quantum key distribution (QKD). Modern cryptography relies on the use of
digital keys to encrypt data before sending it over a network so it can be
decrypted by the recipient. QKD promises a theoretically uncrackable code,
one that can be easily distributed and still be transparent. Additionally,
the nature of quantum mechanics makes it so that if an eavesdropper tries
to intercept or spy on the transmission, both the sender and the receiver
will know. Any attempt to read the transmission will alert the sender and
the receiver, allowing them to generate a new key to send securely. QKD
had its first real-world application in Geneva, where quantum cryptography
was used in the electronic voting system. Not only did QKD guarantee that
the poll was secure, but it also ensured that no votes were lost in
transmission, because the uncertainty principle established that there were
no changes in the transmitted data. The SECOQC project, which did the work
for the voting system, says the goal is to establish network-wide quantum
encryption that can work over longer distances between multiple parties.
Click Here to View Full Article
to the top
Phantom Obama Vote Appears on NJ Voting Machine
Wired News (04/30/08) Zetter, Kim
Officials from New Jersey's Pennsauken District 6 report that 279 votes
were cast during the Feb. 5, 2008, Democratic primary, but Princeton
University computer scientist Ed Felten has learned that a phantom vote was
cast for Barack Obama. The county clerk's report is based on information
taken digitally from the memory cards inside three Sequoia voting machines.
However, Felten says the summary tapes printed from the machines show that
there were 280 votes, and Obama received 95 rather than 94 votes. Felten
has a better chance of solving the mystery surrounding the Sequoia voting
machines after a judge ruled last week that independent experts could gain
access to voting machines in order to test their software and firmware.
The ruling stems from a lawsuit filed in 2004 over the legality of using
touch-screen voting machines in the state. Sequoia had threatened to sue
the state if it allowed researchers such as Felten to review its
machines.
Click Here to View Full Article
to the top
Engineers Harness Cell Phone Technology for Use in
Medical Imaging
University of California, Berkeley (04/29/08) Yang, Sarah
University of California, Berkeley engineers are developing a system that
could one day use cell phones to make medical imaging accessible to
billions of people around the world. "Diagnosis and treatment of an
estimated 20 percent of diseases would benefit from medical imaging, yet
this advancement has been out of reach for millions of people in the world
because the equipment is too costly to maintain," says UC Berkeley
professor of bioengineering and mechanical engineering Boris Rubinsky, head
of the cell phone application development team. "Our system would make
imaging technology inexpensive and accessible for these underserved
populations." Most medical imaging devices consist of three
components--data acquisition hardware connected to the patient, image
processing software, and a monitor to display the image. When the
components are combined in one unit, there are often redundancies that
increase the cost of the device. Rubinsky's team created a way of
physically separating the components so the most complicated element, the
processing software, can be kept at an offsite location. The cell phones
would be used to relay raw data from the acquisition device to the central
server, which would process the data and relay the image back to the cell
phone. "This design significantly lowers the cost of medical imaging
because the apparatus at the patient site is greatly simplified, and there
is no need for personnel highly trained in imaging processing," says
researcher Antoni Ivorra.
Click Here to View Full Article
to the top
Open-Source, Multitouch Display
Technology Review (05/01/08) Greene, Kate
Eyebeam engineers have created a scaled-down version of Microsoft's
Surface multitouch table called Cubit that's being released as an
open-source technology. Eyebeam's Addie Wagenknecht and Stefan
Hechenberger say they designed Cubit in an effort to "demystify multitouch"
and to prove that anyone could build a multitouch table with the right
guidance and hardware. In addition to making the software available
online, Wagenknecht is selling a variety of do-it-yourself kits that
include parts and instructions for people with a range of engineering
skills. Building a multitouch table can cost between $500 to $1,000
depending on the hardware used. Multitouch technology has been available
for awhile, but only recently has it become more affordable due to the
falling costs of many touch-screen components. Cubit is a box-like table
with a clear surface. The device uses a Webcam inside the table with an
added infrared filter, along with a small image projector that costs about
$300. Wagenknecht says a user only needs to plug the Webcam into a
computer, install software available on Cubit's project site, and plug in
the projector to have a multitouch interface. The kit includes a tabletop
with a special coating to make it easier for the camera to track objects,
and strips of infrared LEDs that shine on the back of the screen.
Click Here to View Full Article
to the top
Rice, Methodist and TFA Receive $1.5 Million NSF Grant to
Study In-Home Health Management and Next Generation Wireless
Networks
Rice University (04/24/08) Boyd, Jade; Fairchild, Erin
Rice University, the Methodist Hospital Research Institute, and the
nonprofit Technology for All organization have received a $1.5 million
National Science Foundation (NSF) grant to explore ways of providing
low-cost, personalized health monitoring to people with chronic diseases.
The researchers will study how patients with chronic illness can use
inexpensive handheld wireless monitoring devices called Blue Boxes to
become more active in their own medical treatment. NSF will pay for the
development and testing of the Blue Boxes, and for the wireless broadband
network that will connect the boxes to a central hub for analysis.
Methodist Hospital researcher Clifford Dacso says the Blue Box makes
personalized health care much more accessible to patients with chronic
illness. He says that combining the Blue Box technology with an existing
wireless network will enable people with chronic illness to fine-tune their
health, preventing deterioration that may result in emergency care.
Patients will use the Blue Boxes to monitor several key aspects of their
health status and send that information back to a central database for
analysis. Rice professor Ed Knightly, the project's principal
investigator, says the wireless network is a "first-of-its-kind" research
platform that features a fully programmable wireless network that connects
4,000 users.
Click Here to View Full Article
to the top
Social Networking Applications Can Pose Security
Risks
Associated Press (04/28/08) Irvine, Martha
The thousands of mini-programs designed by third-party developers for use
on social networking sites such as Facebook and MySpace could pose a
security risk for users. Such programs are risky for users in part because
they can be created by anyone with a little technical know-how to gather
information about the users who download them. Among those who have
created a social networking application is Adrienne Felt, a Facebook user
and a computer science student at the University of Virginia who wanted to
research how such applications work. As part of her research, Felt polled
developers of the application and found that they did not use or need the
information they gleaned from users who downloaded their applications,
including demographic information such as gender and age. Developers that
did use the information said they only used it to display targeted ads to
the person when they used the application. However, Felt found that there
was nothing stopping developers from matching the information they gathered
with public records. But even more worrisome for social networking users
is the prospect that the information gathered by developers could be sold
or stolen, which could in turn lead to identity theft. Applications are
not the only threat to social networking users. Last year, researchers
from Indiana University found that they were able to "scrape" information
from students' social networking sites. Given these threats, social
networking users should limit the information they post on their pages,
said Tom Jagatic, one of the researchers involved in the Indiana University
study.
Click Here to View Full Article
to the top
Developer 2.0: Gung-Ho or Ho Hum?
eWeek (04/28/08) Taft, Darryl K.
Software developers have long employed collaboration technology, and their
attitude toward Web 2.0 technology appears to be a mix of excitement and
disinterest. Programming maven Ted Neward says the discussion of
developers and Web 2.0 tends to travel along two distinct avenues:
Features embedded within applications that developers are creating for
customers, and features developers use to build software themselves.
Developers' enthusiasm about anything new and novel--such as Web
2.0--follows the first scenario, while in the second scenario anything that
improves communication between developers is generally viewed as positive,
but this attitude tends to consign Web 2.0 to the same level of interest as
email, shared desktops, etc. "Developers will always vary in their
opinions, but my impression is that they have been kind of 'ho hum' about
[Web 2.0] as a general thing," says developer Chuck Esterbrook. "When they
do get excited, it's about a specific site they're building and what they
envision it doing down the road." Ringside Networks co-founder Bob Bickel
contends that developers are generally very enthusiastic about Web 2.0
because it allows them to get a great deal of functionality with a minimum
of sweat. Iona Technologies CTO Eric Newcomer observes the adoption of Web
2.0 technology by developers, while Cohesive Flexible Technologies CTO
Patrick Kerpan says that "use of social and collaborative features will
further enhance the capabilities of good developers by increasing the
richness of the data stream they live in."
Click Here to View Full Article
to the top
Pay Crunch
InformationWeek (04/28/08)No. 1183, P. 28; McGee, Marianne Kolbasuk
The first drop in IT professional salaries since the dot-com meltdown has
been recorded by InformationWeek's annual U.S. IT Salary Survey, which
found that staffers' median base pay declined from $74,000 last year to
$73,000 this year while managers' median base pay dipped from $97,000 to
$96,000. Among the factors that Rochester Institute of Technology
professor Ron Hira thinks may be behind this trend are the economic
downturn, competition with cheaper overseas talent, the replacement of
retiring baby boomers by less expensive younger workers, and perhaps a
skills/job title mismatch throughout the industry. Though 55 percent of IT
professionals polled in the survey said outsourcing reduced job
availability, just 22 percent said outsourcing caused IT salaries to fall.
Moreover, two-thirds of respondents said outsourcing has not affected them
personally. The stagnation of IT wages is surprising partly because of the
strong growth IT employment appears to have exhibited in the last 12
months, but much of that growth is concentrated in lower-paying positions
such as computer support. In comparison to last year's survey, when 63
percent of staff and 71 percent of managers expressed satisfaction with
their compensation, this year's poll found those numbers reduced to 56
percent of staffers and 63 percent of managers. An 8 percent year-on-year
increase in respondents who place a premium on job security is noted,
although 90 percent of IT pros feel their jobs are strongly or somewhat
secure, which is the same percentage recorded last year. There are worries
that entry-level IT jobs are becoming harder to secure, and the number of
employers who sponsor continuing education, certification programs, and
additional training so IT pros can keep their skills up to date is
relatively low.
Click Here to View Full Article
to the top