Rice Computing Pioneer Ken Kennedy Dead at 61
Rice University Press Release (02/07/07)
Computing pioneer Ken Kennedy, the founder of Rice University's computer
science program and an ACM Fellow, died Feb. 7, after a long bout with
cancer. In a 36-year career, Kennedy is credited with making Rice one of
the country's leading centers for computational research and education. He
founded Rice University's Department of Computer Science Department in
1984, its cross-disciplinary Computer and Information Technology Institute
(CITI) in 1987, its Center for Research on Parallel Computing in (CRPC)
1989, and its Center for Higher Performance Software Research (HiPerSoft)
in 2000. "Ken Kennedy early on realized the power of computers to address
real problems that confront people and the Earth," said Rice President
David Leebron. In 1997, Kennedy was asked to be co-chair of the
President's Information Technology Advisory Committee (PITAC), which urged
government leaders to boost computing spending by over $1 billion, and
provided the spark for increased research support from many government
agencies. Kennedy earned "enormous respect [from] his colleagues around
the world ... For his abilities, his professional accomplishments, and his
humanity," said Rice Physicist Neal Lane, who served as NSF director and
then White House science advisor during Kennedy's PITAC tenure. Aside from
taking part in countless panels at Rice, he was known for his willingness
to work alongside students, specifically in the effort to pull the coaxial
cable for the campus's first LAN. Kennedy received the ACM Lifetime
Achievement Award in 1999, which was "particularly significant to him
because it was an award from the community where he got his start," said
Keith Cooper, Rice Department of Computer Science chair, and a former Ph.D.
student of Kennedy's. In 1999, when the ACM SIGPLAN listed the 50 most
influential papers of the last 20 years, Kennedy had five of them and three
of his former students had at least two. "It is fair to say that no one in
the last 35 years has had as much influence on the field of
programming-language implementation as Ken," said CITI Director Moshe
Vardi. He is also remembered for his work in making supercomputers more
accessible for scientists and engineers.
Click Here to View Full Article
to the top
Feinstein Will Pursue Paper Record at Polls
San Francisco Chronicle (02/08/07) P. A4; Coile, Zachary
Sen. Diane Feinstein (D-Calif.) later this month will introduce a bill
that establishes national standards for e-voting security. "I believe the
time has come for Congress to help ensure that we have such a record in all
federal elections," said Feinstein, chairwoman of the Senate Rules and
Administration Committee. Rep. Rush Holt (D-N.J.), who already introduced
a similar bill in the House, calls the recent Florida election problems
"exhibit A" as to why national e-voting mandates are needed before the
upcoming presidential election. His bill requires a paper trail, random
manual audits of paper ballots in a small portion of each precinct, and an
assurance that e-voting software will be open to regular inspection. Rice
University computer science professor Dan Wallach has criticized e-voting
machine manufacturers for not allowing their software to be tested
independently, saying they "shouldn't need to hide behind a veil of
secrecy." Opponents of e-voting legislation claim that printers will only
add to the problems at the polls, as they are known to jam with varying
frequency, and that e-voting machines have proved to be more accurate at
vote-counting than other systems. However, Feinstein says current e-voting
systems lack the safeguards necessary to prevent fraud. She says, I'm not
sure that the most technologically modern machines necessarily yield the
best results. I'm from the school that likes to see their mark (on the
ballot.)"
Click Here to View Full Article
to the top
Feds Defend Oversight of E-Voting Testing
CNet (02/09/07) Broache, Anne
The Election Assistance Commission (EAC) held a public meeting on Monday
to clear up accusations that it was not being open about its review
processes for the independent labs that test e-voting machines. A New York
Times article that exposed EAC's ban on Ciber, the largest such testing
lab, from conducting any more tests concerned many who felt the commission
should be more forthcoming with such information. Questions were also
raised as to the reliability of voting systems tested by Ciber for use in
past elections. EAC Chairwoman Donetta Davidson said it is standard
practice for labs to "be given a period of time in which they can correct
those non-conformities, and that may go on for some time." The National
Institute of Standards and Technology has released a good deal of
information on the lab review process, such as the complete reports from
onsite assessments of the lab, the lab's response, and the names of labs
that have applied for the review, according to NIST's David Alderman. He
added that such information is not usually made public because of the
tendency for labs to use it to promote themselves or smear other labs.
Ciber now has until March 5 to hand in paperwork that EAC will use to
decide whether or not to grant the lab "interim" accreditation. Also on
this date, the commission plans to stop accepting applications for
"interim" status, which has less stringent requirements. A new federal
system requires a two-step process for lab accreditation. First, the lab
must prove itself in a NIST technical review. If the lab passes this
review, the matter is passed along to the EAC, which checks for
non-technical concerns such as conflict of interest, organizational
structure, and record-keeping protocols. For information regarding ACM
e-voting activities visit
http://www.acm.org/usacm
Click Here to View Full Article
to the top
U.S. Cyber Counterattack: Bomb 'Em One Way or the
Other
Network World (02/08/07) Messmer, Ellen
The National Cyber Response Coordination Group (NCRCG) has been formed to
draw up a national response should a cyber attack occur that impairs the
United States' critical information infrastructure. In such an event, a
cyber counterattack or actual bombing of the source of the attack could be
carried out, according to the three NCRCG co-chairs from the US-CERT
computer readiness team, the Department of Justice, and the Department of
Defense (DoD), although the preferred method would be to warn the source to
shut down before being attacked. Given this week's attempted massive
denial-of-service attack on the Internet's root DNS servers, "We have to be
able to respond," says DoD co-chair to the NCRCG Mark Hall. "We need to be
in a coordinated response." Bringing together elements of the private and
public sectors for information-gathering efforts is quite a challenge even
without considerable disruption to Internet or voice communications.
"We're working with key vendors to bring the right talent together for a
mitigation strategy," says US-CERT co-chair to the NCRCG Jerry Dixon. The
group plans to speak with 50 other countries that are also monitoring for
large-scale cyber attacks. The Air Force has already established a new
Cyber Command that would be ready for "network warfare," says Air Force
Information Operations Center R&D engineer Jim Collins. "Where we had
pilots before, we'll have fighters in cyberspace." Any NCRCG
recommendations would be subject to approval by the President.
Click Here to View Full Article
to the top
Wizardry at Harvard: Physicists Move Light
New York Times (02/08/07) P. A11; Craig, Kenneth
A Harvard study has demonstrated a technique for capturing, moving, and
releasing a light pulse, which one day could allow computers to process
information stored in light pulses. Study leader Lene Vestergaard Hau has
had previous success with slowing down light, and even stopping it in what
is known as Bose-Einstein condensate, a substance generated by bringing a
cloud of sodium atoms down to an extremely low temperature. After a laser
is shined on this cloud, it becomes molasses-like when hit by a second
pulse. Hau's latest work went a step further: After being trapped in a
Bose-Einstein condensate "cloud," the light pulse was transferred to
another cloud and regenerated there. When the initial pulse hit the cloud,
tens of thousands of sodium atoms were sent spinning in a clump that slowly
moved forward; this clump had identical characteristics to the light pulse,
even though it consisted only of sodium atoms. After this clump had
imbedded itself within another Bose-Einstein condensate cloud, a laser was
shined on this cloud, and a new pulse of light was produced, identical to
the original one. This work shows the ability to "put [information] on the
shelf," according to Hau, by making a light pulse into a clump of atoms.
Atomic clumps would be far easier for a computer to work with compared to
fast-moving light pulses. While these results are far from being used in
any practical application, it does provide a "missing link," Hau says.
Today, optical signals must be changed into electrical ones to be processed
and then transformed back into light, but all-optical devices could lower
costs and power usage.
Click Here to View Full Article
to the top
Researchers Invent System to Control and Quarantine Worms
Attacking Computer Networks
Penn State Live (02/08/07)
Penn State researchers have developed anti-worm technology that detects
and stops worms much faster than conventional systems, and is also able to
release any information stopped as the result of a false positive. Rather
than using signature or pattern identification, Proactive Worm Containment
(PWC) "looks for anomalies in the rate and diversity of connection requests
going out of hosts ... [since] a lot of worms need to spread quickly in
order to do the most damage," says lead PWC researcher Peng Liu.
Signature-based systems can take a few minutes to recognize a worm and
create a new signature to stop it from spreading, and when these systems
decrease the time needed to generate a signature, they can miss worms that
are able to automatically mutate. Liu estimates that a worm could only
send out a few dozen packets before being quarantined by PWC, compared with
the 4,000 packets sent out every second by a worm that recently attacked a
Microsoft SQL server. To verify if a suspected host is infected or not,
PWC uses vulnerability-window and relaxation analyses that can undo a
potential denial-of-service resulting from a false positive. PWC can be
seamlessly added to existing signature-based worm filtering systems. Liu
admits that his system would not be able to spot slow-spreading worms, but
notes that those can already be stopped by current technologies.
Click Here to View Full Article
to the top
Low Interest in CS and CE Among Incoming Freshmen
CRA Bulletin (02/06/07) Vegso, Jay
A new survey from the Higher Education Research Institute at the
University of California at Los Angeles (HERI/UCLA) reveals that incoming
freshmen at all undergraduate institutions continue to show little interest
in majoring in computer science and computer engineering. Only 1.1 percent
of incoming freshmen said computer science was likely to be their major in
the fall of 2006. The percentage did not improve from 2005, and has fallen
70 percent from 2000 to 2005. Meanwhile, only 1 percent of incoming
freshmen said computer engineering would probably be their major.
HERI/UCLA started tracking computer engineering interest in 2002, and the
percentage of students saying they were likely to pursue the subject as a
major has fallen every year. Meanwhile, the Taulbee Survey of PhD-granting
computer science departments will be released March 1, 2007, and
double-digit declines in undergraduate enrollment and granting of
bachelor's degrees are expected again.
Click Here to View Full Article
to the top
New Energy Star Rating for PCs on the Way
CNet (02/08/07) Krazit, Tom
In July, the Energy Star program will release new specifications meant to
define the top 25 percent of PCs according to energy efficiency. Over 90
percent of computers on the market fit the current Energy Star requirements
established in 1992. To fit the new requirements, a PC's power supply must
convert 80 percent of incoming electricity for the use by the machine; the
average power supply is currently about 70 percent efficient. The new
specification will include a requirement for idle mode: Basic desktop PCs
must use less than 50 watts of power in idle mode, while multicore PCs and
those with advanced graphics processors will be allowed to use more;
notebooks must use less than 14 watts in idle mode and those with a
graphics chip must use less than 22 watts. The lack of an accepted metric
for power consumption has delayed the release of new Energy Star
specifications, says National Resources Defense Council scientist Noah
Horowitz. He says that computer manufacturers' have not been able to agree
about what is normal power consumption when a computer is "on." Computer
makers tend to set up machines for maximum performance when testing them
against competitors, and Horowitz says, "What are you going to make the
computer do during that test, and how do you make sure it's not gamed?"
Efforts to issue specifications for servers have met similar problems, and
Congress has recently passed a resolution asking for greater energy
efficiency for servers. Energy Star is also planning a new specification
for TVs; the last one was issued for black-and-white models.
Click Here to View Full Article
to the top
Miklau Awarded CAREER Grant to Study Privacy,
Accountability
University of Massachusetts Amherst (02/05/07)
University of Massachusetts computer science researcher Gerome A. Miklau
plans to build a computer database system that improves the management of
digital devices' history of past operations and data. The development of
the database system will be made possible by a five-year, $500,000 grant
from the National Science Foundation's Faculty Early Career (CAREER) grant
program. Computer systems preserve the history of their activity as a way
to offer some accountability, and this is helpful for detecting breaches,
maintaining data quality, and auditing security compliance. "In some
settings, however, retaining a history of past data or operations poses a
serious threat to privacy," says Miklau. "The fact is, privacy and
accountability are both important goals, and system designers need to
carefully manage the balance between them." Miklau plans to give the
computer database system "memory-less" and accountability-support
functions. A prototype database system will be made publicly available.
Click Here to View Full Article
to the top
High Security for $100 Laptop
Wired News (02/07/07) Singel, Ryan
The One Laptop Per Child (OLPC) project has impressed computer experts
with the unique design of its machine, the XO, and is now receiving
attention for its approach to security. The XO's security lies in the
limited permission of each program to access others due to the virtual
machines that every program runs in. Although the idea of limiting
programs' permission is nearly half a century old, it has placed too much
of a security burden on programmers to be instituted, says Ivan Krstic,
head of security for the XO. XO's security system, known as the BitFrost
platform, has no security prompt, firewalls, or antivirus software.
"Applications can no longer run rampant," says Krstic, as opposed to
Windows XP where even Solitaire can access the Web. Only software verified
by OLPC or by a participating country can request permissions. The idea is
to undermine malware by eliminating hackers' economic incentive. Krstic
does acknowledge that interaction between applications will be severely
limited, but he says that "99 percent don't need" to. The XO will also
have a system by which it checks in with a country-specific server every
day to see if it has been reported stolen; if it has been it completely
shuts down, and if not its cryptography-secured "lease" is extended a few
more weeks. Krstic sees flaws in every traditional security architecture
used in today's computers, including the new Microsoft Internet Explorer's
virtual sandbox, which he says, "is trying to impale sandboxing on
something that doesn't exist."
Click Here to View Full Article
to the top
For Computer Scientists Exploring Face Recognition, the
Question Is 'Who?'
PhysOrg.com (02/07/07) Zyga, Lisa
The human brain is able to recognize a face in 50 milliseconds, and while
scientists hope to learn from the way the brain works in order to create
computer programs that can do the same, they are eager to find out if
computers could perhaps surpass humans in this ability. "It would be a
waste not to learn from [the brain], especially since there are no other
computer strategies so far that come close to the kind of face recognition
performance the human brain exhibits," says Harvard scientist Richard
Russell. Humans have the ability to recognize faces in very low-resolution
images. Even when given an ideal perfect view of a face, the human visual
system "doesn't seem to bother storing perfect models of the objects we
see," says MIT's Benjamin Balas, who worked along side Russell on the
paper, "Face Recognition by Humans." People seem to use only certain
features, such as eyebrows, to identify each other. Faces are processed by
adults as holistic images, unlike many other objects. In designing face
recognition computers that could be used to find people in a crowd or for
the creation of "smart" environments, scientists pay attention to the human
visual system's techniques, but remain open to the possibility that a
computer could do better. "Too much generalization can be a short-coming,"
especially when people are in disguise, says MIT's Pawan Sinha, another
contributor to the "Face Recognition" paper. "A detail-oriented scheme,
say examining the precise pattern of irises or the exact distances between
facial features, might be more appropriate, despite being implausible as
human strategies."
Click Here to View Full Article
to the top
Flow of Tiny Bubbles Mimics Computer Circuitry
MIT News (02/08/07) Trafton, Anne
MIT researchers have found a way to use tiny bubbles to replicate the
operations of a computer. The bubbles can carry on-chip process control
information while undergoing chemical reactions. "Bubble logic merges
chemistry with computation, allowing a digital bit to carry a chemical
payload," said MIT Center for Bits and Atoms director Neil Gershenfeld.
"Until now, there was a clear distinction between the materials in a
reaction and the mechanisms to control them." This field, known as
microfluidics, allows the manipulation of chips using tiny bubbles flowing
through microchannels, without the need for any external controls. "Now
you can program what's happening inside the lab on a chip, by designing
bubble logic circuits that function just like their electronic
counterparts," said graduate student Manu Prakash. A future step would be
the creation of large-scale microfluidic systems like chemical memories,
which can hold thousands of reagents on a chip, a technique similar to data
storage. The chips developed by Gershenfeld and Prakash used the presence
or absence of a bubble, rather than high or low voltage, to represent a bit
of data. In their paper, they demonstrate the components needed for a new
logic family, such as gates, memories, amplifiers, and oscillators.
Currently, the speed of their chip is about 1,000 times slower than today's
electronic microprocessors, but 100 times faster than external valves and
control processes used in today's microfluidic chips.
Click Here to View Full Article
to the top
Two NYU Stern Professors Awarded $1 Million in National
Science Foundation Career Grants
NYU Stern (02/07/07)
Two NYU Stern professors have received National Science Foundation Faculty
Early Career Development Awards (CAREER) to pursue research for improving
understanding of the increase in information on the Internet, including its
economic value. The CAREER grants will provide Anindya Ghose and
Panagiotis (Panos) Ipeirotis, assistant professors of information,
operations, and management sciences (IOMS), with $500,000 each over the
next five years. Ghose will focus on the economic impact of new
information online, and the research will offer recommendations for making
better electronic markets and social networks. Ipeirotis will concentrate
on Internet searches, and his efforts could help businesses benefit from
efficient processing and understand its economic impact. "Panos' research
improves the way we find information by enabling users to convert
unstructured information Web pages into structured form and by estimating
the economic value of each piece of extracted information," says Vasant
Dhar, deputy chair of the Information Systems Group in the IOMS department.
"Anindya's research quantifies the economic value of information on the
Internet with the objective of improving the design and profitability of
electronic markets, online retailers, and social networks." Ghose received
the 2005 ACM SIGMIS Doctoral Dissertation Award, and Ipeirotis received the
conference best paper award at 2006 ACM SIGMOD.
Click Here to View Full Article
to the top
The Mind Chip
New Scientist (02/03/07) Vol. 193, No. 2589, P. 28; Fox, Douglas
A notable achievement in computer vision has been made by researcher
Kwabena Boahen and colleagues at the University of Pennsylvania in
Philadelphia, who constructed a device that can see via chips that
physically imitate the electrical activity of neurons in the primary visual
cortex. "I want to figure out how the brain works in a very nuts-and-bolts
way," explains Boahen. "I want to figure it out such that I can build it."
Boahen aims to top his accomplishment of building an artificial retina
with the creation of an artificial cerebral cortex through the
generalization of the chip's function; such a breakthrough may be an
important step in helping restore neural function to people impaired by
disease or injury. The concept of the artificial neuron as a technology
for enabling brain-like computing in real time was first suggested in the
late 1980s by California Institute of Technology scientist Carver Mead, who
discovered he could build such circuits by having digital processors use
transistors in their analog amplifier phase instead of their on/off
switching phase. Mounted on the surface of Boahen's artificial retina are
photosensitive transistors that translate incoming light into analog
voltages with a value determined by the light's intensity and which last
for as long as the light is beamed onto the transistors; these
transmissions are routed to the artificial retina neurons where motion and
regions of contrast are recognized, signaling the edges of objects in the
image. Processing information about edges and movement in the visual scene
is carried out by the low-power visual cortex chips, which build object
outlines out of the signals. A successful cortical implant will have to be
able to mimic the plasticity of the brain's neural network, in which
connections between neurons are created and adapted on the fly.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Super Saver
Government Computer News (02/05/07) Vol. 26, No. 3,
A new generation of file systems is being developed to handle the data
management needs of ultrafast supercomputers such as Roadrunner, a
petascale machine with upwards of 32,000 processors being developed by Los
Alamos National Laboratory's High Performance Computing Systems Integration
Group. A key impetus for the creation of global or parallel file systems
comes from Energy Department labs. In addition to specifying what
operations can be performed on a file over a network, a global file system
must keep tabs on data that extends across multiple storage arrays. "You
definitely want all [data] shared across all the nodes in a cluster, so all
the nodes see the same data, can read the data and write the data to common
places," says Gary Grider with the group developing Roadrunner. Concurrent
with that must be the aggregation of all data pointers into a single pool
to avoid hindering speed of access. Grider points out that Energy labs
generally employ a combination of major global file systems, and Los Alamos
elected to use the Panasas ActiveScale File System running on Panasas'
ActiveScale Storage Cluster for Roadrunner. Choosing the best parallel
file systems for a supercomputer involves designers rating the advantages
and disadvantages of currently available products and solutions. "None of
[these solutions] has yet distinguished itself as the answer to all I/O
issues in supercomputing," notes Paul Buerger with the Ohio Supercomputer
Center.
Click Here to View Full Article
to the top
The Trouble With Software Quality Control
Dr. Dobb's Journal (02/05/07) Armitage, Colin
The Original Software Group founder Colin Armitage points out the
disadvantages of relying on manpower to spot software bugs when
methodologies to work and measure effectiveness are far more efficient.
"Having end users report problems to an overloaded help desk is not the
best way to achieve the goal of releasing defect-free code," he maintains.
"Unsurprisingly, a reactive development/QA environment can exacerbate
damage when application performance plummets, or when programs fail all
together." The government is pressuring organizations to implement
stronger software quality assurance with the institution of federal
compliance legislation such as Sarbanes-Oxley, but top-level IT managers
report a disconnection from end users who frequently have trouble with new
systems. Armitage recommends the Software Testing Maturity Model (S-TMM)
as a useful testing methodology; its advantages include ease of adoption by
IT organizations, the provision of a self-evaluation strategy, and guidance
for IT departments through a set of phases that gradually evolves the
development process. The S-TMM identifies five maturity levels--initial,
phase definition, integration, management and measurement, and
optimization/defect prevention and quality control--that IT organizations
can be scored by as a starting point for improvement. Armitage makes the
case for embracing S-TMM with his contention that "The difference between
where you believe you are in terms of testing maturity and where you really
are can be best clarified with S-TMM." In addition, upon the establishment
of a maturity baseline, IT organizations can employ S-TMM as an improvement
roadmap.
Click Here to View Full Article
to the top
She's Got Their Number
Fast Company (02/07)No. 112, P. 100; Salter, Chuck
The math sciences department at IBM's Thomas J. Watson Research Center is
run by Brenda Dietrich, and she believes it is crucial for researchers to
leave the hermetic environment of the lab and venture outside so that the
math problems they solve have real-world applications. Dietrich's
marketing skills and political savvy have helped her command respect and
resources in IBM's massive organization--no small feat--while head of IBM
Research Paul Horn observes that the company's migration from hardware to
software and services is partly responsible for the incorporation of
mathematicians into virtually all operations. Another phenomenon that the
growing stature of Dietrich's department reflects is organizations'
increased reliance on mathematicians to measure almost every process, given
the importance of the data they produce in improving efficiencies or
exploiting opportunities that impact the bottom line. Dietrich helped
start a class designed to increase researchers' understanding of
consulting's processes and cultural aspects in order to make mathematicians
more accommodating of clients' needs and mindsets. Among the projects
Dietrich and her department are engaged in is an effort to develop more
effective strategies for fighting forest fires through the creation of a
model based on seven years' worth of data that quantifies the extent, cost,
and consequences of past resource utilization. Identifying promising sales
leads and assembling a project team out of far-flung consultants are other
challenges the department is taking on, while Dietrich is especially
interested in the problem of forecasting future labor shortages through
analysis of population trends, employee demographics and skills, and demand
for specific technologies. "We want to push the frontiers of what's
solvable," notes Dietrich. "Otherwise, what's the point?"
Click Here to View Full Article
to the top
I, Microsoft
Industry Week (02/07) Vol. 256, No. 2, P. 17; Teresko, John
Expanding the potential of robotics is the goal of Microsoft Robotics
Studio, a software development kit designed to provide a common platform
for the generation of robotic applications. Georgia Tech College of
Computing professor Tucker Balch says the kit will not only increase the
presence of robots in industry, but also cultivate the application of
robotics technology to consumer products and other economic domains.
General manager of the Microsoft Robotics Group Tandy Trower says his
unit's objective is to offer an affordable, open platform that reduces
robot developers' difficulty in melding hardware and software in their
designs. The kit is freely available to students, academics, and robot
enthusiasts, while robot developers seeking profits can license the kit at
a starting price of $399. Trower says there are three software areas
contained in the development kit: A runtime architecture that is
universally applicable to all robot types; a toolset that simplifies the
programming and debugging of robots; and over 30 tutorials and samples of
source code that serve as a jumping-off point for creators of robotics
applications. Trower says the kit was created in response to requests from
academics, hobbyists, and commercial robotics developers, and he reports
that today's robotics industry is fraught with the same type of
fragmentation that the computer industry suffered from in the 1970s.
"Robots use different operating systems, different hardware and different
processors," Trower notes. "So programming these things is very hard, the
toolset is limited and reusability of code is very limited."
Click Here to View Full Article
to the top