I.B.M. Near Supercomputer Contract
New York Times (08/06/07) Markoff, John
Documents accidentally posted for a short time on a federal government Web
site show that the National Science Foundation plans to award a contract to
build the world's faster supercomputer to I.B.M. The supercomputer is
expected to be built at the National Center for Supercomputing Applications
at the University of Illinois at Urbana-Champaign, and will cost $200
million to build and may cost more than $400 million during its five-year
lifetime. The supercomputer will be the first machine capable of handling
one thousand trillion mathematical operations a second, also known as a
petaflop. Unlike most of the nation's academic research supercomputers,
which serve a large community of users, the petaflop supercomputer will be
reserved for handling a limited number of Grand Challenge science projects,
like simulating the impact of global warming. The computer represents a
significant shift in the balance of computing power between military and
scientific computer centers. For most of the last two decades, the fastest
computers in the United States were located at either the national
laboratories at Los Alamos, N.M., or Livermore, Calif., and were primarily
used for tasks related to the design and preservation of nuclear weapons
and other classified applications. The documents have caused quite a bit
of controversy, as several government supercomputing scientists say they
are concerned that the decision might raise questions about impartiality
and political influence. A second award listed in the documents shows that
the NSF is also planning to install a Cray supercomputer at the Oak Ridge
National Laboratory at the University of Tennessee, which essentially would
supply the Department of Energy with another supercomputer, because
although the award was given to the university, the operation would be run
by the Department of Energy. The I.B.M. supercomputer may not be the
world's faster computer for long, however, as Japanese researchers are
designing a machine that they believe will reach a computing threshold of
10 petaflops in 2011.
Click Here to View Full Article
to the top
37 Percent of Caltech's Incoming Class Are Women
San Luis Obispo Tribune (CA) (08/06/07)
This year, women will account for more than one-third, 37 percent, of
Caltech's incoming freshman class, the highest percentage of a class since
Caltech started admitting undergraduate women in 1970. Six years ago,
women accounted for 36 percent of Caltech's freshman class, but that
percentage dropped and reached a low of 28.5 percent in 2006. Caltech
officials say the increased enrollment of women at the college indicates
that progress is being made in getting more women interested in technology
and science training. "The more women we have on this campus, the better
it is for everybody," says Caltech's assistant vice president for student
affairs Erica O'Neal. "It is better for women to not feel so isolated. And
it is better for the guys to learn how not to be awkward with the opposite
sex." Female enrollment at Caltech is still lower than MIT's expected 46.1
percent for this year's incoming class, and the 42.6 percent at science and
math school Harvey Mudd College. Each of the technology schools fall
dramatically short of the current 57 percent female enrollment at colleges
nationwide. The National Science Foundation reports that women outnumber
men in full-time graduate-level study in many biological sciences, but are
underrepresented by a two-to-one ratio in physical sciences, like chemistry
and physics, and a three-to-one ratio computer science. Caltech says the
increased percentage of women entering the school is because of an
increased effort to actively recruit women, which included efforts to
ensure women were aware that they could major in a hard science but still
be able to study other interests, like music and literature. Director of
undergraduate admissions Rick Bischoff says that does not mean men are less
interested in such subjects, but that the ability to have a diverse field
of study particularly resonated with women.
Click Here to View Full Article
to the top
Analyze This
HPC Wire (08/02/07)
The National Science Foundation (NSF) recently approved plans to deploy
Integrated Performance Monitoring (IPM) on all major NSF supercomputers.
IPM, developed by NERSC's David Skinner in 2005, analyzes the performance
of HPC applications and identifies load balance and communication problems
that could prevent a system from running smoothly and reaching its highest
level of performance. IPM is a particularly good tool for supercomputing
because it is easy to deploy and use in systems with thousands or tens of
thousands of processors. IPM has received approval from several other
supercomputing centers, including the San Diego Supercomputer Center, the
Center for Computation and Technology at Louisiana State University, the
Swiss National Supercomputing Center, and the Department of Defense's Army
Research Laboratory. IPM is easy to use because it has a low overhead and
requires no source code modifications, and running IPM will not interfere
with applications being profiled because it uses a fixed memory footprint.
"Some means of doing performance analyses are quite invasive and disturb
the application one is trying to study; others are more lightweight but
don't provide adequate information to researchers to improve their codes.
Some require all users of a system to actively participate in the profiling
activities; others are more passive, operating in the background. Some
scale to thousands of tasks and some do not," Skinner says. The NSF has
awarded $1.58 million dollars to the IPM deployment project, which will
also focus on expanding IPM's capabilities, including broadening the scope
of what is profiled and improving data analysis, according to Skinner. The
deployment of IPM is scheduled to start in August 2007 and will run for
three years.
Click Here to View Full Article
to the top
The High-Tech Future for the Army
CNet (08/02/07) Skillings, Jonathan
The U.S. Army is planning a major technological overhaul, the heart of
which is the Future Combat Systems (FCS) program that seeks to deploy a
full spectrum of networked equipment running the gamut from unmanned aerial
drones to more autonomous robots to battle command software to
next-generation communications systems and more. U.S. Army chief scientist
Thomas Killion says concepts and pieces of the FCS technology are being
introduced to soldiers now rather than expecting them to understand it and
learn how to use it once it's fully complete. Short-term visions for
robots and drones include intelligence, surveillance, and reconnaissance,
while Killion projects an increase in the systems' autonomy over the next
four or five years. Further out he sees the employment of very small
robots for nonintrusive surveillance and intelligence, which entails the
provision of more compact sensors and additional power efficiency, among
other innovations. Other technologies U.S. forces will field include a
truck-mounted high-energy laser being developed by Boeing, and non-lethal
"heat rays" for crowd dispersal. Killion characterizes directed energy as
"the next-generation capability particularly for
counter-rocket/artillery/mortar capability to defeat inbound projectiles."
However, he concedes that the directed energy initiative constitutes a
major technical challenge.
Click Here to View Full Article
to the top
Internet Project Gears Up at Stanford
Stanford Daily (08/02/07) Trotter, Emma
Stanford University's Clean Slate Initiative, in partnership with Cisco
Systems, Japan's DoCoMo, and Germany's Deutsche Telecom, is researching how
to design the Internet if you were starting from scratch today. Stanford
computer science professor Nick McKeown says the initiative gives Stanford
an opportunity to have a hand in the future direction of the Internet.
"With the breadth of world-class expertise here on campus, and the
proximity to the center of the networking industry, Stanford is well-placed
to do that," McKeown says. Stanford's Clean Slate team views security and
mobility as areas of improvement. In plans outlined in July, the team
cited network architecture, heterogeneous applications, heterogeneous
physical layer technologies, security, and economics and policy as five
important areas for research. Internet surveillance is an interest of
governments worldwide, but political issues may have to be addressed.
Click Here to View Full Article
to the top
Action Plan to Beat Cybercrime
Information Today (08/07) Vol. 24, No. 7, P. 24; Ashling, Jim
The International Telecommunication Union (ITU) recently announced the
Global Cybersecurity Agenda, a two-year program to improve users' trust in
the security of online transactions. ITU secretary general Hamadoun Toure
said the agenda would focus on finding technical solutions for every
environment, developing interoperable legislative frameworks, building
capacity in all relevant areas, establishing appropriate organizational
structures, and adopting effective international cooperative measures. The
agenda says that because cybercrime is a global problem the solution needs
to include a coordinated global response from invested parties, including
governments, inter-governmental organizations, the private sector, and the
civil society. The limited number of existing frameworks are enforceable
only within geographical boundaries, national or regional, which allows
criminals to exploit loopholes with impunity as they establish operations
in countries without appropriate or enforceable laws. Initially, the
objectives of the Global Cybersecurity Agenda appear to be overly
ambitious, but because the ITU consists of 191 member countries and more
than 700 nongovernmental members, the organization has the reach to cover a
full spectrum of interests. The first action will be to establish a
High-Level Experts Group (HLEG) to refine the goals, identify emerging
threats, and develop solutions. The HLEG will produce legislation for
interested countries, security and accreditation criteria for software
developers, and numerous strategies to assist global cooperation.
Click Here to View Full Article
to the top
USU Lab Researching Cyberterrorism
The Herald Journal (Utah) (08/01/07) Burgess, Kim
Utah State University researchers in the Space Dynamics Lab's (SDL)
Cyberconflict Research Consortium have been researching computer attacks
for the past year and a half in an effort to prevent attacks on the United
States' technological infrastructure from causing major disruptions. "We
would want to avoid the cyber equivalent of Pearl Harbor; that is,
something that would catch us unprepared," says SDL program manager Jim
Marshall. The researchers are working with four other institutions on the
project, including the University of Nevada, Reno, the University of Miami,
Ohio, Norwich University, and the Potomac Institute for Policy Studies
think tank. Each institution is addressing a different aspect of cyber
security. The lab's main objective is representing cyber terrorism data in
a visual format using visualization and computer graphic images. "With a
large-scale cyber attack, you have a lot of information, gigabytes and
terabytes of information," says USU computer scientist and research
assistant Robert Erbacher. SDL has experience in visualization because the
information collected from telescopes and sensor systems is best
represented visually. The USU cyber terrorism researchers will work on
representing cyber terrorism data in a visual manner that is easy for
military and homeland security officers to understand. "Any country can
try to attack the U.S. over the network," Erbacher says. "We need to be
prepared to defend against them."
Click Here to View Full Article
to the top
Online Underworld
San Francisco Chronicle (07/30/07) P. C1; Abate, Tom
Over the past few years, international criminals have begun employing
computer automation to transform unprotected PCs into law-breaking robots,
or "bots." Owners of infected PCs remain unaware of the crime. The
trend's scope is alarming, as up to 18,000 bands of infected PCs, or "bot
nets," are in existence at any given time, according to Andre Di Mino of
Shadowserver Foundation. Di Mino says security professionals believe there
are roughly 8 million to 10 million compromised systems being controlled by
"bot-herders." Intent on making money, these criminals are stealthier than
hackers of the past. To carry out attacks on Web sites, steal from bank
accounts on a large scale, and automate identity theft, the criminals have
specialized in various skills. While some criminals create malware, others
employ the malware in contaminating PCs, and still others negotiate
bot-herding deals. Firewalls can provide PCs with a certain amount of
protection, but phishing attacks can fool email recipients into opening
tainted files that seem to come from a trusted source. Microsoft has
worked to enhance the security in Windows, but bot herders are now
instigating their attacks from Web 2.0 platforms. By dodging legitimate
Web sites' security features, bot herders infect the sites with malware.
Experts wonder whether small Web 2.0 startups will be able to fulfill high
security standards, such as those used by Google.
Click Here to View Full Article
to the top
Divorce Software Designed to Handle Negotiations
LiveScience (07/31/07) Wenner, Melinda
Emilia Bellucci and John Zeleznikow, researchers at Victoria University in
Australia, have developed software designed to help couples settle divorce
disputes. The program, called "Family Mediator," combines artificial
intelligence and game theory and uses an electronic or human mediator to
help couples going through a divorce settle problems as fairly as possible.
Family Mediator is based on "Family Winner," a program developed in 2004
that gave each person 100 points to assign to objects or issues in order of
importance. The program would then chose a winner for each category,
starting with the easiest one, or the one with the largest point
difference. The loser of that category would be given extra points, and
the process would continue down the list of items. The problem with Family
Winner was that it did not account for third parties such as children. To
compensate for this problem, Family Mediator uses either a family law
practitioner or an electronic decision support system to ensure that all
decisions are in the best interest of all parties, including any children.
Currently, the programs only exist as research prototypes, but the
developers hope that it will soon be commercialized. "We have applied for
a university grant, which if successful will lead as a by-product to a
commercially viable mediation program," says Bellucci, who notes the
software could be used by social workers.
Click Here to View Full Article
to the top
'High-Performance Computing Boot Camp' to Educate
Faculty, Researchers on Capabilities of Information Technology
UVA Today (University of Virginia) (07/30/07) Arco, Andrea
The University of Virginia's High-Performance Computing Boot Camp will
teach faculty, graduate students, and research professionals from a variety
of fields about the basics of high-performance parallel computing, the
national cyber-infrastructure, and how advanced computing could be useful
to their research efforts. "Computational science is one of the most
important technical fields of the 21st century, because it provides a
unique window through which researchers can investigate problems that are
otherwise impossible to address--problems ranging from biochemical
processes to weather patterns," says the dean University of Virginia's
School of Engineering and Applied Science James H. Aylor. Participants
will learn the differences between sequential and parallel computing
systems, how to optimize sequential applications, how to locate and access
high-performance computing resources nationwide, and gain an overall
understanding of the opportunities and challenges data visualization tools
and display technologies contain. The workshop is in response to the
federal President's Information Technology Advisory Committee's report
"Computational Science: Ensuring America's Competitiveness." The report
highlights the necessity for a comprehensive understanding and
dissemination of technology to maintain scientific leadership, economic
competitiveness, and national security. In 2006, Aylor commissioned a
computational science initiative and task force charged with the objective
of producing a set of recommendations to improve the culture of computation
at the university. In October 2006, the National Science Foundation
awarded the University of Virginia's School of Engineering a two-year
$250,000 grant to develop undergraduate and graduate courses in
computational science.
Click Here to View Full Article
to the top
Taking the Lead
Red Herring (07/30/07) P. 21; Taylor, Marisa
The limited number of women in the IT industry--only 18 percent of
corporate officers at Fortune 500 information technology companies in 2006
were women according to research firm Catalyst--often have difficulty
asserting themselves and voicing their opinions without being considered
prickly or unfriendly. To help women in IT assert themselves in a positive
and effect manner, two Harvard University professors, Lee Warren, a
professor at the Center for Teaching and Learning, and Nancy Houfek, a
theater professor, have run a workshop called "Strong Women/Strategic
Performance," designed to improve women's communication in the work place
and to teach them to get their point across effectively. The idea of a
workshop to help women improve in an area that are supposedly already
skilled in may seem unnecessary, but Warren and Houfek say that women are
so outnumbered that they need to learn more advanced strategies for coping
and getting ahead. The workshop uses theater training, specifically Method
Acting, to help women gain a leg up on their male competition. First,
Houfek and Warren demonstrate physical techniques to improve posture and
relieve tension. Participants also learn vocal exercises used by actors
and singers, and to practice inflection and how emphasizing different words
in different tones of voice can change the meaning of a statement. Then
participants act out different scenarios that might happen in the work
place, acting through each one multiple times to find the most efficient
strategy. The women are encouraged to know exactly what they want to
accomplish and to develop a strategy before they enter meetings.
Participants are also taught power techniques like where to situate
themselves in a room.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Biology Proves a Natural for Robotic Design
Bend Weekly (07/27/07) LaFee, Scott
Designers of robotics technology are being inspired by biology, basing
machines and their functions on "fundamental physical principles," says
Vassar College professor John Long. Under development at Carnegie Mellon
University is the HeartLander, a minuscule medical robot designed to
perform delicate heart operations--measurement readings, drug delivery,
device installation, etc.--via remote control while moving like an inchworm
on suction cups, obviating the need for invasive surgery. Another
biologically inspired machine is Clemson University's OCTOR (sOft robotiC
manipulaTORs), a robot with a flexible tubular appendage that mimics the
grasping abilities of an elephant's trunk to manipulate objects; the
appendage is driven by compressed air and outfitted with sensors and a
camera. The Defense Advanced Research Projects Agency, which is funding
OCTOR, is also interested in BigDog, a quadrupedal, semi-autonomous robot
that has potential as a tool for carrying supplies for troops. Vassar
researchers have developed Madeleine, a robot that swims using
remote-controlled polyurethane flippers modeled after those of a marine
reptile. The robot, which is also equipped with sonar, cameras, an
accelerometer, and an altimeter, has been used in experiments to determine
whether two-flipper or four-flipper locomotion is more efficient. Other
robots patterned after organisms include arthropod-inspired six-legged
machines that can run, leap over obstacles, negotiate stairs, and scale
walls and trees, while University of Southern California researchers are
working on a system of modular robots that can link up like hive insects
into cooperative machines capable of standing, crawling, wiggling,
climbing, rolling, and flying.
Click Here to View Full Article
to the top
Satellite Multimedia for Mobile 'Phones
European Space Agency (07/26/07)
The European Space Agency's Telecommunication Department is encouraging
the development of technology that is needed to use satellite systems to
send digital multimedia such as video, television programs, radio, and data
to mobile phones and vehicle-based receivers. The ability to use
satellites to send content to mobile phones and similar devices will give
content providers an alternative to terrestrial-based networks and will
provide universal coverage and broadcasting. High-powered satellites in
geostationary orbit could be used to broadcast data, and when combined with
Earth-based repeaters, the system could ensure global coverage. Modern
mobile telephones and vehicle-mounted receivers could be easily and
inexpensively adapted to receive the satellite signals. The ESA is
partially funding the development and qualification of important components
and subsystems being developed by the European industry and satellite
operators. SES Global and Eutelsat Communications are the first to work
collaboratively toward establishing the infrastructure for S-band
broadcasting of video, radio, and data to mobile devices. Eutelsat has
commissioned a W2A satellite from Thales Alenia Space that will be launched
in early 2009 and will transmit in the S-band, which is between 2 to 4 GHz.
The S-band is a new frequency for both SES and Eutelsat, and is optimized
for supporting the wireless distribution of multimedia broadcasting.
Click Here to View Full Article
to the top
Academics Seek UAVs That Think for Themselves
Defense News (07/16/07) Vol. 22, No. 28, P. 42; Kington, Tom
Researchers in Europe and Israel are working on creating unmanned aerial
vehicles (UAVs) that use artificial intelligence to "think" independently
without being controlled by humans on the ground. For example, researchers
at the Technion Israel Institute of Technology are using "genetic
algorithms" to develop UAVs that can communicate and coordinate with one
another while in the air. Under this model, a group of three UAVs would be
able to constantly track a suspicious or enemy vehicle driving through a
city, even if the vehicle disappears behind a tall building. "Each UAV
will know the city map, and if one calculates it is about to lose sight of
the target, it will position another UAV to maintain sight while it is
blocked," explains Technion researcher Tal Shima. Similarly, a team of
researchers at U.K.-based Cranfield University is developing a system in
which a UAV flying 500 feet above a town can spot suspicious vehicles or
gunmen. When the UAV spots such a target, it commands a smaller UAV
hovering at rooftop level to swoop closer to the suspicious target; the
detailed information from the smaller UAV is then sent to an unmanned
ground vehicle in the town below, which navigates its way to the target.
The Israeli and U.K. teams of researchers will participate along with 21
other teams in the U.K. Ministry of Defense's 2008 Grand Challenge contest
for autonomous, unmanned vehicles.
Click Here to View Full Article
to the top
The Future Is in the Process
SD Times (07/15/07)No. 178, P. 5; Handy, Alex
The future of computing will almost certainly see software performing new
uses as developers create new languages and systems, predicted software
luminaries at the recent Supernova conference in San Francisco. Irving
Wladawsky-Berger, chairman emeritus of the IBM Academy of Technology and a
visiting professor of engineering systems at MIT, believes that although
software development is still in its infancy when compared to physical
engineering systems and practices, people are the biggest challenge facing
businesses today. "If you're going to apply technology to a process,
processes that are deterministic are much easier," he says. "Processes
involving people are far, far more complicated." Other presenters at the
conference demonstrated some of the capabilities of next-generation
software. Google's vice president of engineering Udi Manber displayed some
of the context-sensitive search capabilities of Google's software. "If you
search for types of dogs, we'll give you good results, but in the middle of
the page we'll insert suggestions for 'breed of dogs,' which will give you
better results," Manber says. "Our attempt is to understand queries, and
suggest different formulations of queries that will give better results."
Manber says that Google's contextual awareness is one of the reasons
unskilled users are attracted to Google and are able to improve their
understanding of the Internet. Wladawsky-Berger believes that every
business will need to be able to provide such services in the future. "Any
business you run is constantly in a state of shift, so the architectures
for this very complex system have to be flexible and adaptable," he says.
"That's not where we are today."
Click Here to View Full Article
to the top
Despite Energy Efficient Hardware, Power Usage Rising in
Data Centers
InformationWeek (07/03/07) Gonsalves, Antone
A new white paper from Uptime Institute reveals that power consumption per
computational unit is down 80 percent over the past six years, but
consumption at-the-plug is still up by a factor of 3.4. Part of the reason
why power usage is rising in data centers is because processor
manufacturers are packing more power-hungry chips into the same-size
hardware, which leads to more heat and the subsequent need for more
cooling. Consolidating more server software in a single box can only do so
much. "After virtualization has taken some of the slack out of
underemployed IT hardware, the trend in power growth will resume," Uptime
says in "The Invisible Crisis in the Data Center: The Economic Meltdown of
Moore's Law." Organizations should pay special attention to the total cost
of ownership as they embark on the purchase of new data center servers,
considering higher power consumption, cooling costs, and other factors will
add $6.54 million more to $1 million spent on servers by 2012.
Organizations could see hardware that reflects a decline in power
consumption in real terms within the next 10 years. Still, using server
virtualization, enabling server power-save features, turning off servers
when no longer in use, trimming bloated software, and improving site
infrastructure energy efficiency ratios can help reduce energy consumption
by as much as 50 percent.
Click Here to View Full Article
to the top