ISI Leads $13.8 Million in E-Science Effort to Tame
Terabyte Torrents
USC Viterbi School of Engineering (04/08/07)
The University of Southern California Information Sciences Institute's
Windward project aims to achieve "Scaleable Knowledge Discovery through
Grid Workflows," to allow researchers to analyze data at the same speed it
is gathered. Windward will incorporate the ideas of industrial
engineering, to bring together machinery and raw materials to create
automated workflows. "Significant scientific advances today are achieved
through complex distributed scientific computations," explains project
leader Yolanda Gil. "These computations, often represented as workflows of
executable jobs and their associated dataflow, may be composed of thousands
of steps that integrate diverse models and data sources." Gil and her
research team plan to achieve Windward's goals by integrating artificial
intelligence and gird computing. ISI project leader Paul Cohen has
extensive experience in using AI systems for complex data analysis and in
developing the Semantic Web, which will be incorporated into the Windward
system. Grid computing will allow the construction of the regional,
national, or even intercontinental AI structures needed for workflow
science. ISI collaborator Ewa Deelman has created a workflow system called
Pegasus, which maps large numbers of computations to distributed resources
while ensuring optimal performance of the application. AI and grid
collaboration has already proved successful in earthquake science at ISI.
Researchers will now create new workflow techniques to represent complex
algorithms and their differences so they can be autonomously selected and
arranged to meet the needs of applications.
Click Here to View Full Article
to the top
Strategy Consultant at ACM SIGCHI to Address Role of
Mobile Phones in Driving Emerging Markets Development
AScribe Newswire (04/10/07)
The ACM Special Interest Group on Computer-Interaction's CHI 2007
conference will conclude with a keynote address from strategy consultant
Niti Bhan, who will discuss the importance of the mobile phone in emerging
markets. Bhan's speech, "The mobile phone as a post-industrial platform
for socio-economic development," will discuss the device's ability to
provide services such as health care, finance, early warning, and disaster
communications for millions of underprivileged people worldwide. In 2006,
the mobile phone became the first communications device to be used by more
people in the developing world than the developed world. Bhan will provide
examples of mobile phones providing remote expert medical consultation and
transactional banking to people who would otherwise have no such access.
She will explain her beliefs that communications technology offers the
ability to close the digital divide between the "haves" and "have nots" of
the world, creating a higher standard of living for everyone. The theme
for CHI 2007, which takes place between April 28 to May 3 in the San Jose
Convention Center, is "Reach Beyond," which both celebrates the past and
welcomes the future. More information can be found at
www.chi2007.org.
Click Here to View Full Article
to the top
Carnegie Mellon P2P System Could Speed Movie, Music
Downloads
Carnegie Mellon News (04/10/07) Spice, Byron; Watzman, Anne
Carnegie Mellon University researchers have developed a way to speed up
P2P downloads by using not only identical files, but similar files as well.
Having more possible sources to download from could decrease download
times significantly. Using a process known as handprinting--taken from
techniques used in clustering search results or identifying spam--to
identify files with similarities to that being downloaded,
Similarity-Enhanced Transfer (SET) has shown its ability to accelerate the
downloading of MP3s by 71 percent. And SET downloaded a movie trailer 30
percent faster by using files that were only 47 percent similar. "This is
a technique that I would like people to steal," says CMU computer science
professor David G. Anderson. "In some sense, the promise of P2P has been
greater than the reality," as a result of both Internet service providers
limiting the amount of bandwidth used for uploading and users that decrease
their computers uploading capabilities to allow improved downloading.
Analysis shows evidence that the files most commonly shared on P2P networks
probably contain many of the same elements. Music files could be identical
but have different artist-and-title headers, for example. Theoretically, a
user downloading a movie translated into German could be downloading the
video portion from the English version and the audio from the German. SET
works the same way as BitTorrent, by breaking a source file into many
smaller pieces that are simultaneously downloaded from sources with the
identical file, but unlike BitTorrent SET keeps looking for similar files
and downloads matching pieces.
Click Here to View Full Article
to the top
Efficient Hardware Repair
Technology Review (04/10/07) Ross, Rachel
University of Illinois-Champagne computer science professor Josep
Torrellas is developing a system dubbed Phoenix designed to fix defective
computer chips using downloadable patches. Phoenix is based on field
programmable gate array technology that sits on the chip and can identify
defects and provide solutions. Torrellas' system works like an antivirus
program, and if it finds a defect the manufacturer could instantly transmit
the necessary patch to all affected machines. Included in the patch is a
defect signature that identifies the cause of the problem. After being
installed, the patch reprograms Phoenix to look for the defect signature
and prevent a crash. Phoenix technology could allow manufacturers to
produce chips faster, knowing that any problems could be fixed with
patches. Although Torrellas is not the first to design a hardware patch
system, he claims that his is the most efficient and can address more
problems than other systems. The system is not able to remedy all hardware
problems, but it can address those bugs that would result in a crash.
Torrellas' team conducted a study of past problems with major
manufacturer's chips and decide upon the problem areas that Phoenix would
focus on, such as memory subsystems. Despite its ability to detect and fix
bugs, vendors may find the technology too time consuming and costly to
implement. However, Torrellas believes that Phoenix will prove its worth,
since "there is more scope for miscommunication" as "bigger teams are
designing the processors."
Click Here to View Full Article
to the top
Game Developers Adapt to Multicore World
CNet (04/11/07) Krazit, Tom
After years of using patches to make software written for single core
processors take advantage of multicore architectures, video game developers
have begun taking multicore into consideration from the beginning of
development. Intel has recently announced the release of software
development tools to help developers make the switch to multicore. When
parallel processing first appeared in consumer machines in 2005, the games
being released had been in development for several years. At that point
the video game industry was at a "D-minus," in programming for multicore,
said Intel gaming director Randy Stude. "I'd say we're at a 'C-plus' right
now." Intel and AMD both made significant efforts to promote simple ways
of making games use parallel computing. "It won't give the same kind of
performance, but it's going to help, and it's better than nothing,"
explains Jon Peddie Research's Ted Pollak. There are currently about 25
games on the market that were built with multicore in mind. "We feel it's
a choice you have to make from the outset," said THQ representative Ben
Collier. However, Intel believes the move to multicore is inevitable as
all PCs are expected to be at least dual-core in the near future, while
quad-core PC chips are on the way. "The learning curve is becoming less
and less to get threading work done," says Intel's Stude.
Click Here to View Full Article
to the top
Assitive Robot Adapts to People, New Places
MIT News (04/09/07) Trafton, Anne
MIT researchers have developed Domo, a robot that can both interact with
humans and pickup unknown objects and place them on a shelf. Domo
represents the kind of technology that could one day assist the elderly or
work in fields such as agriculture or space travel. "The real potential of
robots in the future is going to be realized when they can do many types of
manual tasks," says Domo contributor Aaron Edsinger. Unlike assembly
machine robots, intelligent robots would not have to be placed in a
controlled environment. "We want the robot to adapt to the world, not the
world to adapt to the robot," Edsinger says. Domo's cameras relay
information to 12 computers that analyze what is seen and choose what to
focus on, such as unexpected movements or a human face. If the robot is
told to place an object on a shelf, it uses one hand to feel for the shelf
and the other to reach for the object. Once Domo has a hold of the object,
it finds the tip of the object and wiggles it a bit in order to understand
the size of the object and the best way to transfer it to the other hand or
to place it on the shelf. To make it safe for human interaction,
Edsinger's team put springs in Domo's arms, hands, and neck that let it
feel pressure when a person touches it. The researchers believe that
robots and humans working together could do things that neither could do
separately. "If you can offload some parts of the process and let the
robot handle the manual skills, that is a nice synergistic relationship,"
he says. "The key is that it has to be more useful or valuable than the
effort put into it." Rather than having a single robot housekeeper, the
home of the future is expected to have many specialized robots.
Click Here to View Full Article
to the top
More Power to Google
eWeek (04/06/07) Taft, Darryl K.
In a talk entitled "Watts, faults, and other fascinating 'dirty' words
computer architects can no longer afford to ignore," Google engineer Luiz
Barroso explained the company's desire to achieve optimal data efficiency
for its data centers. "Power/energy efficiency and fault-tolerance are
central to the design of large-scale computing systems today," said
Barroso. "And technology trends are likely to make them even more relevant
in the future, increasingly affecting smaller-scale systems." Building a
data center costs more than powering it for 10 years, so the goal is to
build them to use as few watts as possible, since price is directly related
to watts. In its research of its own data centers, Google found that warm
temperature was not a significant factor in failures. In its efforts to
reach optimal usage, Google has accepted that single-thread performance
"crashes into a power wall," and that distributed shared memory systems can
be crippled by "a single fault," Barroso said. He imagined a day when
power companies gave away servers for free so long as the customer signed
an energy contract. Google is currently working with partners, including
Intel and AMD, "to create open standards for higher-efficiency power supply
units," Barroso said. The applicability of multicore processors and
growing parallelism is subject to the ability of programmers to develop
efficient and concurrent programs. However, this could be easier for
Google, since it is working with an extremely large amount of data. Along
with fault-tolerant software, Google uses System Health Infrastructure to
provide additional monitoring of servers. This technology may be
open-sourced, but "some of this is infrastructure and we build it so
intertwined with other software we have that it's hard to pull things
apart," Barroso said.
Click Here to View Full Article
to the top
Social Computing Study to Track College Students'
Networking Habits
Rochester Institute of Technology (04/04/07)
Rochester Institute of Technology professor Susan Barnes is developing an
undergraduate online course in social computing, and it will also serve as
a case study of technology and social networking. Barnes, a professor of
communication who is the co-director of the Laboratory for Social
Computing, will head a team of researchers from RIT's College of Computing
and Information Sciences and College of Liberal Arts in preparing the
course for spring 2008 and about 90 students. An emerging discipline,
social computing or media involves the use of software for social and
organizational collaboration, and its tools include email, instant
messaging, interactive Web, and blogs. For the course, students will be
required to complete social computing assignments in text-based myCourses,
visuals-heavy Second Life, and a third environment that is open source and
can be modified, and the researchers will analyze how the students network
and solve problems in the various settings. "How students interact in
different environments will tell us a lot about what's going on in online
education," says Barnes. The project is made possible by a two-year grant
from the National Science Foundation.
Click Here to View Full Article
to the top
Uncle Sam Asks Kids to Be Inventors
EE Times (04/10/07) Merritt, Rick
A Department of Commerce initiative is targeting children ages 8 to 11 in
hopes of staving off an impending decrease in U.S. global competitiveness.
Several Commerce advertisements will lead children to
www.InventNow.org, which aims to stir up interest in innovation among
children by showing them videos of other children inventing devices to
solve problems. "In an innovation-driven economy, the key to our future
success and competitiveness lies in making sure we are sharing America's
culture of innovation with our young people," said Commerce Secretary
Carlos M. Gutierrez. The U.S. Patent and Trademark Office is optimistic
that the ads will inspire children who see them to become "more inventive;
explore math, science, and other creative fields," according to USPTO
director John Dudas. Given the recent fall in undergraduates pursuing
computer science, "Jobs will go begging in the next few years because we
don't have the people willing to take on field," says Microsoft Research VP
Rick Rashid. Rashid says the number of U.S. PhD computing candidates could
be 50 percent less by 2010. "We are at a low point of interest in computer
science," Rashid says. The advertising effort is a product of USPTO's work
with the National Inventor's Hall of Fame Foundation. The two will also
collaborate on a summer camp and club for young inventors.
Click Here to View Full Article
to the top
Protecting Electronic Information From Theft and Abuse Is
the Goal of Virginia Tech CAREER Research
Virginia Tech News (04/09/07)
Virginia Tech researcher Patrick Schaumont has been awarded a prestigious
NSF grant to fund his efforts to improve information security in computing
devices. The NSF Faculty Early Career Development Program (CAREER) Award
is the top honor given to promising young researchers and includes a
five-year, $400,000 grant. Schaumont says that as more and more
information is being stored on portable computers such as an electronic key
fob used to unlock a car door or the image of a signature on an electronic
passports, encryption technology has not kept up to protect data stored on
portable devices. "Computers of all sizes can be stolen," says Schaumont.
"The way we use computers everyday is changing, so we need to rethink how
to safely store information." He intends for his CAREER project to produce
a methodology by which secure embedded systems can be designed. Such
innovation would allow protection of information in cell phones, RFIDs, and
copyrighted materials, such as audio files on portable devices. For the
mandatory educational element of his CAREER project, Schaumont plans to
expose students to hardware-software co-design--the development of hardware
and software in an embedded system.
Click Here to View Full Article
to the top
IT Jobs: Tapping Teens to Fill the Gap
SearchCIO.com (04/05/07) Tucci, Linda
A Memphis summer program has had great success in stirring up interest in
technology among students. After realizing that college was too late to
get students interested in a career in computing, Society for Information
Management (SIM) founder John Oglesby collaborated with a Memphis library
to create Teen Tech Week. "We put together a program with SIM helping
guide the curriculum, and the library doing all the heavy lifting," said
Oglesby. Students ages 12-15 can apply for the program, which was kicked
off by an orientation for students and parents that explained the
opportunities available due to the current shortage of IT workers. Each
program day began with a presentation by a SIM member and then introduced
the "bright shiny object" of the day, new technology intended to attract
attention and interest. The culmination of the project was a webcast for
the library's "Teen Web Page." Three years later, the program is heavily
codified and has spread to other cities. Meanwhile, public schools in
Naperville, Ill., started its own IT curriculum and certification program
almost 10 years ago, but in five years realized that enrollments had fallen
from 100 students to eight students, and that only 1 percent of the
students were passing the Cisco certification test administered by the
program. Organizers realized that they had done the students "a
disservice," says Naperville school technology specialist Brett Thompson,
who then wrote a graduate thesis inspiring schools nationwide to change the
way they teach technology. The Naperville program then began using
materials from the Computing Technology Industry Association (CompTIA), and
enrollment has increase by a factor of five in the networking class alone.
Students that have received certification can now make money repairing
computers and in summer jobs. "Even if the kids don't pursue an IT career,
at least they are smarter consumers," says Thompson.
Click Here to View Full Article
to the top
The Future of Learning
Duke University News & Communications (04/06/07) Hicks, Sally
Duke University will host the Future of Learning Conference, an event that
seeks to close the gap between the digital world and the traditional
classroom setting. The April 19-21 conference, which hopes to spark
dialogue by bringing together educators, public officials, and
intellectuals, will address the fact that while technology has impacted
nearly every aspect of our lives, the classroom remains unchanged.
"There's an incredibly energetic, rich way of learning at home, and then
kids go to school and it's standardized," says event coordinator Cathy
Davidson. "But the education in formal learning environments can be as
exciting as what they're learning at home." The conference's keynote
address will be given by former Xerox chief scientist John Seely Brown, a
supporter of collaborative education and learning tools that engage
students rather than treating them as a passive observer. "With every new
piece of technology, to make this technology work, you have to change your
teaching practices," says Seely Brown. "Part of it is (thinking about) how
to go from sage on the stage to being a real mentor." On the conference's
final day, several "digital visionaries" will convene to discuss topics
including universal access and intellectual property. "People who are
learning in an Internet age are leaning in different ways," says Davidson.
"If you were born after 1991, you don't know there was another way of doing
things." The event is part of an international conference on the
humanistic aspects of technology put together by Duke's Humanities, Arts,
Science Technology Advanced Collaboratory (HASTAC).
Click Here to View Full Article
to the top
Compilers and More--What to Do With All Those
Cores?
HPC Wire (04/06/07) Vol. 16, No. 14, Wolfe, Michael
The Portland Group compiler engineer Michael Wolfe recalls that at the
Principles and Practice of Parallel Programming conference, Purdue
University professor Rudolf Eigenmann cast a baleful eye on the parallel
programming community, lamenting the lack of a widely accepted technique to
generate parallel applications despite three decades of research. His
views were countered by keynote speaker Dr. Andrew Chien of Intel, who
argued that massively parallel systems and the applications that run on
them are testament to the research community's success with parallel
programming, but when Wolfe pointed to the need for breakthrough
innovations, Chien explained that parallel programming now stresses an
entirely different target environment, programmer class, and expectations.
Offered as one solution to the problem of synchronizing between parallel
threads or activities is transactional memory, which in the parallel
programming domain entails entering a transaction, executing updates, and
committing the changes; this model requires the implementation to buffer
the modifications until the commit and then commit them simultaneously, but
the manner of implementation has yet to be locked down. This and other
challenges can be tackled in managed software environments, but there is
uncertainty as to how much time must pass before transactions can move into
high-performance computing. There are also a lot of unresolved matters
concerning multicore processing architectures, with designs such as an
array of minuscule, low-power cores on the chip or one or two large,
power-guzzling cores enclosed by smaller, lower-power cores being bandied
about. "My summary of all the hype for [general purpose graphics
processing units] is that processors or coprocessors unconstrained by
compatibility requirements, with the freedom to redesign to the latest
technology, can deliver higher performance than general purpose CPUs,"
writes Wolfe. He explains that massively parallel systems could show up in
workstations or laptops before 2020, so there is no time like the present
to start brainstorming productive uses for such systems.
Click Here to View Full Article
to the top
Boffins Say Thin Clients Emit Less CO2
Techworld (04/05/07) Betts, Bryan
A new report from the Fraunhofer Institute in Germany indicates that the
use of thin clients could save British businesses 78 million pounds in
electricity and reduce CO2 emissions by 485,000 tons a year. The research
assumes the replacement of 10 million desktop PCs in the United Kingdom,
and it takes into consideration the extra energy costs of servers needed
for thin-client computing. Dr. Hartmut Pflaum says thin clients and their
server consume about 40W to 50W, compared with about 85W for PCs.
Fraunhofer Institute considers the research to be timely because of the
increase in focus on the change in the climate and the need to lower CO2
emissions. "Gartner says PCs contribute half a percent to global CO2
emissions so if we remove half of that it is very significant," says
Stephen Yeo of Igel Technology, the maker of the thin clients used in the
study.
Click Here to View Full Article
to the top
For Him, Scrabble Is a Science
Boston Globe (04/09/07) Baker, Billy
Jason Katz-Brown, a 20-year-old junior studying computer science at MIT,
has developed Quackle to play Scrabble competitively the same way he plays
the popular game, but the artificial intelligence program often loses to
humans. Brown, who has emerged as a top-flight Scrabble player over the
past three years, and Quackle created a stir last November when the
computer program defeated a former world champion in the finals of a human
versus computer tournament in Toronto. Like highly competitive Scrabble
players, Brown has memorized every word in the Scrabble dictionary, and
says within a few seconds of looking at his rack he is able to find
potential seven-letter words. Quackle has the mathematical prowess to
respond to the game as new words are played, but luck is more of a factor
in Scrabble, giving it a third dimension that Chess lacks. According to
many experts, Quackle is unable to match top human players in the ability
to gauge future moves based on unseen tiles, the score, and the layout of
the board. If such "look-ahead" analysis is performed too early in the
game, "you're wasting your time because there's too much randomness ahead
of you," says Joel Sherman, who says he beats Quackle about half the time.
Katz-Brown plans to study the way Quackle plays and apply his findings to
"maximize the luck" of the computer program.
Click Here to View Full Article
to the top
Biggest Threat to Internet Could Be a Massive Virtual
Blackout
National Journal's Technology Daily (04/05/07) Noyes, Andrew
A distributed denial of service attack presents the biggest danger to the
Internet in the 21st century, according to ICANN's Susan Crawford.
Speaking at a Hudson Institute briefing, Crawford said the Feb. 6 zombie
attack on six root-zone servers called attention to the fact that such
servers have little or no oversight. To reduce the risk of DDOS attacks,
the number of zombie computers must be reduced, but "people are turning
millions of PCs into weapons ... and we don't have a lot of data about what
is happening," said Crawford. "Researchers are often operating in the
dark." DHS has shown an inability to address this danger, she added.
"They're trying, but many of their efforts lack timeframes for completion."
Crawford does not believe legislation could prevent DDOS attacks, because
Congress' reach "is too local for the networked age." The best solution
would be to focus money and attention on potential global educational
initiatives, perhaps through the founding of a multi-stakeholder body with
a "new, friendly-acronym," she said. ICANN's power is overly based on
contracts and is not wide enough to have the necessary impact, and the
Internet Governance Forum is "highly political" and "not necessarily the
best forum for a technical discussion of best practices," claimed Crawford.
She named routing security as an important future consideration, because
the ability of hackers to place false paths in a routing system to obtain
packets or spur a DDOS attack increases as "routing tables" grow in size to
meet the needs of IPv6.
Click Here to View Full Article
to the top
The H-1B Limit
InformationWeek (04/09/07)No. 1133, P. 29; McGee, Marianne Kolbasuk
With the H-1B visa cap met in a single day and the attention-hogging
presidential election only one year away, the time seems to be ripe for new
H-1B legislation. A bill introduced by Sens. Chuck Grassley (R-Iowa) and
Dick Durbin (D-Ill.) would prohibit outsourcing of H-1B or L-1 workers to
other companies; require that H-1B employers make "good faith" efforts to
hire U.S. workers; and require that jobs and H-1B applications be posted on
the Department of Labor Web site. Opponents of an increase in the visa cap
insist that companies are abusing the visas by bringing workers to the
United States for training, only to ship them overseas to work, so many
software vendors who do not use the visas for this purpose are trying to
differentiate themselves from those who do. Over the last four quarters,
the U.S. IT unemployment rate has been at 2.3 percent, according to the
Bureau of Labor, compared with unemployment rate across all management and
professional jobs of 2.2 percent. The number of IT jobs is up only 1
percent from 2001 and 5 percent from 2003. Thanks to exemptions for those
working at universities and other nonprofits, about 120,000 new H-1B visa
holders enter the United States each year, but the 10 biggest H-1B
applicants in 2006 were India-based IT companies, says Rochester Institute
of Technology's Rob Hira. Hira supports the Grassley-Durbin bill, since it
should eliminate the need for a cap increase, and believes the way to
improve the country's ability to retain skilled workers is through green
cards, although the current system is slow and subject to quotas. With
politicians very likely to turn their attention to presidential and
reelection campaigns next year, H-1B reformers understand the importance of
getting legislation passed soon.
Click Here to View Full Article
to the top
Mixed Feelings
Wired (04/07) Vol. 15, No. 4, P. 152; Bains, Sunny
Expanding our range of sensory input by tapping the neuroplasticity of the
human brain is the focus of research in labs around the world. The key
lies in determining the manner in which the sensory data should be changed
into a form that the brain is already programmed to receive, such as taste,
sound, touch, and visual imagery. Tackling the problem of spatial
disorientation was the motive behind the development of the Tactical
Situational Awareness System, a garment equipped with vibration elements
that tell the wearer which way is down. A more advanced model, the Spatial
Orientation Enhancement System, can make flight intuitive by triggering
vibrations on specific parts of the body in response to a change in
direction or orientation. Neuroscientists at Wicab, founded by the late
researcher Paul Bach-y-Rita, have developed a "tactile display" consisting
of an electrode-studded mouthpiece connected to a pulse generator that
triggers electric current against the tongue. Testing has demonstrated
that the device can restore balance to people with inner-ear disorders by
generating a tactile square that moves in relation to the user's movements.
The usability of sensory prosthetics ultimately depends on achieving a
greater understanding of the brain's information processing
capabilities.
Click Here to View Full Article
to the top