Wall Street's Collapse May Be Computer Science's
Gain
Computerworld (09/26/08) Thibodeau, Patrick; Weiss, Todd R.
The recent collapse on Wall Street may make a career in computer science
or IT more attractive to students, who largely left those fields following
the dot-com bust of 2001. Stanford University computer science department
chairman William Dally says students are returning to computer science
because they like the field and not necessarily because it can make them
rich. Boston College professor John Gallaugher says he has already seen a
change in student interest, with many students contacting Gallaugher and
expressing an interest in switching from finance. Following the dot-com
bust, computer science enrollment declined until it reached a low of 8,021
last year, down from 14,185 in 2003-2004, according to the Computer
Research Association (CRA). Meanwhile, offshore outsourcing also scared
students into avoiding technology careers. Now, companies are suffering
from a shortage of technology professionals, and the looming baby boomer
retirements will only add to the problem. CRA analyst Jay Vegso says
economic conditions appear to impact the choice that students make when
choosing a major, and students currently choosing majors may be looking for
safer alternatives. Stevens Institute of Technology's Howe School of
Technology Management associate dean Jerry Luftman says the major
difference between today and the late 1990s is the type of student that
businesses need. While technical skills are important, Luftman says
companies also want students with management and industry training, strong
communications abilities, and marketing and negotiations skills. The U.S.
Bureau of Labor Statistics reports that IT jobs are among the fastest
growing; openings for networks systems and data communications analysts are
expected to reach 402,000 this year, up from 262,000 in 2006.
Click Here to View Full Article
to the top
Keeping Computing Compatible
ICT Results (09/25/08)
The European Union-funded Semantic Interfaces for Mobile Services (SIMS)
project is designing a development toolkit for creating software for widely
distributed and highly interactive devices. The researchers say the new
tools will hasten the design and validation of software and services
guaranteed to interact smoothly as distributed computing becomes
universally prevalent. SIMS project coordinator Richard Sanders says when
SIMS-inspired services are widespread, devices will interact seamlessly,
update themselves automatically, and provide users with the ability to
implement new services that are guaranteed to work immediately. "If you
have communicating software and the communication is important, you want to
make sure it works when it interacts with other software," Sanders says.
"SIMS provides the tools to check those scenarios and actually guarantees
compatibility." To create a fully integrated system of distribution and
connectivity, the researchers created a model that uses semantic interfaces
to specify what goals need to be realized and how components of the system
need to behave and interact to accomplish those goals. Semantic interfaces
detail what kinds of connections, exchanges, and results are meaningful and
useful within a particular domain in a highly structured way. Developers
can use this information to create computer code to run devices directly,
ensuring that the code will work with all the components of a system. The
SIMS researchers believe that using their approach and tools will prevent
most of the interaction errors that plague and frustrate users today.
Click Here to View Full Article
to the top
Open Source Could Fix E-Voting Flaws, California
Secretary of State Says
Network World (09/25/08) Brodkin, Jon
California Secretary of State Debra Bowen says open source software could
help fix some of the flaws in electronic-voting systems. Bowen says
e-voting software would benefit from greater scrutiny, noting that
privileged information about voting software flaws is not easily examined
by the public or even the county workers tasked with purchasing voting
machines. She says that in many cases the people purchasing the machines
cannot verify their reliability. "We're basically asking a county IT
professional, who may or may not have any experience in crypto-security, to
purchase a system," Bowen says. "In most cases, the person who does the
purchase has no legal right to review the software, even if they knew what
they were reviewing." Bowen says open source software could help design
more effective ballots. A review of California's voting technology found
security flaws in every voting system, including touch-screen machines and
machines that scan paper ballots. Bowen wants to move away from
direct-recording e-voting machines, which typically require voters to use
touch screens to vote, because they lack a means to independently verify
results. Instead, Bowen favors optical-scanning machines with paper
ballots, which can be hand counted if necessary.
Click Here to View Full Article
to the top
Craig Mundie's Cloud Vision
Technology Review (09/25/08) Naone, Erica
Microsoft chief research and strategy officer Craig Mundie says cloud
computing will transform personal computing by shifting computer processing
and storage away from desktop computers and onto Internet-based distributed
computers. Mundie believes the next big platform shift will be the
composite platform, in which the Internet platform is united with the
evolved client platform. He says the client platform will create a
uniform programming architecture that covers all areas. Many of the basic
tenets of how people write programs will no longer work, as traditional
procedural programming languages tend to mask or eliminate the inherent
parallelism in many problems as a byproduct of the language's structure.
Mundie says better tools are needed for debugging and programming proof, or
verifying that a program's algorithms function as intended, to put new
applications together. These applications will unite all of the
intelligent clients in consumers' lives. Many consumers own cell phones,
laptops, cars, TVs, game consoles, and other devices, but it is largely
left up to the user to get these devices to work together and make the same
content appear on all of them. Consumers want to be able to access their
music or other content no matter what device they happen to pick up or be
in front of. Mundie says a cloud platform that complements the evolving
client or multiclient platform would provide common data services and
orchestration processes.
Click Here to View Full Article
to the top
Grids: More Than Just an Infrastructure
AlphaGalileo (09/26/08)
Researchers from 50 countries are demonstrating how the Enabling Grids for
E-sciencE (EGEE) infrastructure is being used in new ways at this week's
5th annual EGEE conference in Istanbul. Scientific research from particle
physics to geology has dominated grid resources over the past four years,
but this year the Italian project ArchaeoGRID has used the grid to study
issues involving social science. The team studied the rise and fall of
societies through history, the factors that contributed to global change,
and the impact of humans on the environment. Climate change research has
benefited from grid resources such as EGEE's Earth Sciences Cluster and the
ArchaeoGRID. The Worldwide LHC Computing Grid (WLCG) has been key to the
development of the grid, which will run up to 300,000 executed programs, or
jobs, per day, for a project involving the Large Hadron Collider. The
bioinformatics community is using the EGEE infrastructure for its WISDOM
project, which is focused on developing new drugs to fight malaria and
avian flu. Meanwhile, the medical community plans to use the technology to
access medical data while patients remain anonymous.
Click Here to View Full Article
to the top
Art and Science, Virtual and Real, Under One Big
Roof
New York Times (09/23/08) P. D2; Overbye, Dennis
The Rensselaer Polytechnic Institute (RPI) has unveiled the Experimental
Media and Performing Arts Center (Empac), a $200 million facility that
features 220,000 square feet of theaters, studios, and work spaces all
connected to a supercomputer. Empac's virtual reality technology will
enable scientists to immerse themselves in data and experience sensations
such as diving through a breaking wave or closely inspecting twists in a
DNA molecule. Some scientists say the new center could eventually be used
to create a version of the Star Trek holodeck where people can interact
with life-sized virtual people and creatures. Others scientists plan to
teach surgery through virtual procedures, or take doctors on tours through
models of actual hearts and circulatory systems. "Nothing can be compared
to this," says RPI president Shirley Ann Jackson. "To our knowledge, there
is nothing else like it." Computer scientist Jaron Lanier, a visiting
scholar at the University of California, Berkeley, says that while the idea
of a virtual reality theater has existed since the 1990s, Empac represents
a major leap in commitment and ambition. Empac will be linked to RPI's
Computational Center for Nanotechnology Innovations, which has an IBM Blue
Gene supercomputer powered by 32,768 parallel processors.
Click Here to View Full Article
to the top
New Technologies for Better Network Management
Cellular-News (09/25/08)
The EUREKA-funded CELTIC MADEIRA project has successfully applied new ways
of managing large telecommunications networks using a Web-based interface.
The CELTIC project focused on developing a platform for enabling advanced
network management from a central computer, and from distributed elements
such as individual computers in a cooperative manner. Ericsson Ireland's
Liam Fallon says the MADEIRA project resulted in two significant
achievements. "The first achievement was that the project gave the
participants the opportunity to try out ways of applying distributed secure
network management to real telecommunication scenarios," Fallon says. "The
second achievement was that project participants actually set up and worked
with the management system, with running applications and a Web-based
interface to communicate with the outside world." The project built a
prototype management system that uses peer-to-peer technology to enable the
network to configure itself. The prototype was distributed over labs in
Ireland, Austria, Spain, and Sweden. Project work packages addressed
network architecture, platform technology, self-aware management, and data
modeling and management. The project also worked on developing prototype
applications to test and demonstrate the envisage concepts, resulting in a
system that enables adaptable services and managing network elements of
increasing scale, heterogeneity, and transience.
Click Here to View Full Article
to the top
Robot Assistant Gives Surgeons a Cutting Look
New Scientist (09/24/08)No. 2674, P. 21
Researchers from the Hamlyn Center for Robotic Surgery at Imperial College
London have integrated eye-tracking technology into a da Vinci surgical
robot in an effort to provide surgeons with additional assistance when
positioning instruments such as endoscopes or lasers. Using the
technology, a surgeon would be able to control instruments with their gaze.
The device shines an infrared LED on each eye, uses cameras to track the
movement of the pupil, and determines where the surgeon is looking based on
the "glint" of reflected light on the cornea. The data is calculated to
move instruments to different positions on the patient. Surgeons would
activate the device with a foot pedal. The team plans to improve on the
eye-tracking technology's current accuracy rate of within 3 millimeters,
and its results could be made available at the IROS 2008 conference in
Nice, France, at the end of September. "It could be useful in
cardiovascular or gastro-intestinal surgery, which requires lots of complex
maneuvers," says researcher Guang-Zhong Yang.
Click Here to View Full Article
to the top
10 Future Shocks for the Next 10 Years
IDG News Service (09/23/08)
The next 10 years promise to contain many computer technology advancements
and developments. As the cost of power and space continues to rise, cloud
computing will play an increasingly large part in enterprise computing, as
companies look to store their data in inexpensive technologies. Computing
will become increasingly ubiquitous as consumers start wearing eyeglasses
that superimpose a machine-enhanced view of the world and as technology is
built into clothing and objects. Keyboards and traditional interfaces will
become virtual, with keyboards being projected on surfaces or in the air.
Computers will turn on instantly and run without delays or errors.
Interfaces will be intuitive and sleek, and adapt to users based on what
they are doing so they can easily access relevant features. Automation
will continue to spread throughout industry, essentially eliminating the
need for human-run manufacturing. Image recognition will improve to the
point where a picture can be submitted to a search engine and the engine
will be able to return relevant results based on the image. Smart phones
will evolve into the preferred instrument for constant connectivity, with
voice interaction, facial recognition, location awareness, constant video
and sound input, and multitouch screens. Devices will always be connected,
providing a constant stream of data on friends' activities, sports scores,
and other topics without interrupting the user's current activities.
Surveillance technology will improve to the point where it will be possible
to track every human being, possibly through LoJack-style implants for
personal safety, or through trackers in drivers' licenses and automobiles.
Finally, technology will help us remember and strengthen social
connections, recording every interaction to help people remember who they
met and what they did.
Click Here to View Full Article
to the top
Technology Doesn't Dumb Us Down. It Frees Our
Minds.
New York Times (09/21/08) P. BU4; Darlin, Damon
Damon Darlin takes issue with an article by Nicholas Carr presenting the
argument that Google is adversely affecting our thinking ability. He
writes that Google "has largely liberated us from the time-wasting
activities associated with finding information." Darlin recalls that some
engineering professors prohibited the use of Hewlett-Packard's HP-35
handheld calculator in classrooms upon its introduction in 1972 out of
fears that engineers would employ the device as a crutch, yet the amazing
engineering progress that has ensued nearly four decades since then
demonstrates that such fears were groundless. "It freed engineers from
wasting time on mundane tasks so they could spend more time creating," he
contends. Darlin cautions that while a lot of new technologies boost our
productivity, there are others that can waste time. "In a knowledge-based
society in which knowledge is free, attention becomes the valued
commodity," he observes. Nevertheless, Darlin concludes that throughout
history time-saving technologies have generally improved the ease of
thinking and communication.
Click Here to View Full Article
to the top
Iowa State Researchers Part of $208 Million Supercomputer
Project
Iowa State University News Service (09/18/08)
Iowa State University is participating in the National Science
Foundation-backed $208 million program, led by the University of Illinois
at Urbana-Champaign's (UIUC's) National Center for Supercomputing
Applications, to develop the world's most powerful supercomputer. The
IBM-built machine, called Blue Waters, will be based at UIUC and is
expected to go online in 2011. UIUC will work with the Great Lakes
Consortium for Petascale Computation, a coalition of industry and academic
partners, including Iowa State, established to tackle the challenges of
petascale computing. "Iowa State University is pleased to be part of the
project to develop the world's most powerful supercomputer," says Iowa
State president Gregory Geoffroy. "This is truly an indication of the
strengths and expertise Iowa State has developed in high performance
computing applications, virtual reality, and human computer interaction."
Iowa State professor Srinivas Aluru has used supercomputers to help with
the recently concluded effort to sequence the corn genome by developing
software that uses thousands of processors to build genome assemblies in
days instead of months. Aluru says he is looking forward to using
petascale computing to solve large-scale problems in comparative genomics,
systems biology, plant sciences, and biorenewables research.
Click Here to View Full Article
to the top
The A-Z of Programming Languages: Haskell
Computerworld Australia (09/19/08) Hamilton, Naomi
Microsoft researcher Simon Peyton-Jones says the Haskell programming
language was created as an open standard for purely functional programming
languages in the sense that the project began "as a group of people each
wanting to use a common language, rather than having their own languages
that were different in minor ways." Peyton-Jones says the 1998 version of
Haskell was frozen to provide people with an iteration that they could cite
and teach, and that developers such as himself would be able to keep
maintaining. Peyton-Jones appreciates the variant languages that have
branched off from Haskell, noting that they fulfill the language's specific
purpose to inspire diversity. He says that a trail is being blazed for
distributed programming for things such as multi-core CPUs and clusters,
particularly by Haskell and generally by purely functional programming.
Peyton-Jones refers to Haskell as a "lazy" language, in the sense that
expressions are evaluated only when their value is actually necessary,
which facilitates greater consistency in maintaining the language's purity.
"The fact Haskell hasn't become a real mainstream programming language,
used by millions of developers, has allowed us to become much more nimble,
and from a research point of view, that's great," he says. Of all the
programs written with Haskell, Peyton-Jones finds the most interesting to
be Functional Reactive Animation, which enabled the description of
graphical animation.
Click Here to View Full Article
to the top
Wikipedia Depends on Collaboration for Success
Daily Trojan (09/18/08) Huang, Annie; Little, Anita
At the Anneberg Program on Online Communities at the Anneberg School for
Communication, Carnegie Mellon University professor Robert E. Kraut
discussed the steps to success of online communities and his research on
the coordination techniques of Wikipedia. Kraut said that defining success
in online communities is multi-dimensional and can be divided into three
stages. The first is the transaction, where conversations happen,
questions are answered, and resources are exchanged. The second is the
individual getting resources and developing a commitment. The third is the
group, or how persistent it is and how it grows over time to produce a
product. To be successful, online communities must overcome challenges
such as losing members, posts that receive no responses, member
recruitment, and socializing with newcomers, Kraut said. Kraut also
presented his empirical analysis of the coordination techniques that result
in high-quality articles on Wikipedia. A 2007 study found that articles
with more editors were of a greater quality. "Wikipedia articles require
an awful lot of substantial coordination, such as communication to plan the
article, which involves content and organization, developing a neutral
point of view, resolving conflicts, and integrating elements into a
cohesive whole," Kraut said. Featured articles have more editors and edits
than non-featured articles. Kraut found that adding editors to Wikipedia
is helpful when the editors use appropriate coordination techniques.
"These are the sorts of work that lead to the article itself; people write
the article but they also talk to each other about it," he said. "Less
than 50 percent of edits go to the articles and much of Wikipedia work is
done on other pages like coordination talk and conflict resolution."
Click Here to View Full Article
to the top
Gestures Will Force the Mouse Into Retirement
Financial Times Digital Business (09/17/08) P. 5; Twentyman, Jessica
A growing number of human-computer interaction (HCI) specialists say the
mouse is on the way out as an input device because it limits the way people
can interact with computers. "In many ways, our continued reliance on the
computer mouse reduces us to little more than cavemen, running around
pointing at symbols and 'grunting' with each click," says Bruce Tognazzini,
who joined Apple Computer in 1978 and founded the company's Human Interface
Group. "A revolution is long overdue, because we need more sophisticated
tools that will allow us to increase our vocabulary way beyond that caveman
grunt." The latest HCI research involves enabling computers to read and
understand users' movements and gestures. Real-time video interpretation
and inertial sensors are already being used to recognize facial expressions
and physical movements in a variety of consumer technology devices, says
Garner analyst Steven Prentice. Prentice traces the roots of this change
to the launch of the Nintendo Wii in 2006 and the release of the Apple
iPhone in 2007. The widespread acceptance of these devices has led to a
wave of other electronics manufacturers releasing motion-controlled
products. Panasonic has created video displays that can recognize a user's
face and present content based on individual preferences, and Accenture
Technology Labs is building multi-touch, interactive display walls.
Advanced Micro Devices' Richard Huddy says there will be a "strong and
unstoppable" shift toward technology based on simple human gestures and
away from indirect manipulation through physical objects such as the
mouse.
Click Here to View Full Article
- Web Link May Require Free Registration
to the top
Eugene Spafford: Protecting the Internet From the
Criminal Element
Science News (09/13/08) Vol. 174, No. 6, P. 32; Gaidos, Susan
The nature of computer security incidents has changed dramatically in the
past three decades, says Eugene Spafford, executive director of Purdue
University's Center for Education and Research in Information Assurance and
Security and chair of ACM's U.S. Public Policy Committee. He recalls that
while most incidents in the 1990s were mainly perpetrated by people who
were either adjusting to the unfamiliarity of the Internet or were "classic
hackers" out for bragging rights or to demonstrate their skills to others,
today's hackers are more sophisticated and committed to true criminal
enterprises, including credit card fraud and the theft of information and
intellectual property. The network is now global in scope, which increases
its exposure to individuals with a wide spectrum of motives and ideologies,
Spafford says. He says the Internet is devoid of an effective policing
framework, given its lack of physical boundaries. Redressing the absence
of computer security requires society to become more willing to pay for
good security and less tolerant of flaws and security incidents, Spafford
argues. "We have to find ways to increase accountability, authenticity,
and attribution without doing away with some of the freedom of expression
that is part of the benefit of having the Internet," he says. "The
probable direction we're going to have to go in is to build very robust,
highly protected enclaves, or protected systems of computers."
Click Here to View Full Article
to the top
Information of the World, Unite!
Scientific American (09/08) Vol. 299, No. 3, P. 82; Garfinkel, Simson L.
Privacy advocates are concerned about the potential ramifications of data
fusion, in which databases are linked together, writes Naval Postgraduate
School computer scientist Simson L. Garfinkel. However, this integration
is a more challenging proposition than many people assume, and appears to
be restricted to specific contexts due to a number of factors, including
the high incidence of errors and meaningless coincidences in databases.
Distinguishing worthless from valuable information is a formidable problem
for data fusers, as is correctly identifying people and things when names
are shared by multiple individuals or objects. One industry that has
fueled a great deal of innovation in identity-resolution systems is Las
Vegas gambling, which is striving to exclude cheaters and self-declared
problem gamblers from its gaming establishments. Casinos have invested in
the development of the nonobvious relationship analysis (NORA) method,
which involves the combination of identity resolution with databases of
credit companies, public records, and hotel stays. A NORA system is
designed to construct hypotheses based on the data, and then update these
hypotheses as new data becomes available. Garfinkel speculates that
society may be placing unreasonable demands on data fusion, and the failure
of data-fusion systems could just as easily stem from flaws in their
algorithms as from a lack of data. He adds that a dearth of public
information about data-fusion systems in actual use is also a source of
frustration to scientists.
Click Here to View Full Article
to the top