Defending Laptops from Zombie Attacks
Technology Review (03/21/08) Greene, Kate
Laptop-based security software that adjusts to how an individual utilizes
the Internet so that the detection of malicious activity is more dynamic
and personalized has been developed by Intel researchers. The software
targets corporations that pass out laptops and mobile devices to workers,
since IT departments typically install homogeneous security software on all
their hardware, which partly explains why security breaches are so profuse,
according to Intel Research Berkeley researcher Nina Taft. Most IT
departments deploy security software with a component that analyzes the
stream of Internet traffic flowing into and out of a computer, and that
suggests infection when traffic exceeds a preset limit. However, this
method can incorrectly target people who habitually send out large volumes
of information while ignoring traffic that falls below the threshold that
may harbor malevolent activity without the sender's knowledge. Intel
researchers have devised algorithms capable of more subtle evaluations,
including one that creates individualized traffic thresholds by monitoring
a person's Internet use through standard statistical and machine-learning
techniques, and another that assesses how people's Internet usage changes
throughout the day. Another set of algorithms uses the same behavioral
principles to study communication between laptops and other devices on the
Internet to detect the presence of botnets. "I think the basic takeaway
is, if you can be really precise in capturing user behavior, you can make
the work of the attackers much harder," notes Taft. Georgia Institute of
Technology professor Nick Feamster attributes the lack of application of
the behavioral security strategy to laptops to the absence of an automated
way to develop personalized rules.
Click Here to View Full Article
to the top
Design Automation Conference Announces First 'Best of
DAC' Awards
Business Wire (03/17/08)
The Best of DAC Awards is a new event at this year's Design Automation
Conference (DAC). The event is designed to give DAC the opportunity to
recognize exhibitors for their innovation. Attendees of the 45th DAC will
vote on the best exhibitors. "The new Best of DAC Awards competition will
allow exhibitors another way to measure their impact on attendees," says
Limor Fix, general chair, 45th DAC Executive Committee. "We look forward
to an added element of friendly competition on the exhibit floor and what
we hope will become a new favorite element of the DAC experience."
Categories for the inaugural Best of DAC Awards include best overall new
product, best demonstration on exhibit floor, most interesting first-time
exhibitor, most interesting veteran exhibitor, best booth, and best booth
giveaway. DAC exhibitors will have until May 30 to enter their eligible
products and booth product demonstrations for the awards. ACM's Special
Interest Group on Design Automation (ACM/SIGDA) is a sponsor of the
conference, which takes place June 8-13 at the Anaheim Convention Center in
Anaheim, Calif. For more information, or to register, visit
http://www.dac.com/45th/index.aspx
Click Here to View Full Article
to the top
Multicore Boom Needs New Developer Skills
IDG News Service (03/20/08) Kanaracus, Chris
Microsoft research scientist Dan Reed points to a worldwide shortfall of
experienced parallel or multicore computing experts, and Microsoft and
Intel announced that they will donate $20 million to several American
universities to promote advances in multicore programming research. "To
gain performance from quad-core processors and prepare for the denser
multicore CPUs that will follow, application developers need to write code
that can automatically fork multiple simultaneous threads of execution
(multithreading) as well as manage thread assignments, synchronize parallel
work, and manage shared data to prevent concurrency issues associated with
multithreaded code," wrote the authors of a recent Forrester Research
study. The report notes that major operating systems and the bulk of
middleware products are already prepared for multithreaded operation and
for "near term" multicore processors, and that corporate development shops
may turn to independent software vendors to address the problem via
development tools and platforms that can better accommodate
multicore-related chores. However, Reed is convinced that multithreading
over time will become "part of the skill set of every professional software
developer." Meanwhile, major software vendors and chip makers have been
attempting to boost awareness of the challenges and potential of multicore
programming. For example, TopCoder and AMD just started a series of
contests that emphasize multithreading.
Click Here to View Full Article
to the top
Intel Researchers Stretch Wi-Fi to Cover 60 Miles
Network World (03/20/08) Cox, John
Intel recently demonstrated an 802.11 radio link with a data rate of
approximately 6 Mbps and a range exceeding 60 miles. Intel facilitated
this rural connectivity platform (RCP) with off-the-shelf hardware and
modified the underlying 802.11 media-access-control layer to boost the
signal's efficiency. This involved the addition of a method known as time
division multiple access (TDMA) that is currently used in GSM cellular
networks and which splits the channel into time slots and puts the sending
and receiving radios in sync, effectively eliminating waiting for
acknowledgments and resending of data. The TDMA technique extends the
range by minimizing the wireless overhead and opening up more bandwidth for
data transmission. The RCP units can function as endpoints that bookend
each link or as relay stations to effect signal-hopping. The RCP software
uses an operating system based on the SnapGear embedded Linux distribution.
Pilot RCP deployments have been established in India, Vietnam, Panama, and
South Africa. RCP, which Intel Research and Intel's Emerging Markets
Platform Group has been working on for around two years, is one of several
efforts to extend the Internet into rural regions, especially in developing
nations, through the employment of low-cost, low-power Wi-Fi radios.
Click Here to View Full Article
to the top
Can We Fix the Web?
InternetNews.com (03/20/08) Kerner, Sean Michael
During a keynote speech at the AjaxWorld conference, Douglas Crockford,
creator of JavaScript Object Notation and a senior JavaScript architect at
Yahoo, said the Web is in serious trouble, and the question is no longer
should we fix it, but if we can. Crockford said browsers were not designed
to do "all of this Ajax stuff," and Ajax only works because people have
found ways to make Ajax work despite its limitations. "The number one
problem with the Web is security," Crockford said. "The browser is not a
safe programming environment. It is inherently insecure." Part of the
problem is what Crockford called the "Turducken problem," or that people
are trying to stuff the turkey with the duck. Crockford said the many
programming languages on the Web can be built inside of each other, which
can lead to problems. Crockford argued that these are not Web 2.0
problems, but were present in Netscape 2.0 in 1995. The security problems
are based on three core items, Crockford said: JavaScript, DOM (document
object model), and cookies. Crockford says JavaScript's global object is
the root cause of all cross-site scripting attacks, while DOM is
problematic because all nodes are linked to all other nodes on a network
creating an insecure model, and cookies can be misused as tokens for
authority. Crockford also blamed browser vendors for introducing new
insecure JavaScript features, and said ultimately that JavaScript needs to
be replaced with a secure language.
Click Here to View Full Article
to the top
Institute for Advanced Architectures Prepares for
‘Exascale' Computing
Azom.com (03/21/08)
The new Institute for Advanced Architectures, launched by Sandia and Oak
Ridge national laboratories, was established to close the gaps between
actual performance and the theoretical peak performance of current
supercomputers, says Sandia project leader Sudip Dosanjh. "We believe this
can be done by developing novel and innovative computer architectures," he
says. One purpose for the institute, Dosanjh says, is to reduce or
eliminate the growing mismatch between data movement and processing speeds.
Sandia computer architect Doug Doerfler says that a key to scalability is
making sure that all processors have something to work on at all times.
The ability for designers to split processors into multiple cores on a
single die further compounds the problem. Jeff Nichols, who heads the Oak
Ridge branch of the institute, says continuing to make progress in running
scientific applications at such large scales will require maintaining a
balance between hardware and software, and there are huge software
programming challenges to solve. The institute is also tasked with
reducing the amount of power needed to run a future exascale computer.
Dosanjh says an exascale computer using modern technology would consume
"many tens of megawatts," which would occupy a significant portion of a
power plant.
Click Here to View Full Article
to the top
Jim Hendler Shares AI's Lessons for the Semantic
Web
ZDNet (03/20/08) Miller, Paul
Rensselaer Polytechnic Institute professor James A. Hendler has been
closely involved with artificial intelligence research for many years, and
is one of the progenitors of the semantic Web ideal. Hendler promotes the
idea of weakening the "tethers" that bind us to computers, Web sites, and
applications, highlighting the transition that we are already making with
the expanding capabilities of mobile devices. Hendler also emphasizes the
importance of metadata and structured information in sustaining the virtual
connections between resources that will allow us to break the physical
bonds that tie us to our computers and their applications. Hendler
believes that there are currently several shared visions of what the
semantic Web will be, but it can essentially be broken down into two main
areas of utility. The first is heavy duty reasoning, or the artificial
intelligence version of the semantic Web that is based on an extremely
detailed and highly expressive model of a subject domain, which is used to
analyze large bodies of data. The second vision is the data-driven
semantic Web, which is more lightweight and geared toward the application
of a less-structured world view. Hendler suggests that the first is more
Semantic-oriented while the second is more Web-based, but he stresses that
both are important and valid.
Click Here to View Full Article
to the top
Women in Canadian IT: How the Best Get Ahead
Computerworld Canada (03/19/08) Smith, Briony
A recent forum on Women in technology focused on the lack of women in the
IT field and how the problem starts in school with teachers that are
unaware of the skills required or the opportunities available in the field.
Microsoft Canada's Elizabeth Carson says the lack of women in IT is an
industry-wide challenge, and finding women with strong, technically-deep
experience is hard. "The candidate pool is getting smaller, so having that
diversity is not just a rights issue, but a competitive advantage--they can
offer a different perspective," Carson says. She says the deeply technical
positions tend to be dominated by men, while the women in the field tend to
work in less-technical positions such as project managers and business
analysts. Bell Canada's Vanda Vicars says that a major barrier preventing
more women from entering the field is that they do not network as much as
men do, which could be counteracted by implementing a structured mentoring
system that would help women navigate the workplace. Vicars also suggests
ongoing networking meetings and groups where women in the company's IT
department, or IT in general, can interact. Joanne Stanley, the managing
director of the Ottawa-based CATA Women in IT Forum, says women should be
made aware of IT jobs that might be more interesting to them, such as jobs
in human resource technology, online management and collaboration, IT
security, IT architecture, and business and system integration.
Click Here to View Full Article
to the top
Q&A: Experimental P2P Technology Eases User, ISP
Pain
Network World (03/18/08) Reed, Brad
Many of the problems with peer-to-peer systems that ISPs have
traditionally had to contend with could be eliminated with experimental
technology created by Yale PhD candidate Haiyong Xie, which was recently
tested over the Verizon network. Xie says in an interview that "P4P"
technology supplies an iTracker, which is a server that exploits the
information in a network topology map to examine traffic patterns and
deliver suggestions for people within the network to become network-savvy.
Verizon contributed the network topology map to the P4P trial, while Pando
Networks offered the use of its P2P software and servers and deployed an
appTracker server to communicate with the Pando network and the iTracker.
Xie notes that the field test involved a video file that all clients tried
to download and share, and the iTracker enabled most of the traffic to
become localized. "Now that we know the network information, we can make
better decisions and thus dramatically reduce number of hubs used in the
transfer," he says. Xie adds that another advantage the technology offers
is a substantial reduction of the amount of traffic streaming into and out
of the network. He says some ISPs are unprepared to support the P4P
technology because their infrastructure differs from Verizon's, which means
the iTracker will need to be refined to accommodate them. "P2P in the long
run can be a very good complementary solution to the current Internet for
delivering commercial products--people are adding more features to P2P tech
and are adding more quality of service protocols into P2P," Xie says.
Click Here to View Full Article
to the top
Yahoo Empowering Semantic Web Programmers
eWeek (03/13/08)
Yahoo announced that it will soon provide APIs to its Search platform to
allow third-party developers to alter search results with structured data
to make it more useful for Web users. The program will enable developers
to overlay their own algorithms to determine how the Yahoo Search index is
used. Yahoo is also supporting several semantic Web standards, including
RDF, and microformats to make programming on Yahoo's search platform
easier, says Yahoo's Amit Kumar. Programmers have been slow to support
standards and write software for the semantic Web, in part because it lacks
a killer application, Kumar says. He says Web search is the missing killer
app. Instead of independently developed semantic silos scattered across
the Web, Yahoo aims to bring all the semantic information together once it
is available. For example, Kumar says that marking up profile pages with
microformats will allow Yahoo Search to better understand the semantic
content and the relationships of its site's components. "If I can put an
algorithm on top of search that says here are all of the things I want the
algorithm to prioritize and here's all of the things I want it to exclude
that's really powerful," says IDC analyst Rachel Happe.
Click Here to View Full Article
to the top
Cyberscholarship: High Performance Computing Meets
Digital Libraries
Journal of Electronic Publishing (Quarter 1, 2008) Vol. 11, No. 1, Arms,
William Y.
The convergence of high-performance computing and digital libraries is
giving birth to new forms of research that are classified as
"cyberscholarship," which is only possible when there are digital libraries
with extensive collections in a wide-ranging domain, writes Cornell
University computer science professor William Y. Arms. There must be new
strategies for organizing and using library collections in order to exploit
cyberscholarship opportunities. For example, the National Virtual
Observatory is designed to integrate previously disconnected astronomical
datasets through the provision of coordinated access to distributed data
since the datasets are archived at many locations. There is limited
experience to direct the development of cyberscholarship applications, and
among the lessons that such applications have yielded is the need for
market research and incremental development of cyberscholarship
requirements, which in turn requires flexible organization and tight
cooperation between researchers, publishers, and librarians. The existence
and accessibility of data in machine-readable formats is critical to
cyberscholarship, while issues of policy should not be discounted and
custodianship involves the preservation of the content as well as the
identifying metadata. Tools and services that cyberscholarship needs
include APIs that enable direct interaction between computer programs and
collections, and instruments that can identify and download
sub-collections, which require digital libraries to be designed so that
programs can extract large portions. High-performance computing is an
essential ingredient of very big collections, and the use of
high-performance computing systems by researchers who are not computing
specialists remains a major challenge.
Click Here to View Full Article
to the top
Fly, Robot Fly
IEEE Spectrum (03/01/08) Vol. 45, No. 3, P. 25; Wood, Robert
Researchers at Harvard Microrobotics Laboratory are working to create
insect-like flying robotic vehicles that can "perform rescue and
reconnaissance operations with equal ease," writes the laboratory's Robert
Wood. This could lead to a whole new paradigm in approaching difficult
situations; for example, after a natural disaster, rather than have
personnel search out survivors themselves they could deploy thousands of
tiny flying robots that could "detect signs of life, perhaps by sniffing
the carbon dioxide of survivors' breath or detecting the warmth of their
bodies," Wood says. Insects use a complex combination of different wing
motions to handle aerodynamics at their tiny scale, much different from the
larger scale aerodynamics used for airplanes, even model airplanes. One
important motivation for creating these tiny flying robots is to bring
unmanned flight within the price range of law enforcement and emergency
rescue services. "We placed a great deal of importance on our choice of
materials, which ultimately had to be cheap and fairly easy to work with,"
Wood says. "Durability was less important, because we envisioned a robot
that could be replaced for less than $10." The researchers have focused on
duplicating the flight of the two-winged insects of the order Diptera, and
over hundreds of iterations their design has become ever closer to a real
fly's shape. The small scale involved makes the materials and fabrication
science behind the robots highly novel. "Just because we designed the
robot didn't mean we knew how to make it, and mechanical components with
features of one micrometer are well below the resolution of standard
manufacturing techniques," Wood says.
Click Here to View Full Article
to the top
Terror on the Internet: A Complex Issue, and Getting
Harder
IEEE Distributed Systems Online (03/08) Vol. 9, No. 3, Goth, Greg
Attempts to crack down on online terror face the challenge of doing so
without restricting free speech and access to information, and politicians
the world over regularly call for the removal of terrorist sites from their
hosts' site servers or for the blockage of access to such sites by search
engines. "Those who think that we can stop online terrorism by removal of
Web sites are either naive or ignorant about cyberspace and its limitations
for interference," says Haifa University professor Gabriel Weimann. "As a
short answer, there is a need for strategy and not tactics, there is a need
for a multi-measured approach, and not just 'Let's kill those Web sites.'"
Weimann says multilateral agreement on fighting Web terror is lacking
because the issue is riddled with legal ambiguities, such as who ultimately
has authority over the determination of terrorist sites. In addition,
there is great disagreement over to what degree content--such as
instructions on an arborists' site for making explosives to blow up tree
stumps--could be defined as terror-inducing material. Meanwhile, ISPs'
efforts to develop filtering and blocking technologies for Web sites owned
by a wide range of malevolent parties are being met by jihadists'
improvement of work-around strategies. Government-directed
anti-cyberterror initiatives include collaboration with independent groups
that collect and examine global terror site content, and the development of
deep analytic technologies such as Web spiders that can study links between
jihadi sites, messages, and forum postings to create white-hat viruses and
malware designed to hamstring or compromise jihadi sites.
Click Here to View Full Article
to the top
The Future of Web 2.0
Campus Technology (03/01/08) Vol. 21, No. 7, P. 20; Grush, Mary
Gary Brown, director of Washington State University's Center for Teaching,
Learning, and Technology, says in an interview that the advent of Web 2.0
requires the migration to next-generation online learning, but adds that
"so far, instead of transforming the traditional classroom with online
learning, we've merely transposed it to what is now the traditional course
management system [CMS] or collaboration and learning environment [CLE]."
Brown predicts that the percentage of the college population that goes to
community college and must work will increase, while about half of the
population is comprised of students who take courses from multiple
institutions. EPortfolios may be more effective measures of student
learning than standardized tests, but Brown says they are currently limited
because they are institution-specific, while Web 2.0 offers students an
opportunity to combine numerous applications that they would control
themselves and can share with anyone. "To that end, we should start
thinking not so much in terms of an ePortfolio but, instead, in terms of a
personal learning environment [PLE]," he says, adding that the technology
WSU is generally tapping to promote this transition is already out in the
world. These "worldware" applications exist in the hundreds, and Brown
says it is critical to expect that these technologies will continue to
change, mature, and explode so that proper accommodations can be made. WSU
is collaborating with Microsoft on a "harvesting" gradebook that an
instructor would use to link students to all of the work they must perform,
and which Brown expects to be a killer app. Brown is also an advocate of
inviting outside employers to participate in the online learning process,
pointing out that employers are less interested in the degrees students
earn or the courses they complete as in their ability to perform the
desired work.
Click Here to View Full Article
to the top
E-Voting Vendor's Web Site Hacked
IDG News Service (03/20/08) Montalbano, Elizabeth; McMillan, Robert
Sequoia Voting Systems' e-voting Web site has been hacked, stirring uproar
from New Jersey officials that used the Ballot Blog in a February
presidential primary. Princeton University computer science professor
Edward Felten reported the breach, following an inquiry from a state county
clerks coalition to investigate the e-voting system. Evidence of the
infiltration was apparent because the hacker had inserted a message with a
cyber tag name. The system was temporarily suspended and users were
redirected to a hosting-provider page, but Sequoia later brought the blog
back online. "My guess is that they took the site down temporarily while
they were clearing out the stuff left behind by the intruder," Felten says.
The county clerks have asked New Jersey attorney general Anne Milgram to
probe Sequoia Voting Systems AVC Advantage e-voting machines, due to
discrepancies in vote counts during the primary. Sequoia says different
vote totals were due to poll worker mistakes and warned Felten against
investigating it further.
Click Here to View Full Article
to the top