Voting Glitch Said to Be 'Disastrous'
Inside Bay Area (CA) (05/10/06) Hoffman, Ian
A recently discovered vulnerability in Diebold's touch-screen voting
machines has election officials scrambling to understand and contain the
risk. A hacker with minimal specialized knowledge of Diebold's system and
an off-the-shelf component could load software onto the machine to disable
it or alter vote counts in a matter of minutes. "This one is worse than
any of the others I've seen. It's more fundamental," said Douglas Jones, a
University of Iowa computer scientist. "In the other ones, we've been
arguing about the security of the locks on the front door," he said. "Now
we find there's no back door. This is the kind of thing where if the
states don't get out in front of the hackers, there's a real threat."
Finnish computer expert Harri Hursti discovered the flaw while working with
Black Box Voting in March, and quietly spread word of the glitch to several
prominent computer scientists who advise states on voting machines.
Pennsylvania, California, and Iowa have directed their election officials
to seal the machines with tamper-proof tape until election day, though
California advised its counties that intend to use only Diebold machines in
their upcoming elections that the threat is low, and that tampering would
be easily detected by voters from the paper read-out and by officials once
they recount 1 percent of their precincts' paper ballots. California
Assistant Secretary of State for elections Susan Lapsley downplayed the
risk, arguing that "it assumes access and control for a lengthy period of
time." Scientists disagree, noting that hackers could work out plans ahead
of time, and that it only takes a minute to install the software, a hole
that apparently originated from Diebold's efforts to make it as easy as
possible to update the software inside its systems.
ACM's U.S. Public Policy Committee has released a report on Statewide
Databases of Registered Voters. To review, visit
http://www.acm.org/usacm/VRD
Click Here to View Full Article
to the top
CFP 2006: Life, Liberty and Digital Rights
TidBITS (05/08/06) Porten, Jeff
Participants at ACM's recent Computers, Freedom, and Privacy (CFP)
conference met in Washington, D.C., to discuss the effect of technology on
society, focusing especially on the ways that governments can use
information against their citizens. CFP has historically facilitated an
honest discussion between the hacker, security, privacy advocate, and law
enforcement communities. A panel debating the different approaches to
privacy law in the United States, Canada, and the European Union noted the
difficulty that Europeans have faced in implementing centralized privacy
laws, while the United States still maintains a patchwork of national and
local laws. Participants debated privacy in the context of a world where
information flows freely and governments maintain massive databases of
personal information, often over the objections, or without the knowledge,
of their citizens. A panel discussing the Bush administration's domestic
surveillance program questioned the legality of both the government agents
and telecommunications companies involved, noting that under the Foreign
Intelligence Surveillance Act, employees of private companies found to be
complicit in illegal surveillance are subject to criminal prosecution, a
provision that has raised questions about AT&T's collaboration with the NSA
to deliver vast troves of telecommunications traffic. In another
discussion, Apple drew severe criticism for its implementation of DRM. In
the closing keynote address, science fiction author Vernor Vinge contrasted
two historical visions of a technology-driven future: the dark, Orwellian
world where privacy has completely succumbed to ubiquitous government
intrusion, and the cyberpunk world controlled by the anarchist hacker.
Vinge suggested that liberty has gradually eroded as humans have fallen
prey to a melange of technologies and laws that are steadily (and at times,
inadvertently) waging war on the last vestiges of privacy.
Click Here to View Full Article
to the top
USC Hacker Case Pivotal to Future Web Security
InformationWeek (05/09/06) Greenemeier, Larry
The trial of Eric McCarty, the 25-year-old San Diego resident who claims
that he hacked into the University of Southern California computer system
only to call attention to its vulnerabilities, could become a referendum on
acceptable practices of security research, especially if he is convicted
and sentenced to the maximum of 10 years in prison. Everyone agrees that
McCarty violated the law, though the ethical legitimacy of his actions is
being hotly debated, and many security researchers believe the maximum
penalty is extreme, particularly since McCarty has been cooperating with
the FBI. McCarty hacked into a SQL database that contained the Social
Security numbers, birth dates, and other identifying information for more
than 275,000 USC applicants dating to 1997. McCarty initiated a SQL
injection after he found a vulnerability in the login system of USC's
application Web site. The university then took the site down for two weeks
to fix the flaw. Security professionals have mixed feelings about
McCarty's actions. "McCarty was trying to prove a point," said Digital
Defense's Rick Fleming. "Part of me commends him for saying, 'Hello, wake
up.' But he crossed an ethical boundary because he didn't have permission
to test that system, and he broke the law." The online document called
RFPolicy informally lays out the basic protocols for researchers to
communicate with vendors and developers to address vulnerabilities.
RFPolicy has no legal authority, however, and it does not provide a method
for legally entering someone else's IT environment and testing Web
applications. Security experts worry that if McCarty is sentenced to jail,
many white-hat researchers will either stop looking for flaws or stop
reporting them for fear of legal reprisal. "If the good guys aren't going
to do this research, that's a bad thing because the bad guys certainly
won't stop," says WhiteHat Security founder Jeremiah Grossman.
Click Here to View Full Article
to the top
Cause for Concern? Americans Are Scarce in Top Tech
Contest
Wall Street Journal (05/10/06) P. B1; Gomes, Lee
Only four of the top 48 contestants in the recent TopCoder global finals
in Las Vegas came from the United States, renewing concerns that Americans
are falling behind their international counterparts in computer science.
The United States dominated the competitions when they began in 2001, but
this year Russia claimed eight of the top spots and Poland took 11, while
Norway and China both took four. The Polish contestants viewed their
strong showing less as a sign of superior training and education than as a
testament to the significance of such a competition for a country that has
so few opportunities to compete on the global stage. Programming has
captured the popular imagination in Poland largely due to Tomasz Czajka,
who has won more than $100,000 as a repeat TopCoder champion. Each
90-minute round of the contest had three problems, described as easy,
medium, and hard. No American made the final cut of the two-day
competition. The top prize of $20,000 went to the Russian Petr Mitrichev.
Ken Vogel, a former TopCoder contestant, noted that the competition is not
the best measure of a programmer's worth in the job market. The
competition tests the ability to solve a series of contrived problems
quickly, while in the real world, companies are looking for employees who
can work in team settings, see the big picture, and anticipate the desires
of end users. The Americans' poor showing in the contest nonetheless
raises troubling questions about the anti-intellectual strain in the United
States. Po-Shen Loh, one of the four American-born contestants to finish
in the top 48, got a heavy heart when he watched a cartoon aimed at
toddlers where the characters were making fun of the stereotypically smart
one: "If this is what American kids are watching before they know any
better, it can't help but affect them later on."
Click Here to View Full Article
to the top
Engineered by Women, for Girls
Connect for Kids (05/08/06) Rafferty, Heidi Russell
Since 1981, the New Jersey Institute of Technology (NJIT) has run the
FEMME program, a summer learning experience designed to encourage
inner-city girls to pursue careers in engineering, science, and IT. With
men accounting for 80 percent of today's engineers, and the affluent far
more likely to succeed in technical fields, FEMME director Suzanne Heyman
says the program's goal is to help close the gap for underprivileged girls.
While most girls usually abandon high-level math and science classes in
the ninth grade, Heyman and her fellow instructors, most of whom are women,
hope to immerse their students in the sophisticated math and science
concepts at a young age so that they will be second nature by the time the
girls reach college. Instructors discuss workplace gender issues and try
to facilitate an honest dialogue about the pros and cons of various
careers. The multi-year program is divided into two groups: the FEMME
program, a day camp for fourth through eighth grade girls, and the FEMME
Academy, a three-week residential program for ninth graders. NJIT then
offers a co-ed college preparatory program for engineering for 10th through
12th graders. Each grade level dives into a different topic in depth:
fourth graders study environmental engineering; fifth graders study
aeronautical engineering; and sixth graders concentrate on mechanical
engineering. The mechanical engineering course is a favorite among the
students because they take a trip to an amusement park to observe the
roller coasters and see the principles they have learned in motion. The
seventh graders study chemical engineering, where they watch how hair dye
is made at the L'Oreal factory. Eighth graders study biomedical
engineering, one of the most popular engineering disciplines among females
because it offers the clearest societal impact, and ninth graders study
electrical and computer engineering.
For information on ACM's Committee on Women in Computing, visit
http://women.acm.org
Click Here to View Full Article
to the top
The Most Realistic Virtual Reality Room in the
World
Iowa State University News Service (05/08/06) Krapfl, Mike
Iowa State University is spending more than $4 million upgrading C6, a
hexagonal virtual reality room that will project 100 million pixels, twice
the number of pixels illuminating any other virtual reality room in the
world. Iowa State opened C6 in 2000 as the first six-sided room in the
country that provides an immersive auditory and visual experience, though
the equipment has not been updated since. The new equipment will feature a
Hewlett-Packard computer with 96 graphics processing units, 24 Sony digital
projectors, and an ultrasonic motion tracking application. Supported by a
Defense Department appropriation through the Air Force Office of Scientific
Research, the upgrade began this spring, and is expected to be unveiled in
the fall, with a grand opening celebration planned for spring 2007. Iowa
State architecture professor Chiu-Shui Chan has used C6 to generate 3D
models of buildings and cities, and is exploring the effect that virtual
reality could have on urban planning and workplace efficiency testing. The
upgrade to the system will help Chan deliver more realistic and interactive
models that convey a stronger sense of place than he can with the existing
C6 technology. Other researchers are using C6 for visualizations of genes,
cell biology, and engineering tools. James Oliver, a professor of
mechanical engineering and the director of Iowa State's Virtual Reality
Applications Center, is leading a project to develop a virtual reality
control room for unmanned aerial military vehicles. Under Oliver's system,
a lone operator could control many vehicles by monitoring their surrounding
airspace, the terrain over which they are flying, and information taken
from instruments, cameras, and radar and weapons systems in the virtual
environment. "The idea is to get the right information to the right person
at the right time," he said. "We think this kind of large-scale, immersive
interface is the only way to develop sophisticated controls."
Click Here to View Full Article
to the top
Democratic Senator Wants Net Neutrality
Regulations
CNet (05/09/06) Broache, Anne
An aide to Sen. Daniel Inouye (D-Hawaii) advocated legislation calling for
the regulation of broadband providers at a broadband policy summit on
Tuesday. "I don't view this as new regulation of the Internet," declared
Senate Commerce Committee Democratic counsel James Assey Jr. "In fact, I
view it as reaffirming what has been a very old principle...that network
operators with an ability and incentive to discriminate be prevented from
doing so." Network operators claim they should be allowed to charge
premium fees to heavy bandwidth users so as to recoup their infrastructure
investments and to guarantee product security and quality; Internet
companies and consumer groups argue that the operators' proposed business
model would force them to pay more money in addition to the already vast
sums they currently pay broadband providers to deliver content. On April
26, House Republicans killed a Democratic-led bid to turn strict Net
neutrality rules into law. "It would be unthinkable for the government to
insert fees into the way the Internet is now, but yet there are a number of
people who would be fine with private entities doing so and being able to
selectively pick and choose and treat others differently for any reason
they see fit," said House Energy and Commerce Committee Democratic counsel
Johanna Shelton. Senate Commerce Committee Chairman Sen. Ted Stevens
(R-Alaska), endorsed a hands-off approach to broadband providers in a draft
bill released last week despite his concerns about discrimination, after
hearing from Wall Street interests who warned that federal intercession
would "chill investments," said committee staff director Lisa Sutherland.
She noted that lawmakers have received an "unprecedented" amount of input
from lobbyists, constituents, and virtually anyone with something to say
about the Net neutrality issue.
Click Here to View Full Article
to the top
Vanderbilt Engineers to Help Air Force Use Global
Information Grid
Vanderbilt News Service (05/08/06)
The U.S. Air Force wants to tame the ad hoc manner in which the Global
Information Grid (GIG) has grown over the years. The military service's
research laboratory has awarded a $1.2 million grant to a group of U.S.
researchers to develop software that will allow military personnel to use
the disparate resources of the GIG more effectively. The communications
technology of the GIG range from the Internet and landlines to cell phones
and satellite. However, GIG has not been immune to cell phone dead spots,
busy signals, email spam, voice mail loops, and other problems associated
with technology. "The software we are creating not only will broaden
communications capabilities by utilizing the GIG to augment Air Force
communications technology such as warfighters' radio, landline and
satellite communications, but also will ensure that all communications are
delivered according to commander priorities and are protected from
interception and disruption," says Douglas C. Schmidt, a professor of
computer science at Vanderbilt University. Engineers at Vanderbilt will
work with researchers from Carnegie Mellon University on the software,
which must provide an efficient interface, reliability, and military-level
security. "It's helpful to think of the GIG as presenting a similar, but
actually even more complex, challenge in terms of integrating technologies
sufficiently for them to work together," notes Schmidt.
Click Here to View Full Article
to the top
Cell-Phone Tracking: Laws Needed
Wired News (05/08/06) Singel, Ryan
The cell phone industry and privacy advocates are urging Congress to adopt
clear, standardized rules regarding the use of mobile phones to track
suspects. At ACM's recent Computers, Freedom, and Privacy Conference, a
panel agreed that Congress should write rules governing what level of
suspicion police need to have before tracking people through their cell
phones. Law enforcement is currently allowed to track suspects using their
cell phones without probable cause, a practice the Justice Department says
is sanctioned by a combination of wiretap laws governing stored
communications plus a law that lets law enforcement learn the phone numbers
people dial. However, eight out of the 10 judges who have published
decisions since August have rejected the DOJ's legal arguments. "We've
seen an avalanche of...decisions rejecting the government's hybrid theory,"
said Kevin Bankston, a lawyer with the Electronic Frontier Foundation,
during the panel discussion. "For several years, the DOJ has been
successfully pulling the wool over the eyes of magistrates." Bankston
added that some of the legal uncertainty may be resolved soon, since the
DOJ has filed an objection in at least one case. Other members of the
panel, including Catholic University of America law professor Clifford
Fishman, did not understand the fuss over law enforcement tracking cell
phones without a probable cause. "The government has legitimate reasons to
follow people," he said. "This is the technology law enforcement needs to
get the probable cause to search you, arrest you, and throw you in
jail."
Click Here to View Full Article
to the top
Q&A: IBM's Dharmendra Modha
Red Herring (05/06/06)
In a recent interview, IBM's Dharmendra Modha, chair of the Almaden
Institute, discussed his research in cognitive computing and his belief
that scientists must develop a better understanding of the human mind in
order to endow computers with intelligence. Later this week, the institute
will hold a cognitive computing conference where Modha will challenge the
scientists in attendance to advance the understanding of neuroscience and
psychology as the basis for intelligent computers. Modha prefers the term
cognitive computing to artificial intelligence because it more precisely
conveys the idea of the brain as biological hardware, equipped with all the
cognitive processes of perception, language, memory, intelligence, and
consciousness. Rather than developing algorithms to measure thoughts and
feelings, Modha's goal is to reverse engineer the brain and crack the
algorithm that mirrors the brain's processes through a mathematical and
computational approach to neuroscience and psychology. While there are too
many variables and hurdles to clear to project an accurate timetable for
reverse engineering the brain, Modha believes there is an abundance of data
that have been collected about the brain and that recent improvements in
supercomputing performance give today's researchers the ability to make
revolutionary strides in cognitive computing. Ultimately, Modha hopes for
commercial applications from even the simplest form of artificial mind.
Click Here to View Full Article
to the top
Real-Time Maps Could Help Make Cities More Livable
Technology Review (05/10/06) Bourzac, Katherine
Researchers at MIT's SENSEable City Laboratory can track the movements of
anyone using MIT's wireless network by simply monitoring the access points
to which their devices are connected. The lab is led by Carlo Ratti, an
Italian architect who in a recent interview discussed his research aimed at
developing real-time maps from location data that provide insight into the
movement of people and the flow of traffic through cities. As companies
such as Microsoft and Google continue their push into real-time mapping and
municipal Wi-Fi projects, however, Ratti is troubled by privacy concerns,
and advocates a collaboration between city planners and technology and
telecommunications companies to develop infrastructures that will safeguard
individual privacy. Access to dynamic, real-time city maps could
streamline transportation, as individuals could tailor their movements
according to the overall traffic flow in the city. Ratti is developing a
project called Rome in Real-Time for the Venice Biennale architecture
exhibition that will overlay all the real-time information that his team
can obtain on a city map, including cell phone data and bus and taxi
positions. Architects and city planners can use the maps to make better
use of space by designing areas in accordance with real movement patterns.
Ratti says the Katrina relief debacle would have been avoided with a
real-time positioning system that tracks cell phones, though he believes
that the privacy implications are too grave to ignore. His basic solution
is to give people the choice to not have their data monitored; at MIT,
students and faculty will be able to decide on an individual basis who can
monitor their location. Companies such as Google, which is donating Wi-Fi
equipment to build a mesh network in San Francisco, are hoping eventually
to cash in on the data they collect from municipal Wi-Fi projects.
Gratti's lab is putting together a consortium to discuss the future of the
city with planners, telecom companies, and hardware and city-infrastructure
providers.
Click Here to View Full Article
to the top
Report: Computing Poised to Change the Way Science Is
Done
Johns Hopkins University News Releases (05/04/06) De Nike, Lisa
Scientists have collected more data in the past year than in all the
previous years that science has been practiced, amassing mountains of data
requiring sophisticated new analytical tools that will help researchers
extract meaningful discoveries, according to Alexander Szalay, professor of
astronomy and computer science at Johns Hopkins University. "Computer
science has the potential to drastically change the way we do science and
the science that we do," Szalay said. "It will play a critical role in
tackling the largest challenges facing our world, from medicine and health
to energy and the environment." Petabyte-scale datasets will require major
advances in computing that will have a transformative impact on scientific
practices over the next 15 years. Szalay is a member of the 2020 Science
Group, and has been working with Microsoft's Jim Gray for almost 10 years
on projects related to the impact of computing on scientific conduct.
Szalay has worked on numerous collaborations, including projects at Johns
Hopkins to simulate turbulence, develop wireless sensors to monitor the
environment, and create the multi-terabyte archive for the Sloan Digital
Sky Project. Szalay and the 2020 group note that while the data explosion
has been mostly confined to the physical sciences, it will soon have a
major impact on the life sciences. Scientists can develop far more
accurate models of complex systems that will enable them to map epidemics
such as avian influenza and malaria and improve response time in the event
of an outbreak. With scientists increasingly looking to databases to make
connections between different sets of information, computer science could
soon become as integral to their job as mathematics. A report issued by
the 2020 group calls for elevating scientific innovation as a national
priority and improving public awareness about the value of research.
Click Here to View Full Article
to the top
Science, Tech, Math Degrees Dropping
Durham Herald-Sun (NC) (05/06/06) Smith, Gerry
The percentage of college students graduating with math, science,
engineering, technology, and technology-related degrees has declined from
32 percent of all graduates in 1995 to 27 percent today, according to a new
report from the Government Accountability Office (GAO). The report raises
concerns about the future competitive advantage of the United States in
technology, as well as questions whether the high-tech industry will face a
shortage of workers in the years to come. The number of foreign workers in
the industry is also starting to fall, with the report showing a decline in
visa approvals for computer system analysis and programming jobs from
around 163,000 in 2001 to roughly 56,000 in 2002. "There's a general
perception out there that there are no jobs for students with computer
science degrees because all the jobs are going to China and India," says
Marcia Harris, director of Career Services at the University of North
Carolina. "But we're hearing from major employers like IBM and Microsoft
who are very concerned about where they're going to find talent in this
country." College students and officials see the quality of teaching and
high school preparation as having a negative impact on the number of
students enrolled in math, science, and tech programs. Universities would
do well to target women and minorities in their outreach efforts, added the
GAO.
Click Here to View Full Article
to the top
Microsoft Scientists Pushing Keyboard Into the
Past
CNet (05/03/06) Kanellos, Michael
The technologies and prototypes on display at Microsoft Research's road
show indicate that the software giant's lab is focused on making it easier
for cell phone and handheld users to input data or navigate the Web.
Microsoft is showing off a prototype of a program that will enable users of
the devices to conduct searches by using abbreviations and truncated
versions of words. The Wild Thing application is designed to make use of
the telenumeric pad on a cell phone, in which numbers are represented by
three sequential letters (i.e. A, B, or C for 2) and punctuations marks
determine work spaces or other grammatical rules. Fewer letters would be
required for more popular topics, and the application groups search
results. A user who types "2*#7423" for Condoleezza Rice would return
results for Condi Rice first, but would also receive search results for
brown rice, Anne Rice, and Cellular Shades. Microsoft is holding
discussions with device makers and carriers, and believes the application
could find its way into cell phones within a year. The company is also at
work on a software interface that would enable users to input letters using
sweeping motions and gestures; the approach is similar to the Shark
prototype of IBM and the application Hewlett-Packard has developed for the
Indian market. Microsoft is also displaying its Pinpoint application for
monitoring the location of someone using GPS or triangulation with W-Fi or
cellular data.
Click Here to View Full Article
to the top
DNS Security: Most Vulnerable and Valuable Assets
IT Observer (05/08/06)
A survey conducted by Cornell University's Computer Science Department
mined public data to determine: The most vulnerable assets of the Domain
Name System (DNS); the servers most likely to be assaulted because they
control the biggest chunk of the namespace; and the existence of servers
with known vulnerabilities and the domain names they affect. The survey
found that attackers can gain a tremendous advantage by exploiting the
architecture of the legacy DNS, which creates many non-obvious dependencies
between names and nameservers. The higher the number of nameservers on
which a domain name depends, the bigger the trusted computing base, which
leads to a larger number of dependencies, a bigger attack profile, and
greater susceptibility to attack. According to the survey, a routine DNS
name depends on 46 nameservers on average, while the most vulnerable top
level domain names are ranked .ua, .by, .al, .sm, .mt, .va, .pl, and .it,
from highest to lowest; the bulk of country code TLDs average more than 100
dependencies per name. The survey ascertained the most valuable DNS assets
by evaluating how important a role a DNS nameserver plays in name
resolution, and found that a nameserver is involved in the resolution of
166 externally visible names, on average. Furthermore, 67 hostnames
appearing in Yahoo!+DMOZ depend on the nameserver ranked 5000, 29 publicly
visible Web sites rely on the nameserver ranked 10000, and the median
number of externally visible names served is four. In addition,
institutions that may be ill-equipped or unwilling to assume DNS
functionality operate many important servers. Information about the most
vulnerable and most valuable DNS assets was then combined with data about
established bugs in servers to infer that one in three Internet names can
be hijacked by well-known, scripted exploits; among this percentage is
www.fbi.gov as well as every other name residing in the fbi.gov
domain.
Click Here to View Full Article
to the top
Smalltalk: Requiem or Resurgence?
Dr. Dobb's Journal (05/06/06) Chan, Jeremy
Jonah Group principal consultant Jeremy Chan cannot verify a resurgence in
the use of the Smalltalk object-oriented language as indicated by Georg
Heeg at the recent Smalltalk Solutions Conference, noting that none of his
company's customer requests exhibit a desire for Smalltalk. Chan writes
that Smalltalk's status as a OO language does not really provide an
explanation of why it might be superior to other OO languages, and why,
given such alleged superiority, it has been displaced by the likes of Java
and C# as the most preferred language for enterprises in the last decade.
"This is the essence of the Smalltalk Paradox," Chan says. The author
reasons that developers may have difficulty relating Smalltalk's
programming concepts, presented by defining the language's five principal
vocabulary terms (object, message, class, instance, and method), to
something else they already know. The vocabulary defines four rules of the
language: All things are objects; all objects represent instances of some
class; objects perform tasks by sending messages; and messages are deployed
via methods. Chan attributes Smalltalk's lack of popularity to several
factors, including the defensive posture Smalltalk developers assume when
the language's superiority is questioned, and the absence of a major
industry backer with the marketing muscle to facilitate Smalltalk's
mainstream penetration. The author believes the Smalltalk community would
receive a significant boost by attracting outsiders and discussing
collaboration.
Click Here to View Full Article
to the top
SPARQL Will Make the Web Shine
eWeek (05/01/06) Vol. 23, No. 18, P. 50; Rapoza, Jim
While the Semantic Web holds the considerable potential to streamline the
organization and development of online content, many technically savvy
people are still completely unfamiliar with the term. The constellation of
technologies that have emerged as signposts of the development of the Web
2.0--blogs, wikis, social networking sites--is much better known, and the
popularity of those applications has indeed brought Web creator Tim
Berners-Lee's vision of the Web as a space where users can both create and
search for content with similar ease closer to a reality. Neither the
Semantic Web nor Web 2.0 has an adequate querying and search technology
specifically designed for it, though that could soon change if the W3C
recommendation to make the SQL-like SPARQL the standardized query language
for the Semantic Web is accepted. SPARQL is rooted in the Resource
Description Framework (RDF), though it also employs numerous Web services
standards, including Web Services Description Language. The basic
components of SPARQL are a normal query language, a data model (essentially
RDF), and a data access protocol. By enabling users to query precise and
relevant information from large databases, SPARQL has the vast potential to
transform search on the Web, pulling data from a comprehensive search of
RSS feeds, image sites, Google applications, and numerous other sources.
If SPARQL is accepted as a standard, it will only require minimal changes
in the way that users create content, thanks to the emergence of Web 2.0
technologies.
Click Here to View Full Article
to the top
Defining Trust
SC Magazine (04/06) P. 26; Kaplan, Dan
Klocwork CTO Djenana Campara co-chairs the Object Management Group's (OMG)
Architecture-Driven Modernization Special Interest Group, which seeks to
prevent terrorists from funding their malicious activities through the
exploitation of insecure U.S. networks by building a framework that would
evaluate risk and detail the characteristics and elements that make up
trustworthy software. This effort is supported by the federal government,
whose operations depend on the security and trustworthiness of software.
Campara says tool vendors and software manufacturers must shoulder the
burden of following a best possible practice, and she thinks the framework
would establish standardized design criteria and automated procedures for
tool vendors and software makers to adopt to make sure their products are
reliable and trustworthy. "Tool vendors will be building tools based on
this framework because they will know that there is a market for them,
while software suppliers will use those tools to improve and clean up
software products," says Campara. James Madison University computer
science professor Samuel Redwine reasons that a universal software
assurance framework would carry benefits for buyers as well as sellers.
"It would separate out the people who have convincing arguments and
evidence of why you should have confidence in software from those that
don't, in a rather clear way," he notes. OMG expects to release a request
for proposal (RFP) for the model, which will be open to all OMG members, by
November or December; a standardized code analysis process does not
currently exist, and creating one will require a concentration on following
authoritative coding processes, says Anthony Nadalin of IBM Software.
Nancy Mead with the Software Engineering Institute expects some vendors
will initially be resistant to the idea of a universal framework, if
conforming to it requires a substantial amount of labor.
Click Here to View Full Article
to the top