Right Questions Key to Data Mining
Chicago Tribune (05/12/06) Van, Jon
Homeland security agents are hoping that sophisticated computer programs
will help them cull through the records of millions of phone calls that
Americans have made to each other as they search for information about
terrorist plots. Some researchers question whether even the most advanced
computers will ever be able to perform the necessary link mining
operations, and congressional leaders are insisting that the Bush
administration answer questions about whether the domestic surveillance
program violates an individual's right to privacy. Others argue that the
necessary data mining operations will be difficult, but possible. "It's a
massive data problem, but you can do it," said Karl Hammond, professor of
electrical engineering and computer science at Northwestern University.
"If it were impossible to get specific answers to specific questions from a
huge database, Google couldn't exist." Hammond believes the key will be
asking targeted questions, such as limiting a query to the mobile phones in
Washington, D.C., that made calls to Tehran over a given period, and
whether those phones made calls to San Francisco in another period. "If
you approach the data without specific questions and just look for
patterns, you can find hundreds of millions of patterns," Hammond said.
Government agents may not realize that computers, unlike human detectives,
cannot make inferences and instantly change their assumptions, according to
Yali Amit, professor of statistics and computer science at the University
of Chicago. Amit also warns that it may be impossible to extract
meaningful data about such a small subset--suspected terrorists--from the
vast repository of phone records. Computer researchers are also having to
consider social networking phenomena such as instant messaging and buddy
lists as they develop new data mining techniques.
Click Here to View Full Article
to the top
Reversing Course on Electronic Voting
Wall Street Journal (05/12/06) P. A4; Cummings, Jeanne
Citing the spate of demonstrated vulnerabilities in e-voting machines,
some supporters of the 2002 Help America Vote Act have grown concerned that
the law intended to improve the voting process could have made things much
worse, and have begun filing lawsuits to block the compliance efforts of
some state election officials. The law, passed to ensure that the
confusion surrounding the 2000 presidential election is not repeated,
requires states to upgrade their voting systems to electronic machines,
which at the time were considered more reliable than the archaic paper
ballots being used in many states. Arizona was sued last week over the
e-voting machines that it purchased with federal money authorized by the
act, and a suit is likely to be filed against Colorado election officials
next week. The Arizona lawsuit charges that the e-voting machines are
unreliable, susceptible to fraud, and that electronic ballots are more
difficult to recount than paper ones. The Help America Vote Act "has been
turned on its head and it's causing more problems than solutions at this
point," said Lowell Finley, co-founder of Voter Action. Diebold argues
that its equipment is secure, and that it runs on technology that has been
in use for at least a decade. Several states returned to paper ballots
after experiencing glitches in electronic machines in the 2004 election.
In addition the charge that they are unreliable, critics of touch-screen
systems claim that the sophisticated technology gives too much control over
the election process to equipment makers. Investigations into glitches in
e-voting systems have uncovered both technical flaws and cases of user
error. Although, there has not yet been a proven instance of anyone
electronically manipulating votes in an actual election, computer
scientists say it's possible. A 2005 report from the Commission on Federal
Election Reform warned that "Software can be modified maliciously before
being installed into individual voting machines. There is no reason to
trust insiders in the election industry any more than in other industries."
To view a report entitled "Statewide Databases of Registered Voters," by
ACM's U.S. Publica Policy Committee, visit
http://www.acm.org/usacm/VRD
Click Here to View Full Article
to the top
This Is Your Brain on a Microchip
CNet (05/11/06) Olsen, Stefanie
The parallels between the current interest in cognitive computing and the
preliminary emergence of mobile computing are not lost on Jeff Hawkins,
co-founder of Palm Computing. Endowing a computer with the ability to
process information like a human brain--the essence of cognitive
computing--will either occur "'not in our lifetime' or 'any moment now,'"
Hawkins wryly observed to a crowd at this week's cognitive computing
conference. "We've been trying to do this for 50 to 60 years. Artificial
intelligence, fuzzy logic, neural networks, the Fifth Generation
project--they've all had big moments in the sun." The researchers at the
conference agreed that in spite of the many disappointments and failed
projects over the years, some cognitive computing initiatives are actually
beginning to pay off. Hawkins himself launched a company called Numenta in
March 2005 that is developing an open-source computer memory platform
modeled after the human brain that will allow programmers to develop
applications for artificial intelligence, computer vision, and machine
learning. James Albus of the National Institute of Standards and
Technology called for a national program to formulate a scientific theory
of the mind, proclaiming that cognitive computing is approaching a
watershed. The technology required to conduct conclusive experiments is
rapidly emerging and intelligent systems are becoming commercially viable
in areas such as the automotive and entertainment industries, Albus noted,
adding that government and industry will invest billions of dollars in
cognitive computing research over the next 10 years. Scientists are
increasingly focusing their research on the neocortex, the area which
accounts for around 80 percent of the brain and controls sophisticated
thought. Several projects are underway to create complex simulations of
different aspects of the neocortex.
Click Here to View Full Article
to the top
French Digital Music Copyright Bill Advances
New York Times (05/12/06) P. C3; Crampton, Thomas
French lawmakers have moved closer toward passing a copyright law that
could reshape the landscape of digital music. Bowing to pressure from
Apple, the Senate amended the bill to modify the provisions that would have
required Apple to make all the music sold at its iTunes store playable on
devices other than the iPod. The Senate version of the bill also only
allows companies to appeal to the courts to force Apple to open its music,
while the version in the National Assembly permits such requests from
consumers. The material effect of the legislation on companies such as
Apple and Sony will only be determined by committee sessions, but the issue
reflects the broader debate playing out around the world over patents and
copyrights in the age of Internet distribution. "France has adopted an
entirely new and unique approach to managing digital music and films that
could be a model for other countries to follow," said Ovum's Jonathan
Arber. "Everyone will be watching the impact six months down the line to
see whether consumers or companies have benefited." The penalties for
piracy are reduced to the level of a traffic infraction and software makers
must disclose details of their programs to the government in both versions
of the bill. Apple, Vivendi, and Time Warner are aggressively lobbying
against the bill, claiming it is tantamount to sanctioning piracy, though
the French government argues that it will encourage innovation.
Click Here to View Full Article
to the top
As Tech Advances, Privacy Laws Lag
Los Angeles Times (05/12/06) P. A1; Menn, Joseph; Granelli, James S.
Privacy laws are struggling to keep up with rapid advancements in
data-tracking technology, a fact that was underscored by Thursday's
revelation that three of the top telephone companies in the country
provided customer calling records to the National Security Agency (NSA).
The advent of powerful database tracking programs has made American's
personal data easier to collect and distribute than ever before. A wide
range of parties, including credit card companies, online retailers,
curious neighbors, and county law enforcement, now have the capability to
collect this personal data. And companies that collect this type of data
can suddenly find themselves at the center of a privacy controversy when
their customers' privacy expectations collide with the U.S. government's
national security needs, which is what happened when AT&T, Verizon, and
BellSouth complied with the NSA's request for customer calling records.
Online retailers such as Amazon.com use powerful software to make
recommendations to customers, and credit card companies also tailor their
offers to consumers by tracking consumer purchases. "You have to think
about how that information could be misused or used too zealously," says
Martin Flaherty, a law professor at Fordham Law School.
Click Here to View Full Article
to the top
3 States Mandate More Security for Diebold E-Voting
Machines
Associated Press (05/11/06) Goodin, Dan
Diebold is developing a permanent solution for a flaw in its electronic
voting machines that some observers believe could be used to conduct
unauthorized functions, and even sabotage an election. Researchers with
Black Box Voting, a nonpartisan, not-for-profit organization, discovered
the feature that could theoretically allow a hacker to load authorized
software on Diebold Election Systems e-voting machines, and the Oakland
Tribune reported the vulnerability this week. Black Box Voting also plans
to release a report on its finding this week. "It's a deliberate feature
that was added by Diebold that we all believe is unwise," says Carnegie
Mellon University computer science professor Michael Shamos, who has been
briefed on the flaw. Diebold maintains that there has been no evidence
that any voting on its machines has been compromised, adding that following
its existing security procedures will make it difficult for anyone to take
advantage of the vulnerability. Although Pennsylvania officials say
someone would need to have physical access to the memory card slot while
the system booted up in order to exploit the vulnerability, they have
ordered local officials to reinstall the authorized software just before
testing Diebold machines and certifying them for use. California and Iowa
have mandated similar policies for Diebold computerized machines until the
company delivers a permanent solution.
For information on ACM's e-voting activities, visit
http://www.acm.org/usacm
Click Here to View Full Article
to the top
AJAX Experts Tackle Security, Other Issues
eWeek (05/11/06) Taft, Darryl K.
A group of experts met to discuss the major issues concerning AJAX, such
as tooling, security, support, and the stance of Microsoft at this week's
AJAX Experience conference. Members of the audience were most concerned
about security, and panelist Alex Russell, co-founder of The Dojo Toolkit,
noted that the basic security issues have not changed over the past five
years, and that trust is still at the center of computing security,
irrespective of the introduction of AJAX. There have been some recent
developments that could optimize the browser capabilities and improve the
cross-domain problem, said Brent Ashley, consultant and scripting expert
who specializes in AJAX. "There are JSON [JavaScript Object Notation]
requests that don't exchange cookies during the request. And [Adobe] Flex
and ActionScript have a cross-domain file that says, 'These sites are
allowed to cross-domain with me.' That gives some control back to the
server side. So while there are issues now, here's a new set of
constraints." Some panelists expressed frustration at the lack of
compatibility between AJAX and Microsoft's Internet Explorer. Russell also
noted that the numerous AJAX frameworks that have emerged generally have a
high level of interoperability. When asked about mobile AJAX, Sun's Greg
Murray said that his company is looking into developing an AJAX platform to
support portable devices.
Click Here to View Full Article
to the top
U. of I. Goal: To Revamp Computers
Chicago Tribune (05/11/06) Van, Jon
Researchers at the University of Illinois have launched an ambitious
project to overhaul large-scale computing, improving both reliability and
security, and have built a prototype called Trusted ILLIAC that will soon
connect some 500 processors to form a new supercomputer. The scientists
have partnered with government and industry leaders to develop a technique
for predicting a system's reliability and security that will ultimately
lead to an essential test-bed for private industry. "We expect within two
or three years that our industrial partners will be demonstrating this
technology," said Ravi Iyer, head scientist at the university's Information
Trust Institute. By developing a system equipped with hardware and
software to recognize the applications running on a computer, the
scientists are taking a more basic approach to security than the current
system of ad hoc patches. The system can work through software bugs and
reconfigure itself as new security threats emerge, bringing IBM's
longstanding goal of automated computing a step closer to reality. Though
it is designed to improve the large-scale computing systems at major
corporations such as Microsoft and Hewlett-Packard (which are both partners
in the project), the technology could improve the experience of all
Internet users. William Sanders, director of the Information Trust
Institute, says, "It will have a big impact on pervasive computing and the
handheld devices like the PDAs and BlackBerrys that people use."
Click Here to View Full Article
to the top
From Geek to Chic: The Changing Face of Computing
Florida State University (05/11/06) Elish, Jill
Professors from 10 universities have formed the Students and Technology in
Academia, Research, and Service (STARS) Alliance to promote diversity in
IT. The consortium recently received a $2 million grant from the NSF to
recruit a diverse body of students to pursue college degrees in IT,
computer science, and other fields related to computing. "We want to
encourage more people--particularly women, underrepresented minorities, and
people with disabilities--to pursue careers in computer science and
information technology," said Larry Dennis, dean of the College of
Information at Florida State University. Other FSU professors agree that
nurturing the IT workforce is a matter of vital national importance, and
that diversity is essential for the future of IT in the United States.
With fewer highly skilled foreigners coming to the United States and the
demographic trend of declining white male representation in the workforce,
women and other groups have an unprecedented opportunity to claim their
share of the 1.5 million IT and computing jobs projected to be created by
2012, according to FSU research associate Anthony Chow. The perception
that computing is the sole province of white, male nerds is a serious
obstacle to recruiting a diverse group of students, and the STARS Alliance
is trying to give the field an image makeover. The consortium will hold up
role models in industry to report market trends and debunk the myths that
plague computing, such as the assumption that men are superior at solving
technical problems. The Star Alliance will create and maintain a Web site
promoting, among other things, the Student Leadership Corps, which will
support a variety of initiatives, including peer mentoring, research
opportunities, community involvement, and professional development.
Click Here to View Full Article
to the top
Students & Turtles Mesh
Unstrung.com (05/10/06) Martin, Richard
Computer scientists at the University of Massachusetts have developed a
Wi-Fi network based on the theory of discontinuous networks. The
UMassDieselNet project is a large-scale network spread across Amherst's
150-square-mile bus system that provides riders with real-time information
on the location and arrival times of individual buses. The system creates
what associate computer science professor and project head Brian Levine
calls "unpartitioned networks," which are absent in many Wi-Fi-enabled
environments. "There are lots of places that have partitions," he says.
"For instance, a region like New Orleans that's been hit with a natural
disaster, when all the infrastructure has gone down. The power's out, cell
towers are down, how can you maintain a network? Or areas where no
infrastructure exists in the first place--like India in particular." The
researchers attempted to develop a model that would tolerate disruptions in
service, such as a bus traveling in and out of hotspots as it follows its
route. Each bus is outfitted with a Linux-based computer with an onboard
Wi-Fi access point for passengers and an additional 802.11b USB card that
continually scans for access points while the bus is en route. Constantly
pulling data from its surroundings, the system is intrinsically imperfect,
though it is built to work in less-than-perfect settings. Levine's
colleague Mark Corner has applied the idea of discontinuous networks to
tracking an endangered population of wild turtles near Amherst, equipping
each turtle with a package containing a small computer, a GPS tracker, a
solar cell and battery, and a wireless transmitter that makes it a node in
the network.
Click Here to View Full Article
to the top
New Supercomputing Center to Advance the Science of
Nanotechnology
Rensselaer News (05/10/06)
Nanotechnology research will be the focus of a new supercomputing center
on the campus of Rensselaer Polytechnic Institute that is expected to be
up-and-running by the end of the year. Scientists who work at the
Computational Center for Nanotechnology Innovations (CCNI) will pay close
attention to the time and cost of shrinking the dimensions of materials,
devices, and systems, as well as the industries that stand to benefit from
nanotechnology, including semiconductor makers. "The CCNI will bring
together university and industry researchers under one roof to conduct a
broad range of computational simulations, from the interactions of atoms
and molecules up to the behavior of the complete device," explains Omkaram
Nalamasu of Rensselaer. The creation of CCNI is the result of a $100
million partnership between Rensselaer, IBM, Cadence Design Systems, and
AMD. CCNI will be the most powerful university-based supercomputer center
in the world, with a system comprised of massively parallel Blue Gene
supercomputers, POWER-based Linux clusters, and AMD Opteron processor-based
clusters, which will yield a speed of 70 teraflops. Such computing muscle
will make CCNI one of the top 10 supercomputing centers in the world. In
addition to modeling, simulating, and optimizing nanoelectronic devices and
circuitry, CCNI will be used for other research projects on campus, such as
for biocomputation research.
Click Here to View Full Article
to the top
Right-Brained Programming
Dr. Dobb's Journal (05/08/06) Murphy, Niall
Thinking outside the box may help programmers find better solutions to
problems than they would by using routine algorithms, writes Niall Murphy
of Embedded Systems Design. "It's sometimes necessary to throw away the
current way of doing things in order to see that alternatives are
possible," he explains. Murphy recalls how in his university days lateral
thinking was used to build a rubber band-driven miniature car, since the
mechanism to force the rubber bands to unwind more slowly so that the
wheels had sufficient traction to avoid spinning was hard to design or
implement. The solution was to replace the wheels with spikes, allowing
the car to slide over the surface by means of a catapult. Murphy cites the
traditional strategy for picking the timeout period for a watchdog timer in
an embedded system--approximating the time a system takes to traverse its
main loop and multiplying by some factor to accommodate loop time
variation--as an approach to avoid in favor of considering the outcome of a
lengthy timeout. "It's better to choose the timeout based on the external
impact of the system, rather than starting with the system's internal
performance," he advises. Murphy also takes note of new email spam filters
based on algorithms that spot genuine messages instead of following rules
to detect spam, and points out that such a strategy might be optimal
considering the percentage of online email traffic that is currently
spam.
Click Here to View Full Article
to the top
Sense of Speed
The Engineer Online (05/09/06)
Computer science researchers in the United Kingdom believe wireless
sensing technology can be used to improve the performance of sprinters who
have dreams of competing in the 2012 Olympics. Over the next four years,
Dr. Stephen Hailes from the Computer Science department of University
College London will head a project to outfit sprinters with lightweight
sensors that are able to relay real-time data on limb position,
orientation, muscular function, and physiological status. "We want
something that an athlete can wear without being aware of it," says Hailes.
He says current systems use components that are too big and do not provide
an easy interface for wireless networking, while video motion capture
features that can be disruptive to training, and data obtained is not easy
to interpret. The researchers face challenges in attaching sensors to
athletes in a manner that will facilitate the production of accurate data,
and developing a mechanism for quickly and effectively interpreting data
upon arrival. "We will have to extract meaning from that data in a way
coaches and the athletes can use, which is difficult as it is often noisy
and imprecise," says Hailes. As a way to provide real-time feedback to
coaches and sprinters, the researchers are considering having the wireless
sensing system produce a certain sound to indicate when an athlete's foot
should strike the track, and overlaying data on video imagery so
information can be easily understood.
Click Here to View Full Article
to the top
Bush Broadband Goal Fading
InternetNews.com (05/09/06) Mark, Roy
Enthusiasm toward the president's goal to establish "universal,
affordable" broadband access is waning, as indicated by a Government
Accountability Office (GAO) study estimating that just 28 percent of
Americans had broadband connectivity last year, while 30 percent of polled
households access the Internet via dial-up and 41 percent of the country
lacks an Internet connection of any kind. Rural Americans are less likely
than urban Americans to subscribe to broadband, and broadband has
penetrated a mere 17 percent of rural U.S. homes. Factors impeding
broadband take-up include price, availability of broadband applications and
services, and technical issues, according to the GAO. The study points out
that many households cannot acquire DSL given the limited range of copper
DSL connections, while wrangling over rights-of-way, pole attachments, and
wireless tower sites can also hinder broadband implementation. "The
disparity of broadband deployment between rural and urban America cited in
the GAO report raises serious concerns," stated Senate Commerce Committee
Chairman Sen. Ted Stevens (R-Alaska). "High-speed Internet access is
absolutely essential to all Americans, whether you live in Manhattan or a
remote village in Alaska." Stevens recently issued a draft bill calling
for numerous measures designed to boost broadband deployment, including
national video franchises, municipal broadband services, wireless use of
white space spectrum, emergency network interoperability, and the
application of Universal Service Fund (USF) rules to "communications
providers" that include VoIP and broadband suppliers. The new USF fees
would be committed to broadband deployment in rural and high-cost U.S.
regions.
Click Here to View Full Article
to the top
System-Level Design Language Arrives
EE Times (05/08/06)No. 1422, P. 1; Goering, Richard
Proponents of the System Modeling Language (SysML) believe that their
vision of a modeling language that represents all features on an electronic
system--from hardware to software--will soon be a reality. SysML can be
used to specify, analyze, and design hardware, data, personnel, procedures,
and facilities in elaborate systems, though the language is just beginning
to realize its potential in the area of hardware design, and advocates
believe that it could soon be used for system-on-chip (SoC) design. "Our
emphasis is to look at how we can integrate with electrical design," said
Lockheed-Martin's Sanford Friedenthal, chair of the SysML team, adding that
the language should be compatible with implementation languages such as
VHDL. "I believe it's totally feasible, and I believe we have a lot of
constructs that are very natural for supporting integration with electrical
design." The SysML 1.0 specification received unanimous approval from two
committees on the Object Management Group in April, and a final approval is
expected in February, though after the April vote standardization is
considered all but a formality. The origins of SysML date to the mid
1990s, when software developers commonly used the Unified Modeling Language
(UML) to support different kinds of structure, behavior, and interaction
diagrams. SysML, which grew out of a request for proposals issued in March
2003, builds a stronger bridge between software and hardware design than
UML 2.0 with its block component that abstracts away from UML's
software-specific features. Other key features of SysML include
requirements modeling and the ability to support parametric models that
describe the properties and relationships within systems. "SysML is more
tailored to the entire system," said Alberto Rosti of STMicroelectronics,
"whereas UML is for modeling the software artifacts."
Click Here to View Full Article
to the top
Accessibility Issue Comes to a Head
Computerworld (05/08/06) Sliwa, Carol
Bruce Sexton Jr. has joined a lawsuit filed by the National Federation of
the Blind as a plaintiff alleging that the Target Web site violates the
Americans With Disabilities Act (ADA), the California Unruh Civil Rights
Act, and the Disabled Persons Act. Sexton, who is legally blind, claims
that certain information on Target's site cannot be read by his
screen-reader software, and that the site requires a mouse to navigate.
The lawsuit is shaping up to be a landmark referendum on Internet
accessibility, since Target is only one of many sites that could be accused
of inhibiting access for the disabled, according to the plaintiffs'
attorneys. The problem has been exacerbated by the shift from text-based
to visually oriented content that is only likely to continue with the
emergence of Web 2.0 technology, which could update information without
refreshing the entire screen through the use of Asynchronous JavaScript,
XML, and Dynamic HTML. Assistive technology such as screen readers and
magnifiers would have no way of knowing what information has been updated
unless developers take steps to ensure that the updates are readable.
Working within the W3C, IBM is heading up a dynamic accessible Web project
calling for such measures as a development syntax that would relay
information about a site's accessibility to assistive technology
applications so they would know which parts of a Web page have been
changed, though the proposals are still in draft form. The Mozilla
Foundation included support for the technology in its Firefox 1.5 browser,
but the forthcoming 7.0 version of Internet Explorer will not support it.
Gartner's Ray Valdes claims that most Fortune 500 companies are largely
unaware of how accessible their public Web sites are, and that cost is a
prohibitive factor in improving their accessibility. The lawsuit could
finally clarify the question of whether the ADA, enacted in 1990, applies
to Web sites.
Click Here to View Full Article
to the top
One Qubit at a Time
The Economist (05/04/06) Vol. 379, No. 8476, P. 79
As the miniaturization of computing components approaches the atomic
scale, physical limitations will halt the process, forcing scientists to
use alternative methods to improve performance. Scientists are looking to
quantum computing as one possible solution, harnessing the quirky
properties of quantum physics to perform a theoretically infinite number of
calculations in parallel. While quantum computers hold the potential to
solve problems that stymie existing computers, scientists have only been
able to make very basic models that often work only in tightly defined
conditions. The superposition of qubits is only preserved if they are
isolated from external conditions, though researchers are working to
resolve this problem. Andrew Briggs, an Oxford University neuroscientist,
leads a team of scientists that successfully used the electrons of a caged
nitrogen atom as a qubit, managing to keep it in superposition for 500
nanoseconds, longer than any other molecule previously studied, but not
nearly long enough to perform a calculation. Another approach, advocated
by Hitachi's David Williams, calls for using existing silicon chips to
power quantum computers, manufacturing quantum dots on the chips' surface
that would function as qubits. Other groups are developing techniques for
oscillating electromagnetic waves to trap ions and use them as qubits. A
fourth approach involves a recently discovered form of matter known as a
Bose-Einstein condensate, which is cold enough to reduce atoms to their
lowest quantum state, exciting the possibility that they could function as
qubits in an environment where conditions close to absolute zero can be
maintained.
Click Here to View Full Article
to the top
AI Gets a Brain
Queue (05/06) Vol. 4, No. 4, P. 24; Barr, Jeff; Cabrera, Luis Felipe
Enabling software developers and enterprises to simply and efficiently
harness human intelligence as a core constituent of their applications and
businesses is the purpose of Amazon Mechanical Turk, a program that would
free up people to innovate by giving them the ability to instill human
intelligence within software. The Turk is characterized as an "artificial
artificial intelligence" that conceals the presence of human processing
power and intelligence supplied by a global, Internet-scale workforce that
completes tasks submitted by developers to the Amazon Mechanical Turk Web
site. Software applications request that flesh-and-blood people execute
tasks best suited to human intelligence through Amazon Mechanical Turk's
Web service interface, and the requesting application is alerted upon
completion of the tasks and the availability of the results. Among the
internal tasks Amazon initially classified as aligning particularly well
with the kind of processing facilitated by the Turk was data improvement,
Japanese text orientation, and image selection, which have in common high
volume and business value, human centricity, varied demand, and a
self-contained nature. Key issues that had to be addressed in building the
Amazon Mechanical Turk system included scalability, reputation tracking,
accountability, flexibility, and quality control, and the resulting system
is capable of managing task submission, assignment, and completion,
matching qualified people to jobs that call for specific skills, providing
feedback to encourage quality work, and storing task details and results.
The point of interaction between the Turk's five central concepts--human
intelligence tasks (HITs), workers, qualifications, assignments, and
requesters--is the Turk Web site, where the requester identifies the task
to be done, designs the appropriate HIT and qualifications, and funds the
workers through an Amazon account deposit. The qualifications and HITs are
loaded into the Turk via Web service calls made by the requester's
application, and workers regularly visit the site to look for jobs. The
requester approves assignments after it polls for reviewable HITs and
acquires enough data to make final checks and tweaks, and then
corresponding payments are released to the workers.
Click Here to View Full Article
to the top