Primary Voting-Machine Troubles Raise Concerns for
General Election
USA Today (03/28/06) P. 1A; Drinkard, Jim
Voting-machine difficulties in Texas and Illinois have revived concerns
that this year's election will be fraught with glitches. Since the Help
America Vote Act required states to modernize their voting equipment, it is
estimated that in this year's election more than 30 million voters will be
using unfamiliar machines. Concerns about the reliability and security of
new e-voting systems have reverberated throughout the country, and early
problems in primary elections have already materialized in two Illinois
jurisdictions--Chicago and Cook County--where precinct judges were
untrained, and paper jams and misplaced equipment caused long delays in
tallying the ballots. In Texas, state Supreme Court candidate Steve Smith
is contesting the March 7 primary due to count irregularities. An initial
ballot tally in Fort Worth had 150,000 votes recorded, though there are
only one-third that number of voters. State spokesman Scott Haywood says
the irregularities were the result of human error, and the problems have
been fixed. In May, 10 states will hold primaries, including Pennsylvania,
which is "a disaster waiting to happen," according to John Gideon, director
of VotersUnite.org. The new systems will be up to the task, however,
retorts Michelle Shafer of Sequoia Voting Systems, which provides voting
machines to Pennsylvania and 19 other states.
ACM's U.S. Public Policy Committee recently issued an in-depth report on
the accuracy, privacy, usability, security, and reliability issues of
statewide databases of registered voters. To review the report, visit
http://www.acm.org/usacm/VRD
Click Here to View Full Article
to the top
UW Leads National Effort to Bring People With
Disabilities Into Computing
UW News (03/27/06)
A new program at the University of Washington is launching to help improve
computing accessibility for people with disabilities. Funded by a $2
million NSF grant, the AccessComputing Alliance will attempt to create a
nationwide network through partnerships with universities and corporations
to provide specialized training for disabled students. "The shortage of
qualified professionals in computing fields is due in part to the
under-representation of specific groups of Americans, including women,
racial and ethnic minorities, and people with disabilities," said Richard
Ladner, a professor of computer science and engineering. Facilities are
frequently inaccessible, there is a shortage of role models and qualified
educators, and equipment is rarely designed with the needs of the disabled
in mind, according to Sheryl Burghstahler, director of Washington's
Disabilities, Opportunities, Internetworking, and Technology program, or
DO-IT. The alliance will attempt to grow the numbers of disabled students
in university computing programs though a system of summer academies,
internships, and mentoring programs. It will also develop analytics to
measure accessibility at different departments, and work with college
faculty to better equip them to instruct the disabled. Finally, the
alliance will build a searchable database consisting of case studies, best
practices, and training and education materials.
Click Here to View Full Article
to the top
PC, Leave Me Alone -- I'm Busy
Financial Times: Special Report (03/29/06) P. 2; Bradbury, Danny
In an effort to make technology less intrusive, Microsoft Research's Eric
Horvitz has been developing context-sensitive systems that would know when
a user is indisposed or concentrating on a task and wait to deliver a
telephone call or an email notification. Using artificial intelligence,
Horvitz's system, which Microsoft is currently testing at its headquarters,
guesses when a user is busy and informs callers when he might again be
contactable, though training a computer on a user's particular work habits
is not a simple task. Many questions arise, such as how the computer is to
handle an emergency. Should each caller be given the same level of
information? In the user's absence to whom should calls be routed, and
will every call be routed to the same person? Users must clearly see the
benefit before they will customize a user interface, warns Kara Pernice of
the Nielson Norman Group. "The profiles must be generic enough to create
them easily, but when they're too generic they don't work properly,"
Pernice said. Beyond the user settings, sophisticated context-detection
systems would be able to infer their users' habits from data that they
collect. Such a system might notice that a user's calls are only a few
seconds long at the same time each week, suggesting that he might be in an
important standing meeting and that calls should always be rerouted during
that time. Accenture is developing systems that can even sense emotion in
a phone conversation. Researchers at MIT's Media Lab analyzing mobile
phone data found that they could predict the movements of the test subjects
with a 90 percent rate of accuracy. British Telecom CTO Paul Garner
believes that the context-sensing abilities of data mining are limited, and
that the real breakthrough will come from physical sensors. Advanced
context-sensing systems could have a major impact on home health care,
though they will require new modes of communicating.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Internet Firms Want FCC to Enforce Net Neutrality
Washington Post (03/29/06) P. D4; Mohammed, Arshad
A half-dozen Internet companies are protesting a bill supported by House
Commerce Committee Chairman Rep. Joe Barton (R-Texas) that permits the FCC
to decide Web access disputes only on a case-by-case basis and prevents the
commission from establishing detailed regulations. Opponents of the bill
argue that it would essentially de-claw the FCC's ability to enforce
network neutrality, and give cable and phone companies free reign to
discriminate against competitors by blocking access to Web sites. The six
companies--IAC/InterActiveCorp, eBay, Yahoo!, Google, Amazon.com, and
Microsoft--sent a letter to Barton and Rep. Fred Upton (R-Mich.) expressing
their concern that the legislation "would deny consumers from unfettered
[Internet] access," adding that the companies "have urged Congress to adopt
network neutrality requirements that are meaningful and enforceable. The
provisions in the committee bill achieve neither goal." Some of the major
phone companies, which are implementing high-speed and video lines,
maintain that they should be allowed to charge Internet firms more money to
use their networks. Barton's bill was issued after a compromise between
committee Republicans and Reps. John Dingell (D-Mich.) and Edward Markey
(D-Mass.) failed to materialize. Markey says the bill's "network
neutrality" section "removes FCC authority to establish any future rules
needed to ensure that consumers and competitors can avail themselves of the
Internet experience they enjoy today." The committee's Larry Neal counters
that the legislation invests the FCC with plenty of enforcement powers,
including the authority to probe cases and fine violators as much as
$25,000 a day. "What it doesn't have is a blank check for bureaucrats to
write so many regulations that they'll choke off brand-new services even
before consumers try them out," he says.
Click Here to View Full Article
to the top
Holograms Break Storage Record
Technology Review (03/29/06) Greene, Kate
InPhase Technologies has reportedly broken the record for storage density
with a square inch of disk space that holds 64.3 GB of data, inviting the
possibility off a holographic disk that could contain more than 100 movies
of DVD quality. Magnetic storage is fast approaching its physical
limitation of around 37.5 GB per square inch, said InPhase CTO Kevin
Curtis, adding that holographic storage should continue to advance for
another five or six years without significant technological changes. While
the idea of holographic storage has been around since the 1960s, the
necessary optical technology has only been developed in the last five
years. Lasers have also become smaller and less expensive. InPhase is
scheduled to release a holographic disk drive and a 300 GB disk at the end
of this year, followed by a disk that can hold 800 GB in 2008, and 1.6 TB
in 2010. CDs, DVDs, and other optical storage disks only have information
on their surfaces, while holographic devices hold data in three dimensions,
which is critical to their large capacity. Holographic disks could help in
large data archives, which mostly still use magnetic devices that are
susceptible to moisture. When the blue laser is split into a signal beam
and a reference beam, the scientists cross the two beams of light to
produce holograms. The signal beam stores the ones and zeros as light and
dark pixels, which appear as a hologram when crossed with the reference
beam. This array, known as a page of data, contains about 1.3 million bits
of data in InPhase's disks. Piecing together multiple pages makes a book,
Cutis explains; there were 320 data pages in each of the three books in
InPhase's record-breaking device. The company will next create books with
more pages and further reduce the angle between each one.
Click Here to View Full Article
to the top
Professor to Try to Hack Voting Machines
Pittsburgh Post-Gazette (03/27/06) Sherman, Jerome L.
After promising to pay $10,000 to anyone who can hack into a touch-screen
voting machine without being detected, Carnegie Mellon computer science
professor Michael Shamos is going to try himself. With thousands of
computer scientists having raised doubts about the security of voting
machines, Shamos will travel to Harrisburg to test the Sequoia AVC
Advantage machine that Allegheny County intends to purchase. He has
conducted more than 100 tests on voting machines in five states, and feels
that he is better qualified than most to assess the vulnerability of
e-voting machines. To meet the requirements for federal aid under the Help
America Vote Act, Pennsylvania must have updated equipment in all of its
counties. "If the system meets the requirements of Pennsylvania law, I'll
recommend it," Shamos said. "If it doesn't, I'll have no hesitation in
recommending against certification, even though it would throw elections in
this county into a tizzy." Shamos has been certifying voting machines in
Pennsylvania since 1980, and had been ready to quit the business when the
2000 election fiasco occurred, prompting a new level of concern about
voting machine reliability. Shamos has tested the Advantage machine
before, and this time he will spend up to nine hours searching for flaws in
the machine's security, reliability, or usability. Voting rights advocates
in Allegheny County have raised similar concerns as the Verified Voting
Foundation, the California-based organization that has led the call for
equipping machines with a mechanism to produce a paper trail for voters to
confirm the accuracy of their ballot. David Dill, the organization's
founder and a former student of Shamos', favors optical scan devices, but
Shamos says those systems can fall prey to human error as well, and that no
evidence of fraud has yet to appear. Shamos has never approved the
addition of a paper trail to any system.
A report entitled "Statewide Databases of Registered Voters--Study of
Accuracy, Privacy, Usability, Security, and Reliability Issues" by ACM's
U.S. Public Policy Committee is available at
http://www.acm.org/usacm/VRD
Click Here to View Full Article
to the top
Java Is RIFE With Open-Source Development
Frameworks
eWeek (03/28/06) Taft, Darryl K.
Geert Bevin began developing the open-source framework RIFE in 2002 to
bring the utility of PHP and Perl to the Java Web development space. Bevin
is CTO at Belgium's Uwyn, a venture specializing in dynamic Web
applications, AJAX, and Web 2.0 whose name is an acronym for "Use what you
need." "We needed to provide the metadata approach. The existing
standards are there, such as Enterprise JavaBeans [and others], but they
are not applicable to every model," Bevin said. With RIFE, a full-stack
component environment that includes life-cycle management, metadata, and
component management capabilities, developers can quickly and consistently
produce and maintain Web applications in Java. Through external
interfaces, RIFE also supports Web services, content syndication, and
services such as authentication resource abstraction. Another feature
supported by RIFE is metaprogramming, or writing applications that create
or control other programs. Metaprogramming enables developers to work
within a domain-specific API or language, saving time and eliminating
complexity by working with larger blocks. The RIFE/Crud feature
automatically produces the administration mechanism for routine create,
read, update, delete (CRUD) tasks. The next step for RIFE will be to
support development features with Java 5 annotation. RIFE also recently
added Direct Web Remoting to support developers writing Web sites with
AJAX. "It's good because you can package functionality that has AJAX
components and you can use it anywhere in your applications," Bevin said.
RIFE also contains HTML templates without logic, a central component model,
and support for flexible declaration and configuration in both plain Java
and XML.
Click Here to View Full Article
to the top
Digging Deep to Unlock the Grid
IST Results (03/28/06)
The IST-funded DataMiningGrid project is attempting to realize the data
mining capabilities of the Grid by optimizing the tools and infrastructure
that support it. By cohesively marshalling the diverse resources that
comprise the Grid, including CPUs, software, and bandwidth, the researchers
want to make the Grid operate as seamlessly as a local system. Still,
"trust, security, data privacy, and reliability in Grid computing is still
a largely unresolved problem," warns Werner Dubitzky, co-coordinator of the
DataMiningGrid project. "These issues are particularly important when
commercial computing jobs are distributed across sites not belonging to the
company that issued the jobs." Data mining is integral to advancing the
fields of astronomy, finance, and biology. The project is developing
information pre-processing, analysis, and post-processing techniques to
coalesce the data drawn from diffuse resources and improve the reliability
and accessibility of the Grid. The DataMiningGrid project draws on
existing tools for workflow management, scheduling, and integration to
extract data from the Grid, rather than reinvent the wheel with an entirely
new infrastructure. While the project itself is not intended to produce a
commercial product, Dubitzky says the researchers are still focused on
solving practical problems and bringing the technology to the market.
Among the challenges facing the researchers is the difficulty of satisfying
the widely disparate specifications of data mining applications in
different environments. Also, due to privacy concerns or other factors,
data must stay with the original source in many data mining
applications.
Click Here to View Full Article
to the top
Chip Ramps Up Neuron-to-Computer Communication
New Scientist (03/27/06) Simonite, Tom
The study of interconnected brain cells has the potential to lead to the
development of computers that use live neurons for memory. European
scientists hope to gain a better understanding of interconnected brain
cells by using a special microchip that is able to communicate with
thousands of individual brain cells. The computer chip can connect to far
more individual neurons than previous neuron-computer interfaces. The
device, which has 16,384 transistors and hundreds of capacitors on a 1-mm
squared experimental microchip, can receive signals from more than 16,000
mammalian brain cells and send messages back to several hundred cells.
When surrounded by neurons, transistors receive signals from the cells, and
capacitors send signals to cells. Stefano Vassanellia, a molecular
biologist with the University of Padua in Italy, sees new applications
emerging from the ability to control thousands of connected neurons. "It
would definitely improve our ability to experiment and understand the
workings of neurons, and this development could also provide a whole new
way to store computer memory, using live neurons," Vassanellia says.
Click Here to View Full Article
to the top
Bluetooth Adopts New Radio Technology
Associated Press (03/28/06) Svensson, Peter
The Bluetooth Special Interest Group on Tuesday announced that it would
incorporate ultra-wideband (UWB) radio technology to boost the speed of the
Bluetooth wireless standard and enable it to transmit data as fast as a USB
cable transfer if within 10 feet of a wireless transmitter. The first
UWB-enabled devices should hit the market by 2008, said the industry
group's executive director Michael Foley. UWB was developed by the WiMedia
Alliance, whose members include Intel, Microsoft, and Hewlett-Packard.
This push to create large-scale transmission capacity will help
interconnect all types of devices, from televisions to computers to cell
phones, so they can transmit a range of media. "There's a convergence
between three major sectors: Personal computing, consumer electronics, and
cell phones," noted WiMedia Alliance President Stephen Wood. These devices
must be able to send large files, he added.
Click Here to View Full Article
to the top
Professor Contends Car-Pool Lanes Aggravate
Congestion
North County Times (CA) (03/26/06) Downey, Dave
Transportation agencies should convert all car-pool lanes to general
purpose lanes to allow more vehicles on freeways, and use sophisticated
ramp-meter systems to control capacity, says UC Berkeley computer science
and engineering professor Pravin Varaiya. Varaiya says such strategies
would improve the efficiency of freeways with car-pool lanes by increasing
speeds in all lanes and moving more commuters. Last fall, Varaiya embarked
on a study of the efficiency of car-pool lanes by tallying and analyzing
traffic volumes at 30-second intervals, using 26,000 sensors underneath
California freeways. "We've been surprised to discover that some HOV
[high-occupancy vehicle] lanes may have the perverse effect of actually
adding to congestion," he says. Varaiya says car-pool lanes are less
efficient at 60 mph free-flow conditions; the data shows that such lanes
moved up to 1,600 vehicles an hour, compared with adjacent general purpose
lanes that pushed through 2,000. During rush hours, car-pool lanes
accommodated up between 1,400 and 1,500 cars in 30 mph stop-and-go traffic,
compared with 1,200 to 1,300 cars for general purpose lanes. He notes that
slow drivers particularly reduce the efficiency of car-pool lanes,
especially on freeways that have single car-pool lanes. He favors the
managed-lanes concept of San Diego County's Interstate 15, which has two
lanes, and separates car pools from other general purpose traffic with
thick concrete walls, essentially creating a freeway within a freeway.
Click Here to View Full Article
to the top
Mix Masters Try to Crack Code for Construction
Baltimore Sun (03/24/06) P. 1C; O'Brien, Dennis
The National Institute of Standards and Technology in Gaithersburg, Md.,
is using the fourth fastest supercomputer in the United States to gain a
better understanding of the strength and durability of concrete and how
long it takes to harden. The variations of a typical batch of concrete
reach into the billions, considering concrete is a combination of sand,
gravel, and cement (a mixture of limestone, clay, and other materials that
are heated), sand and gravel are quarried locally, and the size and shape
of each particle impacts the durability of finished concrete. "Concrete
can be different every time you make it, depending on what you're making it
from," explains William George, a computer scientist at NIST who is heading
the initiative. NIST was one of four organizations to win 1 million hours
of processor time on NASA's Columbia supercomputer, a $120 million computer
with 10,240 processors, covering 15,000 square feet at the Ames Research
Center at Moffett Field, Calif. George says the project will take about a
couple of months, as processing time is calculated using 1,000 processors
for an hour, which equals 1,000 hours of computer time. Columbia will
enable NIST researchers to scale up simulations of mixtures of small pieces
of concrete, modeling blocks by 10 times, and for the first time see how
flow and durability of concrete is impacted by the size, distribution, and
shape of particles. The research could lead to concrete that is more
durable and easier to pour and pump, and could also lower the cost of
repairs for construction projects.
Click Here to View Full Article
- Web Link May Require Free Registration
to the top
Council to Draw Up Cyberattack Response
Washington Technology (03/27/06) Lipowicz, Alice
The IT Sector Coordinating Council is in talks to set up a national IT
disaster response system as it prepares to draft a sector-specific plan for
protecting the nation's computer networks against a terrorist attack or
other disasters, says Guy Copeland, the group's chairman and Computer
Sciences vice president. The council is asking for ideas from the IT
industry and the Homeland Security Department as it starts work on the
sector-specific critical infrastructure protection plan at its April 4
meeting, Copeland says. The council expects the plan to be complete by
September. One of the main goals during the drafting of the plan is to
involve government officials very early on in the process since IT
companies have complained in the past that they have not been asked for
their input on infrastructure protection by federal agencies until the last
minute, says Copeland. Some issues affecting the IT council include if and
how IT companies should share sensitive data about their cyber
vulnerabilities with the government, how that information will be protected
and used, protocols for sharing information with other sectors, and how to
assess the vulnerability of IT assets. The council consists of 33 members
and was organized back in November 2005 as one of 17 sector councils
representing water, energy, financial services, food. and other areas.
Click Here to View Full Article
to the top
If You Think You Understand, Then You Don't
The McGill Daily (03/27/06) Vol. 95, No. 45,Watts, John; Wachsmuth, Jeff
Researchers at the University of Illinois have developed a quantum
computer capable of solving a problem without turning on through a
technique known as counterfactual interrogation. The researchers, led by
Paul Kwiat, sent a photon through an array of interferometers containing
the computer with the algorithm, which is essentially a cluster of logical
gates that convey the truth of an answer by turning opaque or remaining
transparent. So far, the principal applications of quantum computing have
been in data mining and encryption. The power of quantum computing could
simplify cracking existing RSA encryption. "Theoretically, you would be
able to solve [an] RSA encryption faster than it may be created," said
McGill computer science professor Claude Crepeau. In his experiment, Kwiat
searched a database for the answer to an algorithm that was always between
one and four without actually running the algorithm. Kwiat credits
graduate student Onur Hosten with a major breakthrough when he described
the quantum zeno effect, a phenomenon that projects the photon to a given
state while taking a quantum measurement. Superposition periodically
collapses the photon, despite the algorithm being pre-programmed
essentially never to run. Passing through the mirror puts the photon into
superposition, where it is reflected without being duplicated. In the end,
the quantum zeno effect enabled the researchers to produce accurate results
without having to run the algorithm.
Click Here to View Full Article
to the top
ICANN's 3-Year Plan Under Scrutiny at Meeting
IDG News Service (03/27/06) Perez, Juan Carlos
ICANN CEO Paul Twomey says ICANN is using its meeting in Wellington, New
Zealand, to hone its Strategic Plan and that ICANN is expected to decide at
the meeting whether it will approve the plan. The Strategic Plan provides
a vision for ICANN through mid 2009 and defines the top 10 challenges that
ICANN faces. These challenges include the increasing number of Internet
security threats, adapting to the demands of global Internet users, and
stability concerns associated with the numerous types of devices that are
now used to access the Internet. Other issues ICANN is addressing in
Wellington include the proposed .xxx domain, internationalized domain names
(IDNs), and ways to prevent distributed denial of service attacks against
the Domain Name System. ICANN intends to address any issues associated
with IDNs and the approval of new top-level domain names in a timely
manner, though not so hastily that the stability of the Internet is
jeopardized, says Twomey. The Canadian Internet Registration Authority
(CIRA) recently cut its ties with ICANN, claiming the organization is not
transparent, an assertion that Twomey strongly denies.
Click Here to View Full Article
to the top
Accelerating Data Transport Over Hybrid Networks
HPC Wire (03/24/06) Vol. 15, No. 12,
The emerging crop of data-intensive, multinational projects slated to go
online over the next few years will require new methods of data retrieval
and transmission. Computer scientists at the Electronic Visualization
Laboratory (EVL) at the University of Illinois at Chicago have been
developing user-friendly applications for moving large sets of data.
Current shared systems use numerous routers and data paths and share
bandwidth equitably, though large data flows are handled clumsily. Today's
routers are too expensive to optimize a dedicated network infrastructure,
though a switch-based system could affordably guarantee bandwidth, latency,
and scheduling. EVL's protocol research at the application level is aimed
at exploiting the maximum available bandwidth, regardless of whether it is
routed, switched, or a hybrid of both, while making sure not to disrupt
existing traffic. "For the past five years, we have been designing
specialized transport protocols for data-intensive interactive
visualization and streaming media applications," said EVL co-director Jason
Leigh. EVL researchers began experimenting with moving large files over
hybrid networks with its UDP transport protocol LambdaStream in January.
They tested the system by moving data from memory to memory, disk to
memory, and from disk to disk, comparing transmissions over the routed
TeraGrid network with the switched CAVEware network and finding their
speeds to be comparable. Partnering with the University of California, San
Diego, in the NSF's OptIPuter project, EVL researchers are demonstrating
that having the appropriate endpoint technologies enables economical
point-to-point connections.
Click Here to View Full Article
to the top
Next-Generation Vehicles: Drivers Optional
EE Times (03/27/06)No. 1416, P. 1; Johnson, R. Colin
The team of Stanford researchers that designed Stanley, the autonomous
vehicle that won the 132-mile DARPA Grand Challenge in 2005, is now working
on a self-driving car built for the real highway. "Our goal at Stanford is
to be able, within the next two years, to drive from downtown San Francisco
to downtown Los Angeles with 100 percent autonomy," said Sebastian Thrun,
director of Stanford's Artificial Intelligence Laboratory. Thrun expects
the trip to take seven hours, challenging the vehicle to negotiate urban
traffic, congested freeways, and long stretches of interstates. Stanford's
success at the 2005 Grand Challenge convinced Thrun that it was only a
matter of time before self-driven cars made it onto the road, particularly
as sensors and semiconductors continue to advance. Thrun admits that the
dream of the fully-automated car integrated into the daily traffic flow is
still some 30 years away, though in the meantime numerous milestones will
appear, such as automated military convoys and a bevy of safety and
convenience features that will signal the transition to complete
automation. Collision avoidance technology is a critical goal for
automotive semiconductor makers, which Schulmeyer believes the industry
will meet as radar is gradually tested on adaptive cruise control systems,
then on safety systems such as the emergency brake, and finally on systems
wholly dedicated to collision avoidance. Several auto makers have already
demonstrated fully autonomous parking capabilities, Thrun says, adding that
while many consumers will still want to drive themselves, a car that can be
switched over to a fully-autonomous mode could allow drivers to check their
email or take a nap on a long trip, and enable the elderly to remain
independent longer.
Click Here to View Full Article
to the top
There's a Chip in My Pants
Discover (03/06) Vol. 27, No. 3, P. 26; Johnson, Steven
As the price of digital processors continues to drop and researchers
develop materials that can transmit digital signals, the reality of smart
clothing appears to be at hand. Adidas is at the forefront of this
development with its athletic shoe designed to sense environmental
conditions and adjust its cushioning level accordingly. A microprocessor
receives 1,000 reports a second of compression level data from magnetic
sensors, which it then relays to a motor that either tightens or loosens
the shoe's cushioning support. Adidas is developing a new model for
basketball that will adjust in response to the player's movements of
jumping, running, and cutting and generate a profile based on the player's
patterns of motion. Other smart clothing products can look inside the
wearer, monitoring heart rate, respiration rate, and body temperature.
ViviMetrics has developed a shirt to monitor the state of sleep apnea
sufferers, a technology that could also be used to prevent sudden infant
death syndrome. The MEMSwear device is a miniature silicon-based sensor
that can be embedded in a shirt that conveys an alert to a cell phone or a
computer through the wireless Bluetooth standard if the wearer falls.
Though many of the potential applications of smart clothing may seem
farfetched for the average consumer, the rapidly declining cost of hardware
could lead to their widespread use anyway. Looking forward, smart clothing
could interface with navigation services to provide walking directions
based on the wearer's current position.
Click Here to View Full Article
to the top
Going With the Flow
Queue (03/06) Vol. 4, No. 2, P. 24; De Jong, Peter
Workflow models integrate an organization's data flow, organizational
charts, and flowcharts in order to capture both the structured and
unstructured components of the organization by extending its rational,
computerized elements to its natural and open elements, writes Microsoft's
Peter De Jong. Workflow systems must accommodate any type of message that
comes into the organization; the work items that coordinate the receipt of
the message with the organizational worker whose role the message
specifies; the business rules that automate the decision process employed
in assigning and carrying out a work item; and flowcharts that
particularize the organizational plan for how work streams throughout the
organization, and which comprise the main modeling structure in workflow
systems. The workflow is preferably modeled by the organizational employee
who is most knowledgeable about the process being modeled, rather than by
the most proficient programmer. The runtime executes on the organizational
design specified by the workflow models, and the resulting organizational
information is divided into live data from executing workflow instances and
historical data from already completed instances. Insight into the
organization's performance is derived from the live data, which is used for
workflow monitoring and management. The modification of workflow models to
yield greater organizational efficiency is facilitated through historical
data, which is used in reports of the organization's operations over time
in order to identify operational trends. A workflow system can be used to
explicitly model difficulties such as organizational events, work item
subsystem, parallelism and synchronization, and exception handling. The
most effective workflow systems are equipped with prebuilt workflows that
outline procedures that occur regularly across multiple organizations, and
are easily tailored to fulfill the needs of a specific organization.
Click Here to View Full Article
to the top