UT Prof Gets Computer Science Award
Daily Texan (06/30/06) Powell, Matt
ACM's Special Interest Group on Computer Architecture (SIGARCH) has named
University of Texas computer science associate professor Doug Burger as its
2006 Maurice Wilkes Award winner. Burger won the annual award, which began
in 1998, for his work in spatially distributed processors and memory-system
architecture. Burger's work is designed to expand the processing of
messages beyond current physical limitations of processors. The award is
given to computing achievements by those working in the field for less than
20 years. "There are so many quality people working in this field. I
really wasn't expecting it at all," says Burger, the co-leader of UT's
"Tera-op, Reliable, Intelligently Adaptive Processing System" project,
which is developing new microprocessors in conjunction with IBM that it
says could "revolutionize computing." Burger, 37, is the youngest person
ever to win this award. For more information about the Maurice Wilkes
Award, visit
http://www.acm.org/sigs/sigarch/wilkes/wilkes.html
Click Here to View Full Article
to the top
A Search Engine That's Becoming an Inventor
New York Times (07/03/06) P. C1; Hansell, Saul; Markoff, John
Though it has vaulted from the confines of a Silicon Valley garage to an
established member of the Fortune 500 with $9 billion in cash, Google
stubbornly clings to the culture of innovation that began with founders
Larry Page and Sergey Brin building their own networked computers out of
cheap PC parts. As graduate students, money was tight, but Page and Brin
also felt they could create a more efficient search tool with their own
networked computers than with commercial offerings. That climate of
innovation persists, as Google still primarily relies on custom-made
servers. Google is also using software developed by its own sophisticated
tools to power the computers in its growing string of data centers around
the world, such as the vast new site in The Dalles, Ore., where
technologies have been customized for energy efficiency. There are some
indications that Google is even preparing to develop its own microchips.
Google might be the world's fourth-largest manufacturer of computer
servers, behind Dell, Hewlett-Packard, and IBM, by the estimates of
Gartner's Martin Reynolds. Microsoft and Yahoo!, Google's chief rivals,
both develop much of their own infrastructure software, though they buy the
bulk of their equipment from established manufacturers. "At some point you
have to ask yourself what is your core business," said Yahoo!'s Kevin
Timmons. "Are your going to design your own router, or are you going to
build the world's most popular Web site? It is very difficult to do both."
Google is trying to do both, and claims that, as a result, its computing
costs are well below those of its rivals. Microsoft Chairman Bill Gates
acknowledges Google's pool of talented computer scientists, though he
downplays the efficacy of home-brewed hardware. One of Google's patents
covers a series of software applications for massive parallel computing.
Part of Google's innovation mitigates the problem of hardware
unreliability, as its software redirects tasks to other components when one
device fails.
Click Here to View Full Article
- Web Link May Require Free Registration
to the top
Getting Machines to Think Like Us
CNet (07/03/06) Skillings, Jonathan
With a conference commemorating the 50th anniversary of a Dartmouth
College conference on artificial intelligence scheduled later this month,
Stanford University professor John McCarthy, who is credited with
originally coining the term "artificial intelligence" in advance of the
1956 conference, shared his thoughts in a recent interview on the origins
and development of the field. Fifty years ago, the notion that computers
would be the primary instrument to conduct artificial intelligence was a
minority opinion, McCarthy said. Computers are much more skillful at
playing chess than they are at playing Go, according to McCarthy, who notes
that comparable effort has been put into programming them for each task,
indicating computers still have difficulty evaluating positions and
situations and identifying parts. One of the major advances in artificial
intelligence has been computers' ability to reason nonmonotonically,
inferring a conclusion from a qualified statement. McCarthy also notes the
success of the team of Stanford researchers who won last year's DARPA Grand
Challenge, sending an unmanned robotic car on a race 131.6 miles across the
Mojave Desert. In the future, McCarthy looks for machines to exhibit more
commonsense reasoning and knowledge, and to have a sense of originality
programmed into them. Machine capability is no longer the impediment to
artificial intelligence, McCarthy says, claiming that the continued
evolution of the field will of necessity be a product of basic ideas.
McCarthy is dismissive of futurist Ray Kurzweil's notion of the
"singularity"--the literal convergence of man and machine. Describing the
singularity as "nonsense," McCarthy predicts that the next major advance in
machine intelligence will come from a younger generation of scientists.
Click Here to View Full Article
to the top
Hacker Attacks Hitting Pentagon
Baltimore Sun (07/02/06) P. A1; Gorman, Siobhan
While the reported number of attempts to breach the Pentagon's computer
networks has spiked from fewer than 800 in 1996 to more than 160,000 last
year, the government's efforts to shore up its cybersecurity defenses have
stalled. An initiative undertaken by the National Security Agency to
encrypt sensitive information and restrict access at the Defense Department
and other government bodies is seven years behind schedule, due partially
to fundamental differences between the two agencies. According to an
internal NSA report, 30 percent of the agency's security equipment provides
insufficient protection, and 46 percent of the equipment is nearing that
status. "Much of the existing cryptographic equipment is based on ...
technologies that are 20-30-plus years old," according to the report, which
also noted the sharp increase in the sophistication of cyber criminals. A
security team spent weeks addressing a recent incident where Chinese
hackers gained access to a computer system that serves the Joint Chiefs of
Staff, according to two sources close to the incident. "This stuff is
enormously important," said John Stenbit, who served as the Pentagon's CIO
until 2004. "If the keys get into the wrong hands, all kinds of bad things
happen. You don't want to just let a hacker grab the key as it's going
through the Internet." The Pentagon reports that attacks against its
computers have increased 200-fold in the past decade, citing growing
threats from individuals, terrorist groups, and foreign states. In a
recent court case, two men were charged in Miami with hacking into
government computers and sending military secrets to China. Iran has also
emerged as a major threat to the government's aging computer systems. The
NSA is developing the Key Management Infrastructure program to strengthen
the government's defenses, though it has been impeded by high costs and
poor management.
Click Here to View Full Article
to the top
Securing Europe's Future Information Society
IST Results (07/05/06)
In an effort to combat the mounting security risks associated with
Internet services and commerce, the EU has launched an effort to shore up
the reliability of its networked systems. Drawing on Europe's leading
security experts, the SecurIST project is formulating a roadmap for the
continued improvement of network dependability and security throughout the
continent. Security, which includes confidentiality, integrity, and
accessibility of information, is closely related to dependability, which
encompasses reliability, safety, and maintainability in the face of
intentional and accidental threats. "The project should provide Europe
with a clear European-level view of the strategic opportunities, strengths,
weakness(es), and threats in the area of security and dependability," said
Jim Clarke, coordinator of the SecurIST program. "It will identify
priorities for Europe and mechanisms to effectively focus efforts on those
priorities, identifying instruments for delivering on those priorities and
a coherent time frame for delivery." The program created a security
taskforce by dividing its focus into different linked areas such as
security policy, application security, identity and privacy, and
biometrics. An advisory board helped the more than 200 SecurIST
researchers develop a series of recommendations, including enhancing the
centralized control mechanisms and empowering individual users to guard
against identity theft and other cyberthreats. The researchers have
identified service oriented architecture as a key priority for secure
software design, as well as the development of the broad disciplines that
enable security: cryptology and trusted computing.
Click Here to View Full Article
to the top
Researchers Claim Great Firewall Workaround
IDG News Service (07/05/06) Lemon, Sumner; Gohring, Nancy
Researchers from the University of Cambridge have reported the discovery
of a method for working around the Chinese government's complex Internet
filtering system, though some question how much of a breakthrough their
research really is. The filtering system uses routers and
intrusion-detection applications to screen for keywords within packets of
Internet transmissions. A request for Web sites that include prohibited
words such as "falun" is blocked by sending reset (RST) packets to the
client computer and the Web server, severing the connection between the
two. The Great Firewall keeps the connection broken for a period of time
that averages around 20 minutes, but can last up to an hour, a finding that
some researchers say was already common knowledge to those familiar with
the system. "There's nothing in there I didn't know two years ago," said
Michael Robinson, an IT expert in Beijing. "The connection reset system
described in the paper is only one layer of a much larger multilayer
content control system. Using encrypted proxy servers is the only way
around all of them." The researchers suggested special software that could
ignore the RST packets as a potential workaround to the Great Firewall,
though Robinson counters that creating a proxy connection involves the same
amount of work and provides a more complete solution. Richard Clayton, one
of the Cambridge researchers, counters that the link to proxies is
generally unencrypted so that Internet traffic is still subject to
censorship. Clayton hopes that software companies such as Microsoft will
begin creating TCP/IP stacks that ignore the resets to increase
security.
Click Here to View Full Article
to the top
A Petaflop Before Its Time?
HPC Wire (06/28/06)
In an interview at this week's ISC2006 conference in Dresden, Germany,
Horst Simon, director of the National Energy Research Scientific Computing
Center, discussed the challenges posed by petascale computing. Of the
several varieties of petaflop performance, Simon believes that a peak
petaflop and a Linpack petaflop could be attained as early as 2008, but
true petascale computing is most likely eight to 10 years down the road.
Simon is concerned that the HPC development community, which is currently
enjoying ample government funding, could settle for "a petaflop before its
time" by setting its sights on what he describes as easy goals, such as
peak or Linpack performance. Simon argues that the real work lies beyond
those benchmarks, and that researchers will have to devote themselves to
developing a significant computing infrastructure before achieving
sustained petaflop performance. Simon is also concerned about power
consumption. For today's architectures to scale to the petaflop level,
they would require an inordinate amount of power that few centers will be
able to afford, posing a potential barrier to widespread adoption,
particularly among universities. Simon estimates that by 2011, a sustained
petaflop system could require around 20 megawatts, which, even by today's
electricity prices, would cost around $12 million a year to power. New
approaches to processor design could help resolve the power problem, though
Simon notes that different processor curves are still a long way from being
commercialized. Programming software to run on parallel architectures is
also a challenge, which DARPA is working to overcome with new languages.
Simon says the HPC community must adapt the system software, applications,
and hardware in chorus so that supercomputing evolves as an ecosystem.
Petaflop computing will be especially useful for climate modeling, though
Simon is concerned about the viability of today's algorithms at such a
large scale.
Click Here to View Full Article
to the top
Cool Light Leads to Greener Chips
BBC News (06/30/06)
A team of researchers from University College London has developed a
technique that could result in inexpensive, environmentally friendly chips.
Using cool ultraviolet lamps to fabricate silicon dioxide instead of the
heat-intensive furnaces currently in use, the technique requires much less
power. "This means that the industry's energy, and subsequent cost
savings, could reduce the prices of electronic devices for consumers and,
of course, create a positive environmental impact," said UCL professor Ian
Boyd. As the number of transistors steadily increases in accordance with
Moore's Law, chip manufacturers have been exploring techniques to keep
temperatures down to prevent warping and distortion. Also, the heat that
is created by denser chip designs can cause features to become fluid and
bleed into each other. The new technique uses an ultraviolet lamp filled
with argon gas that is charged with a high voltage. It emits light that
breaks oxygen molecules down into individual atoms, a dissociation that
produces one atom with a lot of energy and one with very little. The
energetic atoms can oxidize silicon at room temperature, Boyd said. The
most significant roadblock to the adoption of the technique in the
commercial setting could be that it does not create a pure material,
according to Douglas Paul of the University of Cambridge. "There have been
many people who have shown similar results but all these techniques cannot
be used for electronics because the defect densities are far too high," he
said. Boyd acknowledges that sustained exposure to UV wavelengths causes
defects, but he notes that his technique employs a light wavelength that
has never before been used. Boyd says the technique could be used for
applying circuits to materials such as cloth, paper, or plastic, in
addition to microchips.
Click Here to View Full Article
to the top
Wariness of U.S. Tech Lag on the Rise
United Press International (06/26/06) Darm, Alecia
Integrating technology into every public institution is critical to the
United States' competitiveness in the 21st century, according to experts
speaking at a conference hosted by the New America Foundation on Capitol
Hill. Technology education has the capacity to broaden access to
information at virtually every national institution. "Acquiring the best
technology for learning is not a problem but a challenge; it is an
opportunity to excel," said Dexter Fletcher of the Institute for Defense
Analyses. Technology education is particularly valuable because the
computing experience is highly personal and interactive, according to Henry
Kelly, president of the Federation of American Scientists. Speakers also
emphasized the value that technology education has for children, especially
since the average child spends six hours each day using electronic media,
according to Michael Calabrese, director of the Wireless Future program at
the New America Foundation. Lawmakers are working to pass the Digital
Opportunity Investment Trust (DO-IT), legislation that would support
technology education through programs such as Immune Attack, an educational
video game about human immunology. Digital Promise's Rayne Guilford
likened the scope of the legislation to the GI Bill and the Northwest
Ordinance. "Once a century, Congress makes a major investment in
transforming training and education," Guilford said. "The Digital Promise
is the 21st century equivalent of the GI Bill." The absence of a
commercial market is a central impediment to the DO-IT initiative, though
some private corporations are realizing that technology education is
critical to preserving the United States' climate of innovation.
Click Here to View Full Article
to the top
IBM: Free the Net
ZDNet India (06/28/06) Yeo, Vivian
In Singapore for a ministerial forum on next-generation networks co-hosted
with the Infocomm Media Business Exchange 2006, IBM director of Internet
technology and strategy Michael Nelson warned that government regulation
could stifle the Web's evolution, which he sees headed in three primary
directions: Ubiquitous high-speed access, faster networks reaching 100
Mbps, and grid computing. "The Internet is much more like the computer
industry, which has been very lightly regulated than the telephone
industry, which has been very heavily regulated," said Nelson. "Because it
has been very lightly regulated, innovators have been able to do new things
to move in many different directions--directions which politicians and
regulators couldn't possibly have anticipated." He notes that various
organizations, including ICANN, IETF, and W3C all contribute to but don't
dictate the Internet's direction, and argues that VoIP should not be
"regulated the same way as traditional telephony." As to the future of the
Internet, Nelson said, "we'll have networks that tie together much more
powerful machines, and are able to do many different things." He envisions
an Internet where all people, devices, and "even their dogs" are
networked.
Click Here to View Full Article
to the top
Breaking Barriers and Bridging Divides
Explore Qatar (06/26/06)
Despite 40 years of predictions, technology has still had no substantive
impact on education, according to Raj Reddy, a professor at the Mozah Bint
Nasser University of Computer Science in the School of Computer Science at
Carnegie Mellon University. Reddy argues that technology's impact is still
impeded by bandwidth and memory constraints, and that much of the existing
technology is priced out of the reach of developing countries. The digital
divide is "widening because technologies and solutions are not tailored to
be used in villages or rural areas, or by different language groups. There
is no attention being paid to their needs," Reddy said. Particularly in
the Arab world, technological progress is slowed by a shortage of
researchers, and the community approach to technology that characterizes
the United States is also an important missing ingredient. Reddy is
working on the Million Book Project, an endeavor led by Carnegie Mellon
that aims to scan 1 million books by 2007. Reddy is also an advocate for
the uneducated and illiterate, claiming that the least educated people in
the world should have the most bandwidth, so that they could send voice
files instead of emails, for instance. For the developing world, Reddy is
promoting a device called the PCtvt that would combine the functions of a
PC, TV, personal video recorder, IP phone, and Video with a simple,
user-friendly interface. Reddy also notes the disproportionately low
involvement of women in computer science, describing the initiative
underway at Carnegie Mellon to give greater emphasis on the problem-solving
concepts of the discipline rather than the programming side. Raj Reddy is
the co-reciepient (with Edward Feigenbaum) of ACM's 1994 A.M. Turing Award;
http://www.acm.org/awards/citations/reddy.html
Click Here to View Full Article
to the top
Wider Authority Urged for IT Managers
Federal Times (06/26/06) Vol. 42, No. 21, P. 8; Curl, Aimee
Security experts told lawmakers that Congress should give federal CIOs and
chief information security officers (CISOs) more power to prevent more
security breaches from occurring such as the one at the Veterans Affairs
and Agriculture departments. They also said information officers at
government agencies need to have more authority to guarantee compliance of
data security guidelines. Eugene Spafford at Purdue University's Center
for Education and Research in Information Assurance and Security said CIOs
and CISOs need sufficient funding and a trained staff to perform an
effective security plan. A laptop containing the personal information of
26.5 million veterans was stolen from a VA employee's home last month and
the Agriculture Department's computer systems, which stored 26,000 current
and former employees' information, was recently hacked. Spafford, along
with former VA CISO Bruce Brody, told the committee there is a need for
someone else besides the VA to enforce privacy polices. Brody testified
before the House Veterans Affairs Committee with other experts on June 22.
"We've found that individual directors often feel they can override policy
when it gets in their way," said Spafford. "Unfortunately, the people
making these decisions don't have the training to understand the
consequences." Eugene Spafford is chair of ACM's U.S. Public Policy
Committee;
http://www.acm.org/usacm
Click Here to View Full Article
to the top
New Research Center to Combat Identity Theft
TechNewsWorld (06/29/06) Morphy, Erika
A new group called the Center for Identity Management and Information
Protection has been formed to fight identity theft and data losses that
have occurred at colleges, in the private sector, and the government. The
group will be based at Utica College in New York and will focus on how to
prevent and detect identity fraud and theft, how to spot cybercriminals,
how to improve identity authentication systems, and how technology can
protect and share information. Experts agree that establishing the Center
is a step in the right direction. "Most of the incidents of identity theft
lately, at least anecdotally, have been cases of employees taking home
laptops with sensitive information on them that were subsequently stolen,"
says Ron O'Brien, senior security consultant at Sophos. The Center will be
led by Gary R. Gordon, a professor at Utica College, while the board of
advisors of the college's Economic Crime Institute will oversee the
Center's research. In addition to Utica College, the Center was founded by
LexisNexis, IBM, the United States Secret Service, the FBI, Carnegie Mellon
University Software Engineering Institute's CERT/CC, Indiana University's
Center for Applied Cybersecurity Research, and Syracuse University's CASE
Center. The Bureau of Justice Assistance, Office of Justice Programs, and
Utica College's CIMIP will team up for the Center's first project, which
will look at existing and upcoming criminal identity theft groups.
Click Here to View Full Article
to the top
A Culture of 'No'
InformationWeek (06/26/06)No. 1095, P. 23; Claburn, Thomas; Whiting,
Rick; Malykhina, Elena
IT professionals must take a paternal, security-minded view toward
employees' take-up of popular, often free consumer-oriented Web tools, but
not at the expense of innovation, which is essential to companies' ability
to rapidly adapt to change. There is a growing paranoia about security
among business technologists in light of much publicized system intrusions,
vulnerabilities, and advisories about shady employee conduct. However, "if
[IT teams] put policies in place and make it so that people go around them,
they end up opening up bigger security holes," warns Gartner VP David
Smith. One strategy to help ensure continued innovation while giving
employees sufficient maneuverability is for companies to collaborate with
vendors to make popular applications secure. Cox New England's Brad Shipp
recommends that companies try to understand the value of rogue apps, which
obviously fulfill some need beyond the capabilities of IT; "They're all red
flags, but they're also opportunities for doing something better," he
maintains. Google Enterprise general manager Dave Girouard said at the
recent MIT Sloan CIO Symposium that companies must broaden their menu of
options in order to maximize the productivity of innovative workers.
ProBusiness Services network engineer Bob Pierce does not endorse
unconditional employee usage of rogue apps, suggesting that imported items
should be subjected to security checks, while any output from the
unauthorized software must exhibit compatibility with corporate software
standards. "Does IT Matter?" author Nicholas Carr believes workplace-based
consumer technology will trigger an increase in internal IT needs in the
short term because data control and integration across Web applications
requires on-staff expertise, but he predicts that most corporate IT
positions will be phased out over the next 10 years.
Click Here to View Full Article
to the top
E-Paper Chase Nears the Finish Line
Electronic Design (06/22/06) Vol. 54, No. 13, P. 47; Allan, Roger
Electronic ink and paper are approaching readiness for mainstream
penetration, as indicated by prototypes, demos, and new products. E-ink
displays usually involve black and white microcapsules suspended in a
transparent fluid that line up in different orientations and patterns in
response to electrical charges, forming on-screen text. Forthcoming
products include the Sony Reader electronic book, a lightweight display
that uses electronic ink from E-Ink, and has 64 MB of internal flash memory
in which to store hundreds of books. E-paper can be applied to more than
just e-books: ISuppi expects magnetic cards and electronic shelf labels to
be the No. 1 and No. 2 applications, respectively, for flexible displays by
2013. Actual products on the market are being outnumbered by public
demonstrations of e-ink and e-paper technologies, such as Toppan Printing's
wall-sized electronic newspaper of nearly 300 individual e-paper tiles,
which constitutes the world's biggest newspaper display. Also generating
interest are collaborative projects to develop e-ink, e-paper, and flexible
displays, such as the joint E-Ink and Ambient Devices venture to develop
the Weather Wizard, a household device that wirelessly displays five-day
local weather forecasts in real time. Currently being researched to meet
future flexible display requirements are nano-engineered materials.
Click Here to View Full Article
to the top
Robots Are Our Friends
New Scientist (06/24/06) Vol. 190, No. 2557, P. 56; Richardson, Kathleen
The trend to create robots humans can relate to as companions or
caregivers is gaining momentum as industrial and academic roboticists
strive to create humanoid machines, writes University of Cambridge social
anthropologist Kathleen Richardson. She notes that advances in humanoid
robot technology are accompanied by changing perceptions of what defines a
human being as well as the similarities and differences between machines
and people. "It seems the meaning of human-robot encounters has less to do
with what the robot can do and more to do with what the human is doing,
prepared to do or prepared to imagine is occurring in the encounter," notes
Richardson. This in turn is feeding into the growing emotional attachment
some people feel for robotic toys such as Sony's Aibo dog or the Tamagotchi
virtual pet. Richardson observes "a growing misanthropic attitude" in
human culture that encourages anthropomorphism of machines and increases
the possibility that humans might start having deep relationships with
robots. A further indication that people think a human-like robot-human
relationship is feasible is the emerging interest in robot ethics as
machines assume more human qualities.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Moore's Law Meets Its Match
IEEE Spectrum (06/06) Vol. 43, No. 6, P. 44; Tummala, Rao R.
The Georgia Institute of Technology's Microsystems Packaging Research
Center is focused on devising system-on-package (SOP) solutions that could
yield small megaelectronic devices by eliminating the 90 percent of passive
components comprising current systems. Rao R. Tummala, the research
center's founding director, writes that SOP technology can miniaturize
components and make circuit boards almost disappear, effecting an increase
in system functions that is proportional to the expected doubling of
component density every year or so, and that far overtakes the increase in
transistor density dictated by Moore's Law. In a SOP approach, passive
components are integrated into micrometers-thick thin-film elements that
are incorporated into a multilayered system package. An SOP is much less
power-consumptive because its smaller size enables faster chip-to-chip
transmissions at lower currents and voltages, while design, fabrication,
and time-to-market cycles are shortened because there is no reliance on a
single technology and separate chips are employed for different functions.
For SOP to achieve mainstream success, several developments must occur:
Tools for the concurrent design of digital, analog, and optical circuits
must be devised along with the package; the manufacturing approach must
transition to an integrated design/fabrication/packaging model; and
processing technologies must be able to combine cheap, large-area board
processes with thin-film processes and clean-room processes to produce SOP
substrates. "At Georgia Tech we believe that the market for
multifunctional products and the advantages of designing chips and system
packages concurrently are so compelling that companies will just have to
design and fabricate everything together," comments Tummala. "And as the
SOP concept takes off, design-tool and fabrication houses will turn their
attention to developing powerful programs for concurrent design and
advanced manufacturing."
Click Here to View Full Article
to the top
Bring Order to Your Project With Scrum
Software Test & Performance (06/06) Vol. 3, No. 6, P. 12; Galen, Bob
Software team productivity can be enhanced with the Scrum agile
methodology for project management, writes RGCG founder Bob Galen. He
explains that Scrum concentrates on an agile project's planning, execution,
monitoring, and collaborative facets rather than suggesting specific
development practices. The methodology starts with a product backlog of
prioritized features to deploy within a project, and from this is extracted
a "sprint" backlog that the team will focus on for a 30-day period or
sprint, during which they will reach consensus on a focused theme and the
contents to be completed to fulfill the goal. The entire team meets on a
daily basis, where each member answers basic status and issue tracking
questions; then real-time feedback is provided by the team leader and via
burndown charts that show ongoing progress from multiple viewpoints. At
the end of each sprint is a sprint review period where demonstrable
artifacts are presented and where practice and process adjustments can be
recommended for the next sprint. Galen explains that Scrum teams initially
feature a tester who helps to plan the testing effort for each backlog
component and as each sprint backlog is devised; "Whether fair or not,
testers need to integrate in Scrum teams as a development partner first and
show how they can impact the sprint deliverables and contribute to the
overall goals of each sprint," the author writes. Galen has wrapped Scrum
around traditional testing teams and initiatives, and observes that the
methodology can be leveraged with only a few key tweaks: The testing
backlog emphasizes testing activities that are prioritized according to
time and the volume of testing that will result, while sprint duration must
be modified to calibrate with testing type as well as cycle time. Daily
Scrum planning and tracking meetings continue to have relevance in this
setup, while feedback always yields adjustments and lessons to be applied
to the next sprint.
Click Here to View Full Article
to the top