Most See Visa Program as Severely Flawed
Washington Post (03/31/06) P. D1; Kalita, S. Mitra
Buried in the debate over immigration and illegal workers is the
discussion of the H-1B visa program that allows skilled foreign workers to
stay in the country for up to six years before either obtaining a green
card or returning home. The Senate Judiciary Committee voted to increase
the H-1B cap from 65,000 to 115,000 and permit some students to sidestep
the program and become sponsored for a green card immediately upon entering
the country. Opponents of the increase claim that foreign labor brings
down the wages of native-born workers, while advocates of the cap increase
say that foreign workers are healthy for the economy and prevent U.S.
companies from having to export jobs overseas. H-1B supporters claim that
the process must be simplified, and that it takes too long for guest
workers to obtain green card sponsorship. "What you want is a system where
people can get hired directly on green cards in 30 to 60 days," Stuart
Anderson, executive director of the National Foundation for American
Policy, told a House committee yesterday. Economists disagree about the
impact of skilled immigrants on the position of U.S. workers, with some
contending that foreign workers keep wages depressed and take jobs from
their U.S. counterparts. "Those who are here on H-1B visas are being
worked as indentured servants. They are being paid $13,000 less in the
engineering and science worlds," said Ralph Wyndrum, president of the
Institute of Electrical and Electronics Engineers, which advocates
green-card-based immigration only for outstanding candidates. Recruiting
and hiring managers counter that with current green-card caps on workers
from India and China reached, the H-1B visa program is the best way to hire
the most qualified candidates.
Click Here to View Full Article
to the top
U.S. Tech Jobs Back on Track
Investor's Business Daily (03/30/06) P. A6; Howell, Donna
The aftermath of the tech bust and the phenomenon of offshoring caused
many U.S. IT jobs to disappear, though in recent years the situation has
brightened for U.S. workers. With several studies identifying sustained
growth and interest in computer science among college students falling off,
the industry could actually experience a worker shortage. A recent Robert
Half survey found that 12 percent of companies plan to increase their IT
workforces in the second quarter of this year, compared to just 4 percent
that plan cuts. "Companies have hired in finance, information technology,
administration, sales--you name it--over the last six months to a year,"
leading to a greater demand for IT workers, said Robert Half's Jeff
Markham. That demand is particularly strong in the retail sector, where
companies are investing, or, in some cases, re-investing, in technology for
customer relationship management and business intelligence to enhance
customer contact, efficiency, and to better understand buying patterns.
Security, enterprise resource planning, and health care are also hot fields
for IT workers, according to the placement company Yoh. ACM reports that
the approximately 2 percent to 3 percent job loss rate in IT due to
offshoring is much smaller than the rate of job loss and creation within
the United States, and that tech jobs should stay strong in the areas where
they have historically been healthy. The telecom bust, driven by massive
over-extension leading up to Y2K, combined with the dot-com collapse to
constrict the supply of jobs, according to Moshe Vardi, co-chair of the ACM
study and a computer science professor at Rice University. "We essentially
had the perfect storm in terms of jobs -- and then there were job losses,"
he said. "We found IT turned around by late 2002." Still, Vardi said the
proportion of college freshman planning to study computer science dropped
from about 4 percent in 2000 to 1.5 percent in 2004, because the tech crash
caused so many to lose faith in the stability of IT.
The ACM Globalization and Offshoring of Software Report is available at
www.acm.org/globalizationreport/
Click Here to View Full Article
- Web Link May Require Paid Subscription
to the top
Science Agency Chiefs Laud Bush's Budget Request
National Journal's Technology Daily (03/29/06) Belopotosky, Danielle
The federal government's role in funding basic research was the subject of
a Senate Commerce Technology, Innovation, and Competitiveness Subcommittee
hearing on Wednesday. The private sector sought to remind policymakers
that the federal government should share some of the burden of funding
basic research, which has an impact on U.S. competitiveness. Chairman Sen.
John Ensign (R-Nev.) acknowledged that the United States must continue to
provide money for basic research, even when there is little room in the
budget for such spending. President Bush wants to cut the deficit in half,
but has still requested a 2 percent increase in non-defense research and
development spending, or an additional $1.1 billion, reaching $59 billion,
for his fiscal 2007 funding request, according to John Marburger, director
of the White House's Office of Science and Technology Policy. He described
the budget as targeted with investment in key initiatives at the National
Science Foundation, the Energy Department's science office, and the
National Institute of Standards and Technology. Biotechnology,
nanotechnology, information technology, manufacturing, energy sources, and
biometrics are areas targeted by the budget.
ACM President David Patterson's statement on the importance of increased
funding for technology education and R&D investment is posted at
http://campus.acm.org/public/pressroom/press_releases/2_2006/sotu.cfm
Click Here to View Full Article
to the top
Touch-Screen Voting Isn't the Answer
Baltimore Sun (03/31/06) P. 11A; Schneider, John
In framing the electronic voting-machine debate in Maryland around
security, many experts are missing the point, writes John Schneider, an
Internet and data security consultant. Because any system can technically
be rigged or manipulated, security is a relative term, generally a function
of the effort and risk of breaking into a system weighed against the
rewards of doing so. Without a sufficient recovery plan, voters will have
to take on faith from a small group of technologists that their votes have
been counted and recorded accurately. Most involved in the debate agree
that some kind of paper recording mechanism is in order so voters can
confirm their choices. Paper ballots also enable officials to conduct a
hand recount if the machines experience problems. One type of paper trail
would feed a roll under glass for voters to lean forward and read, while
another would have the voter create an individual ballot to be read by an
optical scanner. Schneider writes that the critical difference between
optical scanners and touch-screen systems is that voters prepare a ballot
by hand with an optical-scan system so it cannot be hacked should a recount
be necessary. Given the value of Maryland's inventory of touch-screen
systems, an optical-scan voting system could be deployed for a net cost of
around zero while offering the invaluable benefit of restoring confidence
in the state's voting process, Schneider concludes.
ACM's statement on the importance of verified voting procedures is at
www.acm.org/announcement/
acm_evoting_recommendation.9-27-2004.html. The ACM news release on voter
registration guidelines to assure privacy and accuracy is at
http://campus.acm.org/public/pressroom/press_releases/2_2006/vrdfindings.cfm
Click Here to View Full Article
to the top
Artificial Intelligence: The Edge of Research and
Beyond
ZDNet UK (03/29/06) Goodwins, Rupert
Despite failing to live up to its early expectations, artificial
intelligence still commands the attention of researchers. IBM and the
Ecole Polytechnique Federale de Lausanne are partnering on Blue Brain, a
project using a Blue Gene supercomputer to model the 10,000 neurons in a
rat's neocortical column (NCC), a building block that is close to its
counterpart in the human brain. The computer currently has 8,096
processors, each capable of modeling up to 10 neurons, with a peak capacity
of 22.8 trillion floating point instructions per second. Once the
scientists successfully model an NCC, they will replicate it to build a
complete neocortex consisting of about 20 billion neurons. A full model of
every component of the brain would contain about 100 billion neurons, and a
computer powerful enough to create such a simulation could appear within 10
years, though in the meantime scientists expect to discover important
findings about neurological diseases, the functions of memory and
sensation, and brain data processing. With 1 billion PCs on the Internet,
each containing 1 billion transistors, there is already enough hardware to
model a human brain, and some experts have argued that the Internet's
processes are not too far behind the mechanisms of the human mind. "Both
the brain and the Web have hundreds of billions of neurons (or Web pages),"
said futurist Kevin Kelly. "Each biological neuron sprouts synaptic links
to thousands of other neurons, while each Web page branches into dozens of
hyperlinks." Scientists are also looking to quantum computing to advance
neural simulation, taking advantage of the similarities between parallel
processing architectures and the neural links in animal brains. Artificial
intelligence enthusiasts look ahead to the singularity, the stage at which
the capabilities of artificial intelligence surpass human intelligence,
though even the most advanced system today still cannot match the
intelligence of a primitive mammal.
Click Here to View Full Article
to the top
Women Underrepresented in IT
Minnesota Daily (03/30/06) Aquino, Jeannine
The University of Minnesota's Institute of Technology has stepped up its
efforts to attract more women to its science programs, according to Peter
Hudleston, associate dean for student affairs of the department. A new
hire will work with admissions and also focuses on recruiting women, says
Hudleston. The Institute of Technology Center for Education already offers
the "Exploring Careers in Engineering and Physical Science" program, which
gives high school girls the opportunity to meet science majors at the
university over the course of a week. The efforts come at a time when the
number of women studying engineering at the Institute of Technology is on
the decline. "The percentage here [in the Institute of Technology] and
nationally was 20 percent five years ago," says Hudleston, adding that the
number had fallen to 17 percent in the fall of 2004 and to 15.3 percent
this year. Only 7 percent of women at the Institute of Technology study
electrical engineering, and the figure includes junior Sara Nasiri-Amini,
who had concerns about the low number of women in science programs. "I
thought maybe I'm picking the wrong major because no other girls were
here," says Nasiri-Amini, who ultimately decided that she would continue
studying what she enjoyed.
For information on the activities of ACM's Committee on Women in Computing,
go to
http://acm.women.org
Click Here to View Full Article
to the top
Brain on a Chip May Be Closer to Reality
Photonics.com (03/29/2006)
Stanford University associate professor of bioengineering Kwabena Boahen
is leading a team of researchers trying to imitate the functions of the
brain's neural system with silicon chips. Boahen says that neuromorphic
processors could eventually serve as small computers and replace damaged
neural tissue or restore vision with silicon retinas. Boahen believes a
better understanding of the brain's functions could also lead to more
efficient computers. "When I tried to figure out how computers worked, I
was disgusted," he said. "I thought it was totally brute force. I felt
there had to be a more elegant way to do this." He found it while studying
adaptive computational models at Johns Hopkins University. After an
unsuccessful project where he tried to develop an associative memory chip,
Boahen moved on to study neural circuitry at the California Institute of
Technology. While a professor at the University of Pennsylvania, Boahen
developed a silicon retina with image processing capabilities comparable to
a living retina. Now at Stanford, he is exploring learning and memory in
the human brain as he tries to build a chip with 100,000 neurons, allowing
the researchers to model the activities and interactions of different
cortical areas. Ultimately, Boahen wants to model the different cortical
regions, which control functions such as language, image processing, and
hearing, on an artificial network to study how the brain works. Figuring
out how neurons organize themselves will be critical to making a computer
that can match the performance efficiency of the human brain, and could
help those who suffer from conditions relating to neurology.
Click Here to View Full Article
to the top
High Court Considers eBay Case on Patent
Washington Post (03/30/06) P. D1; Noguchi, Yuki; Lane, Charles
The Supreme Court heard preliminary arguments yesterday in the eBay patent
infringement case that could rewrite the rules governing the use of
intellectual property as it revisits a century-old precedent holding that
court-issued injunctions are in order when patent infringement has been
proven. The case has drawn interest from a diverse group of interested
parties, with tech firms such as AOL and Yahoo! rallying behind eBay,
arguing that current laws give patent holders too much power.
MercExchange, which says that it developed the idea for eBay's "Buy It Now"
feature, claims that it is standing up against large companies that are
unwilling to pay for the use of intellectual property. Supported by the
pharmaceutical industry, MercExchange claims that when negotiations with
eBay to purchase the invention, which allows users to make a purchase from
an online auction at a fixed price, broke down, eBay stole the invention
and used it on its Web site. Coming on the heels of the dispute between
Research In Motion and NTP, the eBay/MercExchange case is the latest to
highlight the weak points in the troubled patent system, and it could spark
action in Congress. Rep. Lamar Smith (R-Texas) has already introduced a
bill that would restrict the use of permanent injunctions to cases where
patent holders could prove that they would be harmed irreparably without
one, though the bill stalled, due largely to the lobbying efforts of the
pharmaceutical industry. "I think the biggest issue this is going to
result in is a more urgent push for patent reform" in Congress, said patent
attorney Brian Ferguson. EBay has argued that the current patent system is
too rigid to adapt to the fast-moving environment of software technologies.
The court is expected to reach a decision by July.
Click Here to View Full Article
to the top
CMU Uses Game Maker's Characters to Interest Girls in
Computer Programming
Pittsburgh Post-Gazette (03/29/06) Roth, Mark
Electronic Arts has agreed to allow Carnegie Mellon University to use the
animation for characters in "The Sims" to teach computer programming in a
more appealing and less technical fashion. In the "Alice" course, students
will be able to manipulate animated figures on a computer rather than
simply writing code, which Carnegie Mellon says should increase interest in
computer science among women and others who might not otherwise consider
the field. When Carnegie Mellon first developed the Alice program a decade
ago, computer science professor Randy Pausch admits that the characters
were a little primitive, but he said that they were "the best we could make
with our own hands. Our characters are a little robotic, while 'The Sims'
are literally state of the art." Programs such as Alice are intended to
curb the overall decline in interest in computer science among students,
and particularly among women. The number of freshmen interested in
majoring in computer science has dropped more than 60 percent in four
years, and the proportion of computer science degrees awarded to women has
fallen below 30 percent. Caitlin Kelleher, a Ph.D. student at Carnegie
Mellon, has modified the Alice program, which is currently used in 113
colleges and at least as many high schools, to appeal to the storytelling
instincts of girls. Kelleher identifies middle school as the crucial time
when many girls form their opinions about math and science, and says that
couching programming in the context of telling a story is a more palatable
way to introduce girls to computer science. Kelleher has created new
motions in the Alice characters to prompt students to develop new story
lines. The addition of the Sims animations will greatly magnify that
process. Steve Seabolt of Electronic Arts says the company undertook the
program out of "enlightened self-interest," as it hopes to see more
qualified women and minorities in the video game field.
For information on the activities of ACM's Committee on Women in Computing,
go to
http://acm.women.org
Click Here to View Full Article
to the top
A Better Way to Cool Computer Chips Receives
Support
UCR News (03/28/2006)
Researchers at the University of California, Riverside, are exploring new
cooling techniques for microprocessors in high-performance computers. The
performance of VLSI microprocessors is compromised by heat and high power
consumption, which in turn undermines their reliability and shortens their
life cycle, according to Jun Yang, assistant professor of computer science
and engineering. "We are developing a software-based thermal sensing
system that is more accurate at monitoring heat changes during run time,"
said Yang. "Usually, these chips have only one thermal sensor that cannot
get accurate readings for the range of temperatures found throughout the
chip." The effectiveness of existing temperature sensors is limited by
signal noise caused by electronic interference, as well as their ability to
only read temperature at one point on the chip. Yang's solution calls for
a thermal sensor that can quickly and accurately read temperatures from
across the microprocessor, as well as a technique for heat control that
anticipates which areas heat will build up heat by communicating with the
software-based thermal sensor and rapidly cool them. The research is
backed by a three-year NSF grant.
Click Here to View Full Article
to the top
Business Skills in Demand for IT Workers
Network World (03/29/06) Dubie, Denise
IT employers will be placing more emphasis on business-related skills in
the years to come, according to a new survey of 100 companies by members of
the Society of Information Management (SIM). Kate Kaiser, a charter member
of the Wisconsin chapter of SIM and an associate professor at Marquette
University, says there has been a need for IT professionals to pick up
business skills for some time, but employers now want them to have business
and industry knowledge much earlier in their careers. "Computer science is
very technical by design, but two of the more popular areas in demand are
systems analysis and systems design, both of which are customer-facing
positions that require user interaction and communications skills," says
Kaiser. Companies cited business-related capabilities as five of the 10
skills they need to retain. They also said there are not enough project
managers with skills in project planning, leadership, and risk management,
adding that entry-level employers often lack communication skills. The
report says employment numbers for the IT industry, including in-house,
independent contractors, and third-party provider full-time equivalents,
will remain largely the same from 2005 through 2008. And outsourcing will
have little impact on employees in the United States.
The ACM Globalization and Offshoring of Software Report is available at
www.acm.org/globalizationreport/
Click Here to View Full Article
to the top
Researcher: DRM Technology Fails in Practice
IDG News Service (03/27/06) Kirk, Jeremy
The music and film industries should focus more on their business models
than digital rights management (DRM) technology, computer security
researcher Ian Brown said during the Changing Media Summit in London.
Brown said DRM remains a flawed encryption technology because the
algorithms used for watermarks--the invisible instructions that determine
the usage of the file--are not sophisticated. Brown, a senior research
manager at the Cambridge-MIT Institute in England, believes DRM
technologies would work better for time-based events, such as a broadcast
of a live sporting event, because a determined hacker would eventually be
able to remove the watermark and post a file to a peer-to-peer network.
Entertainment companies view DRM technology as a way to place limits on
what consumers can do with music and movies, and protect against piracy.
"Fundamentally, it's an anti-user technology," said Brown. "It's a
technology that allows content owners to provide data to their customers
with restrictions on how they can use it that aren't justified by copyright
law."
Click Here to View Full Article
to the top
Device Warns You if You're Boring or Irritating
New Scientist (03/29/06) Biever, Celeste
Researchers are scheduled to present a device that will inform people with
autism that they are boring or annoying the person they are talking to at
next week's Body Sensor Network conference at the Massachusetts Institute
of Technology. The "emotional social intelligence prosthetic" device is an
improvement from previous computer programs that detect the basic emotional
states of happiness, sadness, anger, fear, surprise, and disgust because it
focuses on the more complex states of agreement, disagreement,
concentration, thinking, uncertainty, and interest, which appear more
frequently in conversation. Built by Rana El Kaliouby of MIT's Media Lab,
colleagues Rosalind Picard and Alea Teeters, with Peter Robinson of the
University of Cambridge, the device consists of a camera (small enough to
be attached to eyeglasses) connected to a handheld computer that uses image
recognition software, and software that can read the emotions of the
images. The software makes the handheld vibrate when its wearer does not
engage the listener. The device, which gets emotions right 64 percent and
90 percent of the time when presented with video footage of ordinary people
and actors, respectively, is based on a machine-learning algorithm that was
trained by showing it more than 100 eight-second video clips of actors
expressing different emotions. The researchers say they still need to
reduce the device's computing demand for a standard handheld, find a
high-resolution digital camera that is easy to wear, and train autistic
people to use it. In addition to autistic people, teachers could benefit
from the device.
Click Here to View Full Article
- Web Link May Require Paid Subscription
to the top
Researchers Cooperate to Create Better Ways of Finding
Reliable Information Online
Chronicle of Higher Education (03/29/06) Kiernan, Vincent
R. David Lankes, an associate professor of information studies at Syracuse
University, and Michael Eisenberg, a professor in the University of
Washington's Information School, want to make it easier for Internet users
to find credible information online. The two researchers have received a
two-year, $250,000 grant from the John D. and Catherine T. MacArthur
Foundation to build a Web site, Credibility Commons, which will offer
computer programs to help Web users assess the credibility of information
they find online. Lankes says librarians, college instructors, and other
information specialists continue to note that the quality of information
online varies tremendously, and that they are more likely to trust
information if a site has a professional appearance. Lankes adds that
users are more likely to believe information if it is in line with their
own thinking. As co-directors of the project, Lankes and Eisenberg are
considering developing a search engine that would direct users to the Web
sites used by skilled searchers, such as reference librarians. Although
software developed by Credibility Commons would be available for free at
the project's Web site, anyone who creates software based on the work of
Lankes and Eisenberg would have to share their new application as well.
"If you use it, you've got to share what you used it for," says Lankes.
Click Here to View Full Article
- Web Link May Require Paid Subscription
to the top
Everything, Everywhere
Nature (03/23/06) Vol. 440, No. 7083, P. 402; Butler, Declan
Tomorrow's computers could be networks of minute, low-cost sensor nodes
with built-in data processing and transmission capabilities; by constantly
monitoring environments, buildings, and even the human body, these networks
could usher in a transformation in the field of science. "We will be
getting real-time data from the physical world for the first time on a
large scale," says University of Washington computer scientist Gaetano
Borriello, noting that this will facilitate a paradigm shift in which
theories can be generated and tested much more rapidly. Robert Detrick
with the National Science Foundation's Ocean Observatories Initiative (OOI)
explains that sensor webs will allow researchers to integrate inputs from
diverse sensors interactively and build "virtual observatories."
Programming a sensor web for a specific scientific application is currently
a formidable challenge, given the customization effort. Center for
Embedded Networked Sensing director Deborah Estrin says scientific
fieldworkers must contend with major sensor-web shortcomings: The price of
sensors currently precludes the node densities researchers frequently need
to conduct detailed field tests, while not all monitoring requirements can
be fulfilled by sensor webs alone. Estrin projects that sensor webs will
often function as just one tier in a stack of data collecting systems, and
machine-to-machine communication will be needed on a grand scale to manage
these stacks, thus necessitating the development of new operating systems
and standards. Sensor-web tools will have to become more user-friendly if
they are to break out of niche applications, according to Dust Networks
founder Kris Pister.
Click Here to View Full Article
to the top
Future Shock
Network World (03/27/06) Vol. 23, No. 12, P. 54
A quintet of futurists--Burrus Research Associates CEO Daniel Burrus,
Institute for Global Futures CEO James Canton, Technology Futures' David
Smith, Foresight Nanotech Institute President Marc Lurie, and AT&T Labs'
Sid Ahuja--project upcoming breakthroughs in networking. Burrus foresees a
lifestyle revolution through super-intelligent agents that use neural
networks to learn people's habits and preferences and eventually anticipate
their needs; he expects quantum computing advances to make such agents
feasible, while access to agents will be made possible through Web-based
services. Canton anticipates the emergence of collaborative, intuitive,
personalized, predictive, self-reflective, and self-repairing networks, as
well as embedded devices that augment human productivity and/or
intelligence, from the convergence of nanotechnology, biotechnology, and
IT. Smith envisions strong demand for IPTV and the extension of electronic
games beyond consoles and PCs thanks to peer-to-peer networks that support
do-it-yourself, individual content generation, which will piggyback on
increasing computing power, storage expansion, the penetration of
broadband, and the axiom that large networks scale exponentially with the
size of the network. Ahuja is looking forward to the day when employees
can do their jobs from anywhere and stay connected to the network and the
data they need via broadband video, and he expects such services will be
outsourced to mobile virtual network operators, while internal networks and
corporate information databases will still be controlled by IT executives.
Finally, Lurie believes Moore's Law will eventually be trumped by nanotech,
paving the way for innovations such as network systems fashioned from human
ribosomes, and a roughly 1,000-fold gain in network computing power and
performance.
Click Here to View Full Article
to the top
Louisiana Invests in Immersive Technology
Federal Computer Week (03/27/06) Vol. 20, No. 8, P. 44; Sarkar, Dibya
Officials in Louisiana are finalizing the construction of the Louisiana
Immersive Technologies Enterprise (LITE), a $27.5 million 3D visualization
complex powered by a supercomputer that they expect will become the center
of a new Silicon Valley in the Bayou. LITE, which could be operational as
early as the end of April, will be immediately used for the development
projects of gas and oil companies, though if successful, it could draw
companies engaged in medical, biotechnology, environmental, and other
activities, as well as helping the Gulf Coast recover from hurricanes
Katrina and Rita. "We're not trying to be industry-specific, but we are
trying to be science-specific," said Gregg Gothreaux, president and CEO of
the Lafayette Economic Development Authority, which is also partnering with
the state government and the University of Louisiana to develop the LITE
facility at the university's Lafayette campus. The 70,000-square-foot site
will house 22 SGI servers and have a data storage capacity of 8 TB. One of
the center's four rooms is an immersive space, where visitors can surround
themselves with data represented visually by six projectors beaming images
onto the ceiling, floor, and walls. This technology could help gas and oil
companies search for resources in the Gulf of Mexico and aid in medical
procedures, notes Ramesh Kolluru, director of the university's Center for
Business and Information Technologies. Other companies expect to use the
technology to help with hurricane recovery efforts. Louisiana's state-wide
computing network, the Louisiana Optical Network Initiative, links to the
National LambdaRail, providing the researchers with added computing
capacity if the resources at the center are exhausted.
Click Here to View Full Article
to the top
A Tiered Test Approach to Validating Software
Software Test & Performance (03/06) Vol. 3, No. 3, P. 33; Dada, Aditya
Aditya Dada with the Sun Java System Application Server Quality
Engineering group recommends the use of a tiered testing model to catch
defects as early in the software development process as possible, because
such a model embeds checks within every phase. To ensure the least number
of bugs in every product development stage, the product build was released
to the test and development engineers in various ways and increments in
order that each increment could be employed to run a more comprehensive
test suite. Dada's group devised a tier of different builds and sets of
tests to go with them: Tinderbox builds that track engineers' product
changes through continuous compilation, nightly builds, more thorough
weekly builds, and milestone builds that typically take place after a
system freeze. Quick-look tests were created to confirm the integrity of
engineers' changes, while smoke tests were designed to help the release
engineering group spot problems in weekly builds. There are also basic
acceptance tests that run on any platform and that deeply analyze all of
the most-supported features and configurations; the full test base that the
quality engineering team runs on all supported configurations; developer
unit tests; and the compatibility test suite. The tiered product build
testing approach employs a strategy to catch bugs on a per-change,
per-night, per-week, and per-milestone basis. Per-change testing can
combine the tinderbox build with quick-look tests, per-night testing
executes the quick-look tests, and per-week testing involves the execution
of quick-look, smoke, compatibility, and sometimes basic acceptance tests.
Per-milestone testing involves running the entire test base on the
milestone build.
Click Here to View Full Article
to the top