Dealing with Exponential Explosion – All Pairs and PICT tool
As the number of inputs that a software system accepts grows, the number of different combinations of these inputs grows exponentially. It does not take a lot of input parameters and values to have so many combinations that it is not practical to cover all the possible combinations by tests. One technique to deal with this problem is the All-Pairs test technique.
This lecture explains the All-Pair test techniques and demonstrates how to apply it, using PICT tool (freeware from Microsoft).
This paper was presented in 2009, in a SIGiST event in Tel Aviv, Israel.
Right and Wrong in Writing Test Cases
Test Case design is a common and well covered topic in SW test publications. On the other hand, very little is said about the correct writing style to use when writing Test Cases.
Using test-case examples, this presentation helps understand how better writing style can simplify test cases and improve their clarity. Common mistakes in test cases writing and organization, as well as practical proposals for improvements are covered as well.
This paper was first presented in SIGiST 2009, Tel Aviv, Israel and later in QA&Test 2010, Bilbao, Spain
Patterns in Test Automation project failure – and Recovery
When Test Automation project fails, it frequently follows one of a number of “failure patterns”. Test managers can mitigate or avoid these failures by being aware of the patterns, look for “Alert Signals” that tell when a pattern emerges and deploy a set of actions as ”Counter Measures” to these failure risks.
This paper was created with Alon Linetzki (www.best-testing.com).
This paper was first presented in SIGiST 2006, Tel Aviv, Israel and later in QA&Test 2011, Bilbao, Spain
This paper is the predecessor of the more detailed “The Pathologies of Failed Test Automation Projects” (paper & presentation – see links below)
The Pathologies of failed Test Automation Projects
Most test automation projects never die — they just become a mess and are redone. Initial solutions that start well and are full of promise often end up as brittle and unmaintainable monsters consuming more effort than they save. Political feuds can flourish as different automation solutions compete for attention and dominance. Tests become inefficient in both execution time and resource usage. Disillusionment ensues, projects are redefined, and the cycle begins again. Surely we can learn how to avoid such trouble on the next project. In this presentation, I describe failure patterns I have encountered during my years in software testing. The patterns are described and I give some suggestions how to detect them early, avoid or mitigate them.
This paper first appeared in the fomrat of the “Patterns in Test Automation project failure – and Recovery” paper created with Alon Linetzki (www.best-testing.com). I later added other failure patterns and wrote this one.
This paper was first presented in STAREast 2013, Orlando FL, USA
Roadblocks to Bug Reporting
Testers sometimes find a bug, yet do not report it. The reasons vary and include technical and psychological reasons. This paper describes a list of such situations and proposes strategies to overcome them. By making testers aware of these cases, they will be able to recognize when they take place, and apply the proposed solutions to avoid them.
This paper was first presented in SIGiST 2008, Tel Aviv, Israel, and later in Conquest 2008, Potsdam, Germany
The Home-grown Tools Syndrome
Test management is a generic process, yet much effort goes into developing tools in house to do this work. Learn the reasons for this phenomenon and suggestions for avoiding it.
Exit Criteria, Software Quality, and Gut Feelings
Bug counts and trends don’t cover all the quality aspects of a product. A good exit criteria list provides an orderly list of attributes that research and experience showed to have impact on product quality, so you can monitor the product quality at any given time and forecast the expected status at release. That’s how you improve your product.
This article was first published on http://www.testingworld.co.il(in Hebrew) and then on www.stickyminds.com on November 14, 2016 ( https://www.stickyminds.com/article/exit-criteria-software-quality-and-gut-feelings )
Less is More: Picking Your Test Automation Language
It’s a classic dispute: Two test automation engineers can’t agree on which programming language to use. In some contexts, the strong points of a certain language definitively make it the right choice, but what do you do when either language could work well for a project? That’s when it becomes a managerial decision.
This article was first published on http://www.testingworld.co.il (in Hebrew) and then on www.stickyminds.com on December 28, 2015 ( https://www.stickyminds.com/article/less-more-picking-your-test-automation-language )
The Secret to Change Management: Creating a New Tradition
When we try to implement new processes, there is often resistance from the team. People get so used to their typical habits that it doesn’t occur to them that there could be a better way to do things. To get buy-in from everyone, you need to understand the current traditions, then think about how you can set an example to start making the processes a new tradition
This article was first published on http://www.testingworld.co.il (in Hebrew) and then on www.stickyminds.com on June 6, 2016 ( https://www.stickyminds.com/article/secret-change-management-creating-new-tradition )
Who is to Blame for SW Testing Low Esteem?
Download Article (Hebrew)
Software testing does not enjoy the high prestige that software development enjoys. In fact, it is seen as a second-level job in the development hierarchy. Why is this so? Is this a lost case or can we do something to change this stigma?
This article was published on http://www.testingworld.co.il (in Hebrew)
Download Article (Hebrew)
When we speak about Test Automation, our first association is of a major integrated system: An execution controller, scripts, monitors, logs, reports, a user-interface and a database that supports the whole thing. Something big and complex.
But if we use James Bach’s definition for test automation (“any use of tools to aid testing”), we quickly realize that we write many of such tools every year. As tool writing is a major part of our daily work, it makes sense to think about the correct management of this activity.
This article was published on http://www.testingworld.co.il (in Hebrew)
More Important than Techniques
When discussing Test design, a lot of the focus is on Test Techniques: How to select an effective set of tests out of the practically infinite list of possible tests.
However, even before test-techniques, it is critical that testers are familiar with a number of principles and rules about testing; principles that deal with the question on “how does one Test” rather than “how to design the tests”.
In this paper, I will discuss the very basic principles of the testing profession. Knowing these principles will help testers create better tests and avoid tests that don’t really test anything or are insensitive to some problems.
This paper was first presented in SIGiST 2012, Tel Aviv, Israel.
Effective Bug Management
Effective bug management is a critical activity in a SW project. When bug management is ineffective, the project as a whole suffers: time, effort and energy are spent not on fixing bugs, but on arguments and delay tactics.
The parameters involved in effective bug management span the range of purely technical issues, to human behavior and to organizational politics. In many cases, these parameters are conflicting – satisfying one will result in neglecting the other. Finding the best solution for the conflicts is not easy. Components of the process, such as the bug triage, can be improved when the process owner learns to work with or around the technical, organizational and personal aspects.
In this paper I share the experience collected over many years of involvement in bug management activities. The talk includes description of challenges that are part of the process, and the solutions used by my organization to deal with them.
This paper was first presented in EuroSTAR 2009, Stockholm, Sweden.
The Fab Experience:
How I Stopped Whining and Started to Appreciate Process
Many software teams struggle to implement and comply with quality processes. Process adoption and strict adherence is seen as stifling, blocking innovation and redundant.
Contrast this with semiconductor fabrication plants (fabs) where adherence to process is the everyday norm and no fab engineer feels compliance with quality processes is optional.
Studying the top reasons why the fab world follows process so diligently reveals some underlying ideas that can be used in the software development world. Applying these ideas, software companies can improve the ability to implement and adhere to quality processes.
This paper was first presented in PNSQC in Portland, OR, USA (October, 2014)
Is What You See What You Get?
Computer Vision is gaining foot in everyday commercial systems. Face recognition is now part of Windows 10; virtual reality games and hand gestures control of the PC are just some examples of the coming flood of applications.
Intel is one of the companies leading this trend with its depth-camera offerings and the RealSense SDK.
This is great… but how do you test these applications? It turns out a lot of the standard approaches don’t work well and testers need to constantly invent new ways to test. This presentation describe the very difficult challenges facing testers of computer vision applications and some of the directions taken.
Body of Evidence: The Challenge of Measuring People
Body models play an increasing role in the virtual world. The market offers a plethora of full-body scanners (including RealSense-based), but is struggling in defining the quality metrics for these models. How to validate a body-scanning system? Learn about this difficult problem, the emerging solutions and the unresolved challenges in this presentation.
Are you wasting your organization’s resources with Innovation?
Are you sure your Innovation effort is not misdirected, irrelevant and redundant?
Over-focus on Innovation is a mistake that is repeated frequently in the software test profession, where it is promoted for the wrong reasons. While effective in the short term, such innovation may yield frustration and disillusionment more than innovative solutions.
Using examples from various disciplines – from art to engineering – the lecture discusses the problem of over-focus on Innovation and presents a balanced approach of encouraging Innovation while using exiting solutions.
Writing Good Requirements
This is a workshop that I conduct internally at Intel, but have also taught externally. It is based on material that was first written by Erik Simmons (Intel), with a lot of changes, modification and additions done by myself over the years I taught it.
This workshop is targeted for engineers who read, review, write or use functional and non-functional requirements at product level and at feature level.
The workshop covers:
– Why requirements are important
– What makes a good requirement
– Reviews: why reviews are important; type of reviews; how to review a document
– Pitfalls in writing requirements
– Techniques for writing good requirements
The workshop is a mix of theoretical material and hands-on sessions where students write and review requirements.
Upon completion of this 1 day workshop students are able to: analyze functional and non-functional requirements, write good requirements and identify requirements that are not well written.
There was an attempt to video-tape this course, and it was stopped for various reasons before completion. The videotaped version was planned to be a significantly shorter version of the workshop. You can see a short excerpt here: