Skip to main content

[More Conferences]

A Guide to Testing Web Applications - Remote Talk

A micro case study with macro lessons learned and examples of Exploring and Testing a Web Application.

The full version of this talk is available in Evil Tester Talks

A Guide to Testing Web Applications - Remote Talk Thumb Image

Dev Fest Bishkek 2019

A remote talk held on 14th December 2019 for The Bishkek Dev Fest.

I tested an application for about a day, making notes, and then generalised these into lessons learned and categorise of Test Sessions to help build a better model of how I approach testing.

This talk is available as an expanded (31 video) course on Patreon

A Guide to Testing Web Applications

Have you ever wondered how other people test applications? Not in theory, but in practice? What thought processes are used? How did they model the application? What tools were used? How did they track the testing? That’s what this talk is all about. This talk will be based on a short Case Study of testing an open source web application. Why open source? Because then there is no commercial confidentiality about the process, tools or thought processes. Alan will explain his thought processes, coverage, approaches, tools used, risks identified and results found. And generalise from this into reusable models and principles that can be applied to your testing. This covers the What?, and the Why? of practical exploratory web testing.

Notes and Slides

  • The notes that I made during my testing sessions have been collated into a pdf
    • .pdf
    • and you can see the detail of the notes I take
    • hopefully they are understandable, but remember they were designed to support me, not you
    • if this was a real project, and you needed to understand them, we would debrief and I would explain them

The slides are available as slides on slideshare

TLDR; I split my testing into small chunks with a high level aim, and a generic classification. I make notes. I emphasise evidence more than upfront planning.

One Thing

One thing I’d like to see more of, is actual output from the testing that people perform.

The issue is, that most testing is on a project and therefore confidential.

The only way we can ‘see’ the testing people do is when they perform it on an open source application.

I have previously created examples of how I test on YouTube, these have the disadvantage that I’m often trying to ’explain what I do’ as I do it. This generates more cognitive load and changes the way that I test.


For the Bishkek talk, I wanted to:

  • test an application
  • as close to possible as I do when on a project
  • make the same level of notes
  • demonstrate how I track sessions
  • explain how I come up with test ideas

To do that, I decided to use a debrief format, where I do testing, and then debrief it to analyse what I did, why I did it.

Session Types

Over the course of my testing I classified my Session Types as:

  • Install
  • Health Check
  • Planning
  • Recon / Modelling
  • Debrief
  • Coverage
  • Exploratory
  • Admin

This is not a complete list of session types I use when testing. This is the list of session types I used when performing this testing.

You do not have to use the same names as these, own your testing, and name them as appropriate for your and your environment.

My sessions usually started with a Planning session so I would know what I’m doing, and were followed by a Debrief session so I know what I did. Giving the following rough order.


When I make notes, I:

  • use a linear format
  • keep simple text notes
  • add timestamps to my notes
  • record data and actual observations
  • write down questions
  • identify todos
  • describe defects
  • etc.

Lessons Learned

I found this a useful exercise because it caused me to ’name’ by session types which I had only ever done informally before.

We explore at all points in our testing.

I plan in short bursts, before I test - to the level I need to, to help me prioritise and perform the next action. And I make planning decisions as I test.

  • The more we are focused on coverage
    • the more we constrain our exploration to the Coverage.
  • The more we are focused on Exploration,
    • the more our coverage has to be reverse engineered from:
      • our logs,
      • evidence, and notes.

There are more lessons learned in the slides.


mind map of talk


Below are some of the animated gifs I use in the talk to showcase some of my notes.

Overview of Health Check Session Notes

Issues found in initial Health Check Session

Overview of Recon Session Notes

Overview of Debrief Session Notes

Overview of Coverage Session Notes

Overview of Exploratory Session Notes

Additional Resources

I expanded some of these as Patreon posts as I was testing.

[List of Conference Talks]

This talk is also available,
with bonuses (e.g. transcript, extra videos,
exercises and resources),
in Evil Tester Talks: Technical Testing.