Automated Testing for EDU: It's Never Too Late

Standardized tests

Two weeks ago at WPCampus Online I presented Automated Testing for EDU. This session was a high level overview of different types of automated testing and when to use them. I structured the presentation to show how automated testing can be added at different phases of a project’s life cycle, starting with old sites where motivation to add new processes may be low. I ended with the planning stage when tests can help define the project scope and goals. Along the way, I tried to dispel myths and excuses that can stop a team from using automated tests. There is almost always a way for automated testing to help you move faster or increase safety (or both).

Visual Regression Testing for Zombie Projects

The main myth I wanted to dispel in my presentation was that automated testing has to be started at the beginning of a project. I think the best counterargument is the value of adding visual regression tests to a zombie project.

A zombie project is that website no one on the team wants to update. It was built years ago by long-gone coworkers. Changing anything about the site might break it. No one even has a solid definition of what is “working” and what would be “breaking.” Zombie sites tend to be many versions behind on their security updates because no one wants to be responsible for a regression caused by a change.

This lack of confidence and familiarity means a long, anxious manual QA process. This is exactly the kind of situation where visual regression testing can offer a big benefit. Visual regression testing means taking screenshots before and after applying updates (often between staging and live environments) and having a computer do the screenshot comparison.

The first time I tried this process I found that one template configuration out of many had titles printing twice. There is a really good chance that had I done manual QA, my eyes would have glossed right over the duplication as I opened a bunch of tabs and vaguely looked for something broken.

Automated tests are most helpful when they take a slow, error-prone, but easily understood process, and do it really fast and very accurately. For this reason, visual regression tests are almost always the best place to start for any team looking to automate their QA.

Code Sniffing for the Workflow Overhaul Stage 

Often, a team moving their sites to Pantheon is doing so as part of a larger effort to level up their developer operations (sometimes I like to un-short DevOps to remind myself that it’s not a mystical concept). In that situation, there is a good chance the team wants to get to a state where every code change is accompanied by a series of green check marks (or red Xs) confirming (or denying) that things still work. One trap here is a chicken-and-egg situation: do you write automated tests even if you don’t have a process to run them? Do you change your process to run automated tests even if you don’t have any? The answer is simple! Burn down that bikeshed by using an automated test that someone else wrote!

Code sniffing is a way to automatically check whether code is in compliance with a set of guidelines like the Drupal or WordPress coding standards. These standards tell you how many characters you can use in a single line comment or where line breaks go. The people using your website won’t care whether you used tabs or spaces in your PHP files (and the server won’t either) but your coworkers sure will.

I used to fall into the trap of emphasizing coding standards without automating their checking. In that scenario, I would often notice coding standards errors at the beginning of a code review and those points were the first bits of feedback my coworkers would hear. That’s a recipe for friction, frustration, and failure when the reviewer can convince themselves they did a meaningful code review when in fact they may not have really evaluated the substance.

Like visual regression tests, code sniffing is a task done much better and faster by a computer. It also is far less likely than visual regression testing to have a false positive. If what you care about most is adding a green checkbox to your pull request process, then start with code sniffing. Just don’t get a false sense of security from that passing test that's checking code style and verifying nothing about the behavior of the site.

An automated test for every bug fix after launch

Here is a pattern that leads a team to never writing tests:

  • Before launch: we don’t have time to write tests! We have too many features to build.

  • After launch: I don’t have time to write tests! I have too many bugs to fix.

That pattern can be cut off at either stage. Let’s take the second half. A simple definition of a bug is a mismatch between expectations and reality. If your stakeholders expect their site to behave in a certain way they will let you know if it does not behave that way.

Automated tests can help you in this situation because of the three ingredients in any test:

  1. A way of defining the expected behavior of a piece of software

  2. A way of measuring the actual behavior of that software

  3. A way of comparing the two

That may sound simple but many teams get scared off automated testing by the time investment necessary to bring those three things together. That excuse is especially common at the beginning of a project when the list of features seems huge and you don’t know which areas are risks for breaking or regressing.

Uncertainty around what might break goes away after launch when a bug is filled. A bug coming in after launch means that either:

  • The thing never worked as expected because of confusion somewhere. Writing the automated test is an exercise in clarification.

  • Something broke and you should write an automated test as regression protection. Having your client point out that something broke is bad. Having them point out that something broke again is much much worse.

If you’re working on a team resistant to automated testing, I highly recommend the tests for bugs first strategy.

Closing with Conventional Wisdom

I closed the session at WPCampus with advice that offered recommendations that were more conventional. Performance and accessibility audits really should be added as early as possible. Don’t count on the week before launch to do a sprint of performance and accessibility fixes.  Similarly, tools meant for behavior-driven development like Behat lose their differentiating value if they are shoehorned in only after features have been built. As an intermediate step before BDD I would recommend trying Demo Driven Development.

Whatever stage your project is in there is almost certainly a way you can save time or make your site less likely to break by offloading some of your quality assurance to a computer. I hope that one of these different methods helps you get started on your own journey!

Topics Development, Education, Testing & Optimization, WordPress, WordPress Hosting

Let’s get in touch