Jonathon Wright is joined by Paul Gerrard, a consultant, teacher, author, webmaster, programmer, tester, conference speaker, rowing coach, and publisher. He has been working in the test automation arena for nearly 30 years. Listen to learn how to use machine learning and AI to improve your test automation.
Interview Highlights:
- Paul Gerrard has been working on the test automation arena for nearly 30 years. [0:40]
- The three modes of development which are popular are the waterfalls structured, agile and continuous delivery. [5:27]
- Waterfall is very well understood where it’s appropriate. Agile, nowadays, is very well understood as a philosophy rather than a process or a method. [6:01]
We’re never going to be in a perfect place because we don’t know what’s coming, but we can be better prepared than we are at the moment.
Paul Gerrard
- In some occasions it’s best to document things in triplicate. [11:14]
- The challenge that they had over the past 30 years is they’ve tended to look at one approach fits all. That was a waterfall in the day. [11:37]
- Even a hundred lines of code can be immensely complicated. If you’ve ever written a code, once you get over sort of 30 or 40 lines, you have to think really carefully about changing it. It’s as simple as that. When you’ve got 30 or 40 million lines of code in a banking system, it’s overwhelming. [22:19]
We have to understand how we make use of our brains and then settle into some kind of logistical approach.
Paul Gerrard
- We need to have that ability where we’re versatile and we all must go with these cognitive and digital skills that actually allow us to be able to contribute to whatever kind of challenge we face next. [24:10]
- As testers, we have a simple whole process to go through, but we have very high levels of complexity to deal with. [27:03]
We should be thinking about the human factor much more than the technology and the even the business.
Paul Gerrard
Guest Bio:
Paul Gerrard is a consultant, teacher, author, webmaster, programmer, tester, conference speaker, rowing coach and publisher. He has conducted consulting assignments in all aspects of software testing and quality assurance, specialising in test assurance. He has presented keynote talks and tutorials at testing conferences across Europe, the USA, Australia, South Africa and occasionally won awards for them.
He was the founding chair of the British Computer Society Information Systems Examination Board (BCS ISEB) of the Software Testing certification board.
Educated at the universities of Oxford and Imperial College London, he is a Principal of Gerrard Consulting Limited and is the host of the Technology Leadership Forum.
He wrote “Risk-Based E-Business Testing” with Neil Thompson in 2002 and the “Business Story Pocketbook” with Susan Windsor in 2012. He wrote “The Tester’s Pocketbook” in 2009. He has also written a book on the Python programming language “Lean Python” in 2013 and “Digital Assurance” in 2016.
Paul won the EuroSTAR European Testing Excellence award in 2011 and The European Software Testing Awards (TESTA) award for Lifetime Achievement in 2013.
He was the Programme Chair for the EuroSTAR 2014 conference in Dublin.
If you can imagine something, whether it’s a tiny piece of code or it’s a hundred million lines of code operating system, we need our brains to be able to make good choices of tools from the set of tools that exist.
Paul Gerrard
Related Links:
- Subscribe To The QA Lead Newsletter to get our latest articles and podcasts
- Check out Gerrard Consulting
- Connect with Paul on LinkedIn
- Follow Paul on Twitter
Other articles and podcasts:
Read The Transcript:
We’re trying out transcribing our podcasts using a software program. Please forgive any typos as the bot isn’t correct 100% of the time.
Jonathon Wright In the digital reality, evolution over revolution prevails. The QA approaches and techniques that worked yesterday, will fail you tomorrow. So free your mind. The automation cyborg has been sent back in time. Ted speaker, Jonathon Wright's mission is to help you save the future from bad software.
Hey, welcome to the QA lead.com today.
I'm joined by a good friend, Paul Gerrard who's won numerous lifetime achievement awards in testing. So I'm going to introduce him. So tell us a little bit more, Paul
Well, I think for nearly 30 years I've been working in the kind of test automation arena. Back in the good old days of green screens and character to character-based terminals up to today and in some ways, the principles haven't changed, but the technology has moved on just dramatically but I think our thinking about Automation is still a little bit stuck in the past and I'm just going to do my usual thing and try and sort of, provide a vision of how I see using machine learning and AI to its best advantage as part of our goal of. Improving our testers a lot not to replace texters, but to support what we do.
On my focus in my area is to say, well, look, if machine learning and AI is to help us in any way at all, it's going to support our thinking. So, what I call the new model for testing is a model of the thought processes, a test it goes through.
So I'm using that as the basis of saying, look, where do tools fit in terms of support of our thinking against that model? So I'm just trying to, you explain the use of machine learning and AI in a more simple way. So, you're middle of the road testers, can say, ah, I see where we're headed.
So a plea to the vendors to say, want to listen to us today, to be honest, To say, look, what we really need is not more and more efficient test execution automation we need. Exploration test design its analysis results, analysis, defect analysis. We need stuff in those areas to help us make better decisions and potentially to take some of those decisions automatically.
So the talk was really all about setting the scene of a or a vision. If you like, of where I would hope tools will be in the next few years, I'll be going in the next few years but a lot successful or correct. I don't know what city
I think that's a great introduction and I loved you when you kicked off today's conference, you kind of said that you broke areas down to different methodologies.
I don't know we're going to get onto this kind of question next but I actually did a, I did a conference last night at for, in for an RPA event in. In Melbourne and I was kind of, I kind of was taking what you kind of talked about and what David also mentioned around our child turning 20 and also what jail's had mentioned about our jails dead long live agility and, I'd also kind of aware the Serafin frameworks kind of come in some of your thought processes and last night I kind of said, well, actually, It's not, it's not a one size fits all.
It's in one way. It's, the Waterfall is still very applicable for many projects and I refer to Waterfall a Waterfall to the core it landscape. So 60% of what businesses do to keep the lights on. That are doing engineering practices. So, the, what we would normally I've seen as requirement engineering, all the great stuff that's been there for the last hundred years and then we've got this 30%, the same way we've moved into the adaptive. It, where, Gardner and everyone has seen core it and adaptive it.
Where you've got people practicing Agile or, the conversation we had yesterday around manifests bringing some of the techniques from manufacturing, whether that be lean, whether that be the Toyota model, to applying that into kind of what, what is now turned into kind of this Agile approach where things are, I guess if we're looking at the Serafin stuff, we'll kind of, Obvious, the no knowns in core, it kind of moved to kind of that complicated note unknowns for Agile and then we've got this final one, which is where I think what you've just mentioned makes so much sense and that what we've kind of called cognitive engineering, which is.
There's fluid it, which is, where things are fail fast, learn rapid experiments using machine learning and artificial intelligence, and actually, I think that is maybe 10% of what we're seeing people doing with intelligent process automation and all these other great techniques. So during the conference, which is what I wanted to get you, you kind of kicked off with was, Dave yesterday kind of had two big statements to, well, TJ to jails or more financial day and he kind of said, Agile is dead, and also our jails are called, and I know you kind of address that this morning and he was kind of saying, well, we know what is the next evolution? Of Agile in the same way. You talked about dev-ops being a separate discipline and continuous integration and deployment being a separate discipline are we looking at that final frontier of chaotic, they, kind of landscape what we're seeing with the virus and how we're having to change the way we do business completely to be a whole new type of methodology, which may be born.
February 17, 2021. When we sign the Agile manifesto again or resign the digital manifesto,
Paul Gerrard It's not a question or I think, let me step back from the firefight that might evolve from saying each other's dead. There are three modes of development which are popular. The moment Waterfalls, struct structured our jail and continuous delivery, which I see as not our job.
It's a very different discipline from Agile and it's got more in common with Waterfall than it has with Agile. I would say. So, so there were three modes of working actually, no, there's an infinite number cause nobody apart from some real, really focused organizations do pure Waterfall, pure Agile, or pure continuous delivery.
Everybody else does a hybrid, a mix of perhaps all three with different proportions and so everyone does it differently. So what kind of an industry are we in? Everybody behaves and works differently. We're going to, it's a shambles and I don't think we're going to get to the bottom of that anytime soon, because it suits some organizations and some, and a lot of people to keep it that way because it's good business, so anyway, so let's not go on that one, but what I think is happening is Waterfall is very well understood where it's appropriate. I think Agile is nowadays very well understood as a philosophy rather than a process or a method and it's been commoditized, and so if you like that will be kind of off the shelf versions of our gyro, which we pick up news and find no problem at all, but for the rest of us where we've got to kind of deal with this issue with everybody working differently.
And so I think we should move away from what you might call linear processes which is what Waterfall and Agile exist to eliminate processes in that respect. But we got scrum instead, which again is a kind of intuitive, but it's still linear in terms of you follow a cyclic process and continuous delivery is often presented as this infinite loop but if you unwrap that infinite loop, it's Waterfall. It's a stage process and so I think we need to sort of stop thinking in these terms of stage processes and do what I think of in terms of what I would call luck and event-driven process, where we don't know what's coming next.
We don't know whether there's going to be an anomaly in production or a local found in, in, in development, or a piece of infrastructure failing in our testing. We don't know what's coming or a new requirement that's imposed because COVID just landed on our desk, so if we don't know what's coming, we've got to be super Agile and the only thing that is flexible well enough to deal with that kind of stuff is our brain and if we don't understand how we think and how we can think through a solution to a problem, the logistics come second.
So do we need to write lots of stuff down? Do we need to follow a very well-defined process? Do we get a crisis management team together and like Drew's example yesterday of his house flooding, I mean, we never know what's coming, so we need processes that can deal with the unexpected and we're never going to be in a perfect place because we don't know what's coming, but we can be better prepared than we are at the moment and I think that's, I don't think there's not thinking along those lines and that's why I talk endlessly about thinking and tested how they think our developers, I think 80% is exactly the same, but for developers and I think that's the direction we should go in and look at much more dynamic event-driven processes rather than these linear, float jobs and trim lane kind of diagrams that we have to follow. That makes sense.
Jonathon Wright No, that's perfect and I've skipped forwards a few I'll move backward and forwards as questions come through as well but, we've talked about a methodology that you kind of came up with around humble, right? The idea that exactly what you're talking about with this human aspect of it and I think partly day one with Theo kind of said that. Actually, this is a really interesting area that actually human, how we evolve those cognitive skills and those ologies, which I know Dave also brought up, part of it is how we do that thinking and we've talked about in the past things like solution thinkings systems thinking, and maybe. Exploring some of these things like CBT for cognitive behavior therapy play in some of those ways that humans interact with each other and maybe that's the skills gap that we're potentially missing.
So, going back to that, if you want to talk through these steps in a little bit of detail, starting with the thinking,
Paul Gerrard Well, these on the street. That's not my slide if you're I'm going to pass on that, but what I would say, I mean, w what I mean by that is there are modes of thing.
There are ways of thinking and communicating, which is, comes a very close second to thinking, right? That will help us have a better understanding of our world and we need to understand what I would call the definition of the software, the systems that we need to build I could be required in all its glory, or it could be a user story, or it could be something written in a formal language or a piece of muffs who knows the old system.
The news can tell us what the new system should do in certain areas and so we need this kind of critical faculty to understand the sources of information and by and large, we usually have multiple social information. Cause of that, there are people involved who have opinions and prejudices and biases, biases, and preferences and we need to sort of, somehow analyze a lot to come up with a shared vision for what the system should do, which is shared clearly with the developers too. They're going to build a damn thing and with that shared vision, a model, because I think you could substitute the word model for understanding either way.
It's the same, how we understand this, basically how we build mental models that simplify our view of the world if you like. So I think what I'm trying to suggest with all this is that everything is logistics, except our thinking and our logistics, our choices we make. To best deal with the situation at hand.
So in some occasions, it's best to document things in triplicate and other situations I'd say good enough to have a conversation in a corridor and just not in a wink and a, an a, an, a, an, a human agreement to, to work certain way for the next hour. We have to understand how we make use of our brains and then settle into some kind of logistical approach and I think the challenge we've had over the past 30 years is we've tended to look at one approach fits all. That was Waterfall in the day. Right now it's everyone wants to go Agile and then we've got advocates of continuous delivery dev ops and a lot of the good stuff. Coming up on the right, I think, well, wait a minute.
I mean, in five years time, we will have another way and another way. So I think we need to step back from having these one size fits all processes, but have this kind of event-driven, more dynamic a resolution it's a bit like crisis management and maybe Drew's analogy yesterday with his house.
Flooding is a good one in the. Really, we need a crisis management approach. It's not a crisis. It's just the way we work but we need to be ready for anything and I think that's the only way we're going to make progress. I don't think in 20 years time, if we're all alive, that we'll be we'll be any further ahead.
Right? We're going, we'll just have, Oh Agile five, if that, and I don't think it will be much more effective at building systems and I mean that the picture you've put on the table, cause we did this. It was like a very informal thing and you came up with this brilliant, the picture humbled all that stuff.
It's the stuff we have to understand and some of it is logistical and some of it is incredibly complex and we just need to have a better way of understanding complexity. So can I fin maybe, one way it and Dave's work in that area is fascinating? I think we need to understand how we think much better and I think we need to understand how we communicate. Complexity through models. So we have to because we're, I'm sorry, I'm rambling a bit, but I guess this is what our podcast is all about. We need to have a better understanding of how to gather information, to explore in the broadest sense, not just software, but text conversations, all that kind of good stuff.
We need to understand how to make rational choices. We need to understand how to be critical of our sources. So we can see through a fallacious thinking or, information and we need to model a lot better and if AI machine learning and AI can help us to build better models, that will be a great help and assistance and if machine learning, now I can give us visualizations of the complexity of our. Operational systems and tell us what's going on. Then I think that's a fantastic input to our thinking. As requirements define us, our users trying to build a business requirement if you like, but also to our developers who will understand where the volumes are, where the hotspots are, what goes wrong, why it goes wrong.
Of the patterns of failure that seemed to be in the, because we're capturing a lot data now we're logging everything. We've got databases and tools to manipulate data I'm touched experience and so on. So there's a load of stuff that I think we need to get our hands around. And it's not a process thing.
It's a. How can we leverage our brains to think through stuff that takes experience people 20 years together? It takes 20 years to get their experience and, every. Yeah, like Richard was talking. Yeah. From the perspective of Lloyd's banking group today. I mean, there'd be people in Lloyd's you've worked for Lloyd's for 35 years on the system.
Only those people have the into standing and that, and a trusted and credible, the credibility to say, you know what? We don't need to run those regression tests this week, but we can take that shortcut and that's where machine learning can help us tremendously,
Jonathon Wright Yeah, I think, yeah, I completely agree and part off and again, this kind of idea of domain and context-specific kind of knowledges systems of knowledge, like you've mentioned kind of, you could spend your entire career in the derivative landscape or in the healthcare landscape and, only scratched the surface and what I really liked about, I guess, going back to what Dave said yesterday as well, which was another. David is, I suppose. Shall we say was, he was saying, well, the it landscape or the it landscape is a shambles, the people who are working in it they just don't have that formalized structure and it's something that's maybe not taught at universities.
I guess that was what theater was kind of saying about the fact that with the example we were kind of mentioning about killing creativity and so Ken, Robinson's talk a Ted talk about, Oh, we, going for the wrong kind of skills, encouraging children to code and not, do performing arts and so forth and, part of it is what is that structure that comes out and, a massive thanks to. The professional sponsors today as well. I know we have the Institute of testing board and we also had the BCS committee and I'm lucky to be on the BCS committee and we've been kind of again, looking at well, how do we bring.
Qualifications and specialisms and all these kinds of supporting mechanisms to really help professional development and part of that, I know you've been speaking to Debbie today, and people will have different viewpoints on whether or not, you go down the route or you go down.
Some kind of apprenticeship route or a mixture of both and I sent this over today and it was kind of this morning. So one o'clock in the morning last night, I was just finished my conference. I'll do five conferences in 24 hours was maybe too much. My brain was just talk continuously spinning about stuff that we talked about, and maybe this kind of.
How do we move the industry forwards? And I sent this across this morning just as a bit of a placeholder with the idea of an institution of business technology and leadership and I know you're incredibly passionate about technology and leadership and you've even got the technology and leadership forum, which you host, but, I think.
Part of where we've kind of bridged the gap and I kind of guess what we would we talked about over the last couple of days is bringing, maybe that front final frontier is less focused on it as an industry and actually more bringing in the gap with the business. Right. And we're talking about, product engineering, bringing product and engineering staff talked about today about, bringing your customer in.
To the conversation, business had been, giving money out to fund it projects and again, the gardener this year talked about the fact that you shouldn't be funding projects. You should be funding products, but product then means, you go hand in hand with the business.
Exactly what you said with Lloyd's is, you need the business as well as the technology. So when we were talking about skills, we're not talking about. Being just fantastic at automation, we're talking about, what skills do we need, or what plethora of different skills do we need to all myself.
So we understand the business context, the technology context, and also be able to become leaders in this new landscape. So, I think we've got a long way to go. And, I think we're taking those baby steps towards, maybe something, like you said, Agile, x.zero, but you know, what's going to be needed to stop looking at it as a manufacturer.
You're applying manufacturing processes, but manufacturing, thinking about business and technology and how that. Changes paces, like you, said, multimodal, it goes between core it and fluid it and backward and forwards with sometimes talking about requirements. Sometimes we're talking about stories, we're talking about capabilities.
I think we need to all come together and we've talked about this in this kind of fusion landscape of bringing, everyone getting together, the developers, the testers, the project managers, the UX designers, the business specialists. Everybody needs to be in there. I think there's a lot to be learned.
So again, coming back to this landscape of, what do we need to arm this next generation of, I'm not going to call them testers, not gonna call them out, share lists. I'm not going to call them anything apart from kind of engineers or specialists, what kind of skills do they need to be thinking about?
Paul Gerrard Well,
I think I don't know if I'm going to repeat myself a little bit, but If you like in the software business, it's a bit like civil engineering. Let's say, although I don't think software is a civil is an engineering discipline, but whatever. We build garden sheds, we build houses, we built tower blocks.
We build 70 story skyscrapers. We built bridges, whatever and with software and with engineering and civil engineering, we've got this huge range of stuff. The techniques and approaches and tools and technologies we use vary depending on the project. With regards to software, it's much worse than that because we can build anything that has a function, a functional piece of functionality using code.
We can imagine anything of any scale almost. And so, we need to stop thinking that there's one solution to this problem of building software systems. So, Waterfall was never the solution to all our ELLs, our Giles certainly isn't the solution to all our ELLs continuous delivery. Offer some prospects of whatever, but again, it's a very focused tool.
We simply haven't got, we need about another 20 tools and then if we have these 20 tools, imagine you had the tools which would be from a spade, a trowel, a bucket, and some water and some cement and sand. To having that kind of chills, you need to move mountains, literally to build dams. What we need are the ability to make good choices of our tools and our tools are basically logistics. As I tried to, this is what I'm trying to pitch is like, if you can imagine something, whether it's a tiny piece of code or it's a hundred million line operating system. Lines of code operating system. We need our brains need to be able to make good choices of tools from the set of tools that exist and the problem is we still don't have tools that help us to model systems to the degree, to model complexity to the degree that.
They help designers, developers to build stuff and test us to test stuff. So, I think essentially the tools we really need are tools that support our thinking and do some of this logistical stuff that can be done by software.
Like moving mountains of data from here to that. Scanning all our book reports and all our source code changes and all our change requests and looking for patterns of failure and success and so on. These are the things we need and we don't have them yet. So we don't have, we've got some point solutions for managing work and if you like a Waterfall and Waterfall structured process and scrum are ways of managing work come down, is a way of managing work and the thing is like, that's absolutely fine, but it helps us not one bit to deal with complexity and complexity is the problem we're trying to solve, even, yeah.
Even a hundred lines of code can be immensely complicated. We've all if you've ever written code, once you get over sort of 30 or 40 lines, you have to think really carefully about changing. It. It's as simple as that when you've got 30 or 40 million lines of code in a banking system, it's overwhelming and we rely on people who have these kind of mystical properties, mystical capabilities who can say, ah, you know what? This will be a safe change to do tonight.
This one might not be and we don't know, we can't capture that knowledge in any way, except by having people work on the same system for dozens of years, perhaps, so I think we, we are, it's a young industry, we all like to think that we're super sophisticated, but I think where it depends when you think software started, but let's say it started in the fifties. We've been going for 70 years. Well, hold on, we're still learning about civil engineering and we started building stuff.
4,500 BC, scale, and we're not there yet, so we need to be ambitious, but I think we've got to be realistic that we're still gonna have these struggles, but you know, certainly, beyond my lifetime, I mean, big juggle then you'll live longer. Well, I hope you do. And you'll see more progress, but I think we'll be having these conversations.
25 years, time easily. Yeah. Something remarkable happens. I mean,
Jonathon Wright I think, I put, I remember I, I always kind of put the same quote into most of the books that I did, the book that I did with Dorothy Gray, where he's kind of saying that the tools and technologies that worked yesterday may not work tomorrow.
And, I think that is exactly what you're saying. There is that we've got to arm ourselves with the capability to be. Able to pivot and learn new skills take on new challenges. And, again that Theo kind of talked about those organizational structures. You go to one organization it's going to work in a completely different way to the other.
I think part of it is we need to have that ability where we're versatile and we all must go with these cognitive and digital skills that actually allow us to be able to contribute. To whatever kind of challenge we face next and I think there's a lot of challenges that are coming along and, we've, I guess all the great taught speakers during this event, I've talked about various different methodologies and approaches and experiences.
And like you said, we're just scratching the surface, but at the same time, I think part of it is the idea of these events is to challenge those norms to challenge and ask questions and say, the approach that maybe we were doing before, connect the improve. Can where you know, go back in and look at a new way or a new opportunity to improve that and I think what your new model for testing was kind of helped push the industry into kind of saying. Yes, you've got to, you've got to examine those things. You've got to survey things. You've got to really start approaching this in a way that you're not just told to do something in the same way and expect that you have to do it like that.
Part of it is to ask questions, which I know that the audience had been fantastic about asking questions and challenging. Some of those basic principles and I think, we are on a bread, brisk of a new era, and I know we're going to try and get back together in February 17th to work for the resigning of digital manifesto to start taking some of those lessons we've done with, with DevOps, with, all with cognitive technologies that are coming through with.
AI machine learning RPA and start sharing some of those ideas about what it means today to be a technologist and also somebody who works for an organization that adds value to its customers. So I know we're at the, nearly at the top of the hour and we've got some more giveaways to give away, but yeah.
Do you want to give us a kind of a summary of, our final thoughts about, where is the industry going and where do you think we, we need to be thinking what we need to be thinking about.
Paul Gerrard Well, okay. So those things are in my head right now, just off the top of my head is when you ask the developer, anyone who writes code, how will they get from an idea or requirement to a deliver a piece of software?
Very few developers can explain that thought process. Because they haven't thought it through. It's almost like, yeah, we don't have, like, you ask an artist how they paint a picture and they can tell you, well, I do the broad brush, I whitewash the page. I sketch out the structure of the painting and, and I work on the trees and the background first, and then I would do on the detail.
They'll give you a method of developers. Can't do that. We haven't thought it through. We haven't thought software development through to the degree that we can explain how we do this piece of magic. Now, what's interesting from the time a developer and I'm like, I can't explain how I could write a piece of code so that you would understand how I thought my way through that problem.
I don't think I started using it to do at all. The thing is I think where we're lucky as testers is we have a kind of simpler. The whole process to go through, but we have very high levels of complexity to deal with and so I think we deal with that by modeling test modeling, let's call it. So I think, yeah, in some respects that the test is a bit more behind in the game, the developers, and if we can solve this problem of how we test complex stuff, I think that could help.
How are developers to understand how they build the damn thing in a systematic or a more systematic and reliable way, rather than relying on very mechanical tools like source control, that's important? And it keeps us out of trouble and endless regression testing and test-driven development. I mean, yeah, we rely on elephant gun kind of approaches to deal with things that really should never happen, and the whole configuration management challenge that we always have. These are, we have incredibly complex tools to deal with stuff that really ought to be very simple. When you say no one touched this code or might cause the or this the stone because it might cause the whole cathedral cup collapse in the pilot rebel.
We don't have these kinds of safety naps that we can take for granted. We're still. Adopting very technical elephant gun approaches to problems, some of which are about discipline and some of them are human problems, communication but at the end of the day, I mean, Weinberg said very astutely, all problems are human problems and we're still at a technology industry trying to solve everything with a technology, whether it's a process or a tool or a, an a, a development approach or whatever. We still obsess about technical solutions to human problems, and that's where we're going wrong and I think Dave stuff, he's endlessly pointing that out, it's, we should be thinking about the human factor much more than the technology and the even the business.
I mean, I mean, I think he said as much. Yeah. It, it touched upon, business people having analytical skills to define requirements. It's unreasonable for us to think they can do that.
Jonathon Wright Thanks, Paul. That was an absolutely great podcast. It was great to have you on the show.