In this inaugural episode, you will be introduced to the host of The QA Lead podcast as we embark on our exciting podcast journey. What is The QA Lead? What are we all about? What can you expect? And why should you join our community? Keep listening (or reading) to find out.
Related Links:
- Subscribe To The QA Lead Newsletter to get our latest articles and podcasts
- Join the waitlist for The QA Lead membership forum
Related articles and podcasts:
Read The Transcript:
We’re trying out transcribing our podcasts using a software program. Please forgive any typos as the bot isn’t correct 100% of the time.
Intro In the digital reality, evolution over revolution prevails. The QA approaches and techniques that worked yesterday will fail you tomorrow. So free your mind. The automation cyborg has been sent back in time. Ted speaker, Jonathon Wright's mission is to help you save the future from bad software.
Ben Aston: You're listening to the first-ever podcast of The QA Lead. Welcome to the show. The QA Lead is a community for anyone with a passion for quality, for making things work properly, and therefore breaking them too. We're on a mission to build a QA community, exploring the latest and best of automation and testing, including quality in agile, quality at scale, enterprise DevOps of course, quality and digital, which means we'll be talking about web, mobile database, web services, cross-browser, big data and the always painful regression testing.
So whether you're already leading a QA team or just starting out as a QA tester this is a place for you to connect with your QA tribe is a place to be inspired to be a quit and connect with those who are shaping the future of quality assurance. And we want to invite you, dear listener, to join our QA conversation, to discover with us how to lead QA teams and build better software. So keep listening to the QA Lead podcast to find the tips, tricks, and tools you need to become a better QA to lead teams better, and of course, banish the bugs. And now, wouldn't have been nice? So while you're listening to the show, please subscribe and join our mailing list on theqalead.com to stay up to date with all that's going on.
So let me introduce myself and what this podcast is all about. So by way of introduction, I'm Ben Aston. I'm the founder of Black & White Zebra. We are an independent media company. We're based in Vancouver, in Canada, and we make serious and sometimes boring things understandable, and a lot more fun. Our mission is to help people and organizations succeed. So that's you, that's your organization. And we do that by creating platforms like theqalead.com, that help people get competent, get connected, and get confident in their role. So next up, I want to introduce you to the future host of our podcast, Jonathon Wright. Jonathon is a published author, a Ted speaker, a co-founder and CTO of digital assured. He's also the President of Vivit Worldwide's micro-focus community, among a whole load of other things, I think, we'll touch on a bit later.
Now, Jonathon has got stacks of experience building things and making sure they work properly with nearly 20 years of experience including Hitachi, George Bank, Lehman Brothers, remember them, and Thompson Reuters, Siemens, and Xerox. So Jonathon's going to take on our first-ever residency on the QA Lead podcasts. So a massive welcome Jonathon.
Jonathon Wright Hello listeners. It's great to be here.
Ben Aston: Good stuff. So we're going to be talking today about the new community we're starting, called The QA Lead. We're going to talk to Jonathon about his QA story, and then do a bit of a deep dive to get some of his QA insights. But Jonathon, I want to talk to you about the QA Lead, the community, for a minute. What's your idea about what we're creating, and why do you think it matters?
Jonathon Wright Well, this, I think, is going to be the first of a kind for a podcast series that actually focuses on quality. It's very easy for people to get confused between what, say, for instance, testing is, and what quality is all about. So that's really why we're focusing on this. It's untouched, undiscovered, and it's the most important thing. We interact with quality every single day, and we all have an opinion about it. So there's going to be some great conversations.
Ben Aston: A dark and mysterious world. Let's hope we find the light. So I want to dive in to some QA basics. Maybe, for people who are new to QA, new to quality, and let's talk about quality for a new, for a non-QA person. How would you describe QA, and how would you describe quality?
Jonathon Wright Well, there's definitely no one description that covers everything. So I think it's probably worth starting with the idea of, what is quality in your everyday life? So, you go to a supermarket, you go and buy a product, and you'll notice sometimes, you get this little badge to certify that the actual quality of that is good. And sometimes this might be something like the ISO, so that's the International Standards Organization, who had literally said, "Here are some great practices or methodologies that you should really adhere to, to make sure that the product is up to a certain level of quality." And I think that's what's really interesting about this, is quality completely changes depending on the person, the price point, and really, what you're expecting, what your expectations are, of quality.
Ben Aston: Right. And so, why for you, is that interesting or important? I think I'm curious from a... Obviously you've got lots of technical experience of implementing systems and building things. Why is the quality aspect, to you, why is it even interesting?
Jonathon Wright I actually think it's that it's a lost Ark. It's literally something that's been put away for so many years, and we've been talking about doing things faster, and getting them quicker to market. And I think we've lost some of that engineering capability, where we really time and care and love for what we're doing. And I think there's a lot of people out there that still agree that quality should be at the front of everything we do. I want to talk to those people. I'm wanting to interview and have conversations about what quality means to you, the listener.
Ben Aston: Yeah. Yeah. No. I think it's fascinating. I think as the world, or there's been a trend towards more disposability, and planned obsolescence, when we know the things that we buy, or the things that we use will break. Whereas, maybe a couple of generations ago, things wouldn't break. And you'd still have furniture that's handed down from the generations that is quality, and it does last, and therefore it's much more valuable. So I think, thinking about that within the context of, "Okay, well how do we..." Within the agile mindset that people tend to be excited about, and iterating on things, and slowly making things better, increasing value over time, is asking, "Okay, well where does quality fit into that?" And is quality the final iteration, or is it something that's embedded within those iterations as a product's development? Well, it develops over time.
Jonathon Wright Absolutely. And I think we talk about things like quality is everyone's responsibility or quality is a culture, or even quality is at the center of everything that we do. And I think this is interesting because this is showing that it's no one person's responsibility for quality. However, we're talking about QA Leads. We're talking about quality assurance people who are in a career to make things just better. So where do they fit in? How do they impact the way that quality, within an organization as a culture, is actually delivered? And I think this is where I really wanted to start with a bit of a journey about how I've seen quality change over time in my personal experience.
Ben Aston: Yeah. Well, tell us, how did you... What is your QA story? How did you end up here?
Jonathon Wright So I think this is really interesting. This is where my passion was. So back in the '90s, I was studying at university, doing high-performance computing. I found myself in the lab one evening, and one of my lecturers had come over to talk to me and said, "Would you be interested in being a tester over the summer holidays?" And I thought, "Oh, okay. So what does a test do?" And he said, "Well, these guys, they make phones, they do stuff like that. I'm guessing you're going to be testing it." So my thought initially was, it was some kind of production line, once the phones had been finished being made, I was going to sit there and take a few out of the box and just turn them on and make sure they work. So that was my initial idea of what testing was. And that's the first step which I had, apart from all the theory that you got out in the academic world, this was my first ever hands-on, "Well, is this quality good, or is this quality bad," scenario.
Ben Aston: And what is that? Was it turning on a phone?
Jonathon Wright It wasn't, indeed. But I was really lucky. I started off in the '90s, and I had a summer internship, which then led to being sponsored through my final year, and my dissertation, which was all around automation and helping people. And part of that summer internship taught me so much. So I turned up the first day and they said, "Here's a whole stack of test scripts," which was a big manual, as chunky as you've ever seen. A bit like a phonebook used to be. And they say, "There's 1,000 tests. You've got the next six weeks to test all these, this phone exchange."
Now, though it was picking up phones, it was picking up one phone, calling another, hanging up, transferring, putting it on the switchboard, trying all these different combinations to try and break the phones. Now, if anyone's used to a desk phone, you expect that it just works, right? You pick it up, you dial a number, no problems. And these were the days of voice over IP, so this was the early days where instead of making a phone call down some copper lines, this was going over the internet. So yes, it did break a little bit more. But those six weeks taught me a whole stack of really hard lessons, which was, I was the first one, I was excited, I was enthusiastic, and I tested every single one of those, all 1,000, 2,000 scenarios. Do you want to know how many bugs I found?
Ben Aston: Spoilers.
Jonathon Wright Zero. So I was the most upset I've ever been in my entire life. Right? And, tail between my legs, I went over to my boss and I said, "Look, I'm not joking, there's a tick on every single page. I have tested the one that you have 10 people on a call, one person, then hangs up and then reconnects. And I've tried every combination." And he said, "I know you have. We've been reading these for the last 10, 20 years, and they've never failed, because we build products that will last. And it's a PABX, a switchboard. It's expected to work, right? We engineer products that will work and last forever." So exactly what you said, it stands true. They build stuff that's solid.
You go and pick up your desk phone in your office, and it works, right? It doesn't crash on you. It doesn't have a blue screen. People have got into that point where they understand that certain things are going to be incredibly reliable, and they're bombproof. But for me, who's just starting out and spending those long days, to then be told that in another six weeks, and I think you did it in the intro, that I'm going to have to do another regression when they patch the system again, what's the most demoralizing thing I've ever done in my life. And thankfully, this was my first flavor for doing automation. And I say automation because this was actually a machine. So they pulled in a machine, which was called an Abacus, which has all ones and zeros, and I program the machine to make phone calls, directly interfacing with the PABX, to call A, calls phone B, hangs upon phone C. And there was no poor student who had to do it again.
So I can see why going faster and automating things, especially manual, repetitive tasks which don't really add much value, make a lot of sense. But at the same time, it's essential that they can at least gauge the level of quality. And as far as Siemens were, back in those days, they were one of those... Like you would associate a German car manufacturer, you have associate quality. You don't know why you associate quality with it, because unless you own one, or you've had experiences, you've just got this feeling that they're German, so it's going to be better engineered, and the quality is going to be higher. So, where does that come from? How do we, in our psyche kind of thing, that product's been better engineered because it's German?
Ben Aston: Yeah. Yeah. I know, that was my opinion about German things until I bought Mercedes, and it always breaks.
Jonathon Wright Absolutely. So that's the whole thing. It's the perception, isn't it? It's the perception that the product that you're going to own is going to be of high quality. And you're also expecting that when you take it in for a service, that it's going to be expensive, right? Because quality is expensive. That is the perception you've got. You're not going to go in and go, "Wow, that was the cheapest servicing I've ever had on any car ever." It's literally... There's a balance between fast, quality, and cost, right? And I think that's where the balances are. And you make a choice of whether or not you buy a premium brand over another brand because you assume there's a level of quality.
The same goes if you're buying beauty products, right? It may have the same ingredients but not have the advertising, it might not have the logo or a famous celebrity endorsing it, but it's the same thing. So why do we still buy those really expensive products? It's because it's quality. It's the packaging. It's everything. It's what Apple does so well. When the concept behind Apple came to us, it was not just the experience of owning the phone and the phone being a music device and everything else, it was opening the box, it everything, looking and feeling so well-built and well-designed. And I think this is where we've lost some of that art because we're looking at getting things out cheaper and faster. And that's what I want to rekindle with.
Ben Aston: Yeah. And I think what you're touching on is really interesting, it's about quality being something that you feel, an emotional response, rather than quality being something that doesn't break. And like in your example where you were talking about the phone system, the quality was that it didn't break. Whereas, the quality that you're talking about with Apple is, well we all know that the battery life degrades, and they slow down the processor when they're doing upgrades and things like that. But we are still associated with quality because it's Apple, because it's expensive, we paid a lot for it, it was packaged nicely, and we said that this is the best phone you can get. It's a conundrum.
Jonathon Wright It is. And I think this is where I want to start helping people understand, "What are the steps you can potentially do to understand some of these methodologies, and toolchains, where you can look at how you create better quality?" And I think the easiest way to start, really, with this, is, and I think, especially if you listen to the Digital Project Manager as well, we talk about things like Waterfall. And for you guys who don't know what Waterfall is, this is a sequential design. So this is the idea where you start with an idea. It could be Steve Jobs saying, "I want to make the best mobile phone ever." He doesn't care what processor it has, how much memory it's got. He's just got a vision, right? And that vision has a set of requirements that then need to be developed, delivered, tested, built, and then delivered to the end customer. Right?
And I worked for Apple in Silicon Valley, and I remember going to Stanford to do a guest lecture, and I went past the old Hewlett Packard garage. Now, this was a great experience because Hewlett and Packard started off doing hardware. So a bit like what we talked about with the telephone exchange, they used to do scientific calculators. Now, if you ever hear the term Mk1, Mk2, Mk3, the idea was they would iterate. So the Hewlett Packard's vision was build and test, build and test, build and test. Iteration one, two, three. Mk3. They'd release a new calculator every couple of months.
Now, we could call this agile because, it's iterative, it's improving on the design. However, it's also engineering. And I think it may have been sequential design. It may have been, "Well, actually there's something missing from the last one. We'll feedback and create a better product." And a bit like the iPhone generation, every iPhone 4, 5, 6, 7, 8, 9, SE now, I suppose. And the idea is that with each iteration becomes a better product, better quality. And maybe that's where it's full of the show, and the same... The form factor. The phones all look very similar. They do the same kind of thing. There's a perceived quality of the actual product. But going back to that idea, which they had at the start, was really a powerful message about how they were going to change the world. And I think there was a certain quality they were looking for, and part of it was building something that changes their life, but also would be part of everyone's everyday life. And I think this is where we've got this new challenge around, "How do we deliver quality products to market? Not every two years, not every year, but actually extremely quickly?" And I think this is a challenge which we're going to address during the series.
Ben Aston: Yeah. Sounds fun. So let's go back to you, in the phone factory, doing your testing. You've created some automated tests. Now you were lucky to inherit that, all those testing scripts, which gave you some guidance to allow you, I guess, to be able to automate, build some automation into the testing. And that was a great starting point for you. But for someone maybe, who's new to QA, who's thinking, "Hey, this sounds fun." And they're making stuff work better, breaking things, finding out what's wrong with them, what type of certification books or training would you recommend to people new to the industry, who are in the industry, just wanting to get better?
Jonathon Wright Sure. So back in the '90s, I had what I called the Bible for automation. And it was a book which was called Software Test Automation. And it was written by a lady called Dorothy Graham. Now, Dorothy is known in the industry. She's a good friend of mine, and she's just retired after doing the hundredth keynote on quality and testing. So she's definitely a thought leader in this space. And this book, which she'd first released, was talking about automation and what she'd been doing for the last 20 years. So she's known as the grandmother of automation, and she worked for Bell Labs. So another communication company, back at the end of the '60s, into the '70s. And she was known as the first person to ever write any automation against systems. And reading that book, obviously, there was a wealth of knowledge that I applied to my role.
But one of the things that I was lucky enough to do is, I looked at reaching out to Dorothy and talk and say, "I've got questions. Who would you recommend?" The best people to talk to. And she came back with a... She was based in the US at that time. And she came back with a... And there were a couple of good people who were in the UK, and I know a couple of people in Germany. "Why don't you reach out to these guys?" So I reached out to them and I found myself a mentor. I found myself a mentor, which actually was in the [inaudible 00:22:15] in the States, who also worked at Siemens. And I was lucky enough to learn some of the experiences, and lessons that he'd done in automation over the last 10 years.
And so he'd come up with this idea of having graduates like myself at the time, writing these test scripts during the day. And then because of the time difference, he would then fix what they called the framework, in those days, the engine that could get things running, and then he'd execute them overnight. So we had this 24/7 automated testing across three different global locations. And that was amazing, at the time. And it was interesting, because when he came over to do the training with the team, he said to me, "There's going to be one word that I'm going to say to you, and that one word is going to change your life." And yeah, I don't want to build it up too much, but the word was, "God, I want to win." Right? And the idea was, you have to want to make it a success.
And back then, it was very hard. There were no online resources. There were very few books. There was no online training. It was literally, you either learn it from reading dusty old books, or papers people had published on different sites. And, it was boldly going where nobody's gone before. And he said to me, "If you ever get to a point where you're stuck, instead of waiting, it doesn't matter if I'm in a meeting with the CEO, come in, grab me and let's work on it together." And I think part of that mentality of the community side of things was, it was, how do we build a community which we can rely on each other to solve these kinds of problems? Obviously things have moved on, and as they've moved on, we've got a whole wealth of good books and training around it. It's very easy. Can jump onto something like Udemy, and learn a course on automation in a few hours.
There are books out there. I've done a couple of books. My first book, which I did was a cookbook. So the idea behind it was, it was all reusable recipes of different automation code that I'd written over the years. And the idea was, you could just apply that pattern or recipe to what you were trying to do. But that was all about the idea of, how do we share and reuse? And this is what we're trying to do with the QA Lead is, how do we point people in the right direction so that they can, A, contribute, but also get something back out of it? I'm actually looking at the ISO website because I was on the shadow committee for the new ISO standard for software engineering, which I'm not going to bore anybody on, but there was the 29110.
But I'm actually just reviewing it at the moment, and they've just done part 11. So another addition on there, and that's now covering testing AI systems. There's a section here on DevOps and how you test DevOps systems. Now anyone can look at these kinds of standards, and be part of them, and help contribute to it. But also, reach out to that whole community of people who are happy to exchange their views and experiences around the quality engineering space.
Ben Aston: Yeah, definitely. And I think, you mentioned earlier, the Digital Project Manager and another one of the platforms that we've developed is called thedigitalprojectmanager.com. And it's a similar story in terms of what you're talking about, in terms of the power of community. And I think that's why I get so excited about launching a new platform, and building a new community because there is incredible power in connecting people. Through that connection, people can get more competent, and then through that, they become more confident, and then that's what helps people and organizations succeed. So I love this idea of bringing people together. And it's about together, writing a playbook, rather than necessarily always going back to the textbook. But it's about sharing with one another, and seeing this playbook that you can use, those recipes, evolve over time as we have new requirements to test things in different ways, through different technologies.
But I'm curious from your perspective, how has your perspective of quality and to that degree, testing, changed from your days testing phones to now what you're talking about enterprise DevOps? How has your perspective changed or evolved over time?
Jonathon Wright A great question. And so, I think part of that journey, everyone's got their journey through QA and testing, and that's what I want to hear. So anyone listening to this podcast, get in touch, reach out, let's have a conversation. I'm the easiest person to get on LinkedIn because I'm literally linkedin.com/n/automation and you can send me a message, get involved, reach out, and share your experience. But we talked about starting off in the '90s, starting off with this Waterfall approach. It was all-around quality engineering, and it was about building the best product we could possibly do. And I think I mentioned in the intro there, about Lehman Brothers, right? So things slightly moved on by then, when we saw what the damage, doing things wrong can actually do to every person who is probably listening, right, has been impacted by that.
So I was at Lehman's back in 2008. I was there on the day of the crash, and with my box coming out of the building. And my job there was automation. So I had a framework. I had this playbook, which you're talking about, with my recipes and my patterns. And the idea was, we were able to scale things. So in this case, I was boarding loans, surprise, surprise, from one system to another system. And there was a certain level of, well, as you're doing that... And it could be called RPA now, robotic process automation. But as you're doing that, we need to make sure you don't miss a zero off, because if you miss a zero off, you could potentially cause a massive financial problem. Not that I was responsible for that day, but what I'm trying to say is, part of it was, I guess the agile movement had moved.
So Agile manifesto, which back in 2001, I guess that's why we're talking about the quality manifesto here, and the people who are listening is, back in 2001, they created a few items, which they believed would help guide people to the Agile movement. And things have moved on. The one of the things that haven't changed is the mapped out manifesto. And those guys that all met up in 2001 skiing, part of it is, they were best guys in the industry, to put their thoughts into something that would help everybody else. We're trying to do exactly the same here. We're trying to say, "How do we bring some of that quality back? How do we, the future of quality assurance, help create a quality manifesto that's lightweight enough that allows people to actually put some vigor around it, so you don't have another Lehman Brothers problem?"
But this is the thing. I must've been a glutton for punishment, because a few years later after I was living, I'd moved to New Zealand and then Australia, as I was working with Microsoft. And I came back at, and I started working for another bank, a large bank called Deutsche Bank. And they were again, trying to do this, but at great scale. So kind of what you were talking about there for scaled Agile. And they wanted quality as a culture. They wanted some... And a great German bank, so loads of great standards, loads of good practice that was out there. And part of it was, how do we do this, but at a huge scale?
So back then, there were 15,000 different products. So it was a massive landscape. And that went from anything from derivatives platforms, and so, therefore, we were having to coordinate 7,000 teams across lots of different locations, and try and get them to all build better, high-quality software. And of course, it's a bank. The last thing you want is to go down. Right? So how do you do that? How do you actually understand which ones need more investment when you're looking at those large landscapes? And I think this was my first take, the opportunity of understanding what the business assurance was. So we've talked about quality assurance, but how do we assure that the business can operate so it doesn't become another Lehman's? Right?
And that means you need to understand the impact of what... I remember seeing this horrible spreadsheet which had literally, "What would 20 minutes downtime cost for this FinTech platform in this country? How many products or loss of revenue would we make?" You couldn't do sequential development, because you had to be able to release fixes quickly. So we had this whole... We're having to think about DevOps, and that's a huge challenge. And I think for most of the listeners out there, they may have had experience in large organizations, small organizations, so your unicorns, or your startups. They may have gone to medium-sized businesses, which have a certain level of maturity around quality. And then everywhere they go, there's a different viewpoint on quality. And I think this is why it's so exciting to be doing this because this is somewhere that no one's really gone before. And we've got next-generation technologies coming through. And I think you've also mentioned around the work that I'd done with Hitachi.
So this was the next generation of, "Well, how do you test things like smart cities? How do you deal with testing things like Car-to-X?" So, people are used to autonomous cars, right? But the idea behind an autonomous car is that it uses the internal cameras, and the internal system to make decisions. So yes, it's using computer vision to recognize if a pedestrian's stepping out. But it's autonomous. It makes its decision on its own. Whereas Car-to-X is a standard around how do cars communicate so that way, they can increase the flow of traffic, the can know if you're about to crash into another car, it will move out of the way. It interacts with the infrastructure. So in Germany already, if you've got an Audi, you can actually go to a traffic light and the cameras will look at the traffic light, and wait for it from going from red to green, and then allow you to go. So you can keep your foot flat down if you really wanted to.
But that's not infrastructure-to-x. The smart cities will allow all the cars to talk to the traffic lights and make a decision when things go good green, and it's safe to go. These are really complex systems. And how do you potentially provide quality which could potentially affect people's lives? Everyone always talks about the most difficult question of, "Well what happens if there's an autonomous car, and on the left-hand side, there's a motorcyclist not wearing a helmet, and on the right-hand side is an SUV with a child on board sticker, which way does the car go?
Now, if I'm testing that scenario, well, what do I test? And the answer is, the autonomous system at the moment isn't clever enough to make any decision about whether it goes right or left. It will make a decision based on the amount of space that it's got, where it thinks the least amount of damage can be caused. So it doesn't have the ability to make the decisions. We do, as humans. And we build these systems in our image. So how do we ensure that it makes the right decisions, but also that the quality is high, that we don't have accidents, we don't have fatalities? We move things forward for the better good. And we'll work together to make that happen. And I think that is what's going to drive us, and it's going to be a whole new world of products in the future. We need to be involved in that process from day one, all the way through to when that product's life and it's driving itself.
Ben Aston: Yeah. I want to... We've touched on this a bit before, but I want to just deep dive for a second, into your perspectives on quality in Agile. Because I think the topics that you're raising in terms of quality in smart cities, or qualities in infrastructure, and these kinds of critical systems, we have that on the one hand. And then, on the other hand, we have this excitement and appetite for Agile, and doing things iteratively, and accepting the fact that they're not perfect, but they're evolving over time. Some of these things are opposing one another when we're thinking about, "Okay, well you can't have a smart city that's iterating. Some level of iteration is acceptable, but then if it's a critical system, a banking system, there just isn't that tolerance for mistakes or for bugs that are going to cause millions of dollars to just disappear or be transferred to the wrong place." So tell me, what's your view about quality in Agile, and what does... You talked about having this quality manifesto, but what is the future of quality and Agile, in your opinion?
Jonathon Wright So, yeah, I think this where we're going to, obviously, we're going to explore in much more detail. And I think it's a great example of what you're saying about the smart cities. So the smart city project I worked on, which was for Copenhagen, was a smart city. The idea was that they want to become carbon neutral by 2020. They wanted to create a city-wide data exchange where people could put information coming from cameras, coming from [inaudible 00:37:36], coming from different weather or traffic or whatever else it may be. So, could we have launched an application on day one, and then iterated on that product? And the answer there is probably no. So, there's talks for new patterns. So the approaches and methodologies that may have worked for, say, web development, or creating a website, may not work in the future.
So one of the challenges we had at that point in time was, we had to create a smart app. And that app would allow people to recommend to people, which was the most carbon-neutral way to get to work. Right? So it was a really good idea because it was community-driven. The citizens of Copenhagen wanted to adopt a cleaner way of living. They wanted to reduce the traffic. They wanted to make a real difference. So we created a platform which would, a bit like what an Uber would do, or your Google Maps would suggest, "Take the bike, take this bus, take this tram, take this underground, this is the best way to do it and reduce your footprint."
But we also want to bring, and have gamification, right? We wanted to courage people to use it. So the city of Copenhagen came up with a great idea that businesses would all come together, a bit like the company that you run, your team would come together to compete against other companies in Vancouver, for instance, to get a better carbon footprint. And then you'd get tax relief, which would allow you to buy your employees bikes, and give them tram ramp passes and all sorts of stuff like that. But how do we possibly do that, and launch the application from day one?
So I remember seeing the requirements, which was, "Day one, 38,000 users is what we're expecting." How can we make sure that the system works? And so we had to be able to at least test, or prove the quality of the product before we ever went into the wild. So how do you get historical information to enable gamification? How'd you get historical information from Google to get all, or whatever, for the Maps for the different journeys through the application? So what we did is we actually reached out to the Copenhagen University, and we said, "Guys, can you install this GPX logo," which is a track where you go around the city, "And we'll exchange it by giving you free beer." So the idea was, they would just turn on the tracker. Yeah. It used to be at their battery, but they would upload the information to us.
And so we had all these historical journeys, right. And we used those as the deltas to then generate billions of possibilities of people walking around, going on the tube, not having any signal for a bit coming out this other end, walking at two miles an hour, walking at three miles an hour, going in different directions, all this historical information, so that we could prove the quality of the product, and also that it would be able to scale. So as far as the amount of users that would be needed, and also the recommendations that it was making, the gamification.
And so, this required the feedback from emulating the real world. And I think, I talk a little bit about my Ted Talk for a second, but this is where I think the Agile manifesto is missing something, is that we talked originally around the Hewlett Packard approach to build and test, build and test, build and test. In Agile, you start off, you maybe you do some ceremony, so you do your sprint planning. And then at the end, you might do some retrospectives, and you may be doing something like scale Agile, so something like safe, or data, or less so, distribute Agile delivery, a scrum of scrums. And whatever that is, you've got that feedback loop to say, "Well these are the things that went really well. These are the things that didn't go well."
But they're sometimes team-based. There's sometimes a PI-based, or you're a level above features or your capabilities. And quality is a really difficult thing to deal with, when there's different teams, with different skills, different methodologies, all using, different tooling. It's that cultural aspect of being able to get people to understand, "Well, what was the real quality?" Because there could be 50 teams all building something that comes together to represent, so for instance, the Smart City app. So the idea with the Ted Talk was to say, "Okay, let's move this paradigm a little bit. Let's talk about thinking as one activity." So that's the before the stage where it is, "What are we trying to do? What are we trying to build?"
So in the old days, that might be called requirements engineering. But now, that could be a story, that could be an executable specification, that could be a vision, that could be some kind of charter of what you want to do as an organization. Whatever it is, that's the thinking aspect, which you bringing people together to make that happen, right? You're not coming up with one person, say, from marketing who say, "We're going to launch a new type of pizza because it'll increase sales." It's literally something which is collaboratively done. And then again, that evolution over revolution approach to things was, how do we then evolve the idea? So what humans are great at doing is this whole build and test, build and test, kind of an iterative approach, which the Agile manifesto really supports.
And then I think there's the third section of the paradigm, which is the learning aspect. So this is when we start applying machine learning, artificial intelligence, the cognitive engineering that I talk about quite a lot, about how do we learn from what's actually happening in the real world? So take the example of with the mobile app, once it's gone live, how do we understand how the actual users are using the application? How do we know when they're having problems? How do we know? How do we improve it? So how do we feedback what we know the behavior of the users are, in the real world, and how do we feed that back into the thinking process of how we continuously improve, and continuous quality of that product going forward? And I think that's the whole point, is that it's not a build, deploy, and then maintaining production and fix the bugs. And it's not the ability for like DevOps to be able to do that release 25 times a day. It's about the feedback that comes from those 25 releases per day, to understand how that should change or pivot the design of the system.
And that's this next generation of challenges and opportunities. But it's also about quality because those people are using your product right now, they have that perception of the quality, based on what's happening in the real world. And we need to get back closer to the real world, to understand, "Well, what are the actual users doing?" And I think this is the exciting shift, is understanding the real world, is where we're going to understand quality.
Ben Aston: Yeah. Yeah. It reminds me, it's like a conversion optimization when we look to see what people do, what they interact with. We use... I'm using marketing terms now, but we start doing AB tests. We start using heat maps to see where people go and what they click on. And then we use that intelligence to optimize the experience. And thinking of that, through that lens of quality, how does that impact the way that we actually designed this? And how can we iterate in a way that makes more sense, using quality as a cornerstone, so that we can continue to develop things responsibly and effectively, rather than just being blind about it, and using data, I think is really the future of how we can develop within this kind of Agile context. So I think that's an exciting thing to think about. But I'm curious, Jonathon, what are the kinds of projects that you're working on right now, or some of the challenges that you're dealing with?
Jonathon Wright Well, that was beautifully said. And actually, it leads nicely into what I'm actually doing at the moment. So just finished for the day. But I've been helping one of the fastest-growing fashion retailers in the UK and Europe. And it's exactly what you said. All of the stuff, what you're talking about, is exactly what we're looking at doing now. So let's just take the thinking aspect for a second, right? So we've got a very passionate CEO, and she's got a vision for what this new mobile app was going to be. And they wanted to try something completely different. So first of all, it was a bit like a dating app. You could swipe if you loved it, or liked it. And then we'd use that to be able to make recommendations of the styles and the products that you want to buy.
But actually, we've come up with this killer app idea, but we need to validate it, right? So we've got choice A. We've got choice A which is, we go into some kind of development lifecycle, and we build the product. But we've decided that actually there's better ways to do it, right? So exactly what you said is, let's quickly get it, let's get the vision up there and work out, what would this app look like? Right? So there's loads of tools out there. I'm going to give a shout out to Envision, which is this free tool, which you can upload screenshots and it turns into an app. So the actual end-user feels like they're actually using the mobile app. But all this is just images, right? So this is wireframes. This are sketches. This is literally nothing more than a UX designer putting some concepts up, right?
And use a product called Maze, which again is free, which allows the internal influencers... So we've got the fashion industry. So she's got friends in the fashion industry, so they can literally pass it out to them and they can literally, like, it's a mobile app that's working, they can go through, they can order stuff, they can buy something, return something and there's no code been written. But the end of each experience, they're given a mission to say, "Go and buy something," or, "Go and look for a product that you like." And at the end of it, we get that feedback, a bit like enterprise crowd testing. We're getting that feedback to say, "Yeah, I like that, but I didn't really like how it felt when I clicked the like button. I didn't like the fact that I had to fill in all this extra detail. I just want a one-click to pay solution." But we're getting quick feedback.
So this is the design phase. We're not even past this to developer, and we were already getting feedback of the different journeys that you can go through the application, and what the end-users are feeling, what the influencers say, what internal fashion experts are saying, we can go to the crowd and actually get people that at the right demographics, based on what your Google Analytics is telling you. You've got your gender split, you could even get down to, if they've got family, dogs, cats, you can get down to that level without even ever releasing a product into, or even writing a line of code.
So, to me, that's exciting. And we can actually automate that. So we are actually using image-based testing capabilities, and visual testing techniques, to automate the flows. So we can actually drive it with different scenarios, with different combinations, and test it. So there's a level of quality to a product that hasn't had a single line of code. Now, that is extremely exciting. And then on the opposite side of it, so let's say we did launch this application into the wild. So we did put it onto the App Store, and you downloaded it, right? On the right-hand side, we're doing something called digital experience analytics. So this is looking at every journey that you take on your phone, and it's creating a DNA, or a kind of a footprint of what your activity is, and where you going, how long you spending on a particular screen, how long you looking at a particular banner or an image or a product, what is your user behavior?
But not only that, we're looking at every single person's user experience or digital experience across all of our million customers, right? So every day, we get new views of the behavior, like you said, of where people are going, where they're dropping off, where they're bouncing. And yes, none of this is new, but we're introducing things like... A word I've started to coin as the word dark Canary. But at the idea is its dark launching, which is this ability to launch a feature hidden to the user, that appears to a certain amount of users. So the AB approach. But it's got that Canary rollout aspect where you can start with 10% of people, then ramp them up. Or you can even target them. You can say, "Oh, well actually these early adopters with these buying styles, we'll get them to validate this idea or concept."
So there's so much more we can do today. And quality comes in at every single point, not only in the initial designs of something, but the actual operational stay of the app, and how you can improve the quality of that by learning what those digital experiences are and look at how you can remove the dropout from that. So these are all exciting things that I know we're going to talk about them. So I just wanted to give people the idea that this is going to be really exciting. We've got some really exciting, well-known industry experts that we've got lined up for you, who talk about this kind of stuff. And I think it's going to really inspire, but it'll also change your view of, "How do I question this? And how do I actually put this into practice at work? How do I get access to Google Analytics so I can see what my users are doing? How do I ask the design team of how I can get in earlier to look at the wireframes and the mock-ups of the product before anyone starts building it?"
It will change your behaviors, which will help you drive quality across your organization, and help you to get involved and learn new techniques, new patterns that will hopefully change the world.
Ben Aston: That sounds good. And I think for me, that is what is really exciting about this podcast. It's that what we want to do is go beyond the theory, go beyond the things that people are doing, and actually explore the bleeding edge of quality, and what that looks like, and really hopefully lift the lid on new techniques, new approaches, new tools, different ways to use tools, and really build this community around developing this quality manifesto, this playbook for delivering quality in the next however many years, as we see the industry evolving, as we see new technologies emerging, and requiring a different way and approach to quality and testing in this world of enterprise DevOps, of quality at scale, quality within an Agile context. I think it's going to be super exciting.
But just as we close, as we think about how this community is going to be different from your perspective, what's your view on this? What is going to be unique about the QA Lead?
Jonathon Wright Sure. There's loads of testing communities out there. I could list 10, a few, which I'm involved with, right? And I think they're all very much focused on providing valuable resources for testing. I think this is where we take a step up. We're looking at quality, and this is something that has never been done before because it is so difficult to define. It's difficult to find materials around it. And that's why I think we can really help. We can help provide blueprints, these playbooks, some templates of things you can think about, resources, training things that you can go off and do, stuff that you can learn to help you get better. Books, references. And I think that's why I'm going to play a clip in a second, which is this whole...
I think it's a lost art of quality assurance. And I think there's no need for you to listen to this podcast if you know you can completely define what quality means for you, completely define what quality is for all of your customers, and for your organization. Then yeah, you probably nailed it. But for the rest of us, including myself, I just think it's similar to the matrix, nobody knows what it is. But we know that it's everywhere, and we need to be able to understand it better. I'm going to play this clip, and hopefully, people will think, "Yeah, this is an unknown, and I want to learn more." And that's why you should subscribe.
The Matrix Audio Clip: Let me tell you why you're here. You're here because you know something. What you know, you can't explain, but you feel it. You felt it your entire life, that there's something wrong with the world. You don't know what it is, but it's there.
Ben Aston: Awesome. Thank you, Jonathon, so much, for joining us. It's been great having you with us.
Jonathon Wright It's a pleasure. And I look forward to some of the great content we've got lined up for you.
Ben Aston: Yeah. So we have got some episodes that are going to be rolling out over the next few weeks. So come back and make sure that you subscribe. Tell us what you think about today's podcast, and come and join our gang. Comment on the post. Head to theqalead.com to register to join our soon to be launched forum. That way, you'll find all kinds of interesting conversations going on about all things quality and quality assurance. But until next time. Thank you so much for listening.