Jonathan chats with Mike Lyles about service virtualization, crowd testing, and future importance of accessibility testing. Listen to the episode here!
- Join the waitlist for The QA Lead online community membership forum
- Subscribe To The QA Lead Newsletter to get our latest articles and podcasts
- Check out Bridgetree
- Check out mikelyles.me
- Connect with Mike on LinkedIn
- Subscribe to Mike on Youtube
- Follow Mike on Twitter
- Follow Mike on Facebook
- Follow Mike on Instagram
- Follow Mike on Medium
Other articles and podcasts:
- About The QA Lead podcast
- Help Test COVID Safe Paths MIT Project (with Todd DeCapua from Splunk)
- Automation Testing Pros & Cons (+ Why Manual Still Matters)
- 6 Hacks For Great Quality Engineering In Remote Dev Teams
- QA Tester Jobs Guide 2020 (Salaries, Careers, and Education)
- The QA’s Ultimate Guide To Database Testing
- Top QA Experts & Influencers To Get Inspired By In 2020
- 6 Key Steps to Creating A Quality Assurance Plan
- 12 Key Quality Assurance Skills & Competencies
We’re trying out transcribing our podcasts using a software program. Please forgive any typos as the bot isn’t correct 100% of the time.
In the digital reality, evolution over revolution prevailed. The QA approaches and techniques that worked yesterday will feel you tomorrow. So free your mind. The automation cyborg has been sent back in time. Ted Speaker Jonathon Wright’s Mission is to help you save the future from bad software.
Jonathon Wright This podcast is brought to you by eggplant help businesses to test, monitor, and analyze their end-to-end customer experience and continuously improve their business outcome.
Hi, Mike. It's great to have you on the show. And, you know, I just want to say you're absolutely right. Today, I was in the queue, the drive-through queue for 20 minutes because of the COVID outbreak. So your book is ahead of its time?
Mike Lyles Yeah. I've been thinking of that theory. I've spent a lot of time in Drive-through.
So that's definitely one place that I've learned a lot and seen a lot where people would, you know, walking past me, going in, getting their food and leaving. And I thought it's gonna be much faster. So I started the theory and I stopped going through the drive-through most of the time, and I realized I can get in and out much faster. And then I started applying to how the world is for me. I mean, everything you do in life, whether it's work or its personal life, there's usually another way in a better way to do things. That's not just a conventional way. So I'm enjoying the theory. It really became it actually wasn't the title of my book when I started, but it became the title of my book after I started seeing that one resonated through most of my chapters.
Jonathon Wright So let's say I've heard they get it, especially it comes to quality. They got a concept to go slower, to get faster.
And I actually get a look at download in the book and I've read three because I'm fascinated with that concept, because it's kind of a motivational side of things as well, based on a lot of experiences that you've had. And, you know, I guess it does that mean potentially you could have a follow-up book in the pipeline?
Mike Lyles Yeah. Absolutely. The point I love about the book was that I started putting together ideas in 2002. And I said I want to write a book. I have no idea what I'm going to write about. I knew I wanted it to be self-help, motivational. And so I read a book by a guy named Jeffrey Fox, and he's got a book called How to Become CEO. And that book is like 20 to 25 to 30 chapters. And they're all different stories. So you can just pick up his book, go to Chapter 10, read the book. And it doesn't have to be in any order. And that's really how I've out-laid, laid out my book. And my book is 30 chapters.
Each of them is a different thought process, different story. You can pick it up reading each chapter and get different you know, you don't have to read it in order. And through the years from 2002, through the time of publishing the book. I just kept adding new chapters isn't a great chapter, you know. And I would be sitting in traffic. And as you'll find out in the book, I learn a lot from sitting in my car. But I'm in the traffic lane. I would say, you know, slower traffic keeps right. You know, a lot of signs in the US, slower traffic keep right. And you got people blocking you from speeding up and going through the lane.
And that was another chapter in just several different thoughts about these are really catchy chapter titles. And so I put the titles together over the years. The hard part was putting the chapters to those titles afterward. But it was a lot of fun and it's and it's been a whole lot of fun getting it out there and then promoting it.
Jonathon Wright So maybe an audiobook fishing. You get, you know, get someone like Stephen Fry to actually narrate each one of the stories because you really begin to storytelling, you know, especially. Yes. Some of your workshops on biometrics and this idea. You telling a story?
I think it's a lost art with, you know, some of the skills that come through. Some of the QA engineers and testers, you know, storytelling. A, unless you've got the experience of going through and having those stories to tell. I know. You know, it's really difficult for them to kind of get the confidence to really be able to express, explain something without using numbers.
Mike Lyles Absolutely. And the metrics tell us where he really came to me. I was working with a very large Fortune 50 company, and they really focused on numbers. You know, what's the percent done? What's the percent complete pass-fail percentages and all kinds of metrics that tell me nothing about the product and then tell me nothing about whether we're in a good shape or if the products to him what it's doing. Michael Bolton Low Times talks about really your metrics. And he says it's not metrics. Saying the Metrics program is like saying words instead of books.
You know, it's a measurement program. And so he got me thinking along those lines in about 2011. And I really liked the theory because what I understood right away is if you can figure out what your stakeholders want to know and then you can tell the story to them in a way that they can get it and understand it. And it's more about, OK, you know, pass-fail percentages might tell me about progress. You know, 80 percent done with my testing tells me progress, but it doesn't tell me anything about the quality or whether this is got high confidence.
I'm going to live. And I looked at James Box Dashboard that he created years ago and. People have used it and I use it now in my company, we the dashboard really doesn't have a number on it. The dashboard shows the features and it shows competence. You know, that's a big column in that report is what is the tester's confidence in this product right now? And do we feel like this product is nearing completion and ready for production? And I think what I find is our company really starts looking at talking to the tester about do you feel good about this? There's no harm. Far to be done. What's the percent complete? How many tests did you run? It's more. How do you feel about the product? And that's really a paradigm shift for really our organization or any organization that does that.
Jonathon Wright Yeah, I think that's a fascinating thing. I had a really interesting podcast, Website just a couple of back around with Neil and Neil is talking about time quality.
And again, it was one of those kinds of things where I'd never really thought of the idea REST, you know, when does it get to a point where the is kind of acceptable or it comes within tolerance or a certain confidence level? It's that amount of time, you know. How was that compared to the original plan, which is so difficult to plan things? You know, an estimation way, as humans would just know for good. But, you know, a plane that I know you had 20 years it's 45, 50 company. And, you know, I've actually been doing work with something over in Europe at the moment, similar kind of space. And, you know, obviously, they're shifting from, you know. OK. COVID 19. We can't open our doors.
We can't sell our products. So how do we get our products through the digital channels? And then you've obviously I know you've got extensive experience in performance service virtualization at the end of there as well. And, you know, there's a lot of people like Best Buy at Best Buy, like some of them I'm trying to think of the great company that went down over Jimmy Kovin. But there are lots of, you know, these that have made the digital switch. And I guess, you know, as that pivoting to quickly be able to start selling to a different channel, you know, the, you know, potentially a confidence switch of getting something out quick. So, you know, we're not used to delivering paving slabs, but we're going to deliver them to you the next day.
Well, what does this actually mean now? And, you know, I know you do really great comparisons against things like Apple. Right? And, you know, you've got this kind of trust in these kinds of companies like Best Buy. Right. You kind of feel like you're getting the best deal. Right. And, you know, they're looking after you and, you know, you can go back with the product and they can look, you know, swap exchange, trying to help you. And not just that. You can point to that like an Apple genius guy who's just got the title and he's going to take it in the back, you know, and follow the little manual, which is telling him, you know, unplug base, replace the screen disrupted, send it off to HQ.
But, you know, I find it fascinating because a little organization like that, we should work for 20 years in this new paradigm where they have to switch to selling to online channels. You know, if you were in that position now and someone was asking you, OK, we're going to sell on this new channel, let's call it Instagram for a second, you know how quickly your confidence would be that they could all get their ducks in a line and actually get the product to the market instead of those long release cycles? You know, do you see organizations really don't get the digital win and be able to have that idea of what quality means if they release it right now?
Mike Lyles Well, I think speed is is definitely pushing during the COVID situation that we're going through.
You know, a lot of people have had to close their doors and the brick and mortar stores. And so they have had to go online and had to do more digital.
And I think what I find is that many companies weren't prepared for that. You see that now from a performance standpoint, you know, suddenly my Website used to get a million hits now and gets 10 million hits, you know, or my Website. The performance against that Website is really underperforming because we never expected that everybody in the United States or the world would go online and do their orders.
Fulfillment is a big deal. And being able to look at areas where, you know, it was a small part of your business, but now it's a large part of your business that I think that's a big change. I work with the marketing services company and we do marketing. When we do, we do a marketing program. You know, a lot of our programs are mailing things to people were either email or physical mail and saying to them, take this to the store, you'll get this discount or use this coupon or something to push them to go to the store. Now, we've had to help those clients change and think about things more in a digital age. And it's really something it's kind of eye-opening because it's probably something we should have been easing into before.
COVID But COVID kind of said, OK, you're going to do it now. I think security is a big thing with companies. Companies had tight security, but they didn't have it super tight. And you can run penetration tests and vulnerability tests and really find my company's not that solid right now. And a lot of companies are doing that right now. But we were saying that you know, for the last decades, people, you know, security is a big deal. Security was really starting to stand up and testing conferences and events. And I got a really strong interest in it. And luckily for our company,
I had someone from our team who really wanted to look in the new area, you know, to build a different skill set. And I said, let's start working on security within the company. So we were paired. But I think a lot of companies weren't prepared in that space. You know, I think security performance really, really hit him hard. The speed the market really calls for service virtualization because if you know. If you're waiting. Waiting and you're trying to get quick sprints out the door, but you're waiting. And the whole product is not ready for integration testing or into testing service virtualization is amazing.
I did a lot of work for that in 2014 through 16 where we would build those simulated API so that we can do the fully into and find things before we did the final end. And I think that companies don't. A lot of companies don't realize how powerful that is to have that in place and be able to do that, you know, to plug it in, you know, whole trains there. Now we can run now, put in the right pieces at the end. And we've already kind of tested every day for less testing at the very end if you can do it along the way.
Jonathon Wright Absolutely. And I think this idea of contract testing where you've got you to know, you're delivering something which you can't do end to end, and it may be all through API eyes, whether that be fulfillment systems or payment systems or, you know, or the stock system, you know, there are all these different API, is it?
You know, I felt like service virtualization was a slow start to really I know the UK. We started in 93 as a developer. I guess shims and stubs and fakes were all part of your own unique tasks. And, you know, it was just you just got to. And yet you have in your heart into a habit. Right? I was tested. We were kind of slow at things like Y and Mark, if you're getting into kind of more automation level or if you were using something in the cloud-like mock labs or something, you could quickly put some together. It's interesting. One of my friends alone who was the founder off of Blaisme has just started a company called Up Nine.
And, you know, part of what it does is it looks at API and it models out all the different types of flows. So, you know, if you've got a swag is back or if you've got sometimes a definition document, you can actually then generate old penetration. It brings this aspect of model-based testing and testing KPIs and also negative tests and all sorts of other things. There is a lot of complexity around and moving to this kind of API slash, you know. Let's call it microcircuits. I know they're talking about nano services now, which is even chalta, so that in a spec is even it's a Nowra. But, you know, you kind of get into this play where it becomes just a hugely complex, vast ecosystem that the defendant sees in data and data.
Data privacy. And, you know, as you said, security, which is really important. You know, I just don't know how many organizations really know how well that they can't they can do it, stack up against that. And, you know, there was this quote that kind of say, you know, the only reason why you've not been hacked is that you're not interesting enough to track yet. But, you know, it's kind of that the idea is a day where people say, well if there's that kind of availability of data if you're Yahoo! Well, somebody like that, where actually that data could be really useful and have some value and potentially you could be a target.
And it really means you've got to be smarter about you do it like you said that taken, you know, holding that sacred, you know, MailChimp list, you know, part of it. Then someone saves it to the local file, puts it on a USB drive. It's not encrypted. You know some will make some changes to it. You just don't know, do you? But at the same time, if they can access that data, you know, that's could also be bad for your customers. Right? Because then they could look at the side of emails are all sorts of another safe. It breaks your brand. So, you know, do you see the data lifecycle being an interesting aspect for testers? You know, BDD to get nonbiased test data, which is really state. You can synthesize it. You can test against, you know, service virtualization or an endpoint. And then also, you know, how people treat life to test data and test.
Mike Lyles Really, I think I think data is we've said it for years, but it's kind of similar to the security and performance thing. I think a lot of people have not really taken it seriously. In the US, we have the California Consumer Privacy Act ECPA.
I guess our first version of the GDPR that will allow even January 1st of this year. I was part of the committee within our company because we're all about data where and that's the core of our functions. And with very large customer databases and large databases full of data. And really our focus has been on, you know, we know California was first and they're not the last.
You know, other states are already talking about it before, you know, the next couple of years, the whole country is going to be talking about data privacy. It's already on every Web site and everywhere you go. So when we start looking at how do we test this, you know, and we're testing functions that are really focused on data, you find that a lot of teams or what I found in the past has allowed teams will say, let's just get data in there and their test data is not good and then their test doesn't work or it does work. Unfortunately, and it shouldn't work because the data's not production data or we have situations where people are using production data.
Now you're starting to compromise the data privacy issues. So I think there's there's a lot of work to be done around making sure that we do find a way to match that data, to clean that data, to also be able to build that data in a way we can build it up without having to go to get production data. But it looks like production data. So it may be your name, my address, and someone else's phone number. So it's real data. It's just not really, you know, production data. And I think I think that's a big thing that a lot of companies are facing right now is do I have the right data? You know, is my issues are the issues that I'm finding due to my data not being correct or is it because, you know, the test is actually fell over, the functionality is not working.
Jonathon Wright So, yeah, I did do one of the things I've been talking to a lot of customers about is things like test day coverage. So this idea of, well, you know, as you would do with your automation scraps and data, drive it nice.
You know, you've got different states, you know, account open, account closed. You know, that old thing states that the business and the standard of receipt from performance, you're churning through a huge amount of data and you potentially changing the state's closing accounts, empty balances. You know, if we have to synthetically generate input, also have the right data to exhaustively go through and talk to all the different test permutations. So, you know, if you've got ETL running, you know, all those enhanced transforms of loads, you know, there's so much complexity.
And obviously, bias is an interesting one as well about how you generate that data and what it looks like. It's such an interesting area because I think, you know, I was fascinated by you. You did a workshop on visual testing. I don't you know, I see this is a big trend kind of coming through. And I Joe talks about it a lot in the FTP tools and Angie's that I've been going through your test, a PDA, which she's got some fantastic resources.
And I'll make sure I get that LinkedIn that they pull off this move towards looking at the visual front end. You know, when you look at the visual front-end, you know, you are looking at the data, the layout, and not as much on the data. You know, they could be, you know, hashtag, you know, drop tables or something could be in there. And it'd be a valid kind of, you know, middle name. Right. But part of it is, you know, do you see visual testing helping with some of the hard kind of checking activities?
Mike Lyles Yeah, I think visual testing and I'm glad you brought this up, but it's one of my favorite workshops. I've done many workshops over the last eight years. And when that one has been one that I've really done, I think in every country I've spoken in, across the many countries and many conferences. And the beauty of that workshop is we bring things to people and I share it on the screen and I say, what do you see? They tell me what they see. They're sure of what they see. And it's not what they see. You know, it's I've got an I've got examples where half the words are covered up.
And so your eyes are seeing Mike Lyles is jumping to conclusions, but they say that's what it says. And then I'll take the other half of the letters down and it's the bottom half of the letters is making the words not. Mike Lyles is jumping to conclusions. Your brain fills in those things and we get caught up in that from a visual testing standpoint. And we don't keep certain things. You know, there's there are great tools out there. We'll name them. But there are some really good tools out there that that catch it for you. The small things that really small one-pixel changes that you don't catch.
But I think to be a great tester, you have to be someone who catches things in detail and you know, exactly by looking at it. This is a problem and I'm not going to skip it, you know, because one of the things we teach in the digital testing course because I had them do things repetitively and to the point where they start forgetting, well, I've already tested that. I've already looked at this and I'm not going to pay attention to that. And that's when you get caught, you know, but choosing when you missed the one big defect or the one problem is when you're so used to it, you know, it's like driving home from work or driving to work.
A lot of times you'll get to work and realize, I don't remember that drive at all. I drove, you know, and I was functional. I didn't wreck. I didn't pull out in front of anyone. No one hit me. But I didn't really listen and think about my driving. And I didn't pay attention. I was maybe listening to a book on audio or radio or talking on the phone. But when I get there. I've done the job, but I've not really paid attention to a thing around me.
And I think that's one of the things we try to do in that workshop. And I don't think it's a big thing that testers don't always pay attention to, especially with regression. You know, they'll get caught up in. They don't pay attention because it's like, well, that's worth a thousand times, it's never going to fail. And the mess the one time you when they don't pay attention.
Jonathon Wright Yeah, I completely agree. It isn't. It's really interesting, you say, to get that kind of autopilot way. You know, you kind of everything. The door shuts down You don't really realize you've just driven, you know, a thousand miles.
And I know you did. You're a psychology major. Still applying some of that cognitive kind of thought processes and patterns is really useful. And, you know, I was kind of looking at this kind of cult, you know, this idea because, you know, I work with a large piece of manufacturing company across different countries in Detroit and Australia and into other countries, all doing things slightly differently. But when they go into regression, this was something we kept from seeing kept on saying that, you know, they tested would then, you know, fire up his favorite private eye, which was always Firefox.
And, you know, with all these little plugs, a plugin, which he loved. Or she loved it. I mean, you know, they go through and they don't have their favorite pizza. You know, they go through would use the dummy credit card so that the dummy account, you know, and, you know, the path was actually quite limited. Then when you call it saw something weird on safari or you saw something weird on, you know, now that's, you know, the edge browsers turned into Chromium. Right. You know, part of it is completely missing. And then it doesn't work. And you see that issue coming out.
And, you know, I find this stuff fascinating. A lot of my focus, though, has been on kind of the shift right aspect. Well, you know, you can get information. Like, I was talking doing gamification with a job. Yes. So talk about Session Cap, which is an app that you can put into your Web site. And I'll tell you the heat map of where people are going and, you know, clicking on. I remember a test company in Australia when I worked out there. We used headsets with that test to see what they were looking on the screen.
You know, if you should get a beautiful heat map and you typically will be on the menu system and it won't be on, you know, the icons and maybe some other areas which are less focused on. But overlaying that with the code complexity of areas like Amex has got a four-digit validation code compared to Visa Card or Bostik or something like that. PayPal. This is Apple Pay versus fate pay. You know, all these different permutations. It just seems like, you know, you need some kind of assistant, you know, input from Bob Chappell or something to kind of say to you, you know, only 12 percent of USD Chromium because that means X because nobody likes it.
But, you know, maybe you should try it this time and in that or, you know, a lot of people seem to be clustered towards this venue or this particular item or this tax, which is red because, you know, it's that kind of danger. If you look at it and go must-read that because you've read it must be important. You know this to me, there are so many different things. I mean, accessibility. I think, you know, that's one of the things that I lost. Well, you're saying about, you know, your audience and age, you know, this persona is around different ages.
You know, they're used to a pamphlet, you know them well, used to go in on a tablet and having that the font size set to the maximum while the DPR rate up to the maximum so they can see it. That then causes also problems with icons not being accessible low. Buttons being too large. You know, it feels like there are so many different applications for visual testing. And, you know, we've got probably going to see some big developments in that space. But you know what else you kind of see as kind of a change in that in a kind of The QA Lead landscape?
Mike Lyles Well, I've heard a lot of people say that I've talked to in the testing community that design, you know, skills in design and user interface user experience used to be good to have now. You've got to have them and you need them as part of your testing role. I've seen cross-browser testing was good. And some people used it.
But of the companies I worked for, one of the companies I work for was a very large retailer that had millions of people using their site online and buying a lot of things online. And when we started doing research with our marketing team, we found that there were fifty-three different browser types extend versions out there. So we went out GitHub itself. Labs started using their product and where we would run one test automated or manual against 60 browsers at a time.
And that quickly give us, you know, cut your testing time by 60, you know, divided by 60 because you didn't have the test feature then, but you really had to test all of them. You know, there were a couple of browsers where, you know, some person in the middle of nowhere is using an old or a very, very old version of the browser. And the chances of them going for that one thing and having problems, probably slim. But if you've got half a million people using Safari or, you know, half a million people using the latest version of Chrome, then you have to make sure you're hitting those. So we ordered our browser versions and really sort of hammering it.
And it's something with my current company, we do a lot of work with different clients that. I do have large customer bases and a large online presence. So we've started to give that as a service to our clients and say, look, you know, we're not just going to test, you know, on one version you like. You know, if your whole office is using chrome, you have to understand that there are people out there using Firefox, you know, Safari, and other new browsers that are coming into play. So I think that's a big thing and really the interface user interface. Why do I think I think accessibility it to me, accessibility is going to be the Next data thing.
You know, data is the big thing right now. You know, data security and performance. But accessibility is right on the heels because you have more people that have disabilities that need to be able to use the Internet. I mean, you get you to know, who would have thought years ago that my parents would be on their phone more than me, you know, or as much as me. But now they are you know, everybody's got a device. Everybody's using devices. The Internet is not just for the middle age and the youth anymore. It's for everybody. And I think is that you see that grow and more people are using the products.
We're going to have a bigger need for accessibility. So my challenge to companies when I talk to them and I'm really hoping to have a talk someday soon about accessibility, but I don't feel, I mean, that space to pair up with some of the people who are really doing a lot of talks and a lot of great talks on that's on the subject. I'm nowhere near there. They're great at this point. But I believe everybody needs to be aware that if you're not already preparing for accessibility, you know, preparing for handling disabilities and people that need to use their systems but don't use it the same way as you, then you're gonna find yourself both losing customers and possibly in lawsuits down the road. So, I mean, it needs to work. And so I think companies need to be aware of that.
Jonathon Wright Yeah, that's a really think you should do that on four FTP. Come when it comes back on virtually or whatever in the .NET next time I see.
I see. You taught me a valuable lesson today. So I was Googling you as by noble kind of Google kind of see interest leads through some of the, you know, the books and publications you've done your WordPress. I kind of flipped between your WordPress and your trip reviews, your food reviews, and whatever other both of you were very good. So I was kind of going between them. But actually, when I quit your name and. Is this the banner came up, but it said, you know, author. And it was. You know, I've never seen it before. It was a knowledge of what they call it knowledge summary.
And then you can claim that knowledge. Some read like you would play a location on a Google map. Right. And it was interesting because obviously, the data was in there. You know, some effect obviously was right. So if it wasn't right, like Michael Bolton late to the singer, Michael Bolton. But he didn't have you in his link. So you covered okay. On that. You know, part of it is, you know, you start building these relationships up and, you know, I can then my ask Alexa, Alexa, who is, you know, Mike. And they put up that little banner and say, you know, here's your data and it read out to you because that's how, you know, the virtual personal assistant.
Would Kopay take you for what was Wikipedia to, you know, into that kind of format? But if that information is not correct, you know, or is not accessible, you know, that's you know, suddenly all these things start falling into pieces. So you thinking of all those different browsers and configurations, but you're also looking at, you know, what type of devices and other in an old tablet. Is it you know, something that is is is hugely complex today. And I was kind of I'd be doing that. It's worth checking out Lighthouse if you've not checked out Google Lighthouse full for accessibility.
There's an accessibility store, it's the score, and some recommendations straight off there. It kind of does hold the page speed stuff as well. But you know, the accessibility Siffredi good. And what I was doing that on a company that's a milk merchant, you know, that sells home improvement merchant. And it's interesting because they just published, you know, this really good talk at a conference around it moved this new graph, DBI knowledge graphic and you out. And I'd spidered their Web site using some security tools using like the worst kind of security tools. And you put Swiger.
And I kept on fighting these errors, which was product, not fact. So you'd click on a product and the product page, which were supposed to appear, just didn't appear. Right. And I thought, well, they listed that they got 87000 products online. So I'll try 87000 products. And I sent a message to the guy who responded back to me. He did a really good conference presentation. I said, out of those eighty-seven thousand, I could only find twenty-six thousand products. And I got 13000 products that just came up with this error. You know, you think, well, that did we didn't use to have the capability.
I don't care to start an I going to type head, you know, the best, best Ops followed by, you know, brush followed by, you know. You know, we never have that capacity. That was kind of what automation was promising us, was that kind of capability. But, you know, GraphDB well is more of just an API now. So if you fly it, API is with the same product name as the search the query language like you would have done a sequel database query back in the day, which you seemed to disappear as well. You know, you'd get this build response, right? So suddenly you're thinking to yourself, what land should they be testing out? And, you know, have they tested with the right data?
Have they gone through and checked? Why? When I type in, you know, scissors, it doesn't find scissors. You know, it seems like such a basic thing for people to get wrong. But, you know, I talk about this kind of confidence area is, you know, the confidence of what if they just say, like you use you said, well, previous podcast, Jitter, that generate 10000 test scripts. They're all not very good and very useful. They have to be kind of focus on the visual testing of the API testing. Do you see more types of levels coming through, accessibility, testing, where there's actually more breadth of people doing not just focus it on automation, UI testing, for instance?
Mike Lyles Yeah, I mean, I think one of the big things that I'm looking for when I talk to the testers, when I interview them and I find out many things, I love the interview process and sometimes I let it take a little longer because I like to meet different types of different styles of testing. I worked in the big companies that followed the certification and the process.
And you do step one, two, three and four. And you take a side note there. I've worked with these people and I've known the people who worked in IoT. I used to be. And I'm now I'm so forward to one side that I'm against that because I think I heard someone say at a conference, I think it was I forgot who it was now. But they mentioned that you know, whether you agree with it. The certification process rises to cubie or now. They've got some great documentation around having standards and in formats, you know, doesn't mean you have to do it. You know, I compare it to getting a driver's license. My son turned 16 a year and a half ago, is 17 now.
And when he turned 16, he got his driver's license. That did not make him a better driver. It didn't make him a driver who's been doing it 30 years like me. Who is going to know that you need to take that extra look to the right before you pull out or you need to watch this person because, you know, history has shown me this person is going to pull out in front of me because I can see their car rolling. He's not going to know there are things he's going to know the signs, the road, what side of the road to be on, when to stop, what lights to look at. He understands the core things, which I think is needed.
But where I'm going with this topic is I do I've learned from a lot of people in the testing community. It's really helped me to grow by listening to other people talk about what they do. Of all the things I learned in school, all the things I learned in my job, all the things I learned talking to in just a couple of friends, I didn't learn anything like I've learned in this testing community by meeting people in advance and hearing their stories and hearing what they do to do things differently.
And I got hooked on this context, ribbon testing with James and Michael Bolton. And I get their part. I get other people's ideas and I kind of blend them into the way I do things. And I think a lot of people are so hung up to your point on you know, we expected things to write the scripts and then run their scripts. The problem is, I think a lot of people got hooked on, OK, now I've done these 10 steps. They all passed. We're good to go.
And it's like, OK, I've done everything and my driver's manual, but I'm not good to go because I didn't realize going to come running down the street and right in front of me and I have to take that decision. So I. I asked testers when I interview them, you know if I give you a project and I have no requirements document, what do you do? And many of them who don't work with me will say, I have to have a requirements document or I can't do any work. And I'm like, that's really. Is that true? And they're like, absolutely, I have to have a requirements document. And so I showed them a picture of a TV remote. And I say, tell me how you would test this remote. And they go, you know, I would press the power button that would make sure there are batteries. I would do the UpDown volume button, that would do the channel buttons, I would do the play on the DVR, the record, all these different things.
And when they get done, I'd just calmly say now that I give you a requirements document for this controller and they're like, no, no, no, no, no. And they had some you know, but I had one person just recently, the interview with me say, yes, but I know that controller. I know what a controller is. So it's easy for me to do it. And I said, and that's what I want testers in my organization to do. I want them to know the product. And you can't have confidence in something if you don't know what that something does. You know, if you hand me a controller, I can tell you pretty quickly if I have confidence that that controller is gonna work on my TV.
But if because I know what a controller is supposed to do and how it should work. So I think a lot of testers get so many testers to get it caught up in that. I've got to have a system. I've got to have steps. And that's what I love about the exploratory side of testing, are you go down. It's like Mad Libs. The old Mad Libs books we had when we were younger, you know, you read your story and you get to this point. What do you want to do? Take a left. OK. No, go to this page and read that story. Now, when you go there, it will take it to another page.
To me, that's what testing is. It's exploring and it's information gathering. Another question, and I know I'm running over here is the I ask people what is the role of testing? Many people will say to find defects and to find bugs. And I said, no, that's not the role of testing. That's not the role of testing any more than going to the gym is to sweat. It's something that happens. But I go to the gym to work out and to be healthy. Sweat just happens. Defects just happen. And the role of testers is to explore and observe the application. And as Michael Bolton says, many times defying, is there a problem here, you know, and combined with is the stakeholder getting what they ask for? So I think the big deal that we're trying to accomplish here with testing really should be around.
Have we done everything we can probably do with this product? And if I don't feel confident, I need to start asking questions. I do a big exercise on why. And I talk about how kids, you know, sometimes when they're young, they'll say, dad was. Why is the grass green? And you say, well, it's because of this and I know why. And then you answer and then they'll say, why? And then it's a cycle. And I tell my testers at the conference, there's an audience I speak at, continue to ask why until they're almost like a parent who said, I've had enough. You know, I'm telling you everything I know.
But I think. If you don't feel confident to even test it, then you need to continue to have questions to get that confidence and then be able to test it in and observe whether or not you really are confident that the products to do so. Always ask questions, continue to ask questions to the point where you feel like you may be asking a couple to through extra. But I think I think it helps you as a terrorist to continue to grow that way.
Jonathon Wright No, I really like an interesting site. I've got some kittens at the moment. And, you know, the kittens are obviously learning and doing potty training at the moment.
But you know that their mum doesn't use a pot, you know, just use it. She goes outside. So there's no reference point. You know, they can't say they win the food and they eat. They let a tray is a kind of where they are at the moment. But you know it just because there's no requirement. Stop that. They're not as good at Googling things as I am or ask it, Alexa. But, you know, it's interesting what you say about the remote control thing. So I'm actually adding the fitness stakes. I got through the post today. A switch fat, which is you remember the. We felt that way. So you stood up. I'm saying this is the same. It's like a big round ring.
Right. But, you know, there are so many sensors in that, as you know, a bit like IoT, you know, you can push and you can, you know, these. And I ask and I'm like you would do on remote control, but it's also, you know, proximity and gyros and all this or the kind of complex area. And, you know, if someone was kind of saying, well, this, you know, how would you go about testing that and say, cutaway? You know, if we do go back to your VHS days or your Betamax days, you knew that if you get some fast-forwarding and rewinding, fast-forwarding and rewinding reliability might be an issue.
You might. The type might. Right. So part of it is, you know, does this kind of site reliability kind of could be thought about, you know, chaos and Java? You know, it's one of the guys on the show, Colten, who helped with spearhead and cast from Key to Netflix and Amazon. You know, there's this kind of idea of, well, how do I actually cause the errors in the machine? And he's got he's a founder now of a critical gremlin. And that they do that they kind of go, well, let's say I'll purposely make it so the bottom three sticks down. So we know you know, it's not until you've spilled something in your remote control, you don't get that sticky three kinds of button coming through, but you can emulate it, right.
A bit like service virtualization. You can purposely create a fault called a return service virtualization response that will make you have to go through and say, well, what happens if you know? I think that's a really interesting aspect because I don't think we've ever had the tools to kind of. Yes, we've had, you know, OK, if you put in this dummy account number, it will give there's no money in the account or rejected or another type of error code. But you know that pay off into connectivity of so many different systems and having the flexibility through a kind of a testing API, open testing API that allows you to say, OK, bring this system down while I'm processing the data, let's see what happens.
You know infrastructure is really interesting because we're always thinking about it in this kind of Happy Path kind of everything. So everything is working. Not in this. Okay. Systemic failures starting to happen. You know what? Systems go down. You know, my Internet goes, you know what happens on your phone? Your phone stops working. The brother, the Website guys. But why do we even have cash when you know we can't carry on? Looking at the page, it just says, whoops, snap two or something. It feels like the resilience of products is also really important because that's the kind of the Twitter social media brand damage that people are saying, I can't get to Best Buy because of you Website style. Yeah. You know, it feels reliability might be one of those kinds of things and also this kind of idea of chaos, engineering as well. You know, do you feel that that kind of comes into this, what you were saying about this ability to understand the mechanics of how websites it bill, the architecture and everything else is now important as well?
Yeah, absolutely. And, you know, one of the things I liked your point on, sometimes it's effective and sometimes it's responding fast and sometimes it's not. My first service virtualization tool was Paracel. And what I liked about their product as they had an environment management tool where you could see visually, it's kind of like a Visio diagram of how the systems are interacting in the lines between them. And you could go to this system or the API or application and you could click and it would give you a drop-down and say, I want to use the like system or I want to use a positive response always or a negative response always. And I like that because you could quickly change to make this thing reply with a negative result. Two things that I saw as part of virtualization, also that they're really back in 2014.
It was not as necessary as it is now, but I think it's more needed now. And it was this. The ability to say, I want to randomly perform, so, you know, the first couple of hits, I reply within seconds. The next couple of hits, I wait three minutes and then then the next hit reply in a minute and a half. So it's the system that is not like I've got to handle this at the peak performance or the lowest performance. I'm going to handle it all over the place. So how does your system? And we saw systems that would not handle that. It just didn't it didn't look like the schizophrenia of the responses. And then another thing that really came up to me, and I think someone in performance said to me is we were doing in virtualization then it never hit me until we started doing it this way.
Some apps like we always think of performance as take it to the load, you know, to the peak and then watch it break. And that's where our focus point. You know, we've got a baseline and we run it into the highest peak and we say, OK, it broke at this level, but we never think of it the other way. What if your application is is is not prepared for a fast response? So if I want to finish something and I expect it to take two seconds or one second to respond, but the service virtualization responds in a split second, you know, like a millisecond. And now you've got thousands or millions of transactions responding in milliseconds. And that application, like I was ready for that.
It was a ReactJS send it, but I'm not ready for that. And we saw a lot of situations where you were actually able to what I call it, was reverse breaking the performance side of things because it's like you're too fast for me. And I think we see that. I mean, I don't think any companies out there are saying that's a big problem right now. But I think it could be you know if you're running hits and you're to the application, you're using the mobile app or you're using the Web app and you expect things to be right and things fail because of that, then psychologically wise. And we talked about psychology here. I think I did some research and saw that a lot of people will delete an app right away if it doesn't perform well.
You know, there's a lot of apps I put on my phone and I'm like, okay, I will give you one chance. You know, once you start flaking out on me, I'm not going to come back. And something a lot to give me back if I come back to you. So I think companies have to deal with that now because performance and having the right responses and accurate data is a big deal. So I use a lottery. I play the lottery a lot and enjoy it with the scratch-off. I'm not a big million dollar winner, but every now and then I win a few hundred dollars and I use this phone app and the phone app will never stay connected. Now I need that app, so I don't go away from it.
But it's one of the few that I'm like, I like to use the app, but that thing flakes out every single time. And I'm thinking you're probably one of the richest divisions in the whole state and the whole country, and you can't make her out, you know, be high performing enough to be dependable to me. But I have to have it. And I think that's why they don't care, you know? And then you get companies that do that where they're like, well, they need me, you know. You know, Facebook, Twitter, and Instagram. People want that. So if we have a few issues, it's OK, you know. So I think you see companies doing that, too, as well.
Jonathon Wright Yeah, I used to work for the Lottery Commission in New Zealand.
And I'm actually doing a book which I think you might have contributed to as well, which is an A lean pub book, which is called 80 Round the World in 80 Tests is a subsidy, is it? I'll send you to take care. Michael Bolton and a few of the people that did it and I said I'll do a chapter. And I remember what I wrote a chapter said that Dorothy Gray experiences and test summation book and which I know is you've got that league table. So Iran's digital handbook sounds like a great late-game that began. I was when I did the book with Dot. You know, I was kind of guy to the cab. It talked about BC and LOCKSS about all the possible quotations using model-based testing.
And one of the big things we were doing, and this was about 10 years ago, was, you know, we'd use production, we'd shift right by looking at the production logs and tell them the logging, which was flex in those days, Flex Day services, putting every request-response pad was an X amount message so we could read the message and say No. Eight, 14, twelve, whatever. Also, between each KW response, we'd say use one more month. It's taken 40 seconds before they submitted the Next screen or, you know, they played on the bowl and wait for the bulls to come up with the shows.
You would see you get all that. But we'd also see through the day that in the morning people were buying tickets, one o'clock, droll people were checking tickets. The behaviors were changing. We tried to model that using like for like the product, which was really interesting. And now you have this weird lottery system which used the air temperature as something to work out, a random number. So it could never be, you know, it seemed to be done. But it was really interesting because we used to do the biggest jackpot, which was some 60 million. New Zealand is something we did the same kind of realistic low profile, which was huge. It brought down the system and we ended up having to flip down to a version which was had fewer CDI calls.
You know, it was a less functionality, just purely so that you could actually scale because, you know, it was just the kind of the challenges back then. But, you know, you're actually right. We used to find people in South Island and on a fifty-six K connection modem connection and on some old AEF 5.5 browser, which we would like. You've just got to go because if I don't have flashes and of life this year. But, you know, it was just, you know, this is causing too many problems. And obviously, they mentioned visual testing and things like you use will render Astrolabe text differently. You know, Damian, you know, until we zino how it brings out, you know, how good it is is going to change what it looks like. Right.
So compared to one set up to another setup. And our problem with Fleck's was we used to have this thing called the bullseye where this guy would jump off a roof and lovely the ball, you know. Which was great. But it ran really fast on a fast BC really slow on a slow precinct. You'd have to wait for this guy to frame by frame, jump off the thing. And then people were just like, we just do it right. But it was so interesting because performance issues, the same kind of thing we like for like production system, which we're really lucky to have. It was the prediction days off for a bit. But this idea of, you know, testing things locally without that kind of SVOD network virtualization to give you packet loss, Jitter, all that kind of that poor connection in South Island where they're going through 50 jumps before they get to you.
You know, that level of complexity changes things. And one of the things that I was I was fascinated when I first heard Todd DiCaprio talking about service virtualization. He came in, came from Shinra. And, you know, partly what he said. His mobile traffic adds a 40 percent load on your back. And I was like, that's just nonsense. And what he was trying to say was actually the six hundred millisecond round time hold resources in production for that amount of time because it's waiting for it to either time out or to get a response. Whereas if you've got a forty-three-millisecond desktop, which nobody has any more of those jumping through, you know, multiple badly configured routers so that a house can get full connectivity and everyone's at home at the moment, and then you've got a local ISP, then you finally get to the Internet.
By the time you've got through that, you've added all sorts of extra, you know, jump through that. And I just think, you know, when people think about that and then everyone gets back out again and now five GS Chromium and the idea with five G were eight more devices, more bandwidth, more speed. Also, reduce that millisecond round trip. So that means actually things are going to hit harder. So 60 percent of apps are coming through a mobile thing and people go, well, we're ready for mobile. Well, we're ready for slow mobile on 3G and 4G. Now we got 5G. It's likable to take this off desktop again, that you can get hold of resources for a lot less. And then, you know, I just finished doing a course on eight to be us, which I'd always been putting off because it's painful. But, you know, the load balancing and an auto-scaling mean.
Yeah, it's gonna spin up a container that can speed up 20 containers. Right. Based on a threshold violation of our science C8 CPO. So that could potentially start impacting response times. But auto-scaling is going to take time. Take time for that container to spin up, to get configured, to be accessible, to be round-robin, or however it's been configured to low balance. And, you know, people go, well, we've got cloud now. And I know you've got Wilson Somalis and a couple of other people on the performance landscape for them. And they've kind of said, you know, I've seen the debate is about, well, we've got the cloud. We don't need any performance anymore. But quite the opposite because you've got Soor and things like that.
You kind of mentioned it with things like serverless architectures where, you know, this idea is you're sending off a test. So, yes, your response might be 40 milliseconds, but actually then going to do some compute power, which could be uploading a video link, processing it, and that might take 30 minutes, which is holding resources now. Yes, that just fires off and it uses Amazon's compute power. It's fine. But, you know, for those people who haven't got that kind of infrastructure, you know, those kinds of capabilities mean that that noise is going to be happening in the background and using a processing power while things are processed in batches or however it's done. And I just don't think people look at it like that.
They look at that load, like you said, they kind of go, well, how hard did we hit the eye off Savard until it goes down? Well, that's proving just the framework of whatever you've used to implement, all the people who provide you the altercation gateway, you know, but we've never got that kind of realism of going well. We've got this amount of traffic. A process also orders that kind of going through the system at the moment. Plus, we're going to, you know, like you said, abnormally or sporadically at something that nobody would expect. You know, everyone's coming off the tube or all come out of a football ground off the NFL final and decide to hit it at once.
We can just those edge cases feel like that they're ready, they're ripe for that happening and get the publicity of sites going down and systems failing when they're already in the cloud. Why should they fail? Just this plus, for instance, it's scary. We know Disney doesn't have the infrastructure. But, you know, they went down and that's brand damage. They, you know, DGSE that be the big in the companies now, Brandos. You know, you know, technology because we kind of certain technology, like you, said, Facebook, Instagram, we're kind of. Yeah, we get it. If we can't get it is not the end of the world. But we're not just rage. Quit and delete our account or move to another provider like MySpace because it's not they have any kind of thing.
Mike Lyles I agree with you. I think people do hold onto things and they know it.
You know, you've got companies who have eager applications or websites or mobile apps that they know people are not going to leave us. You know, if TikTok today that everybody these is my kids who are starts having issues and it gets down to a grind. Kids are still going to use it because it's like that's the avenue that people use to communicate with the world right now.
And it's so many things going on there. So I think Picard knows they've kind of got that. I don't know how their performance is. But, you know, they, for example, Snapchat, the the the tools that are being used by the next generation of people in this world are really they're having to scale and modify and move with the changes. But I think that's I think you raise a great point. It is true, Brand.
You have it depends on whether or not you've established yourself. You know, if you're well-known, you know, it's like being a struggling actor versus being an actor. If you're an actor, you might get in another role. You know, people are going to follow you. If you're struggling, you may only have one show and then you're done. So I think I think companies have to look at that and say, yeah, I'm established. But I also need to continue to keep my brand in place because I see a lot of people complaining about products and services out there. And he does. We'll get to a point, no matter how big you are, that people will say we're going to take our business elsewhere.
You know, I think Facebook got hit by the things they've done by Snapchat and Pictogram surprised them about them yet. And that's what they did with Instagram when the Instagram kind of took them over. So I think you see that you either biome or you or you lose to them in that situation. So I think, yeah, I think that the world's moved so fast. And that's the thing until my testers and I have a talk called testing is not a nine to five job, which causes a lot of havoc online when people see that because they're like, I'm not going to work overtime. It's not about I said it has nothing to do with working overtime.
It's do you go to work? And that's the only time you do your testing process in your work, or do you think about testing all the time? Because I'm test driving a car. The dealer hates riding with me because I test everything in the car. I'm testing the eBay. I'm testing my phone and testing my refrigerator now. And so, you know, I challenge folks to continue just to grow. And Stephen Curry said one time, nothing fails like success. And what he meant by that was you've got one hand here and then another hand touching it. You're successful today because you're able to meet the needs. But when the world changes and work continue to grow above, you're down here now. Now that success is a failure. So I think people have to be able to evolve and grow with that and do it.
Jonathon Wright Yeah, I, I, yeah. I think you made some great points that a and we will cut a standard of empathy if it was you know, the TikTok.
Great example. Right. It's, you know, the reason why TikTok exists. Because of Jan Alpha. Because I know people keep referring to millennials. I mean the millennial phatic back by a month. But, you know, the such a large gap that actually misses GenZ, which where you know the kind of app we have, the X books, generation millennials, GenZ, we're now on Gen also, which are about 12 or 13, which are that kind of still that kind of age where, you know, if you can't deliver a message to them in eight seconds, which is kind of tech stocks kind of lets their attention span disappears. Now, performance, you know, the old myth around.
Well, after three and a half-second, four and a half seconds, people navigate away. You know, I'd be doing some site audits which we've run up the will do so recall every year. And, you know, it does matter what Website I look at, whether it's, you know, as you know, CNN or, you know, Google or Facebook, you know, it is 30, 40 seconds. By the time it becomes done, done, complete. Guess you might get all your stuff within three seconds. But actually, by the time it's rendered on the browser, you know, it's accessible and intractable.
It's too slow. And, you know, but they mark you know, they created that product for a generation where we've got that tolerance kind of saying that, you know, once we're getting over a certain amount of time, well, we have to try it more than once. So close the gap more than once. We've got that tolerance. But the younger generations down. They'd just be like, well, I don't know. I'm not waiting that long. I think that's interesting because our testing mentality also is sembler, isn't it? If you think about the, you know, putting yourself in a different persona of a teenage girl, which I doubt, you know, for obvious reasons, but, you know, perfect from kind of a biased perspective.
You thinking about all those types of different personas and what they're looking for from the experience compared to other ones. I don't think, you know, we've also got architecture which gets out of date quite quickly. You know you go from service every day. Take it architecture, client-server. You have three-tier two. You know, now microservices, you know, they build it and the product grows. So PayPal by, you know, through acquisition for X.com and Elon Musk's company, you know, there's a certain date where the technology and the mainframes all kind of coexist and then they stack stuff on top of it, you know, try and create disruptive products that they give to generations. But they have all that heritage.
And, you know, the legacy is your legacy kind of thing. Whereas actually, maybe that platform isn't suitable for them. You know, I try to advise a company yesterday to actually continue doing paper versions of the magazine over e-books. And it was because the average age is people who, you know, over 70. So, yes, they may be using the phone, but, you know, they're used to that. If 50 percent of your traffic comes from people looking at the magazine and typing in the address, then that's still your most effective method of actually communicating with those people. We see as well that will just make everything digital because that's what everybody wants.
But if people are having this particular heart disease are over 70, suddenly all like COVID, where there's a certain age difference when a less is less important for a 14-year-old to be able to diagnose contact tracing versus somebody who's vulnerable. Right. So, you know, I think that kind of generation gap is not everything that should be digital. How do we perceive to be if we have to put ourselves in that kind of what is the market, what is our actual cause? It's so difficult. And I think a lot of organizations are stuck in that. Well, that way of saying we know who our customer is. We know we were a Home Depot or something. We know this is our age group. These are the people who buy them. And then you show them what it is and they go OPs.
I had no idea that we suddenly got all these young 20-year-olds buying our stuff. And, you know, now we actually need an all platform that you hover over in the corner. It shows where the product would be and what would happen if I did it in some more stuff and I could build this more lavish kitchen. You know, they need these tools to interact with different generations. And I think that is that's a really interesting one because we're very much it's like you said, there are different skills that people have as test tested. So kind of telling them to test like a 14-year-old might be a really different, difficult challenge.
Mike Lyles Yes. Yeah, I agree. And the other thing I see is, is I think it is really knowing your market and knowing who's out there and knowing who your potential leads are as well. Because, you know, all it takes is that one YouTube video by some teenager. That is this app is awesome. And then your app is awesome. I mean, by the way, looking for that one teenager or that one YouTube girl who's got millions of followers. This is this book by Mike Lyles is amazing because then I don't have to work anymore. But I've never met that person yet. But I'm trying to find them.
But I do think it's really, you know, the viral side of this, you know, being able to figure out what is making people go to, you know, drawing them to certain maps or certain products. And really, how do you market to that? And if you do market to that, had you tested for what you're going to get when you do it, you know, be careful what you ask for kind of thing. The last thing I'll tell you is I see a lot of like you test a lot of crowdsourcing companies that are really. I've always been signed up for your test, and every now and then I'll get a request for them. The other big one for a donut shop here in the US, Dunkin Krispy Kreme Donuts.
And they wanted me to test the app. I'm like, I'll be glad to do that. No problem. I just don't have that. And you had to buy it online and go to the store and get the donuts and then go see if it worked for you. And they were thinking to give you the donuts for free for doing it. It was. That was my pay. Good pay. But I'm fine now during the cold situation. I think a lot of companies are starting to know maybe they've laid a lot of people off and now they need to get staff back in place. Things are picking back up. And I see a ton of emails now that I didn't have years before.
Over the past couple of years asking me to be part of the crowd testing, you know, what do you know about power services or environment services? What do you know about this service that's sawing which I've had for this week, you know, and asking me to volunteer to do testing for them? So I think you're about to see something new and then maybe I've called it out and we'll see how it goes, where companies are going to start using more the crowdsourcing, testing to supplement the fact that they don't have the staff that they didn't have.
A year ago. So it's going to be an interesting age because now you've got people who you sort of know-how they test. But you don't know Ali Simian Army. I've never met you. You're gonna test my application. What does that do? It can be an interesting thing to look at in the next couple of months.
Jonathon Wright I completely agree. I by Eurostar talk. So before Eurostar became virtual, it was going to be the Zoom.
So I'd kind of talk I pick the topic around crowd testing in the wild. And again, I'm a very big advocate of crowd testee and I'd use crowd testing for a fashion app. That was going to because I'm not a 14-year-old teenage girl, I wanted the right demographic. Right. And, you know, part of it was really interesting to experience to see, you know, just what level you actually I would like, you know, somebody who's got pets, somebody who's, you know, enjoys golfing, you know, getting that kind of context-driven aspect of the crowd tested to test it a different way. And, you know, I think it's actually fascinating.
I think you're actually right. You're going to see this move of where there were resources. It has a center of enablement, the center of excellence, whatever it was there, which has disappeared. You know, going to have to think, well, how do I make that available? Crowd testing. And I think crowd testing is a kind of massively is missed opportunity so far because it gives you that ability to flex up and down, you know. And I just think part of the bit, which I really interested in is this kind of testing and production. And, you know, I'm doing it at the moment with the M.I.T., working with the guys at M.I.T. for the Safe Paths contact tracing map. So at the moment, we're doing crowd testing in Boston, strengths in the case.
But before we'd be testing in Haiti. So Haiti was a bit more difficult. Boston was really easy because I just rang Iran and said, can I borrow your Google? Take out the data already suggest. Yeah. So give me several years of him walking around Boston. He lives in Boston. So I had all this data which I could then use to sell Bill and building my tests to stop feeding in the GDP X locations. Oh, he's gone to another Starbucks. All that, you know, end of the phone will go in the same location. I've got some kind of useful data. But obviously, we're testing that at the moment on a live system. Now, obviously, people have always had issues about that. It probably doesn't come time to go through it.
But, you know, part four is obviously we flagged to say that this is a test account and this test is using the same infrastructure, but actually is testing small amounts. And we're measuring that using what kind of reconstitutes experience analytics puts Splunk and, you know, Abdeh and all these kinds of tools to say, well, actually, let's have a look at the crowd testers. What are they doing? What are the journeys? They're complete. What the session's looking like while they're obviously on different versions of the application because we use it test fly at Google Bisa, you know, so Kafka is they using new features in you know, in life and we're seeing how they interact.
It's really fascinating to watch because we're all volunteers working with M.I.T. and we've got two thousand five hundred testers and coordinate in those guys from things like localization to, you know, to kind of you just basic journeys and exploratory. You know, it fascinating because I think that's what's going to happen is the donut example which you gave us a great one because I think, you know what? How cheap can be? Don't give him what donut. And you know, each case to save IoT. If it was Amazon near in the old days when Amazon started and they give you a pound voucher, it was brilliant. You know, you introduced everybody you knew to Amazon because every 15 and you get a free book. But, you know, that doesn't cost anything for an organization.
And I always feel that there's a missed opportunity, that the social analytic data, things like stuff coming from Twitter, YouTube, wherever it is where people go, I don't like this app or this happens, you know, why can't I trace that back and go, what? Why did that ever occur? Why if I got a screen here, what's this time out? You know, you start pulling that kind of weird errors that you see on production. And I always do that. I was pulling them out using the sentiment analysis, M.I.T. libraries, and just pull over to see if it's a negative response to our Twitter feed. And then we use it and they drop it into Slack. And that's really interesting because straight away you go negative, negative, negative.
And obviously, a lot of people say negative things. But straight away, I'm kind of guy, OK, I don't know what that is. I don't know how that error happens. And then we start trying to build out with a real product to see your test is all you people are using your product, but are you monitoring them, are raising issues that they're finding or you just kind of reach your customer service staff, kind of go, no, I'm really sorry about the experience. You know, here's a free download and I think that's a missed opportunity as well. Tying that and to cut it the opportunity with crowd testing, I think is going to really help flex the resources as people come back.
Mike Lyles Absolutely. Yeah, I think that you really went on the head. And I think a lot of people, we have to decide whether or not people can handle their constant change. You remember back in the early Facebook days, everybody complained to Facebook, just changed the color of AP. Or, oh, I hate this new status. I hate this new. I feel happy or sad, you know, options. Why don't you have something more than, like an accommodated some of those things? And then and then everybody would complain for a couple of weeks. Any time a new change to the apple would come out or the new change to the layout of the application.
I just don't hear that much anymore. People are busy worried about other things now and using Facebook to tell you about what they're worried about than they are complaining about Facebook. And it hit me when they said that. I think you get some companies that their brand is so strong that eventually, you can do anything to me. And I'm just going to accept that. I'm going to stop complaining that you change things every day or every week. And I'm just going to deal with it. You know, I noticed on the Facebook app, I used to be a research and my search is gone. I don't know where it is.
You have to do creative things to be able to search. You know, if you want to hear where's my friend, you can't search anymore. You have to. It's not there. I don't know if it's just my app or this. Just this past week, I follow that and I'm like, how many people are going to complain about it? But I've not heard one person say, you know, we're just like, I'll figure out another way, you know?
Jonathon Wright And it's interesting because, you know, it is the same with dark launching.
You know, this idea is, you know, they may have launched you know, a version and a canary billed for your particular phone because they targeted you and your particular profile, offered you a new interface or a new lack of search, one hoping that you're going to go and look for A/B tests you, but then it redialing the functionality to iterate on. It's really interesting because the gamification group Damian, which we out yesterday, you know, they were saying the same thing, is actually what they said.
Oh, well, it's a widget. We're calling this a widget. And they're like, well, don't call a widget because they wouldn't descend. What of which is use Facebook terminology because it's so much easier and accessible and accepted by age groups. A whole stack of different age groups that were on there that a group is is much better than calling it a liking is something that's kind of readily understandable. But, you know, there's a lot of transients, like you said, you know, going from the face party, MySpace to Facebook to this what was Farmville and a whole stack of apps that just was on a big, long pace to what it looks like today is, you know, we've accepted that.
But where, you know, people are trying to do different experiences and generic experiences for everybody for their mobile app on their Web page. And I think there's going to come to a point where you kind of go, well, actually, does it really matter? You know, as long as it's in a particular format, it's easy to self, you know, get yourself around. And it also supports that age range as well. You know, I think that's where people should be looking at, not, you know, how cool new react native framework. We can do it. What cool functionality can we add? Maybe it's just getting the the the the functionality.
Right. Unstable, unreliable, and matching your brand expectation. I think this is all really interesting stuff that hopefully, you know, we'll see coming through all the testing stuff. And, you know, I've actually got a family guy called Moggs who's a is a YouTube with about five, six million views. And he's actually built his house through YouTube and everything else. And it's so strange to see influencers talk about a product that then spikes that, you know, downloads or are interested in there, even the advertising. And that's a sign with him or his sponsorship, you know, parfaits where it's such an interesting world where influence might not be technology.
Gartner analyst, which we're used to, it isn't you know, it could be just a teenager operating out of his mom's basement. And, you know, this is a completely different challenge because, you know, it changes how people download the app like we've been trying to do. The M.I.T. is how do you build awareness around getting people to do it? You know, is it better to kind of have a generic thing where you can share it on Facebook and say, oh, no, come and join, you know, and get encouraged to get more people to join that way organically? You know, I think there are lots of different mechanics now that we're not used to and we're not always exploring.
And you kind of set testing. You know, it's testing more than that, you know. And, you know, if you're Facebook, I know marketing's a great one. The Facebook load is a different browser, called it, you know, which is an in-app browser that has reduced functionality if the safari functionality, which doesn't render certain pages. Right. So if you're using an app referral coming from a Facebook app, especially now when Iran would be hitting me to say, oh, yeah, but now we've got two screens on the new Android eleven, you know, it could go into the full ground where it could go to another app or it could be using the same cumber resource.
So you've got all these kind of new challenges as new devices, 5G. And, you know, the behaviors change on, you know, how people interact with system you know there's just so much to think about. And I love kind of where, you know, you've kind of pointed this discussion is gone in. We'll have to think of a nice crowd testing kind of. Was titles said this. You know, what's the best way to kind of get, you know, for people to kind of get in touch with you and kind of, you know, find out more? You know, like your Website.
Mike Lyles Yeah. So if they go to mikelyles.me, it's a very simple page. It's about me page up there from my home page. But that mikelyles.me Will actually show all the sites in the bottom corner there.
And an easy way for them to go and say, okay, I'd like to connect with him on LinkedIn or Instagram or Pinterest, which I don't use anymore, or Facebook or wherever else, Twitter.
So all of my sites are there. I try to keep that updated. So if someone says, you know, I'd rather connect with you on Twitter or some other area, just jump on mikelyles.me. And that will also give you a link to my blog site. My book cited in all my other places as well.
Jonathon Wright I will make sure we get I'll get Moggs State to do a shout out for the book.
I'll just tell you, it makes the McDonald's drive through a post-COVID thing and you'll be like, that's a great idea that actually drives through takes longer because, you know, there's COVID now so people are queueing without getting out of your car and putting on a mask.
So there's the, but yeah it's been absolutely fantastic talking to you, I can it will have to get you back and talk a little bit more about crowd testing in the future.
Mike Lyles Definitely. Definitely. Thank you so much.