Jonathon chats with Adrian O’Leary of NIIT Technologies about quality engineering, the importance of using data, and how software testing has changed.
Related Links:
- Join the waitlist for The QA Lead online community membership forum
- Subscribe To The QA Lead Newsletter to get our latest articles and podcasts
- Check out NIIT Technologies Limited
- Find Adrian on LinkedIn
- Send Adrian an email: Adrian.oleary@coforgetech.com
Other articles and podcasts:
- About The QA Lead podcast
- 5 Types of Performance Testing & Top Tools
- Top 5 Quality Assurance Certifications
- 6 Hacks For Great Quality Engineering In Remote Dev Teams
- Automation Testing Pros & Cons (+ Why Manual Still Matters)
- 11 QA Automation Tools You Should Be Using
- Choosing The Right Quality Approaches & Boosting Engineering Efficiency (with Bernardo Guerreiro from Auth0)
- Your Data Quality Sucks (with Huw Price from Curiosity Software)
Read The Transcript:
We’re trying out transcribing our podcasts using a software program. Please forgive any typos as the bot isn’t correct 100% of the time.
In the digital reality, evolution over revolution prevailed. The QA approaches and techniques that worked yesterday will fail you tomorrow. So free your mind. The automation cyborg has been sent back in time. Ted Speaker Jonathon Wright's mission is to help you save the future from bad software.
Jonathon Wright This podcast is brought to you by eggplant, eggplant, this to test, monitor, and analyze that end to end customer experience and continuously improve that business outcome. Welcome to the show. Today, I'm joined by Adrian, all the way from Arizona. He's a black belt in Six Sigma and Lean.
So he's also the head of the QA practice. And this guy's got so much experience and I've got so much to share. So welcome to the show.
Adrian O'Leary Absolutely delighted to be here, Jonathon.
Jonathon Wright So you've got to tell me about, kind of give the background because you've managed tens up to like a thousand people. You know this you know, how did you start this journey?
Adrian O'Leary Actually, you know, it's an interesting story. I used to a long time ago. I emigrated from Ireland to the US in the early 90s. And I started out as a financial ERP consultant, and I used to implement Dun and Bradstreet's Smart Streamer Financial Applications you know JPA or PeopleSoft. And then one fine day, I just finished a project, and Dun and Bradstreet had sent out their latest allocation maintenance pack, and it crashed and burned when it went out.
So I was a bit of an expert, so they asked me to come in-house for a few months to test it. So there was a box sitting next to my desk with SQA, a robot. This is interesting. I wonder what this does. To install it and all of a sudden, oh, I'm writing code test code. And I said I like this. So I kind of stopped what I was doing as far as the ERP implementations and really dived it into the the the quality control side of things back then.
Right. Learning how to test using methodologies. And I started with a group called Interim Technology, and they had a group, about five 5000 consultants worldwide back in the 90s, which was unusual to have a dedicated, dedicated quality control qu
ality assurance organization with a different consulting perspective.
So I kind of learn the ropes there, graduated into different types of automated testing, graduated into performance testing, and then started to kind of roll out of that more into the quality assurance space, looking at Lean and Six Sigma, looking at ways to help companies optimize the way they do things well across this perspective, trying to get rid of non-value activities and waste and everything they do.
You know, I've run testing organizations for United Airlines. From there, I went back into consulting with Cognizant and I ran their automation performance Centers of Excellence in the Americas. And then I got seconded the UK, where I initially led an effort on the with Cognizant on the integration of Lloyds and HR's and the two banks coming together. And then that morphed into leading the testing practice for the UK and Ireland. From there I was lucky enough to come back to the US and join Comcast.
And Comcast is a huge cable company in the US. They not only what they do, video voice, data security mobile, but they also bought NBC Universal Systems, so universal movies, universal theme parks, and now they actually own BCIT as well. So if it's in charge of that group and we had about nineteen hundred quality engineers in six different countries supporting the the the back office, the customer service agents, the field techs that we doing them, Stoll's, as well as all the video voice and data capabilities left there, went back into the manufacturing arena in about two and a half months to go.
I came back and into the consulting arena and the global quality engineering practice for 18 technologies. So it's been varied, you know, around the world a couple of times. I lived in India for two years with Comcast. My last two years there studying IP and engineering captive. So that's something interesting I can put on my resume. You know, dealing with the Indian government and the ins and outs of doing that has was had had its ups and downs. But it was a joy. I love that the culture in India loved the people. So it was a pleasure just to spend a lot of time there.
Jonathon Wright Yeah, I must admit, whenever I go to Brunei or anywhere at any of the tech centers, I oh, we've blown away by just how far they are, you know how advanced they've got. You know, I big, big fan of CompuServe. And, you know, they'd be doing some groundbreaking tech when it comes to automation. You know, a lot of AI stuff.
They've literally realized if you know, the tech's not out there, that they'll go build it themselves. And they've got some super-smart people that go off do it. I know you're kind of speaking side of things as well. You know, obviously, you've got the automation background. You present it Star W in 2008 around the kind of the pitfalls of automation and kind of lessons learned from there.
And then obviously you've kind of moved into this, you know, this kind of test building testing centers of excellence. And you've got this new challenge which you've just taken on DC now more or less of kind of the testing center of excellence and more of the testing center of enablement, you know, as a kind of culture will wake.
Adrian O'Leary Yeah. That absolute that's you know, things have changed so much when you think of automation. And I was actually talking to Jim Hayes, and there's a very good friend of mine recently, and we were talking about automation 20 years ago and the challenges associated that there are the same challenges you have to be a maintenance, etc., but testing or quality engineering or quality assurance teams, whatever you want to call, has changed.
Right. Everybody is on this DevOps and agile road and they want the stats to suit. And so the scrum teams, they want to have the autonomy to be able to do what they need to do and collaborate. And it's a culture change. So absolutely. But at the same time, you still need a federated quality group. Right. So one of the major challenges that I've seen over the last number of years related to companies that are heading in the agile growth.
Right. And in some cases, they end up being Agile because they're not quite there yet. They still have. I would say third parties that they're dealing with that they can't be truly agile. Where I've seen them have challenges and seen some of them fail miserably is not having those guardrails in place. And when you think about it, there's you know, they think they have the autonomy. So they're creating their own frameworks.
They're using their own metrics. They're using their own tools. And you what you want to do is you want to have ansible resources that can move from project to project. It becomes very hard for testers or assets to move from one team to another on a different product. And now they have to reload an entire way of doing things, different frameworks, different tools. So we've been talking to our customers and we're I've seen other clients.
And when I was a Comcast be very successful, is having that federated and testing center of excellence where they have the tools, they have the frameworks, they have the measures to be able to understand risk, to be able to understand change. And not only that, to be able to look at those measures when a project is finished, to be able to go back into a retrospective to say how can we do better than next time? So the advice we're giving is,
OK. You're heading this route. Please ensure that you have this federated group where you have a bucket for frameworks. You have a bucket for tools. You have a bucket for four methodologies for measures. And no matter what project you're working on. Everybody's pulling from it. And in an approved bucket. So it's like a factory. And when they do it that way, you can move people around from project to project from. From technology to technology. And they can easily come up to speed because they already know the tools and techniques and methodologies that they're going to use.
Jonathon Wright And I think you're starting to see this with this kind of scale, that job when people are going through those kinds of pain points of scaling our job. And I know you've got a really good article, a quality on a quality show. It's at the CIO review, which was around the kind of moving from quality control to quality engineering. So do you believe that's that building up that enablement capability within an organization to make it work?
Adrian O'Leary Yeah, I do. I mean, you know, that was there was an interesting move there because when I came into Comcast, the president was saying to me, listen, I keep hearing testing is above Slack testing is about the best thing. And when you kind of start to peel back the onion and you start to gather data and look at data to have a conversation with a third party around how you're affecting the productivity and cost overruns and release dates,
that was this kind of the start of for us kind of enterprise business intelligence, how using data to have a conversation with third parties about income inequality, but more importantly, about how we shift right as well. How do we use the data and the tools and the technologies and the knowledge that we have and look 360 degrees around us about how we can help others.
And it really became changing the Tester's mindset from being a tester and a quality engineer to being a user advocate. Right. I'm going to stand in the shoes of my customer. I'm going to sit in the chair of my development and my business partner and see what I can do to help them do their job better.
And I'm going to reach out to them and I'm going to say, listen, I have this data here that shows me that if you could do your job a little better, I can do my job better and we can all achieve the goal of doing things better, faster and with a lower total cost of wars. So we call that spirit of shifting and actually taking quality beyond the software development side and actually going out and then seeing how our customers are using the technology, seeing the customer care agents are using the technology that we're releasing.
And then we started to say, OK, how can we tie the benefits of the quality initiatives that we've got going on associated with the SDLC, to customer experience. Right. And more importantly, to customer Xstrata, to the employee experience business, the employees that are serving customers. You want to make sure that they have the right technology and tools and data to be able to do their job, to be able to serve the customer. So that was really taking it beyond the software development lifecycle and saying, OK, we are now user advocates, we're not testers.
Jonathon Wright And I know you did a tool kit, Mobile World Congress as well, around total cost of ownership leading to this kind of innovation side of things. Do you think innovation is also really important when it comes to establishing capabilities?
Adrian O'Leary Absolutely. You know, again, it's about using doing those retrospectives, right? Whether it's within the quality organization or whether it's outside of it. So the way we looked at it is we had all these silos of data. And I kind of use the analogy as I'm going for a health checkup and I go to the heart doctor and I go to the kidney specialist. I go to the liver specialist. And I go to the VA. My overall doctor, just to say, are you OK?
But all these reports are coming out with a siloed manner and they're not giving you that holistic view of how you're actually doing. The doctors aren't working together to do to give you a health plan. So we started to look at the total cost of ownership and how all these silos could work together to drive efficiencies. And the whole thing came out of a labor review, actually. And I can't go into the details because we're a little too much.
But why are we spending so much time here when we should be spending it over here? I came out of the meeting and like. That's interesting. Why don't we look at it ideation actually through the retirement of a project and see where there are opportunities for efficiency in turning all that data together. And it actually led to process innovation.
It led to identifying tools, automation, opportunities through from ideation all the way through to retirement, where we could gather data and help us idea better and collaborate better across organizational and geographic components. But also, we started looking at data from an innovation perspective of how our customers are using the products. Right.
So when you gather all this data and you have devices that have bi-directional communication between what the user is doing and what the network is doing, you're able to see other opportunities to do things better and faster for the customer. Make it easier for them at the end of the day. So absolutely, data leads to innovation. If you look at it in a holistic manner.
Jonathon Wright And you can't get a plain old black belt, ego from lean and lean-to actually it's kind of an identified, the waste and duplication of effort and REST points across that entire lifecycle.
And I think that's really interesting because a lot of companies are great when they're investing in ending product. But, you know, when a product comes to the end of life, you know, it gets more intuits kind of support. You know, there's a lot less innovation and maybe not take capturing that data in the same kind of way because it's you know, it's not the primary focus. Part of it is to keep it.
Run the actual products and achieve them. Do you find this is the multi-modal kind of, you know, fast, agile, fluid I.T. versus core I.T. and know the safe and secure? Is a challenge that all organizations have and they need to look at it holistically, not just the product development side of things, but the actual operational capabilities as well.
Adrian O'Leary Absolutely. You know, that kind of use an example right now where you know. I used to use this in my in one of the articles I just wrote recently that from a quality perspective, I can have the best people with the best skill sets, the best tools, automation at every corner. But I'm still a product of what happens before me, whether it's all technology or new technology. Right. So garbage in testing becomes a bottleneck. Right. And I was talking to a client yesterday that has a big, fat, thick client. Right. And they want to move away from it. But it sells like hotcakes. So why get rid of it? Right.
It's kind of stay around for a long, long time, even though they want to go to the new leading edge to technology, their customer like the old Stockstill. So they're figuring out how they can work agilely into that. And it's kind of hard because there are multiple third parties. But they asked me a question about, you know, how can we. How can we increase the speed of testing? We looked at and I said, you've got a problem.
The problem is, is that you have so much incoming bad quality that you're having to have an army of testers make sure that when it goes out the door, it's actually clean. And then my testing costs are testing costs too much. Well, Mike. Well, the problem is actually to the left to you. So we did this exercise where we started looking at where defects are getting injected versus where they're getting found.
And what's the opportunity cost avoidance if you did it right the first time. And what we identified just for this one project that's about six months old, that they're almost two million dollars of savings if they've done it right the first time. Right. So let's say, I mean, my 18 phases and I've found 100 defects there.
Twenty-five of those is related to requirements. Well, guess what? If you had done the requirements right the first time, you wouldn't have all this cost down in your UAT phase. And the same with design and the same with code related defects. So it's really about gathering that data again and using the Lean and Six Sigma statistical techniques to be able to identify where there are opportunities.
And we really start to talk about the cost of quality. And there are two buckets. You got good quality and you got quality. When you look at poor quality, there are two buckets. There's internal fighting costs, an extra unfetter costs, an internal failure. Costs are under your control. Those are things you can go fix the external ones. How do I work with a third party to be able to get them to do their things right the first time? But you have to use data again to be able to have those conversations.
Jonathon Wright I find that.
Adrian O'Leary Answer your questions.
Jonathon Wright No, I did. I find it fascinating because, you know, we can't give in this really kind of reverse. You said about Loyd's and coming over doing Loyd's and the split with hate it hate Boston, you know, parfait. It's really interesting because what are the challenges that I've personally seen working with things like solution integrators is, you know, we're all talking different types of metrics. Right. Say, you know, you may be helpful to me about tactics.
You might be talking to me about the test create created. And yet there are different ways of getting that information. And what I found is that the PMO saw the product management office collating all this data, which is very reactive. It's a baby. It's a week late before you get that information. And it's a challenge of getting the data to give you that insight and that kind of stare on what direction, what product projects need help with, you know.
Did you find that with a subject like Comcast, which is just a huge organization, that its challenges between kind of silos, if acquisition companies, you know, like is the unit universal in and their way of working versus maybe how you establish the capability already within Comcast?
Adrian O'Leary Yeah, I mean, you know, it's about taking people on a journey. Right. So no matter what organization has been into, there's a term called Watermelon Project Management. Right. Where a watermelon is a big green fruit with a green thing. Green Lantern is it's really red on the inside. And when you look at the software lifecycle and the PM's report and green, green, green, green, green, when it gets to test, it's a big glaring red.
But again, they're not looking at what are all the activities. So when you go back and look at it, it's been red, red, yellow, red, yellow radio all the way along. But yet again, it's about bringing people together and again, communicating across those organizational geographical boundaries about the greater good. What are you trying to do? Use Termit and I.T., which is building quality and versus testing quality out.
Right. So, again, how can I get my peers and other organizations to really think about quality? From the outset. And it's tough. But when you get to DevSecOps and it's really working well. And each of the team members, you know, have empathy. They understand the rules that everybody has to perform within them. They start to get into that mindset of how do I build quality right. The first time to be able to deliver and in short sprints. And to be able to do automation in short sprints.
And that's another thing is, is automation within Slack, OPs and DevOps and agile. You know how to why not build up that technical debt that by doing, you know, sprint automation. But again, if you get it right, it's just awesome to see the collaboration that can happen between groups that understand each other's roles.
Jonathon Wright I think it's great, you know, because, you know, I've been to some organizations will are able to do that. And I've seen other organizations that I've tried to mimic that success but never got there. And part of it is, you know, the culture, like you said, taking them on that journey. And the last podcast, I, I got a guy who I went with on the British computer site in the ISO committee. And, you know, he's part of a team gecko board which does this kind of executive dashboards. Right. Which I've always seen as the holy grail of, you know, where your organization wants to be. Is this just one? You know, the CEO can spin up his phone.
He's a part of a 3000 application landscape and he can say it's all good or, you know, he can turn the dial and make a real difference, whether it be risk appetite, whether it be, you know, brand damage. And, you know, he's got all or he or she's got all that information coming up, the cascading KPIs. And you kind of got to a point where you've looked at that from the kind of all the way up to, you know, something that happens yet. You can talk to a C suite all the way down to, you know, the technical level. What you know, there's software developers and test.
Adrian O'Leary Oh, absolutely. I mean, at Comcast, you know, we did. My boss, his name was Tony Morton. You know probably one of the greatest technologists over ever worked for. And, you know, he had dashboards that were just real-time as far as what was happening on the network. Right. What was happening in the projects that he was looking at? And those were rolled up and down throughout the organization.
So he could call a test lead into the room and ask them what was going on. And he already knew what was going on. So absolute I mean, I've been into organizations many, many years ago. I was doing a metrics assessment for an organization on these calls. And I came in and they showed me their wall of metrics and it was one hundred metrics. And when I analyzed them directly for. That only made any sense.
So an organization was gathering and had four people, 40 hours a week gathering all the data, and four out of 100 queued. They could only act to be able to tell them where they're at and where they're going. So it's funny as well that people can get lost in those metrics at the same time. But again, they got they've got to tell you where you're at today, where you were yesterday, and how you can action both of those for the future and be able to react to it in real-time as well.
Jonathon Wright So kind of proactive inside. What you're really wanting is to say, if we carry on here, this is what it's going to look like and this is what the impacts are going to be.
Adrian O'Leary Exactly. Exactly. You know, everybody, talk radio medium fail fast. Right. I want to be able to fail as fast problem as possible when I deploy a future into production. It's got to be configurable where I can turn it on and turn it off as quickly as possible in case it's going to, you know, take down the entire network. And you're gonna hear about the social media reading quickly as well.
Jonathon Wright Yeah, I think, you know, the Disney plus example in the US is a really good one. You know, building up that launch, you know, they must have been a pressure from marketing and commitments on the launch day to then what happened when the service went down? You know, it's because, as you said, it's holistic. It's so many integral parts of that.
And we know Disney plus don't have the infrastructure as far as you know, the cloud vendor said. They say it's probably on the technology, but there are so many challenges.
And one of the everyone's was looking, you know, and they're all excited about this new service. And you've got at the top of that, you've got Disney and Walt Disney. Is this kind of well, these are the pillars of what the company is about. And, you know, part of it is you have to get kind of having to bring that vision down to a level where that can be delivered.
Did you find that you know, with things like Comcast, you know, did you find that you know, that vision changed? As the day, you know, as they brought on new services, as they brought on, you know, heritage legacy kind of platforms as well. You know, what was the kind of the pillars? And how did you bring that culture in that?
Adrian O'Leary Well, I mean, you did the ROI was looking five years ahead like everybody is. And in some cases, from a technology perspective, they're looking 10 years ahead. So they're working on things today that you may not see for 10 years. Right. You know, I wrote about a lovely example of, you know, where we are today in the world of staying at home and communicating via Zoom or via Teams via Skype. And this was my God. This must be eight years ago.
Adrian O'Leary And we were piloting Skype on your set-top box. And I had just installed it at home. And it came upstairs to tell my eldest daughter that, you know, one of your friends has Skype on the set-top box to use Kafka. You'll be able to do video conferencing with them. And then I looked at her laptop and she's doing a four-way video conference with her school friends using something called VOO. Oh, my God.
Never even heard of this thing. So you know what you said technology just moves so fast today and the millennials moved so fast with us, you know? So you always have the Comcast and the Apple and Microsoft and the Facebook and the Amazons always you know, they've got something going on in the background to be able to move legacy stuff out, to bring in new technology. And then they're way out ahead of us as far as what's coming next.
Adrian O'Leary And you just can't tell, you know, where they're going to go. If you look at the MSO as the cable providers, they're trying to own the home. So if they're your Internet provider, they're trying to figure out ways. How do they consume more of the revenue that you've got in the household, whether it's health care, whether it's energy, whether it's shopping? Right.
So using that, using the technology they've got on the base, they've got to say, OK, they can't do any of that stuff without me. They can't get on their exports to their PlayStation. They can't watch Netflix on their own.
Their devices run that household without me being their Internet provider. So how do I utilize that to innovate as well as to look for other areas that I can either partner with companies or to or to do stuff?
Jonathon Wright And. We had Iran on the show quite recently and he talked about OpenSignal and that kind of independent benchmarking if you know things like your core infrastructure. Right. So your access to your fibers and how much that can support and obviously 5G coming out and things like that, you know, very critical to that. And like you said, things are moving so fast.
You know, technology is changing how they're adopting and eating. It's the larger organizations like you mentioned, that, you know, maybe they've got this five year taking a plan, but they also have an R&D team. Right. And I don't know.
Do you believe that might be something that's missing from this kind of DevSecOps is the kind of the research aspect of proving a hypothesis before you kind of enter that, you know, so your idealization phase is really focused on a concept? Yeah.
Adrian O'Leary Yeah. And I think a lot of them do. You know, you do. I mean when we start to do bring agile techniques into conquest. Way back when you would pick a project, you would train the staff up, you would identify the tools and then learn. There would be touch points along with that where we kind of step back and say, OK, how can I do it better the next time? So absolutely, they're proof concepts that are happening.
No long before long ago, when we used to deploy a new guide on a set-top box, we wouldn't deploy or once we withdraw into one town or one area and deploy there and get the feedback. We used to use CRO testing as an opportunity where we would identify professional testers within the network and we would deploy new technology to them to see how they would interact and give us the feedback.
So there are all these proven concepts always happening from a testing perspective and getting that feedback. And we would go into customer stores and look to see how their customers are interacting. We would go on Check right out on a truck and watch how a technician is installing the products and seeing what we could do to help them about better. We would sit in call centers and listen to calls coming in from customers. As far as what are they calling in about? Right.
It may be stupid things, but maybe I mean, I was sitting on one call and a little lady called in and she said, you're charging me for a month for my telephone and video bill, but I've been on vacation for three weeks. How can you charge me for the full month? But these are the types of calls are coming.
You say, okay, how can I use AI and email to answer questions so they don't have to come up with HR's that they can solve problems themselves. So there's a lot of looking at data and saying, OK, how can I use robotics and RPA to be able to help the customer over and analyze that data?
Jonathon Wright I think, you know, that's a really interesting point. I know you did back in 2015. You did. And I Otane panel discussion. And, you know, so you're quite far ahead in that. And you talked about Comcast, you know, with security, home security and in the home. Now, obviously is a big thing and things like smart contracts. Right. So in the case of the lady who ran up is, you know, she takes that she's out, detects that there's no broadband usage, and then puts on different tariffs automatically. She comes back. It changes.
You know, it's going to be a lot more of that is going to increase the complexity. And you know, it sounds like you've looked at the pre and post-experience aspect as well because we're very good at focusing on the user experience, but not how we get it there in the first place of the buying affair or, you know, the purchasing power of the experience of signing up to Comcast, you know, checking it out, you know, and then actually your technician telling up the install and how they interact with the end-user.
And I think that's fascinating because you were identifying opportunities for improvement through beyond the lights, the software development lifecycle into, you know, into the digital the whole digital experience across the digital lifecycle, really.
Adrian O'Leary Yeah, you know, it's interesting that that's the great, great point. You don't want customers to be calling, you consider. So how can you do to the preventative things? So, you know, if you take your iPhone or your Android phone and put it in the sun, it's going to tell you I'm hot. Right. So what with all these Zooms, as soon as the cable operators are doing is building the technology and this bidirectional communication to be able to tell the customers something's going on.
But sending them a text to say, listen, I know I've just noticed that your your your set-top box or your Mortimer's rotor is cut offline. Can you check the connection or knowing that there's a power outage in your subdivision? So they want to do all this proactive communication to let the customer knowing, hey, we're here, we've noticed something wrong. We're working on it. And just like automation.
Right. When you're doing your scripts, you want to have scripts through self-healing. Right. And that's one of the things they're working on, IoT, you know, with their automation methodologies. How do we get scripts to auto-heal themselves? And look at that zero-touch aspect of testing from, you know, grabbing a pilled and deploying it and grabbing your sanity tests. Reporting on a drug. Reporting defects. So there's as legit as less human interaction is possible in that in the testing process.
Jonathon Wright That's amazing. It's like self-healing kind of support, FirstLine support. That's just proactively looking after you and keeping an eye on everything for you. And what's, you know, I think that's exceptionally interesting.
So I go with probably really low on time but for the listeners. Know what your belt to give them some final tips on how to kind of maybe some stuff that you've looked at in the past and learn online or, you know, books and also how to best contact you.
Adrian O'Leary Absolutely. So let's start with the contact. My email address is Adrian.oleary@coforgetech.com. And tips. You know, I am always looking at Web sites out there, LinkedIn is a great place, which is a centralized repository that leads to so many different places to look at artifacts. Having just conversations with, like to yourself, the likes of Jim Hazan's and other professionals are out there.
Everybody has something to, every day I learn something. I mean, I've been in this industry for twenty-five years now, but there is always someone that can teach us something. Right. And that's the way I look at it. It's every day is a learning opportunity. So the more you read the white papers, the more you ask questions. The professionals out there, the met, the more you look at different industries especially. So what I've learned over time is there's not one methodology that suits the situation.
It may be a blend of all those methodologies that you're looking at, the different frameworks for automation, the different approaches to automation. But more importantly, how it's about using data. And that's the two by habit with everybody is make sure that you're looking at the data to see how you can improve because if you gather the right data, it's not going away. It's going to tell you a lot of truths.
And in order to have a conversation with someone, you need to have the right data to have that conversation. And it's amazing from a quality perspective that if your organization is viewed as the bottleneck, the company, the very deep conversations that you can have with people that are fetid, that are dumping code or data feeds in to be able to say this is how you're affecting my productivity. And just one last point.
There is a metric that we gathered around, meantime, to detect. And we did a retrospective on a project and we said, OK, you going to give us six code drops? And we put that in the chart. And then we said these are all the defects that we actually found during the project. And you put that it's a lovely bell curve. In the meantime, to detect was fifteen days. Now we took the actual data and instead of six code drops, we got 28 contracts. And instead of having a bell curve, you have an EKG chart where I'm finding the effects and dead.
My productivity is data finding defects. And the issue became as you got closer to the release date, you got more code drops and the spikes get higher because you're funny, more critical defects. And we use that data to be able to. Go back and have a conversation is saying, you're killing us, you're absolutely killing us, right? We're having to work nights and weekends to be able to get this product out the door.
And you're just dumping the stuff in. But once they understood that, they didn't know that. They didn't know how they were affecting our productivity. And then we had a conversation and we got back to, OK, how do we fix it together? And then you get to the bell. So just gather data and have a conversation.
Jonathon Wright Well, it's been you know, I've learned something new, Adrian, and I love your digital harpy organization today. That is a fantastic definition with the idea that, yeah, if something goes wrong, that heart rate is going to change. And looking at why your data, what you're capturing and why and what that actually means, it's you know, I think for a lot of people out there, they don't know where to start when it comes to metrics that are maybe a bit scared. And, you know, I'll definitely make sure that they reach out to you, get them.
They can drop you an email. And, you know, we'll have to get you back on the show, at the show a couple of months. Once you regain you you've got some stories to tell at your control.
Adrian O'Leary Absolutely. It'll be delighted to tell you. Enjoy.
Jonathon Wright Wonderful. All right. Well, thank you so much.