Jonathon Wright is joined by Wayne Ariola and Thomas Pryce from Curiosity Software. 5 or 8 years ago, they met and had a conversation about the idea of nodes and connected nodes and the interconnectedness of what necessarily needs to have in order for the actual test to execute. Listen to learn how to curate information to make testers more productive.
- Jonathon and Wayne met 5 or 8 years ago in Orlando talking about software, the evolution of DevOps and software quality as well. [1:14]
The challenge from a tester’s perspective is the complexity that they manage is significantly greater than the complexity in which a developer engages with on a daily basis.Wayne Ariola
- Thomas started off about five or six years ago on Curiosity Software. Back then they were very much focused on test data and on test creation and in both instances took model-based approaches. [2:32]
- The tool which Tom started off with at cred tools was called agile requirements designer. [6:19]
- Concept number one, you need a reference diagram that sits in the middle, that you can collaborate around, which is reflective of more of an operational model than it is solely the bottom up requirement. [10:25]
It is remarkable to me what information the organization has available versus what the what kind of information the testers have access to.Wayne Ariola
- 2021 is the era of data optimization for the tester. Meaning we are going to learn how to curate information to make the tester more productive. [12:12]
- The concept of software test automation really revolves around only one core theme, which is the automation of a script. [12:39]
- The idea of act takes this curated information that started from inform and creates the action in a workstream. [13:43]
- The open-source testing tools have become extraordinarily productive. The proprietary tools are even more productive than the open-source tools, but the open-source tools are good enough. [14:40]
- The most popular question to testers today is “Are you done yet?” But the problem is that the complexity of how to achieve the complete task that is required to test is undescribable to the business. [20:45]
Test coverage is always going to matter, but knowing where you can target your test then where you should be maximizing your coverage doing that continuously change the dial.Thomas Pryce
- Testing splits into two areas. One tester actually then moves into this idea of becoming design experts, helping the team actually create better applications off the bat. Then there’s going to be this OpsDev focus of testing, which means that they’re going to protect the business in an automated fashion, which allows us to actually act upon vectors. [36:07]
- Wayne is going to focus on one thing in 2021, leverage the richness of data available today to make sure that the informed stream is active within an organization. [39:18]
A system that assists me to curate data for testers is imperative for success.Wayne Ariola
- A system like curiosity provides you the ability to not only curate the data for informed but also give you the opportunity to act upon that central concept of a model. [40:26]
With a customer-centric approach and fostering partnerships with industry leaders, Wayne Ariola has created and marketed products that have evolved to support the dynamic software development, test, and delivery landscape. He has contributed to the design of many innovative technologies and has been awarded several patents for his inventions.
A recognized leader on topics such as Service Virtualization, SOA and API quality, software quality governance, and application security, he is a frequent contributor to industry publications and the author of the book Continuous Testing for IT Leaders. Wayne has been a contributor to the software testing space for 15 years and in the software industry for over 20 years. He holds a BA in Business Economics and an MBA in Finance and Business Strategy.
So I think in 2021, what we need to do as quality experts is to make sure that we are opening up these streams of information to be able to understand the impacts of change much clearer.Wayne Ariola
Tom Pryce is a technologically hands-on Communication Manager with Curiosity Software Ireland. His interests include Model-Based Testing, test data management, and Robotic Process Automation.
It’s too complex to test everything, but if we can start leveraging the data and harnessing this informed question from as many sources as possible, it’s shipped out after your idea in a single source of truth, the model, we can really start looking to align our test creation.Thomas Pryce
- Subscribe To The QA Lead Newsletter to get our latest articles and podcasts
- Check out Curiosity Software
- Connect with on Thomas Pryce LinkedIn
- Connect with on Wayne Ariola LinkedIn
- Follow Thomas Pryce on Twitter
- Follow on Wayne Ariola Twitter
Other articles and podcasts:
Read The Transcript:
We’re trying out transcribing our podcasts using a software program. Please forgive any typos as the bot isn’t correct 100% of the time.
Jonathon Wright In the digital reality evolution over revolution prevail, the QA approaches and techniques that worked yesterday will fail you tomorrow. So free your mind. The automation cyborg has been sent back in time. Ted speaker, Jonathon Wright's mission is to help you save the future from bad software.
Hey, welcome to this very special edition with not one but two guests.
My good friend, Wayne Ariola, as well as Tom Pryce from Curiosity Software. So welcome Wayne, all the way from LA.
Wayne Ariola Hey guys. Tom, Jonathon, I'm so happy to be here and well, Hey, by the way, Tom, great move from Oxford to Cambridge. Moving up to the big city. I think that's.
Thomas Pryce Well, it is too much.
Jonathon Wright I think that kind of gives us a great context of how we all met. Right. I've just finished the series of how I met your mother. So, let's talk about how we met each other. So Wayne to recall, was it Vegas, or was it LA that we first met?
Wayne Ariola I, it was either Vegas, but let me throw in another location.
It might've been Orlando. I mean, I hate to say it. If you look at the United States, there's two armpits, there's Las Vegas and there's Orlando. There was one of the two it's, sweltering, hot. A lot of analogies associated with armpits with Orlando and Las Vegas. But the one commonality among the two is obviously those are the hubs for talking about software, the evolution of DevOps, and software quality as well.
So we'll bring it back into more of a professional context there, but, certainly, and that was ages ago. It seems like ages ago, it was a, it seems like so long ago when we were, by the way, still solving the same problems.
Jonathon Wright Absolutely. And the thing was it? Yeah, it probably was five, eight years ago.
There was a whole stack of that. I think Alex was there and we had Hugh Price was there and they were all people who'd created software tools for one purpose to help test it. Right. And we, but we had this challenge back then and we were all brainstorming thinking, how do we move the needle?
Right. I don't and I think that kind of also kind of links into kind of the work that I've done with you, Tom, and Kind of all the work that your career of how you've kind of started an Oxbridge kind of startup and evolved your thinking around, what it means to, to test.
Thomas Pryce Yeah. No, I think that, so, I mean, so just to expand a little bit of what you're getting at there, Jonathon, so people who don't know so much, my background, I started off about five years, six years ago, not suit with comparable good tools set up by the managing director of curiosity, that's Huw Price. There, I mean, we were very much focused on cast data and on test creation and in both instances taken model-based approaches, but there's been, I mean, there's been seismic shifts even in the five years that I've been doing it by so fast, automation and the shift in the skills that were required for that and also the other processes that introduce and what we are trying to automate and the type of artifacts. We were trying to create, but then obviously, to automated, we could do a lot more at the same time and we can produce an awful lot more data and, that's where I think, Wayne talks a lot about this infection point where we've reached now.
So there were hurdles, there were new processes to automate. There were new things to optimize, but now that we started ready, Tapped into those problems. You've got a new opportunity. You don't, we had a new shiny thing, automation, it was pitched. It's going to solve all the problems. Of course, it didn't solve all the problems, introduced a whole load of problems with it.
So we not only started getting around some of those bottlenecks and now that we're starting to do it quite well, we've gotten out a chance to really, to optimize that. And so, I mean, when you can talk about this.
Wayne Ariola No, I agree. It actually ties back into, and I think there was alcohol involved that initial conversation with Jonathon, which was, we had this great conversation about it's all about a node.
Do you remember that, Jonathon?
Jonathon Wright I do, indeed.
Wayne Ariola And we described the node as obviously a reflection of the running system and then by the way, in testing, we have the luxury to kind of abstract above the pure complexity of the production-based system but I think if you talk about this idea of nodes and connected nodes and the interconnectedness of what necessarily needs to have in order for actual test to execute, right.
We have a really complex infrastructure that we still manage way back when, and I think Tom you'll appreciate this, especially in the grid tools days. I mean, the biggest disruptor at that point in time was APIs, right? And at that, at that particular point of time, mobile was obviously hot, but still, it was not necessarily as functional as we were thinking today but now we have an integrated set of pretty complex interactions that stream across multiple devices, endpoints, and components and the complexity of it is actually blown up. Jonathon and I had a conversation about this just the other day, which was, the interesting thing about the challenge from a tester's perspective is the complexity that they manage is significantly greater than the complexity in which a developer engages with on a daily basis.
Right? The granular aspect of what they're working with within a codebase associated with a particular application, right. In a dev environment is much different than the breadth in which, or the environment in which a tester needs to operate with it. Right. And it's a pretty amazing concept because when you look at the evolution of what is now required to be tested, validated, and completed the onus associated with managed complexity sits much, much, much more on the testers side.
Then it does on the developer side and I don't think that's ever changed. I, Jonathon, I'd love your opinion on it, but I don't think that's ever been a concept that's changed, but it's a concept that I think we're beginning to appreciate significantly more in 2020, 2021.
Jonathon Wright Absolutely, and I think what you've just explained with the kind of everything is a node was kind of where the conversation, I remember, I recall the conversation we had and some of you might not know this, but the tool which Tom started off with at cred tools was called agile requirements designer.
They started off with this idea that it was really important that we look at the requirements and understand that requirements change. Now we know, Agile is turning 20. On February the 12th this year, things have changed iterative software development. We're moving much faster, DevOps, all those great things, which we've talked about, which gives you the agility to, for your business but I think let's take it back to exactly what the business wants, right? The business wants and needs, which you know, used to be the user needs but I think now are more abstract. The business may need a capability and that capability. Starts off with a requirement of some description, and that might be a user story or a feature, or it could be some kind of initial concept.
If we look at something like lean UX or some lean startup, but that idea is then translated by a developer into code. Now, as soon as a single line of code is written, that's a node. That node is a testable object that we can look at various different levels. That might be abstracted all the way up to a user interface, which, testers have always been very familiar with, to a component, a single component of the system.
Like you just mentioned API APIs at the surface the service layer. Now, if we take those building bricks and let's apply what, we've been talking about for the last five years with model-based testing, and Tom's help you at the moment with the new ISO 29 119 part a for model-based testing.
Think about it as a model. If you put in a model together what the business wants. So this kind of capability is going to have a number of components in it, which need to be developed. There could be different teams. There could be different organizations, but you need to be able to, they need to be testable and I think that is the big challenge. If we got a node. That we want to test, how do we test it? And how does do we get to this reflection point where we realized that what we developed originally needs to change and therefore there's got to be some kind of reaction which needs now to test it in a different, slightly different way.
So, Wayne, you are the godfather when it comes to this kind of technology with, service virtualization. So the amazing work you did with Parasoft, part of it if we think about just testing in isolation node that may not need to exist. It could be an, a service virtualization.
It could be a stub or a shim if you want to test that, which is what sometimes we call component testing or unit testing, at that level. You've got to understand what the data is, what you're trying to test, what the flow is, what the model is, talk us through. What, where your head's been coming through the last, 15 years.
Wayne Ariola Well, since you just put me in the godfather category, I guess, I guess the maturity of my thought has coalesced into this new idea. So I'm going to play on that for just a little bit. I believe it comes down to a few extraordinarily core concepts. Okay. The first thing is you need something to act upon and distributing your business logic into test scripts. Doesn't make any sense to me.
It makes sense to me when you need to automate something but that's a disposable instance. Okay. However, I need some sort of abstraction. Let's call it a model cause I haven't think of a thought of a better term for it yet. I've been scratching my head. Tom's been scratching his head but we haven't really come up with a better idea than the word model but you do, you need something to collaborate around and this centerpiece to collaborate around is actually unique because it isn't necessarily requirements-based. Right. It isn't necessarily 100% user-based, it isn't necessarily 100% developer prioritized, but it kind of coalesces these concepts and morphs with the priority of the business.
Which, and Jonathon you'll appreciate, this is kind of more of an ops dev concept than it is a dev-ops concept. So concept number one, you need a reference diagram. Ooh. That's not a bad word. So you need a reference diagram that sits in the middle. That you can collaborate around, which is reflective of more of an operational model than it is solely the bottom-up requirement.
Right? So today, you wouldn't say that our testing efforts are overly compensated for, of being bottom-up. Right. Requirements-driven bottom-up has to happen. I'm not denying that you got, or you bail out on that you can't yet. What will you need to happen is there's gotta be some level of collaboration around this core in which we come up with the ideas on how we're actually going to protect the business.
So I think there's three concepts that once you have this idea of something central in the middle, which is worth protecting and there's influences into that model, which is automatically updated there's a couple of core concepts. One is informed and what I mean by inform is that every single time that I've gone into a project in which we're trying to evolve the concept or the evolution of software testing.
It is remarkable to me what information the organization has available versus what the what kind of information the testers have access to. There's a gap there. So the whole idea at first is curating the data which is available within the organization. So what becomes impactful? For the tester and this is the first idea of informed and I think 2021 is the era of data optimization for the tester.
Meaning we are going to learn how to curate information to make the test are more productive. Now that's only the first step. The second step is this idea of act okay. Meaning that once this information is now curated for us, we're going to get a clearer picture of what are those holes that don't allow us to truly achieve automation.
So, I mean, I'm sure everyone on this podcast has probably listened or read one of the analysts reports out there pick and choose what you want to read but software test automation. When you think of the concept of it really evolves around only one core theme, which is the automation of a script.
Whether it's no code or code who cares or script doesn't matter but if the idea of automation really just sits around that one key concept, and depending on what report you read, it ranges from 35, 18 to 35% automation within an organization. Now, if you look at actual automation rates of those tests run about 16%.
So there's a gap between the actual automation of the test. Or creating the steering for the test and how it's actually automated and the interesting thing there is that gap really reflects everything around the test itself, whether it's access to environment, access to the test data, access to understanding how those validations are going to occur and what are meaningful for the particular time horizon, the test itself.
The idea of act takes this curated information that started from inform and creates the action in a workstream. So it's kind of, I like to call it, you have a hub of automation, which could be the test, but then you have the spokes of automation, which are the nodes, Jonathon, that you and I talked about, which we now need to incrementally automate around the core and this idea of acting on information is going to allow us to round that out today. Okay, and I think the final concept is connectivity and openness, right. And today. We're all coming off the era of the large platforms and not to let the cat out of the bag here, but these large platforms were not built.
For openness, they were built for the, quite the opposite. They were there to add value obviously to the concept of testing, but quite honestly, a vendor lock-in is real and we can't have that anymore and we can't have it for a number of reasons. First of all the first thing, you gotta realize is that the open-source testing tools have become extraordinarily productive.
I think the proprietary tools are even more productive than the open-source tools, but the open-source tools are good enough. What you need to do now is actually find a way to unlock the value for that and this is where this concept of, but you know, an open testing platform comes in and an open testing platform should have those three core anchor points, which is informed, acts, and then finally automate.
Jonathon Wright Yeah, I absolutely agree and I think where we go back to that node concept of testing a node and by back to that, I think it star West, when we first bat, part of it was okay. There's two kind of really diff difficult challenges here and you've kind of addressed it with, we have to move faster, we've got to automate more, but it's actually quite re reactive.
It's, something changes, something breaks, we update everything but there is no kind of openness within this kind of community and it's when we look at the developers, they've got a lot of standards and they brought things forwards to, to make things more open so that people can interconnect better with different technologies, whether it be cloud technologies, whether it be service layer, a layer kind of technologies, there is a standard that people work against and this has never really happened in the test world.
Yes, we got the first W3C standard for selenium and, that was great. That gave us this kind of ability to test. Browsers, should we say for a second, but that left all the enterprise organizations still are having to deal with either building their own or proprietary, whether that be a particular testing, something like CAFCA, using the tools that come with CAFCA to do that, or, retrofitting a other tool to make it do what they need to do and these are really difficult challenges, which all testers have got at different levels, depending on what is valuable to the organization and again, what they're trying to do from a quality perspective. So I'm going to take, I'm going to challenge you Wayne, and get your knowledge on exactly your experience.
Kind of with the RPA landscape, you see this top-down approach and the bottom-up approach. You've got this, I'm going to create an organizational blueprint. Which is going to kind of, like you said is this kind of, how does everything fit together? And that's, people would love that and most organizations it's not real and then down to the bottom level of organizations may have the odd diagram. Which people have talked about how everything connects, but the data is so sparse and it's, like you said it's restricted that actually having that entire organizational blueprint is a dream, is pipe dream.
So from a top-down perspective, the organizations are talking about things like. Improving efficiency, improving quality increase in increasing time to market, giving the customer a better experience and those filter down to all these different programs of work. We've got to be more secure.
We've got to be more performant. We've got to, have a better quality product with less outages. All those things are being driven from the business and then passed over to it as projects. Now we heard Gardner say 2021. There is going to be the end of the year of it projects. It's going to be now moving from commissioning a project to a product and that product is evolving within your organization. If you're an organization that relies on that product and that product could be a mobile app, that product could be your e-commerce platform. It could be your backend business intelligence. It's a product that makes you different. It has all your domain has all the business rules in there.
It's the special sauce that makes up your organization. So I love this concept, what you've got with informed. And do you think it is a two-way thing? Do you think there's an information streams that should be coming down from the organization?
Wayne Ariola Absolutely. There is no doubt about it, but Jonathon, we got to also understand that this needs to be selective. Right. So Going after everything is impossible, but going after the priorities of what is important is required So this is why, and I think that RPA top-down kind of concept or process up kind of concept versus the requirements or more discreet requirement concept, which also aligns with this value stream management type approach to moving to, from a, Wait a project orientation to a product orientation.
There's a lot of good things that come out of that and I think the first thing that comes out of that is business priority. Right? So, and I know we all think we know it. I know we as testers, I think we all think we understand the priorities, but I can guarantee you that the organization is not necessarily keeping up with those business priorities.
Now, let's say a business priority does change Jonathon or Tom. Right. How are you going to reflect that if your logic is distributed among test tools around different test tools? Across teams, there is no way today while there is a way today, but there is a not typically a way that people actually use this kind of central collaboration type feature to actually go protect the business and I think as these ideas of product-based governance start to evolve, I think the ideas of like value stream management start to roll in right. We're going to have to find a way to have that reflected in what we are going to be responsible for as testers. And I think this is the other idea of using information because information is like from also you can think of it as like from infrastructure to people.
So there are aware, and then it's also from infrastructure to model. So it's updated. Right. So first of all, the people need to be informed and that's a huge gap from a software testing perspective and in fact, Jonathon, this is, this is what you and I talked about this recently as well as one of the biggest factors that we always face as testers is not necessarily, we're always kind of seen as, Hey, Testing's not done yet.
Right. It's back to that same adage, which is the most popular question to testers today. Are you done yet? Right. That's the most popular question but the problem is that the complexity of how to achieve the complete task that is required to test is undescribable to the business. Right. They don't want to hear the fact that, Oh, I didn't get environment access.
Oh, the person who was basically using the environment, I had to restructure the database in order for this, it never necessarily got cleaned. I never got to a clean environment. I never got it built to actually execute the scope of tests, which has required our test data. We're still waiting for it to be procured or provisioned.
All of these nuances associated to actually get a test to run is almost indescribable to the business. We need a way to actually understand it visually and present that visually back to the organization. So they understand where we are, what we're doing, and what are these areas that is potentially creating the latency in the process right now.
I think the most important part is if we have those two streams of informing people of what's going on in terms of change and then we have that stream of informing the model of what's going on with change, we're able to actually better represent our tasks back to the business were back to your point, Jonathan Bridges, that gap between the bottom-up motion of change and the top-down expectations of the business.
Jonathon Wright I love it. And it just makes so much sense and I'm gonna just take a step back for one second and kind of look at what we're, what industry we're in. Right. The idea, what they were kind of gone and mentioned with, this idea of funding, and it project and it being the business.
Right. You think about any company in the world, whether it makes trainers or is a famous fashion label and they're an it organization but that's not what they are anymore. The idea is that they're a business. And so for a second, this idea of passing over a requirement from the business to an it division, which is siloed in its own sense.
Is it's breaks into two information, which we've never been very good at passing through the organization? I'm working as an infrastructure engineer. I'm working as a security engineer. I'm working as a data engineer. How do I pass that information across? Is it some kind of confluence? Is it some kind of, page, is it some kind of spec, what is it that I'm passing between those organizations to give them information?
So that's the first bit, and then this is the technology, which we've focused on way too much. It's all been about, the latest technology and how we can utilize that to change the needle on our business and this is where we've taken it. I know I've been chatting with Paul Gerrard around.
Removing this idea exactly what you've said there way, which is bringing business and technology together, business technology and the idea is that they join up in the same way that product engineering, Google would talk about product engineering, bringing engineering and product together.
Actually let's bring the business, which is the top down with the technology, which is the bottom-up and let's have ink no more silence. Let's not have separate division. Let's have the business involved and let's drive it and I think what's interesting now, and this is the, to me, there's the Pepsi challenge, which is why everything that you've been talking about.
I absolutely love and we're going to talk a little bit about what the products do and why people should go off and try them but I'm going to bring back in Tom for a second is if I set any tests there and I said to them, what's the value of that test? Right. They would give me an answer and if I said, well, what's the value of that automation.
They give me an answer and then if I said to them, okay, well, what kind of test coverage have you got? They give me an answer. Right? And then if I said to them, okay, where are we? Are we there yet? How long have we got to go? They would. Really struggle at being able to say how confident they are and how far they've got to go and taking Wayne's idea and this is something that you've done before with the model-based testing approaches is if you can prioritize certain critical parts of the business like payroll, or, being able to generate a report and then you're able to focus and provide more coverage using different types of coverage techniques, then you can literally say to the business.
I can show you the value and I can show you in a way that you would understand, do you think that is really it's the bridging the gap between the business to be able to understand and share a model, which they can under both see the value of, and then prioritize where the focus is based on, potential change and where that could be more risky.
Thomas Pryce Yeah, for sure. For sure. I think, I mean, I think the keyword there is value, right? So definition of done. It's been a very hard question, I think for testings for a long time and the bigger period of functional test coverage, rooted in system logic, as you can say, if we maximize this as much as possible, if we're testing as much of the system as possible, we can increase our likely to find it, but before they hit production line, we're not going to have, a big scare if we weren't going to, cause ourselves and basket by that and the defect. We should have called, but I think, the question to ask yourself is, okay, you can test this much in the system, but to Wayne's point business party changes, you might find, I can't talk to about this. If I find 10, 10 bucks in a part of the code base, but actually users are unlikely to touch next month.
Right. And this notion of yes, test functional test coverage is always going to matter, but knowing where you can target your test then where you should be maximizing your coverage doing that continuously change the dial. It's I think, ask yourself, where is our testing units look at the most value to the business?
Not where, not how many defects can we find, so what
Wayne Ariola Tom, not to interrupt you, but this is not thinking that is. Currently subscribed to as well. I think we got it get to that point where people understand that the business priority question shifts this whole real hyper-focus towards something that is much, much different.
It's more like what the APM system is doing is its monitoring for effectiveness. Not necessarily trying to guarantee. Effectiveness. Right. But I think you hit on a great point there not to interrupt you, but I thought it was worth compounding.
Thomas Pryce Absolutely. Yeah, no, for sure. And I mean, Jonathon, you said that idea of shift left shift, right.
Shift everywhere. Quantum teleportation you've caught it. Right. Which is, it was kind of a similar seminar idea in different terms, which is we have potential now to put in information for more sources than happened before. So it's also time you mentioned for Joel that. W, Paul Gerald speaks very well on this idea of implicit and explicit models.
So five years ago with models, and this also goes back to Wayne's idea of having an abstraction layer. It was all about, okay, we're all building models in our head is tested. As you could get those models into a formalized format, it's going to save us a lot of time. We can automate a lot of the manual processes.
We can share knowledge but it was always focusing, on the system logic itself but now that we've started building those single source of truth, the system logic, if we can start putting in these broader systems and informing our models, using data from across the whole, not just development pipeline across the whole business, we can start targeting and testing what we think is going to align most with business value.
I think that's a key next step. Like, because. Test, test everything isn't ever going to be possible. Things changed too quick day. We've mentioned API. You can also mention containerization, subscale and victim replace the system all the time. It's too complex to test everything, but if we can start leveraging the data and harnessing this informed question from as many sources as possible, it's shipped out after your idea in a single source of truth, the model, we can really start looking to align our test creation.
It should be automated already. If it's not our automated test creation with. Business value and where it's going to test. What matters most before that next release? No.
Jonathon Wright Yeah. I let me ask this question too way now. So, This is, I think Tom has defined the problem statement of the decade, right?
Is that if I'm a business and I, maybe I outsource my testing or I have a test team that's there, or I use crowd testing, whatever they're doing to do some testing and to introduce some quality into the organization. If I asked to say, where are we and the team says, well, we got 4,000 tests or they're not another team says we got four tests or they've got 80 big defects, or they've got 10,000 defects.
They've found, what am I measuring? I have no idea. Right. So, I can literally give metrics, which don't really mean anything. And this is this idea of really challenging, well, what is a test? And what's the value of that. And I think Tom's point with this business. Value is hit it right on the head.
We'd look at what's important to the organization and we test that. First, right. And if that is, informed from lots of different systems and I, and Wayne, you picked out an APM and so I'm going to run with that. A great example is if I'm running a mobile app and it let's say it's, we're trying to sell PS fives, right?
So we launched the PS five. Everyone's wanting to download a, but unfortunately. We're having a bit of problems that could be because there's too many users, it could be a performance issue. It could be a security issue. It could be something that's important to the business at that potential time and there's customers that are on their that their phones they're saying, Oh it's freezing.
It's. We're getting all this data coming from the ABMS what's with the instrumentation on the actual phones from Firebase saying, yeah, we're getting all these stack. Overflow's, we're having all these kinds of issues that we're having with on the devices, in different countries. It could be a language-specific problem where people are getting stopped, but we're not using that information to inform our testing and I think that is the big kicker is that they it's coming from two ways. Exactly. What Tom said is we're shifting left and we're shifting, right. But by shifting right to pull the information from what's happening in the real world and shifting left to be able to create a model.
Sooner that represents what's important to the business and gives us an understanding of what the functionality is, which is explainable to the business. Yeah, both those directions are incredibly important and this new ability, which we're seeing with operations and AI ops and all this great capability means that we've got more data about what's happening.
All the way down the stack from the user interface, what's happening on the mobile device, how much CPU it's been using all the way to the service layer, and what's going on at the service layer all the way down to the individual database lab, you got all these different layers and we've segmented our testers to be performance testers, security, testers, the database team, dealing with API APIs and their specific team to do that.
I think that the time has come to say, how can we generate a model quickly? From what the business is doing today and show how the change in that process, that business process model will actually affect what needs to be developed, whether we have to step that out for the time being using service virtualization, but also be able to test potentially issues that are coming from production and prioritize. What's important. So,
Wayne Ariola So it's a great example, by the way, you, and you have to do all this in a highly distributed environment, by the way. Given COVID, I've had to shift rooms three times because my kids are essentially on, on zoom calls with education. And they're speaking to me as I'm trying to talk right now.
So we got to do this in a highly collaborative fashion but here's a great example of that. In fact, I was on the phone with the guys from AptDynamics yesterday and they were like, gosh, we have this great data and they gave almost the same example as you had Jonathon. They detected that this particular mobile device was having some issues in Brazil.
They weren't necessarily finding it in any other area, but, guess what the team that had the app that had the specific application in which would have been, the next version that would go onto the phone, obviously in Brazil had no idea about this issue. Why couldn't this be injected as data into the test framework early?
So you're not necessarily having to go refactor what's brand new, right? The data's there, the information's there it any, by the way, in order for us to actually act upon, it takes a very little amount of knowledge, right? They just need to be informed. So by the way, just like if you were at tester and by the you're sitting there and you're on this mobile app project, or you are responsible for this particular node of the model and boom in your channel, Slack, HipChat, whatever you're used, you basically get a notification that there is some aberration associated with how a particular Android app is operating in Brazil under these conditions.
Wow. That's interesting. Now we can take that one slip farther to act right within your interface. That actually is holding the model which accommodates environments in which it can run. Could we reproduce those particular conditions to run it against the mobile app today? Well, you know what the environments flexibility that is out there today.
Yes, you can. That is just a parameter and it could fig that's very, pretty much pretty easy to set up. It's just that knowledge gap in order to do it. So I think this is the era of connectivity for testers, right. And testers need and I've been, I'm in love with this word, curated data. There's a ton of data out there. But what is required is a interface needs to be built, to curate the data for the tester to make it actionable. And once we understand what those points are, guess what we're going to get smarter and we're going to increment on what we know, and we're going to bring more data points in and we're going to correlate those, and then we're going to have machine learning algorithms versus it and then it's going to evolve into more of an automated process, but it's going to make us better. It's not going to challenge jobs, by the way, it's going to provide more information to let us do things smarter rather than in your case that you just mentioned Jonathon, having a 5,000 regression.
Test suite, which is throwing false positives that you don't, that you can't even trace, which is causing 10 times more work for ferreting out what's good or what's bad. And by the way, what happens at the end of that journey? We've all been there. That's wipe out the test. We can start again. Right? So we, I think this idea of getting smarter about what we need to do when we need to do it is technically feasible today.
And it is, what's going to be this idea of allowing testers to evolve into the true quality engineers and I know people, some people hate that term. I might be one of them by the way, but the true quality engineers that they were, they're basically we're designed to be one more point and this is a little contentious as well is if you look at the activities that I see testers take today, especially core testers. The funny part is most of what they're doing is actually exposing port design. So I think this idea testing splits into two eras areas. One testers actually then move into this idea of becoming design experts, helping the team actually create better applications off the bat.
Right. And you know what your skills are fantastic but those are better met for design and design optimization and then there's going to be these ops dev focus of testing, which means that I'm going to go protect the business in an automated fashion, which allows me to actually act upon vectors, which I'm.
Ingesting from the it ops environment. Right. And I am not reactive anymore. I'm proactively taking those data points and making them actionable within my quality efforts. Some big statements there. I'm so sorry.
Jonathon Wright No, I actually love that, and let me just tell you how the curiosity guys can help you today.
Right? Let's take that example. What ways do you said, say, for instance, a ticket comes into service now and because it's not your ALM product or your CGI product? You don't have visibility of that. You don't have that visibility in your, maybe in your last year in your JIRA tools. You can't see that ticket.
Maybe that you should go off and investigate it. What the curious curiosity guys have done is they're able to pretty much this open test platform can connect into these systems. So. Using VIP, which is their visual integration pro product. It allows you to take feeds from different tools and feed domain into something like Slack and give you that notification.
So it's giving you that translation of what is in essence, more information, which makes you better at what you do. A curated process. So number two, now, if I'm a tester and I want to be better at testing, yes. We could go and speak to Dick bender and learn about model-based testing or.
We could just go to test model.io and create a model ourselves, just model anything you want. You're doing a project and you're trying to do a new interface, or if you're trying to do some API testing, right. And you want to understand, well, what the negative path that I should do and what are the positive ones? And, oh, I want to create a whole stack of rest assured or, postman scripts out of that, back of it.
Great. Go and use test model.io, model out the flow. If you've got a swagger spec or something cool like that, or you've got some other type of starting point, but you can model it out and you could generate it into whatever type of language you want.
So, you don't have to worry about the proprietary vendor, locking, whatever you're using today, you can model it and you can output it. It's selenium, it's Appian. It's, it's perfect. It doesn't matter. They'll integrate with it. So I guess the big question we're trying to throw over to you, Wayne is what Is the big bats for 2021 for our listeners.
Wayne Ariola Everyone's been asking most, a lot about 2021 Jonathon and what is the mantra? I believe quite honestly that an organization must focus on one thing evolve that and then move on. And I think if I was going to focus on one thing in 2021, I would leverage the richness of data available today to make sure that the informed stream.
Is active within an organization which means, and I've said this word so many times, but I love it. Which means that a system that assists me to curate data for testers is imperative for success. And that's step one. Once you're going to, once that information stream starts to become second nature, what you're going to realize is there's patterns to act, and they're going to be distinct patterns, which you're going to go, Hey, by the way, If I understand that there's a outage in the mobile this on this Android OS for mobile phones in Brazil.
I should probably be testing for that today and integrate that into my current plan. Right. So then you're going to see these patterns of act start coming up and again a system like curiosity provides you the ability to not only curate the data for informed but also give you the opportunity to act upon that central concept of a model.
So I think in 2021, what we need to do as quality experts. Is to make sure that we are having these. We are opening up these streams of information to be able to understand the impacts of change much, much clearer. If I had to focus on one thing, the second thing I would focus on, if we felt like we were getting some leverage, there would be turning those priority pieces of information into action.
Not necessarily a report, which you have to go then do something else, but actually having the action show up. In a central system of record, which gives you the, either the notification or the enablement to put a practice into action, and then once we see those patterns start to evolve, I guess, this is where we really start to improve.
Jonathon Wright Oh, that sounds also what I'm going to finish with is Tom, how's best to get started with something on curiosity, is there videos you can watch, how do you sign up?
Thomas Pryce Yeah, absolutely. So I think you hit the nail on the head. When you said head over to test modular.io. As you, discuss, you do also have the test data automation piece.
You also had the test data automation.io, but let's start with test modular that you can sign up for a free trial. You'll get access to the tool for two weeks. Over the course of those two weeks, you'll get emails with useful resources. We have libraries of. Demo videos. We've got online tutorials, structured video series with written documentation.
We also do operating documentation of course you can also get in touch with us anytime, but more than happy to jump on a web meeting. If you've already tested the test data automation piece than the modern piece and the mortar-based testaments motion piece, your test days, day two, it's an area where within the team we've got people who've been doing it for 35 years.
If you guys test data automation.io. Hit the book, a demo book, consultation button, get us on a web leads in because test data it's as diverse as anything. So it's always good to start walking off the bat and that's the best starting point that the test model or trial test data automation get us on the way we need to be more than happy to talk about it today.
Jonathon Wright Yeah. And there's not, there's nothing to stop it. Everybody who's listening now to go and try it free. Right. Have a little play because once you learn and you get some kind of action out of the back of it that's value. Right. So go and try it. Learn a new scale change. Rethink the way that you're testing your 2021 and with that, I just want to say a massive thank you to having two guests first time ever on the show to guests. So all the way from LA massive things, Wayne, and we're going to have to get you back on the show to follow up with some of your insight for 2021. Yep. At Tom in Cambridge.
Massive thanks again. I love it. Reach out to these guys. They're easy to get on LinkedIn, reach out to them if you need any help and, thanks to the godfather and my good friend, Tom, for an awesome podcast and so, have a great one guys
Thomas Pryce Always a pleasure to chat, Jonathon.