Jonathon chats with Royi Haddad of 21 Labs about autonomous testing, functional UI testing, personalizing tests to devices, and more.
- Join the waitlist for The QA Lead online community membership forum
- Subscribe To The QA Lead Newsletter to get our latest articles and podcasts
- Check out 21Lab
- Connect with Royi on LinkedIn
Other articles and podcasts:
- About The QA Lead podcast
- Unit Testing: Advantages & Disadvantages
- Help Test COVID Safe Paths MIT Project (with Todd DeCapua from Splunk)
- Developing The Next Generation Of Testers: The DevOps Girls Bootcamp (with Theresa Neate from DevOps Girls)
- QA Tester Jobs Guide 2020 (Salaries, Careers, and Education)
- What Is ‘QMetry Automation Studio’? Detailed QAS Overview & Explanation Of QAS Features
- What is Quality Assurance? The Essential Guide to QA
- How To Keep Up With Change In The Testing World (with Joel Montvelisky from PractiTest)
- The Digital Quality Handbook (with Eran Kinsbruner from Perforce Software)
Read The Transcript:
We’re trying out transcribing our podcasts using a software program. Please forgive any typos as the bot isn’t correct 100% of the time.
Jonathon Wright Hey, and welcome to TheQALead.com today I have a special guest, Royi from 21 Labs. He's been doing some amazing stuff around AI guided authoring using next-generation AI technologies for mobile. And he's going to tell us all about his story coming from Israel over to California and the startup of his dreams.
Intro In the digital reality, evolution over revolution prevails. The QA Lead approaches and techniques that worked yesterday will fail you tomorrow. So free your mind. The automation cyborg has been sent back in time. TED Speaker Jonathon Wright's mission is to help you save the future from bad software.
Jonathon Wright This podcast is brought to you by eggplant. Eggplant help businesses to test, monitor, and analyze their end-to-end customer experience and continuously improve their business outcomes.
Hey, how's it going? How are you doing?
Royi Haddad Good, good yourself?
Jonathon Wright Yeah, really good. Sorry for the slight delay. I've been back to back on calls today and literally I've just got off the phone with the guys in California.
So they do everything running behind, like a doctor's surgery.
Royi Haddad No worries I can understand that. You're based in England?
Jonathon Wright I am, yeah. So the time zones are very interesting at the moment. And a couple of calls with Tel Aviv this morning and there are two hours ahead. So they kind of woke me up at seven o'clock for the first meeting and it's just kind of it's gone that way. How about yourself, whereabouts are you based?
Royi Haddad We're based in Campbell, California, San Jose, San Francisco area.
Jonathon Wright Nice. Yeah.
Royi Haddad What time is it now? It's almost 10 p.m.
Jonathon Wright No, no, it's six o'clock. Six p.m. still absolutely perfect. But yeah, I was based in Santa Clara when I was working for CA so you know it was, it was. I do miss Southern California. It's is a very different landscape. It's thunder and lightning outside at the moment.
Royi Haddad Oh really. Yeah. I think it's going to be like one hundred degrees today. It's when you get used to living here it's hard to leave.
Jonathon Wright Absolutely.
Royi Haddad I've been here 15 years now and I just don't see myself going back home.
Jonathon Wright I don't know. I just. Where are you from originally?
Royi Haddad I'm originally from, from Israel. So, yeah.
Jonathon Wright All the best, all the best companies are from Israel. It's amazing just how many companies and startups have appeared out of Israel.
Royi Haddad Yeah, you know what? It's I read a book about this once. It's it seems like there's a lot of good companies. But one big difference between the companies that come out of Israel and the companies that, for example, you see coming out of the United States or even Germany and England for that matter, is that a lot of these Israeli companies, they get sold very, very early on. Right. It's not necessarily a mindset or let's build the next Apple, or let's build the next big company as it is with other European startups or American startups. It's just interesting to see that different mindset. I was like, I don't know, maybe it's that that dream of an exit is that keeps turning startups off, you know?
Jonathon Wright Yeah, absolutely. I would I was talking to let me just think it was it yesterday, Joel. Yesterday he's from the Mercury Days. And he's at PractiTest now. And you know, prior to that, I was talking to a lady from a new startup called Light, uh, Light Run, which is a production debugging tool. And then on top of that, Alon, who was who I worked with quite closely, was with the founder of BlazeMeter. He's just started his new stealth company, which is UP9, which is up9.com, which is a API microservice testing platform, which is using AI, which is really interesting, something we were doing working on when I was in Santa Clara. We launched something called Blaze API, which was the idea to do kind of BlazeMeter for APIs, and it just didn't quite work. I, I don't think the industry was ready for it. And we also started down the swagger route and it just nobody had swagger specs. So it was just like never mind. So, so yeah. So I'll tell you a little bit about the podcast to kind of get you an idea so we can we'll obviously be edited and transcribed by a team based out in Canada and you can edit things and remove stuff that typically about thirty minutes long. It'd be great to kind of get, you know, an intro about kind of you what you're doing. I know you want to talk a little bit about the React, um, sorry the Appium challenges. And yeah, we've had Eran on the show who's also another Israeli superstar with Profecta guys. So yeah. And then we can if you've got a blog or something you want to put on The QA Lead. We can add that in when they when it launches as well.
Royi Haddad Definitely. Yeah, that will be great. So yeah, that would be awesome. I think, you know, it would be great to tell you a little bit about our story and we're well, you know why we started. What did it look like when we first started? What does it look like now? And a little bit about the challenges, because I think, you know, one thing that I've learned in the last year is that mobile QA is not what you think it is, I mean, when we first started, it was we had a completely different mindset and then we thought that we can do things in one way. And I think it's the life of a startup. But, yeah, I think it will be great to kind of. And you tell me kind of how you want to start it, maybe tell a little bit about twenty-one and what we do and how we started where we are today. And then you can kind of talk a little bit maybe about the challenges of not necessarily Appium, but the challenges of doing QA and how people perceive it. And, you know, it's that whole, as a company do you how do you find the companies or the customers that are interested in doing QA? Because not necessarily everybody's interested in doing QA, right. So I think I think we maybe we can start out by talking about 21 and what we are and kind of go from there. Does that sound OK?
Jonathon Wright Yeah, perfect. Well I've got that recorded on so we can literally start and I'll do the kind of the intros and your bio and stuff and prepackage that later to mix it all in. So yeah. Be good just to welcoming you to kind of the show. Tell us a little bit about you and let's go from there.
Royi Haddad Sounds good. So I've been in the mobile and machine learning industry for about, I want to say, almost ten years now. I've had my own gaming company, my own mobile gaming company. We've been in the market for several years, had over one hundred thousand monthly users in the last several years. Before I started 21, I was mostly engaged in machine learning, AI, reinforcement learning, which I got from the game's world of reinforcement learning, natural language processing, and computer vision. And so we started 21 late 2018, or early, early 2019. I joined Shani Shoham, who comes from the QA world. My idea of QA before that was let's write a bunch of unit testing to get it over with and just release the product and that's it. But Shani came from that world which obviously is a much more organized world than a lot of developers think. Right. There's much more to QA than just writing or writing a bunch of unit testing or I'm going in and check in a few scenarios on the screen and functional testing. So we join hands together in 2019. We got introduced by a mutual friend and we immediately got started working. And he had a vision of how QA should work because since he came from the industry as a former employee of Profecta and some others, he really understood what the challenges with QA are. Right. You've got a lot of companies that are spending a ton of money on buying infrastructure services and mobile devices and whatnot. But still, the majority of QA was still limited to, hey, let's either do it manually or write a bunch of Appium scripts or Selenium scripts and run them. So again, he had a different vision of what that process should look like, what real automation looks like. And we started working. We started working. We started building an MVP product. We started showing people. We set out to build a truly autonomous QA system just to realize that for most companies they did. They want to check their own scenarios. They want to put the V mark, the check box on their own scenarios. And autonomous testing doesn't allow you to get to that point where you can create very complicated scenarios for testing. So if you've got, for example, a QA system where you have an autonomous crawler or whatever you want to call it, whether it's web or mobile. Yes, you can go and click on some buttons and go from one screen to the other and capture a screen and maybe do some basic assertions on General ticks, but every company has a very specific scenario. I want to check, for example, that when the user searches, his login information on the login screen that I actually can validate that it exists on the profile screen, right? His name, his e-mail, his phone number for an autonomous system is very it's very difficult to build a system that can infer such scenarios. So again, we started with an autonomous crawler, we built an autonomous crawler was able to crawl on its own applications and do quite an amazing thing just to realize that customers are not really interested in that. Right. I mean, they want to put the box on what they really want to test. So we kind of made one hundred and eighty-degree change through halfway through the process and brought and essentially built a system where now companies can really create a test in a matter of minutes, a test that reflects the things that they want to test. They're able to run that test in the cloud without any need for devices in a matter of seconds. And they can do parallel executions. They can take that test and run it on. If it's a hundred days, for example, they can run it on five or ten different devices. At the same time. We started from a place where when we first started building the system, I would say twenty-five percent of our tests used to pass were over execution tests used to pass. And it would take minutes, long minutes to run a test to a place where we're running about six thousand tests at six thousand executions a month. The success rate of most of the customers that we have right now leave aside bugs is close to 100 percent. And it takes literally minutes or less than minutes to run these executions. I'll give you an example. We have one of our customers. They have a test that has fifty-seven screens. And each one of these screens, you have three to four different actions that they're performing insert text, do an assertion, and whatnot. And that test that that test passes time after time. And it takes about, I would say, an average of four minutes to execute that test on any device, which is incredible if you think about it. So we're in a much different place right now. I mean, and, you know, kind of put in context where what we're trying to do at the end of the day. So we're trying to build or our vision is to is to create a QA platform that you enrich the QA process with production data, meaning that you're testing or creating test for the scenarios that really matters because a lot of the QA is done today. There's hardly any visibility to really what the users are doing. Let's build it up process or QA a suite of tests or whatnot that is really based on that. And one of our visions is to connect or to bridge that gap, meaning we know exactly what users are doing. The production. These are let's build a set of suites or let's create the QA based on that. Obviously, test maintenances is the hardest part of any QA process. And we'll talk about that in a bit. And part of our vision is also to reduce disk maintenance and bring that to a minimum. And I think the final piece would be to bridge the gap between developers and functional UI testing. And what does that mean? So if you're a developer right now and you're either building the front end or back end piece, you've got your unit testing. And, you know, before whenever you push your branch or merge development or whatever, you know that you or your unit just run into CI/CD. There isn't such a thing really for for for functional UI testing. And what happens is, as a developer, you release a piece of code, you know that your unit tests pass, but you're not really sure how that piece of code behaves in the real world. Right. How does it behave in the hands of the user right now? And we want to bridge that gap. We want to bring it to a place where you as a developer, you write a piece of code. You know, that piece of code affects the login screen, for example. You'll be able not only to run your units but a set of functional tests to check the UI login flow to check everything that involves around the work you've done in a matter of minutes. So instead of having to wait for a QA person to do it, you can do it in a matter of minutes, get feedback and essentially develop much better code, develop faster, reduce the time that you have to go, and fix bugs and check things and whatnot. So this is kind of the vision of where we're trying to take 21. It's more than just a QA system. It's has a lot more than that. Whether we're going to reach a day where it's 100 percent autonomous, that will be determined how fast, I don't know, machine learning and AI advances, but in the meanwhile, I think that there's a lot that can be done to make it better, faster, more accurate, more aligned with the real world and what users are doing. So maybe I'll take a pause here and kind of pass the ball back to you, I don't know for some questions or something. And we can touch, what test maintenance is and Appium and all of the stuff that, you know.
Jonathon Wright No, that's really good. You know, I really like the vision. And, you know, it's got a lot of synergy with what I my kind of thought processes are around it as well. So I'm doing a book with Eran at the moment, which is a follow up on and it's called Around Shift Right. Which is this concept around joining production, what the users are really doing with what's happening in on the left-hand side. And partly as we were just talking beforehand, you know, when I was out in Silicon Valley, that was one of my big challenges. Big pushes was how showing the analyst community, how can we take what's on the right-hand side, and pull it back to create more realistic scenarios. And I've seen a lot of ideas about this and obviously very, it's maturing. But, you know, part of it is, well, how do I learn on the right-hand side? And I think I've actually got a next week I'm doing a discussion panel with some people within the QA industry. Which you're more than welcome to join in on, which is around testing in the wild and testing in production. And the reason why this kind of appears with this kind of is a lot of talk in the industry around crowd testing being potentially a resolution for COVID and the fact that a lot of the teams have been ramped down. And how do you ramp back up? Now your product to me seems like it's a perfectly positioned product to augment that. So to really work with the crowd testing companies to actually provide them with that device and also that kind of mechanism to capture what real users are doing. And this is one of my personal and we, last podcast, we were talking about it around. You know, I've just finished doing some crowd testing for a fashion retailer. Now I personally, which can tell by what I wear, but unfortunately, I don't have much fashion sense and I'm also not the demographic that they're aiming for. So it's great to have an understanding of the demographic and also how they would use it. So I got a kind of I'm doing a gamification team, which meetings, which we meet every Friday, and we share ideas, tips around gamification. And it was really interesting because we kind of talked about the fact that, you know, the Gen Zs of the world and now that actually we're on Gen Alpha and moving into Gen Beta and the book that I did with Eran the first book around digital quality handbook, but was I talked about digital experiences. I said how my digital experiences changed between me being in my 20s, my 30s, in my 40s, and the fact that technology has changed some of the complexities being reduced. But at the same time, masked behind that is the fact that we know when you submit an image or you'll want to do an image search for a product, so you upload a picture of a shirt that you see your friend wearing and wants to match it. We know this computer vision. We know there's a whole stack. There might be even some fabulous architecture kicking off to do that at the back end. Some some some quite clever technology that's actually pushed away that an abstracted the complexity away from the end-user instead of using filters in Amazon to say, I want a t-shirt, which is formal, with this kind of colors and all this kind of stuff. And I think as technology keeps on changing and, you know, the interaction with mobile devices and connected devices. So I got a Segway this week, not a proper Segway, but just a Segway scooter. And the first thing I did was it had to connect to my mobile phone to lock it. And then I couldn't go past anything else until it updated the firmware kind of the over the air firmware update. And then, of course, now it's completely linked to it's kind of linked into what I do. It's also gamifying me in the sense is it wants me to talk to the rest of the community. So I've already been talking to people, giving them advice about what happened to my last one and could they help, and also to encourage me to use the device and keep on using the app consecutively over a number of days. So behaviors are changing and how we come back into our devices using push notifications. Your Firebase messaging kind of pushes and also kind of in-app notifications, you know, this kind of this digital connection between people using mobile and the rest of the world. And I know you guys have been doing some work on the same MIT COVID Safe Paths project and did some amazing work there. And, you know, it's been groundbreaking with that kind of capability to go off, learn through the past and actually start creating scripts to allow us to quickly test on that kind of going from zero to 10, which you've heard you can enable really quickly. And, you know, obviously, as things go down the line, you know, the there's different types of testing that seem to be kind of emerging through is this kind of visual testing concept of where we know the big vendors that are really focused on, well, what's the difference between the logo being slightly out or something like that, which we don't concern ourselves that much with? You know, when we're talking about users who react native because we kind of you know, we're building, you know, an app for iOS and we're building an Android app and we're all good. But, you know, when I was doing some of the tests, I realized that even things like the fonts and stuff you need to license onto those products. So you might look slightly different based on different types of devices. So there is a chance and also when you layer on top things like accessibility, someone coming in and changing all their devices in their settings, and they're the font sizes to could potentially break the form factor. You know, I've got behind me, I've got my iPhone SE, my 8, my different form factors because they all seemed to work slightly different. And I need those for my Appium test locally. But, you know, if you don't want to manage your mobile, a state something like 21 is a perfect kind of solution for someone to actually get started with mobile testing and automation and also then kind of get them to that next level of maturity. So, you know, do you see that's where your customers are and their use cases are at the moment?
Royi Haddad So I am so thanks for sharing. I think you said a lot, and I think it's it's interesting to understand how or you touched a very interesting point in that that that that personalization. And you can really compare it to customer service at service companies because tech companies don't have customer service. They have the QA process, essentially. But if you're looking at service companies, they have customer service. And that's the analogy that I give QA. Now, how do you test an app or website for that matter, that is more personalized and it's more true to mobile applications because they're much more personalized than Web applications? There is I don't think I've seen that many Web application that that users actually go and try to kind of personalize the experience around that, changing the font, changing, changing the way that the Website load and whatnot. It's more prevailing in mobile application because it's more accessible. And I think the experience itself, people if you have an application on my phone, it's a much more personal to me than I'm going to a Website and looking at the content. Now, how do you test for that? How do you go and say, OK, take an example, a company that has one hundred thousand users and you've got the QA team that has built a set of suites, a set of QA tests or a set of tests, and they're running them. But at the end of the day, you know, to truly create a QA process where you're testing the things that your users are doing. If you think about the personalization part, your QA tests can branch out to dozens, if not hundreds of different branches that contain all of that personalization information. Right. The fonts look different, the screen sizes a little bit bigger, the change, the colors that they might be able to move some icons left and right and whatnot. I've seen applications where users can actually change the layout of the bottom, the bottom bar right and navigationally. And at the end of the day, you want to test that. Now, that being said, you don't want to create an environment, a QA environment where you're overloading. You don't want to be able you don't want to test every possible scenario because the other is just becoming impossible. Number one, to maintain a number to to execute in order to execute a set of suites that contain every possible permutation of what stuff that. But users are doing you're going to need hundreds of emulators, you're going to need, I don't know, days of execution time. The whole idea is to say we're developing a piece of software. This is an agile world. We're developing a piece of software. How quickly can we test what we're developing both in the unit testing world and the function world so we can release this to the world? And that's what we're trying to do. Essentially, we're trying to bring it to a place where it is it is a good example where me as a user, whether I'm a company to write a script right now or whether I'm doing manual testing, I can go in a matter of seconds, upload my BI abdicated. I just released whether it's through the CIA or through a platform and choose what I want to run, assuming that they already have test. If I don't have my test, then I can create at the moment until production will kick in for us. At the moment it takes us, I don't know, a matter of minutes. You can actually create a functional test for SEO Damian tell me if that's OK. But in a matter of minutes you can actually create yourself a test by just clicking and dropping and dragging on the screen. Are our interfaces a very UI, very intuitive user experience which allows you to create things very quickly. So bringing it down to focus. At the end of the day, companies, whether you're a big company or a small companies QA is part of your process. Whether you think you're doing it right or not is part of your process. And the whole idea is to do this, to bring it to a place where it's agile enough that not only developers can make use of it, which they are not at the moment, but also your product managers, your team, everybody that can that can get involved. It shouldn't be a piece of a QA shouldn't be a piece that only QA engineers or QA person is dealing with because it also involves the developers and it also involved the product manager. And the way that we created our application is a very visual number one. If it helps one of our customers, they have an application that it's a questionnaire and it there there's dozens of permutations for that question here, because whatever you choose on the screen can lead you to five different screens in the next and the next step. So, you know, from a product, from a product manager standpoint, they would really want to understand, OK, I've got this application have to branch out to multiple directions. I want to see what it looks like. I want to see what what the user experience looks like. I want to understand how when when the user goes, we need to screen being it's got multiple options. What does that look like? And allowing you to part of our system, you can actually see a graph of the system and how everything plays on top of one another allows you to understand, OK, this is the scenario. This is our test. This is what we're testing. Does this align with with with what I vision for the product? That being said, as as the as the developer in our platform, we've got we've got some new engineers that are running their test through the CI/CD. Some of them are running through the platform. But the feedback is very quick when you're running it through the CI/CD, for example, you're getting the feedback immediately. You're getting results of your test instantly. And I think, you know. With all due respect, to try to personalize everything, I think it's going to be impossible to test every possible scenario. You want to get as close as possible to what the majority of your users are doing. You don't want to leave people out, but at the end of the day, you've got to be realistic. And and at the end of the day, if you think about it, it's also something that can help you develop a much better product, because if you see that the majority of your users are using a very specific path that they're using, very specific, let's call it, they live within the same or the same sphere of personalization. It helps you kind of focus and help you kind of shift your development and shift your product development in into that world. Because I don't think in terms of product, having a product that try to satisfy everyone is good because you're losing focus and you're losing your comfort zone. You're losing the place where users feel most comfortable in your application. But I don't think that right now there is a solution out there that provides that ability to Check and test for different personalization. Unless somebody created a test for that, which, you know, I find it very difficult for for for QA teams to create every test for every personal personalization flow that users have created.
Jonathon Wright Yeah, and I guess it's diminishing returns as well. If you look at how many surplus scenarios you need.
Royi Haddad That's true, I mean, at the end of the day, you know, overloading the QA team with scenarios that are testing a fraction or even part of the flow, that of a test, for example, is something that that is personalized by some users who don't want to have too much of a base to test. At the end of the day, the more focused you are on what users are doing, the more focused you are on on on the flows that matter, the better your QA process looks like. And that that's, I think, where users should be aiming to go. I think once we put our production piece in place, it's something that's in development right now. But once we put that in place, we will give the teams the product, the engineering, much more visibility as to what really matters in terms of the screens that users are found on, actions that they take when they take those actions and allow them to really focus their efforts on building tests that matter because we see customers that they have a suite of 30, 40, 50 tests. But to be honest, a lot of them are either redundant, they Check flows that I doubt how many users actually are using those flows. But they're there. And I, I don't know if they're if the right thing just to feel comfortable and say, I have a map. One hundred percent of my application, which you might not have to write if you compare this with customer service, you know, and you're trying to improve your service or you're trying to provide customer service to satisfy everyone, you can't write. There's always going to be somebody that may not have the experience that are expecting. And I guess that's part of business. Even if you take these and you take AT&T, you take an even and even Apple, for example, I love Apple products, but even I have issues with their software. Every time they release some software or something else, it breaks. Right. Disney, you know, it may be great for kids. I don't know, maybe grandpas are not going to like it so much. And in the QA world, it's OK if you don't test for everything as long as you test for the things that really matter for the majority and the majority, I say over 90 percent of your users, you're in a good spot.
Jonathon Wright Yeah, I completely agree, and I think what's really nice about what you're talking about here is just how easy it is to actually get using the product. Because you know what? I found personally that my kind of background, I found it really challenging just to kind of understand the estate, what you to get all set up. So you get Appium, you know, having to use an Apple product to be able to access the iPhone. It felt like it was quite restrictive for somebody who wanted to get from, you know, having no scratch or no test to actually having something which is valuable and they can rerun and start building on. And I think I love R.P.M., But, you know, part of it is it's not as easy, like you said, making it more visual, make it very easy drag and drop that kind of basic capabilities. You know, getting people to that position, that's really useful. You know, the signing certificates on apps and just all that general kind of steps you have to do to be ready to actually take step one of recording something or automating something. And then the whole complexity around kind of the equivalent domain repository within a mobile landscape. You know, it can be quite hard and also hard to maintain, which I think was kind of one of the things you were talking about is how we reduce that maintenance burden, burden on both the developers, testers of the product team.
Royi Haddad Yes, it's a good point, I remember when we first started and we're trying to create that that initial infrastructure to be able to support the product right now. It took me days is literally days just to get everything set up for Android. I'm not even talking about us, which is a completely different animal. But getting Appium setup and having your Appium talk to your node, into your Python script, and into your Java, and you've got to connect to Sauce Lab, but you've got to connect to Profecta right now. And this device doesn't work and this device works. But this configuration is. We brought it to a place where all you have to do is just upload your APK. You can go to IoT. You can create essentially two ways in our system. One is you can go to a recorder. We have a recorder that is streaming live to your browser. You don't have to install anything on an emulator like the recorders that are found out there. You don't even need a device. You can you can start a recorder and start creating the test. And you're getting live feedback as to what's going to be the application. You can interact with it or you can essentially start. I don't want to use the recorder. I know exactly what I want to click on or what I want to do. And you can very intuitively just start adding some blank screens. You can add some actions on them. When you execute your tests, you will see those screen updates with real data. So the next time that you go to your test will see the Simian Army there, you'll see the image. You'll be able to essentially hover on the screen and see all the elements that highlight all the elements. So we brought it from a place where, you know, it takes specialty, it takes knowledge, it takes time to just get a script running, not even talking about connecting to a device, but just get a script into a place where user is completely abstracted from there. They don't a lot of our users don't even know what Appium means. They don't know what a script is. I know that I go. Everything is visual. They can click on an element. They can insert a text in that element. I can, I don't know, even create custom codified Slack. And I click on a wrong button and it executes and to bring it to a place where you can do this both for iOS and Android in less than a year. I don't want to I don't want to sound arrogant, but it's pretty it's pretty amazing. And then we're talking about the maintenance or the maintenance fees, which is, I think where you're getting start, it's hard enough. Getting started is hard enough. But now that you've started, you've got 50, 40 kids, some of our clients, their application changes on a daily basis. And now you've got a bunch of new screens in your applications. You've got a bunch of new Backlinks. You have existing screens. The buttons have shifted. Right. And. What do you do, your tests are not going to pass right now or a lot of these tests may fail. Not only that, you know, if you're looking at the whole world of applications, every company builds application in different frameworks. You've got your ReactJS, you've got your Cordova, your e-mail five, your your native Java and IoT and Swift. And we have applications where you're looking at the signal that's coming from the device, the door. It is completely empty. You just have the widgets. You don't have any ideas. You don't have any text. You don't have anything. How do you take that now and give a user the ability to still select the elements, to still click on button? So we brought it into a place where we can take essentially any application, regardless of the frame that was built and still give the user the experience of, hey, I can click on that button, I know where this is and I can add just text and I can do an assertion without the need to understand really what is it doing? We need to have an idea. Don't need to talk to my developers and what we did, we created our application of it and we model the application more in a much in a really visual way where we have screens and we have edges and essentially edges there. Just the connection between to allow us to really make the maintenance of these tests much easier. There is one company that they completely changed their application and changed their business model. They changed application. I think it took about 70 minutes to update their entire test, 70 minutes to go and update the test. Now, if you were an animation engineers, you'll have to go each script and script and test it and run it and change and do that. And and our maintenance is part of our maintenance BC. And this is just the beginning of our vision is whenever something changes the application. Right. You can go right now in our system, if you want to create or update your test, you notice something is change or you have to do is go to one test, make the changes and all of your other tests that are affected will be applied. And that reduces your time to to of maintenance substantially. And at the end of the day, what we're aiming for is a system where you don't even have to update one of your tests. Whenever you upload a new file, you will be able to automatically identify what changes exist between your current tests that ran on a previous application to your current tests that are running on this application and apply automatically all the changes that were detected to your test. Now, bearing in mind that such a process can obviously take time, so the user might not it's not something that the user will be doing is something that is going to be done automatically. But I think it's important also when you're building a QA product and I see this with all the other products, that feedback to the user is very important, especially people that don't really understand that. What you mean is how long it takes to run a script, how long it takes to spot an emulator on Slack for a second effort for for for example. And I think creating a platform or creating an environment, a user experience where you're constantly giving feedback to users that are not familiar with that. What is going on is very, very important to creating an experience that allows them to, number one, maximize what they're trying to do, but also is very engaging because you can go to our platform, you'll click on a button, or, for example, if you're trying to use a recorder right. In some other Web product, it can take you an hour and a half just to create one script by one test. For a user that doesn't know what animation is, there are probably fall after the first five, 10 minutes and. This is something that, again, we we haven't perfected yet, but we are constantly working on trying to create that experience where, you know, you understand, although you don't have to know if you don't have to know scripts, you don't care what software because you still get the feeling of what is going on. A lengthy process or some process that requires waiting and requires some updating requires something to essentially relay the message to the user, hey, there's a lot of heavy lifting that is that is happening right now. But at the end, you will get something really good in our platform. Right now, the process is very seamless, right? Even just when you run the test, for example, you build a test and you run it. The feedback is immediate and it takes less than a minute for the test to start running. And you're constantly seeing images coming in streaming to your to your Web browser of the test that is running. You don't necessarily have to wait for the test to be completed in order to understand what was going on. You can see everything in life. And this is just one thing of the process that we're taking in order to create a user experience that matches every other application today in the world, although the underlying technology Happy Path Selenium testing is the heavy lifting process.
Jonathon Wright Yeah, no, I think it's a really great idea and concept because, you know, I when you saw the light slaps, right, you they've got some really cool functionality, like the fuzzy mode which goes through and tries to discover pages. But then you always get stopped by the login or whatever else it is. We don't really know what you're doing. Right. You kind of go, well, I've got I can upload my own Appium script, right. I can do this. I can let it do a simple task, whatever that means. And you don't really feel like you're involved with doing the quality assurance task that you're kind of assigned to. Right. And I think that's what's really nice about what you're talking about here with twenty-one is that actually you're involved with it and you're building something and you're seeing the results of what you're doing and you've got that instant feedback and, you know, which is really valuable because I think that's missing from all of these tools. You know, they are cool. They'll do the job, but they'll do a job, like you said, with seven days worth of time being burned into it when really you want to get to that naught to three very quickly and then you want to start saying, OK, I now have a now say, oh, wait a second, I'm fixing the asset. When I when it breaks, you know, I can put it into my pipeline. I've got all this value. And I think that's the key, is that value-driven delivery approach of where should I be spending my time and what value am I going to get out of it the other side? And I think it is very hard. I think that mobile testing is probably understated in the industry is just how difficult it is. You know, it's very easy for someone to pick up. Postmen do AP testing very quickly. It's very easy for someone in the old days with Selenium build it. But now the new days to just hit record button, start doing something, you know has been a challenge and it's been a kind of yes. With nearly there. But we're not completely the vision is not completely done. And of course. Yes. Emulators spinning off an emulator with your Android studio or your Xcode. You know, it's great if you're a developer and you're signing your own certificates and you've got an Apple account and Apple developer account. And, you know, it all seems to kind of come in, but it's still a lot of learning that needs to get put that. Whereas if you've been assigned a task which is, you know, build some tests or even execute some test run some tests on the device, then you want to be able to get there fast and start seeing value. So for those people who are listening, you kind of want to get to know your product and what's the best way for them to kind of get up and running, you know, actually start doing some testing.
Royi Haddad So obviously, go sign up and sign up to the platform. We are in the process and we actually implemented a really interesting onboarding process because these products are complicated. I think when you try to do something complicated and give it to somebody that may not have an idea how it works, I think a solid onboarding process is really important right now. You can go to twenty-one labs that are you sign up, you'll get an invitation. You log into the platform and you'll be taken through a tutorial. In that tutorial session, we will have you create a test on a sample project in a matter of less than a minute and you'll be able to actually go through the process of running that hasn't seen the results. And that kind of gives you an idea of how easy it is to create a test and how easy it is to run a test from there. Or you have to do is just upload your application, use the recorder, or just, you know, just add a bunch of blank screens again. Are you guys very intuitive and you'll be able to get your application started in a matter of less than I would probably say, less than ten minutes? You can upload a file, create a very basic test, and fifteen screens and run it. We are obviously seeing the process that we are we are helping the initial users to really understand the process because again, we're still a fairly new product and trying to really nail that that onboarding process where somebody that doesn't understand anything about incurables can come in, sign up and use it. It will take a little bit more time. You know, some of the people that sign up for a platform or VP of engineering or CTO product managers trying to test this tool. And again, you're trying to take something that is super complicated and bring it to a place where a user can sign in and just start using the product. And because it's not an autonomous crawler. Right. If you're if this is an autonomous call, like it's a Google firebase and whatnot, you just have to upload a file and you'll see your application get stuck on the login screen and you're done. But because this is a much more robust system, we're trying to bring it to that place, but with a lot more value. And you're up. Question doesn't get stuck on the logging's, but again, right now, the best way go to our website. Twenty-one laps that are your sign up. You'll get invited to a project and you'll see a sample project. You'll be able to play with the system to really go through the process of creating a test, running that test. And then from there, you can just continue on your own. Uploading builds are very easy. You can just use your platform and want to apply to the OR Android. We actually test your build as you're uploading it. We're testing your bill to ensure that this bill can actually run on an emulator or real device. And if there's a problem with that, we'll obviously let the user know. But that would be the easiest way to start.
Jonathon Wright Sounds really good. I love the idea of a dummy app to get you going and show you how quickly it can be done. And, you know, I keep on learning. I was just lucky because Paul Grossman leads the dark arts wizard in the Selenium world. He showed me this candymapper, which had never come across before, which is surprising in the automation landscape. And it's just a whole Website full. You know, I think you're supposed to find candy in a Google Maps landscape, but every day changes. So, like, the buttons will break, you know, certain text can't be accessed. And they're doing it purposely to kind of give you a chance to try something difficult to see how you'd fix it. And I like the idea of having an app where you could kind of get it up and running, you can get it building and then prove that actually you're familiar with it and also then get you through well, what to do with maintenance, you know, what to do with all these other steps. So it sounds really great. And so so how is the best way to kind of contact you as well? So is it LinkedIn Twitter? What's the best way for listeners to reach out?
Royi Haddad Yeah, definitely. So first of all, I'll share my email on the blog itself, but find me on LinkedIn. I don't use Twitter. I deleted my Facebook account about three years ago. But LinkedIn will be the best way and I'll share my email.
Jonathon Wright Wonderful well, they've been absolutely fantastic having you on the show and I'm really excited to see 21Labs.io Where it's going to go because, you know, it sounds like you've got exciting things. You've got some really clever people there doing some really clever stuff. So thanks so much for taking the time out.
Royi Haddad I appreciate the offer. Thank you so much. All right. Take care for now.