Jonathon chats with Senior Test Consultant Laveena Ramchandani about data science, big data, and finding opportunities to learn from your peers.
- Subscribe To The QA Lead Newsletter to get our latest articles and podcasts
- Connect with Laveena on LinkedIn
Other articles and podcasts:
- About The QA Lead podcast
- Unit Testing: Advantages & Disadvantages
- Help Test COVID Safe Paths MIT Project (with Todd DeCapua from Splunk)
- Transitioning Into QualityOps (with Parveen Khan from Square Marble Technology)
- How QAs Can Work Effectively On Agile Teams (with Tatiana Zinchenko from AVEVA)
- 4 QA Job Descriptions: Tester, Engineer, Manager & Technician
- Automation Testing Pros & Cons (+ Why Manual Still Matters)
- Introduction To The QA Lead (With Ben Aston & Jonathon Wright)
- Developing The Next Generation Of Testers: The DevOps Girls Bootcamp (with Theresa Neate from DevOps Girls)
Read The Transcript:
We’re trying out transcribing our podcasts using a software program. Please forgive any typos as the bot isn’t correct 100% of the time.
Jonathon Wright Hey, welcome to The QALead.com. Today, I'm joined by a fellow speaker, Laveena. She's a senior consultant and she's just started doing data science testing. So it can be really exciting journey from business intelligence through to machine learning and AI.
Intro In the digital reality, evolution over revolution prevails. The QA approaches and techniques that worked yesterday will feel you tomorrow. So free your mind. The automation cyborg has been sent back in time. Ted Speaker Jonathon Wright's mission is to help you save the future from bad software.
Laveena Ramchandani A little bit about myself. So I have done a computer science degree, graduated back in 2012 that feels like very far away now. But I fell into testing. So it was that when you're in your final year, you've got that urge to apply like two thousand graduate jobs. But you don't really know what the job really does or how you feel on that job. And it was that motivation. Everyone's doing it. So I should do it, too.
And then applying to so many jobs and then you get a graduate role, you're excited about it. And then the first thing you come to know is, oh, we've got a project in testing and me thinking to myself testing, the only thing I knew was black box and white box. We never actually did anything at university about it. So when I got on the job, it was not a graduate role. They finally at the last moment, they decided to cancel a graduate program and they said it's just going to be learned on the job now. So I was like, wow, I just fell into testing where I, I look back at it now and I feel like it was a very good fall because it's a good mix for me. I feel like it's got a lot of business awareness, it's got a lot of technical awareness.
There's a lot of areas that you can grow into and you can just spread like if you're interested in something to do with, say, for instance, accessibility, then you can join with the UX team and work on that kind of side of things. If you're more interested in doing the automation side of things and you can learn that from the front end developers and see how you can actually do the front end side of things or even with the backend developers and see how you can get onto hiddenness kind of testing. So there's a lot of new areas that you can learn and I don't feel like a lot of pressure in this job. I feel happy. I feel very content, and it's a lot of new things that I learn.
So I quite like it in a way I was going to say in a nutshell, but I said quite a lot.
Jonathon Wright That's a fantastic description. It's so, so refreshing to hear that kind of journey going out with university and going straight into testing. And Yeah, well, I really love about that story as well. You know, it was quite specific. You know, you were doing business intelligence that I. Yeah, it feels like it was way because, you know, back in 2012, bits, business intelligence, BI and then big data, which kind of really emerging areas. You can see how the growth happening. Obviously, from your uni days, you kind of got this these foundations, the technical skills that I guess also the idea of the business but have you found as you grow within your role and then with different clients that you're dealing with, that you were understanding more about the kind of the business side of things and the importance of kind of what that BI that intelligence is, is all about and how it was businesses?
Laveena Ramchandani Yes. So I think I was very, very confused. If I be honest with you, at the very start, I just thought, I don't know what I'm doing. What are these reports useful? Who's going to use it? So when they said that there's this particular client and they're going to use it for certain things to boost their business knowledge and to get a better idea of how their business is doing. So then I was like, I see, fair enough. But then it was still like a major area for me because it was my first ever role there. And then obviously I went to my second job, which was more on the analytic side of things. It was more hands-on rather than more analysis. So I think if you're like in an agile team, I've definitely understood more. I think in my first role I was in a waterfall team, so it was very difficult to understand what am I doing until the very last minute when you testing it.
So I quite enjoy where I get better feedback loops. Like in Agile, there's a lot of planning sessions, grooming sessions. They share a lot of knowledge with you beforehand. Before they actually build the features, so. Yeah, you learn quite a bit, and as I said earlier, it's a good balance, business skills and technical skills, so I quite enjoy that. I didn't want to be fully technical only.
So I think that's how I looked at it. But definitely at the very start, I was very confused. What am I really doing? But slowly, slowly, I understood how this could be useful to a major company that I was working for back then.
Jonathon Wright Oh yeah. And it's the great thing about that kind of exposure that you got now with a large organization that gives you a big project, the small projects every size project. And it's really interesting what you just said, because part of The QA Lead we've been looking at setting up a community side of things and we've been toying with ideas of what that community would look like. And I sat down with Paul Gerrard and Giles Lindsay the other day, and we kind of came up with this concept about, oh, you've got the bucket about the test as well as the testing. So Paul looks that way. And one of the ones we followed up, which is the business story pocketbook, which is. Which again, is that harmonizes both of those kinds of topics. What you talked about is that you come in with a testing context and you're understanding that the importance of testing and like you said, you know, being the back-end and being the model and then you come into doing something more agile, which is sort of where the business story came in with this idea of, you know, a lot more upfront, a lot more information, a lot more visibility about what it is that you're building the importance and business value to the organization. And I think that's fantastic. And so one of the things we've been kind of saying was, you know, when you start off your career, you know, there is a disconnect between maybe what you come in and do as an analyst role to what maybe the leadership team are doing and then maybe they're above that. As far as maybe the C suite, what the, uh, CEO or CIO is trying to kind of manage on a day to day basis, you understand the different challenges and you understand there's a goal or a, you know, a vision for the organization. But linking them to your day to day tasks and your small teams is really hard and inspiring and growing like you've grown in your role and become a senior consultant. Part of it is starting to connect the dots between, you know, maybe what you're doing on a day to day basis and what that means for the business that you're actually enabling that capability. And I think that's a really fascinating story, what you're talking about, because what you've done is you've harmonized both of those in understanding the importance of business and understanding the importance of what you're doing and from the testing. And I think that opens up this kind of leadership role. And that leadership seems to be a bit of a fuzzy, warm weather, you know, across the organization. And part of that leadership, really what we're talking about is, is understanding the business is bridging that gap between business and those activities that we do on a day to day basis. And the importance of like when we started chatting about a release going out, what does that mean? What does that mean for the business? You know, we all understand revenue will all that, but that's not really what that the company is about. The company is about providing some kind of product and that product then touches users it touches and customers that get to enjoy those speeches. What you release and the importance of that is completely different. It might be different to what the business is looking at. Do you find that? And I like what you said when you said about you can go and you talk to the UX team and you can look at the improve the user experience, you know, you can go off and understand the value of doing these other activities. Are you starting to see that as a kind of a senior consultant, that you're understanding the wider landscape of testing and not just micro-level and the value of both? I know when we first met, we were on a panel discussion about why to automate again, that is, it wasn't really about why to automate, is there what's the benefit of the value of doing that activity, whether that be automation and accessibility testing or another phase of this type of approach or testing. So it's not something you can you starting to join the dots with and you can kind of start understanding where those roots came from.
Laveena Ramchandani Yes, A hundred percent, because I feel like the more you collaborate with other teams, the more you share your knowledge and listen to their knowledge as well. You're building that relationship. And when it comes to testing, you understand things better as well. So I definitely would suggest collaborate a lot, pair up a lot and learn from that, because it's giving you a better idea of what's coming up and how you can actually help approach that and add value to it. Because at the end of the day, it's not just our responsibility to look at quality. We should involve everyone and everyone in our team. So the thing is, just talking to a wider team, even with the Ops people or just sharing some QA eminence outside your testing team just helps you broaden up your mindset and get better value. If there's something that you've probably showcased in your team and you're sharing it with another team, then it might be something helpful to them or just going to another meeting in another team with another different QA maybe they've shared nugget of information that could be useful to you when testing. So definitely, I think just going around to different teams and speaking to them about what you think quality is and where you think the quality of this product is going is vital. And sometimes it feels like QA testing teams love to walk like take two steps ahead, then they can, and then we get pushed back. So I feel like at that point we shouldn't feel demotivated. I think the priorities are different at that point. But at least you've put the seed in there. At least someone must have heard about it. So you let the plant grow whenever it like whenever it grows. So it's super important. Like even sometimes it's so difficult to bring a new kind of testing, whether it's functional or nonfunctional, and get it signed off. So just trying to bring it in a different way, like maybe just on a stand-up, make people think about it or even present it to the team. It just helps. So definitely wherever you can add value, whoever you can speak to, it doesn't have to be the only team members you can speak to. Other people as well. Do so definitely agree with that. The more we talk, the better knowledge we get.
Jonathon Wright Yeah, I love that, and I love the premise that it's being a quality ambassador and talking to people outside of your team, other people that and I think part of these new times is, you know, you don't always have that sensibility right. And you don't have that chance. Do you, you know, maybe it's a team-building exercise or you're doing something that falls outside where the rest of the business is involved in it. How does that communication how does that, like you said, that's the idea that then seeds and grows becomes something that's wider and about this kind of this idea of understanding the relationship between what it actually means, what we're doing and the premise behind that. And I think it's really fascinating because, you know, if you went and took your message to equality and equality is everyone's responsibility, if you sat down and have a conversation with somebody from the business, whoever they are, and part of it is explaining them some of the fundamentals, like you mentioned, about the difference between functional and nonfunctional. They may not understand them. They may not have the background to, but they may understand the importance of security, security testing or accessibility based on the type of uses that they're wanting to be able to interact with and the importance of being able to use that of the importance of cross-browser across divides testing, you know, part of it is they can understand the value of that. But part of it is sometimes they may or may not have that visibility. And that's what I find really fascinating about your kind of journey. There is this business intelligence is, you know, originally cascading KPIs with something that were of value, which meant something to somebody. Right. So know if you had 15 defects, is that good? Is that bad? You would one hundred test. Does that mean is it good or is it bad? There's no right answer and there's no result. You know, if I was a chief financial officer, if I'm looking at going, well, yeah, we're finding less bugs, percent less money, and we're doing it with less staff, that seems like it's a really good outcome. But not seems that maybe that's a conversation to say we're not finding any issues. We're not adding value. We'd like to look at doing more exploratory or we're looking to invest in automation or we're wanting to expand our reach to what we do. And I think that's a really interesting point, because it's not that conversation never happens. The numbers kind of tell a story of themselves and some of those milestones which are kind of synthetically being generated of let's product increment X must be released by day Y, and that maybe market-driven, but how these all the quadrants come together and this is what we were trying to say, is that this kind of three amigos aspect to have just developed a BI for a second, and that is maybe that has to expand maybe. You know, it's actually more important that we have, like you said, business and operations in that are part of it is expanding. There's those amigos that they're not just the three same starting point from the same group or the same background. Part of it is we need that rotation. And I like innovation has to happen here.
Jonathon Wright Part of it is it needs to be organic. It's not something that can be forced. It can't be put these people together in a big room plotting once a year, once the product increment or, you know, every quarter, it can't be forced. It needs to be organic. And I really like that messaging that you've got there is it's kind of taking the time to put the right to start the right seeds and understand where you can grow quality. And I think that's really valuable for your customers. And I know you've done some really exciting projects, one of them being for the TFL, for the Transport London Project. Did you find when you were on that journey what kind of different challenges or what was what was working for? You know, a transportation landscape must be very different to what you're doing today.
Laveena Ramchandani Yeah, it was very different. I think that's where I learned about automation testing, which was a complete new world for me. And I was like a bit horrified for a second because I was like, oh, dear, do I have to code? Because I used to always run away from coding. And I was like, oh, dear no, it's calling me back. So I just I think I took it as a challenge because I was like, the more I'm going to be not willing to do it and the more is going to come to me. So I was like, let's just put my head down and get on with it. And that's where they were for them. Automation was super important, so I did start there as a manual tester, but within two, three months we have to just completely scrap manual and focus on automation. So I think it was a very different journey. It was the agile world as well for me. So from waterfall to agile, it was a very, very different place for me. I quite enjoyed it. There was more involvement from the very start of features coming in. And definitely the aspect of adding automation helped me learn a new area of testing. And I think I was working with really interesting testers as well. So there was a lot to learn from my seniors. So yeah, I quite enjoyed that aspect. I learned a new code coding language back then and we were doing a BDD style test, so they were quite easy to understand as well. So it felt very rewarding. I felt proud of myself that I did something for once. So it was really nice. But as I said in the panel session as well, does automation always provide value? Do we really need it? And if we do need it, then where should we add it? And maybe in the high risk areas, for example, or do we really? The thing is, we can't automate everything. What about the look on the field? What about the psychology side of things of the user? So that was an aspect which obviously I left after a few months of automation, but they obviously I don't know if they looked into that, but that would have been super interesting for me to find out if if they looked into that side of aspect or not or if they went if they shifted a bit left or they stayed on the right. So I think I had a nice experience there. It was something different and lots and lots to learn from. And yeah, that's where I thought that. Yeah, testing is really interesting. I want to open up. I want to spread more so that I never feel annoyed or afraid of coding. So I'm quite open now and I miss it now actually.
Jonathon Wright No, no, I think what's lovely about the TFL story is, is, is that you get to see the direct impact, right? You understand what they do, right? They run the tube system. Right. It's kind of there. And you can directly see, well, I made an impact, you know, whether that be just an API, that another traffic management tool can connect him to say, well, if the tube. Okay, or what's the best route to get to a TV or something like that. But it's bigger than that in the sense of exactly what you said. It's about safety as well. You know, it's such an interesting area transportation that, you know, and it kind of goes into your kind of you modeling side of things. I remember working with the CFO, probably a similar amount of time to what you were doing. And we were modeling out the underground as a proof of concept around looking at the air ducts of how they blow their way into it. So visualized it so that we could understand the flows of flow there and and how all connected and it came into this 3D model kind of landscape. Again, we were trying to prove a hypothesis. We were trying to prove that if we did this, the idea is to avoid or predict, you know, issues that are going to happen on the line and avoid them. And again, you've got that kind of cause and effect because you've got that. And if well, the what we're trying to solve for the hypothesis point of view is, you know, this is the goal, right. This is the goal of implementing said system and said system typically is an I.T. based solution. It's not a human solution, even though humans are involved in the process. Right. And I find that fascinating because, you know, you're what you talked about on the test concession, which was brilliant, was about testing a data science model and kind of, you know, your journey of when you kind of started this out and what you kind of thought about and how you applied those models. And it'd be fascinating for the listeners to come out on such a hot topic to understand how you got involved with that side of things.
Laveena Ramchandani So obviously, I had no idea that I was going to go in a data science project.
I just heard that there's data science AI/ML. But I only always used to think, oh, this is such an interesting area, but I don't know if I'll ever get involved. And next thing you see, you're in it. So so basically it was really interesting because I've come with different backgrounds working for a consultancy than a boutique consultancy where I managed to work with a bank and then with TFL to complete different industries. And now this is a complete different industry as well. So I just thought it's a good area.
I think more testers need to be involved here, actually, because the more I ask testers around, they say that they've heard about it, but they're not directly involved. So shout out to all the testers out there. Please be involved. We definitely need more insight here. But I found it quite interesting because it was very math of a style.
Not project, but a lot of math going in here due to the algorithms. So, again, I was thinking to myself, will I have to do all of these calculations? And just getting a bit worried that you don't have to be a math pro here as long as you're pairing up with a data scientist trying to understand what has this algorithm achieved and how is it providing value to the customer. So that was the most important part. So what I tried to do is I was thinking more from the business side and then from the technical point of view and seeing if I'm testing accurately enough or not. So if I were the user, would I understand this conflict or would I not understand this configuration? So that was that. And then there's a lot of data obviously involved here. So I learned around anonymizing data sets as well, because previously I had worked with golden datasets which were ready-made. So you just had to make sure they flow. But here we anonymized data and I 100 percent agree with that. Keep your clients data secure and that's the way forward. And yeah, I think data science is super interesting. I would definitely suggest to research a little bit around that and just understand what it's actually doing. So I think for some data, data science models are there to predict. Some are there to give you ideas of what the future could look like or not, depending on the data you're pushing through the model. So is super, super important to understand what business value this is providing and don't just look into the technical side of things. Also, look, if you've got a front end built to it, also look at it from the user's perspective of when they are using this, how would they use it? Are they going to find it very complex or are they going to find it easy to use?
So put both hands on the technical and the non-technical one here.
Jonathon Wright I think that's great. And it's like you said, with the big shout out for and it's interesting because I kind of I know when we talked, we part of this avoidance of kind of this test data engineering in test kind of landscape of and the avoidance of that similar kind of issue we had with software developer and test. You know, I love data engineering. You know, I think I always talk about data engineering over data science in the sense of understanding the data. And, you know, kind of what whole and everybody we've been kind of getting to is that data is everything right? That test data is kind of maybe the last frontier of the most important aspects of everything that we do in the in this industry is around data and understanding the knowledge, what's associated with that. And I think it's one of the the least explored areas, the coming from a TDM company, Oxfords Data, which was a, you know, math-based, heavily math, applied much physics from Oxbridge. You know guys trying to understand just the sheer complexity around how you do the synthetic generation or obfuscation or you know going through these one of many different approaches to get the right type of data quality isn't hugely complex but I think it's hugely important as well if not the most important thing and especially around modeling as well which again is difficult because not all testers or modelers you know it's a really difficult challenge. And actually, I said to the guys today, the I'm working I'm part of the ISO committee for the new standard, part six of the model-based testing. And so I kind of shout out to people like Dick Bender or Richard for all his work on model-based testing on the rest of the guys who are going to contribute to the standard. You know, it's so important part of moving towards this is that we've got this new challenge. Right. And I think you pointed it out with kind of the AI and email revolution. And, you know, I think I mentioned so I was on a call with Jason and Tariq from Test.ai. And, you know, we're having that same kind of conversation, which was exactly what you said is how do you understand what the outcome should be?
Jonathon Wright And this was this kind of is it a testable state? And I think this is absolutely fascinating in the sense of, you know, if you can't if it can't be explained, it can't be explainable. AI if you can't be explained, then in theory it can't be tested in assessments. Right. And this week I was listening to Wozniak, who was did a keynote for a Dynatrace Go and is really interested to hear his viewpoint from Apple and being obviously a founder.
And he was kind of saying about how we've come to kind of rely on AI. And his viewpoint is that he's very much in the Pro camp of AI. And he was kind of saying that, you know, Apple have a capable and internal team that will go through and have interview, kind of interview the developers who have created these models and say, can you explain that to me? And if they can't explain it to somebody who's non it's non AI background of how it works, then as far as they're concerned, they don't actually see it as a feature or a capability because it's not explainable. And I think that he was kind of coming at the point of saying, well, you know, we've kind of gone too far, like you kind of mentioned, with the kind of the ethical side of things. But also this what is the end emotional state which you expect him from? Somebody we've kind of gone too far down, down this rabbit hole now and what the Internet was, you know, and if you've watched the social dilemma on Netflix, you know, from that kind of building on the Great Hack and obviously the data what like what happened last week with the election data from 2016 and the AI machine learning used from Cambridge Analytica to the ten thousand reference points that it had for each one of those voters.
And then of course, you know, politically exposed people. It's the same kind of thing in the sense of, you know, the Internet's kind of gone down a bit too far and it's now got quite a lot of negative themes. And I know Jason mentioned that what they used to have a Google was the anti-evil team, which actually got decommissioned in the end and then kind of came to this kind of conclusion that recently, as you know, at what point do you become a kind of a whistleblower in your own organization, which is kind of going the opposite way is, you know, at what point is your responsibility to kind of say I shouldn't be using production data or I shouldn't be using this data set to train this algorithm because it's got a bias against it? Because I've synthetically. And therefore, the bias is towards my understanding of the subject matter area, not the true representation of a diversity of what the data should represent. And so certainly, yes, there's this ethical transparency and difficult challenge around bias, which we've had for many years now. But actually now it's okay with what wasn't I was saying is actually, you know, the primary objective of something is that it makes the end. It makes it easier for the human right. It makes it easier for that reaction to be positive. Now, the amount of times that you interact with your virtual personal assistant and it doesn't give you the response that you're expecting in some of those tasks are actually really straightforward. What was that was kind of saying is that before SIRI became SIRI now, you know, those standard things like Chromium or, you know, ring my friend or, you know, what time is it? We're all coded into the to the phone. Right. So they weren't going off some, you know, natural language processor and understanding the context and then coming back and saying, OK, now I can tell you the time and that round trip being, you know, three and a half seconds with this big dramatic pause between each kind of VPE, it was straight away responsive. I understand you, I understand the phone. I'm trying to get your language. I can answer this very quickly and therefore, maybe I should be pushing the AI tests to things that are more complex and maybe there is an easier way to do it without through automation or intelligent automation that isn't necessary an AI for everything. And I think this is where we're suddenly going is that with the trend is that there is a lot of power, but we've like big data like BI. When you started your kind of career, you know, the failure to big data was, well, nobody had the questions that they wanted to find out from the data, which was the new oil. They didn't really know what it was that they needed to ask. Now we've got data scientists who are discovering anomalies in the data and saying, actually, this could mean X, Y, but again, we've not still got the right questions. And I think that's what you can say. And I think it's maybe the most important statement for this century is when do we know what kind of people should be in quality and in testing that will ask those questions, that will ask the hard questions of what does this mean? Should we be doing this in the first place? Is this the right use of technology? Is this humane? Is this the right thing to be doing? Not from a bottom-line revenue generation perspective, but ethically, I think this is certainly meaning that the QA and the test is responsibility is gone from something which is maybe measurable in something that we can understand to potentially social responsibility. And you know what that means to the end customer. So do you feel that this is a by, sounds a bit profound, but do you think this is what the future of QA and testing is to ask the hard questions and be responsible to ask why, why? Why should this work in this particular way? And what's the value to the person who's using it at the end?
Laveena Ramchandani Yes, exactly. I think questions are super important. Ask as many as you want to be the annoying person in the room. It doesn't matter. The more you ask, the more you get. So there's no harm in asking because you won't get if you don't ask. So definitely go for that. And then there's always I've seen teams do the five whys, so if they don't, they ask the why for us. They get an answer, then they ask a few more whys and then they get to the bottom line. So definitely the final decision would be really good. This could be like with anything like I don't know if there was a production issue that happened, go through the five lines. Why did it happen? Why didn't we look at this before? What like what kind of things we missed? How can we build up the dots and make sure we improve next time? So try and become like the what shall I call the detective in this kind of team? Right. Try and understand what's happening and ask a lot because again, it's adding value your questions. It's making people think it's making maybe some people are shy to ask. Maybe you are asking other people's questions and helping them understand.
So ask a lot that I would definitely suggest that.
Jonathon Wright Now, I think that's brilliant. And again, it resonates with some of the tasks that actually was, you know, and this final frontier of what testing looks like was part of his vision was saying that actually, tests shouldn't exist on the left-hand side in the testing, traditional testing landscape. It should happen in the, on the right-hand side of the ship. Right. I talk a lot about it as well. Exactly what you said is an issue happens in production. Why did it happen in production? You know, how could it have been prevented? And yes, we understand the value of doing retrospectives and stuff within Sprint against developed developing products. But what about operational products that sit there? And, you know, it's part of what Tariq was trying to say is that test should be built into the living product. So it's testing itself as it makes those decisions, as it executes right, is it kind of says, you know, there's the idea or the concept around self-healing is it understands that this is a state. It could potentially be in a position where it's going wrong. So it falls back into a last known good configuration or it falls back to a previous field, which understands and it might even do that on a AB perspective of a smaller subset of users that are experienced it. And it could be related to device, it could be related to region, it could be related to a particular type of account today. But each one of those kind of dimensions is, you know, is a testable state, which happens in the real world. Right. And this is a really interesting idea of, you know, is one of the big areas that we've missed is this kind of concept of, yes, we've got synthetic tests that running Happy Path scenarios to see if APIs are up in the real world and the landscape of APM tools and the next generation of intelligent operations or AI operation OPs tools. I've got this huge amount of multidimensional transactional data which we could use to do the root cause analysis, pinpoint failure in real-time or even preventatives. So the systems don't go down because we've got, like you said, a QA inspector who or detective who is actually going in there and maintaining that from a kind of a chaos engineering. But somebody is actually solving it in the real world.
Laveena Ramchandani Yeah. Safety net. Yeah.
Jonathon Wright So so I think we've invented or you've probably invented this new kind of QA role, which is, you know, in which is a professor or a detective in that landscape, and they're doing the forensic, you know, data analysis of what's going wrong with the issues and systems. And they're making decisions on how to prevent or even avoid those situations. And I think this is a really interesting landscape because I think we do put so much pressure on the SDLC as the end of the start and the end of everything. And there's very little focus being taken operationally. And I think the software isn't alive, but it has a room or until it gets a new release, it has to live out its journey of usefulness. That as you do it now, you know, there's ways you can you can help provide a better experience or even understand the experience that people are having. And the TFL is a great example, because you can understand people are happy that they're not stuck waiting for the next two, which is 40 minutes late, because your system sorted out the like the correct signals. I don't know how the signals go down so many times. I think that's an excuse for something else. But, you know, part of it is if there is a there's a whole ecosystem and, you know, I love the TFL Access example because, you know, I use City Mapper a lot . And, you know, part of that is the you know, it's sensitive to its context. It understands it's the weather's bad. So it's just the weather safe roof through your system. Right. It knows that would say, oh, don't get out here because there's a good chance that this is going to be crowded. All entry-exit here or enter that part of it is all this is designed to help humans navigate as a basic functionality. And I think that is that what was that was kind of saying is that from a pure perspective, what is it that we're doing for humans? But also one of the things he said, which I thought was really interesting, was this concept of offline. And I think partly as a connected landscape, we're so reliant on connectivity and that actually Ops shouldn't need connectivity. They should be able to still make decisions. They should still be smart and non-connected stay. And I think this is where we're stuff we should be thinking about states of connectivity and how we deal with testing those, because as soon as you get on the tube, there is no signal. Yes, there's Wi-Fi call in now, but, you know, before there was no signal. So, you know, the data, you know, the change of state affects the data, all the functionality or the capability within apps. And so many apps just don't work, because if you can't find an Internet connectivity, I think we've got to move past this. The same thing is there's not we need to do our solution is building another app. It's like the TFL view of it's going to be open, but developers can build on top of it will interrupt or it will just make it as part of your experience, whether that be your watch, whether that be the new Google glasses that coming out. Part of it should be you shouldn't need a apps to talk to each other. You should just, you know, leverage data. Right. The data of knowledge of systems and solutions and how they interact with each other.
Laveena Ramchandani Exactly. Yeah, definitely. That sounds really, really interesting when you mention the Wi-Fi.
I forgot I forgot to tell you I tested the Wi-Fi system in the underground.
It was really exciting time because I used to turn on my Wi-Fi from all the tube stations I had to get to work from. So I used to then go back to my computer and be like, let me check if it picked the right tube stations and let me see if I can see where I was sitting in the tube as well, because it was picking up which card you're sitting on as well.
So I think that was super, super interesting and I feel really happy whenever I see the Wi-Fi system.
Jonathon Wright It's amazing. It's very funny because when we did the sparsity project in Copenhagen, we did something similar with a GPX logo. We're logging all the GPX stuff. But the only way we could get that was by connecting to network. And part of that kind of basic understanding of handshaking, connectivity, and the actually the worst place in Copenhagen for Wi-Fi was the free Internet from the airport. And it's because you got on the train and it was all free, but the connection was low. And obviously it's going through a lot of tunnels. So, therefore, people were getting so much packet loss and it was just such a bad experience. But, you know, part of it is that it's like get on a train, on a plane. Now you connect to the Wi-Fi. But you automatically assume that because your ten thousand miles up in the sky, that you're connectivities can be slow, which of course is or is when you go on a tube, you don't expect any Wi-Fi connection and therefore you pleasantly surprised by the concept of Wi-Fi. Right. And then it's really interesting because I did something similar for Gatwick and some of the other big airports that they have the same challenge in Wi-Fi. There's so many people connecting to the free Wi-Fi that then people go, oh, it's really slow, this Wi-Fi. But, you know, part of it is they've got a broader spectrum. You know, there's more devices, there's more concurrency, and therefore it's a lot more challenging. And also, they have to support so many more devices because there's all these variations of different language variations that they've got to support. So therefore, the service is fairly narrow. And it's such a challenge because we take for granted things like Wi-Fi becomes a basic necessity. And therefore, you know, and I have tried to do Wi-Fi, call it on the tube and it has worked. But I do assume that when I move in that it isn't going to be able to jump between the Wi-Fi location. So I'm going to lose signal. Right. But I think we're getting into that reliance where so much connectivity.; You know, we need to maybe think about how well, what what is important and what is what should the connected state be, what is necessary. And, you know, I always kind of say the same example of, you know, if you want to go and buy some clothes and you or tomorrow's Amazon Prime Day. Right. So, you know, you go on you go down to the tube, you lose connectivity, but you still can continue your order. You can live it. You balance in your account. It knows how much stock items were in the actual, you know, before you went down. So it's kind of got an estimate of whether or not it's going to be sold out at the rate it's doing by the time they can handshake and approve the process on the other. But why shouldn't you be able to buy that purchase? And we know if you load up Amazon and you lose connectivity, that you get a little picture of the happy dog and we know your connectivity is gone. Right. But we don't build apps to be build a deal with, you know, things like that, predicting the kind of stock, predicting that you can make that simple transaction. And I think this is the next generation and it's more than just buying stuff. It's more about how we live our lives. And we're now reliant we're reliant now on Internet connectivity to do remote working. What happens when that goes down? You know, how do you still work? How do you still connect people? You know
Laveena Ramchandani I think that is a journey, basically.
Jonathon Wright Absolutely. And I think we're talking onto that kind of. Well, what does the future look like? What does it look like for a tester who's joining in the same way that you are joining? So as far as your kind of and obviously you've I know you've done lots of different things, but you know what kind of resources if you found really useful or what kind of places have you gone to get more information or find out more about, you know, testing or agile or any of the methodologies you've been learning like C sharp or.
Laveena Ramchandani Yeah, I think online courses were quite handy, whether that's on YouTube or Udemy then. Obviously, you've got Applitools, which I found quite useful to do some Cyprus style tests. Then obviously the Ministry of Testing has a lot of information and really, really good details. And there might be people who are going through similar situations as you can connect and socialize more there and then just on the team, actually. And it's like a challenge, isn't it? Because if you don't know something, you panic a little, but it's like a challenge. Look at it positively because you're going to end up learning something new out of it. So push, push for it. And like, for example, if I want to be involved in something, I just go and nudge that individual in that team and tell them like I'm interested in this kind of area. If you ever have time, I'd like to shadow you or just pair up with you and see how I could help. So I did quite a lot of that. And then just reading up blogs, even writing blogs. So I think you can get a lot of information out there is just like what you're looking for. And if it's worth learning it on the team or maybe just doing a little bit of research off the team.
Jonathon Wright You know, and James Whittaker always says, "Fake it before you make it".
And, you know, it kind of I think he was trying to say there's a balance between it, right? There's a balance between I'll go and do some basic have, you know, some background stuff and understand the absolute basics. Then go up somebody who I know I've targeted because they're the right person to speak to and say, I'd love to learn more about this. And I think I think that. bravery, what you saying there about being brave and just go to somebody said, I'd love to learn more about this, I think is maybe something that holds us back. I think we have this tendency to kind of go off and try and learn it ourselves. Right. Now, you kind of do a course, you know, kind of fake it before you make it, learn some absolute fundamentals, then just kind of learn the rest as we go along. And actually, I think the bravery model of kind of going somebody saying, I know a little bit, a little bit is I know some foundations didn't want to waste your time, but I really want to understand how science works or how you do the software pipeline or, you know, these different tools do. And I think that is a, you know, a new way of interacting with people above. The scheduled standard meetings that you have with the team is actually you should build to have this time. And I know I saw teams you've got this new feature which does something similar where it kind of reaches out and says, oh, let's have some one to one time with, you know, a peer or a mentor or somebody that you can schedule some repeat in time to have that kind of conversation and build on on some soft skills as well as some personal development. And I think that is really important. And maybe it's something that has been lost during COVID. And I think that the bravery aspect of taking a little bit of time, you know, going and asking somebody for a little bit of help and see if they can help you on that journey. And I think that peer kind of aspect is I think the way that things are gonna go. And hopefully, you know, there'll be a lot of listeners who kind of see the value of what your experiences have gone and how the journey that you've taken so long for those people who are listening in, you know, what's the best way to reach out to you? What's the best way to communicate or, you know, on LinkedIn or Twitter or your blogs or what's the best way to get hold of you?
Laveena Ramchandani On LinkedIn. I'm working on creating my Twitter? Because loads of testers have told me you should have Twitter because there's a big testing community there. So I'll be there soon. But you can reach out to me on LinkedIn and yeah, we can have coffee catch-ups, whatever I can help you with. I'd be very happy to. And yeah, I just wanted to point out one more thing.
Basically, if you pair with someone, you're sharing your knowledge of what you know and then you're learning from them, you might even give them some knowledge of what they probably don't know. So it works hand in hand. So, yeah, definitely go for those kinds of pairing sessions.
Jonathon Wright It's like a digital bartering system. You kind of go, well, actually, you can teach me a little bit about how this works and I'll give you an exchange, a little bit of overview or, you know, some soft skills or some creative thinking or some of the coolest of the day. I think that is a really nice way of doing it, because everybody wants to learn and develop new skills, you know, and as humans, we're very good at doing that interaction. But, you know, we also don't ever have enough time. And I think this is kind of one of the echoing the you can't spend all the time. You need that little bit of space for development. Right. And that is more important than ever because the pressures of being available 24/7, you know, the long connectivity, being able to reach out to people in different time zones who all got different things going on in their lives. And, you know, you need to be able to support that because I think it's so difficult now and everyone needs that time to be able to speak to people and learn new stuff and don't lose that human touch.
So I think fantastic suggestions and it's been amazing to have you on. And we'll have to get you back.
Laveena Ramchandani Thanks for having me. A great platform to be on.