Test Leadership Congress
June 27, 2019, New York, USA
Test Leadership Congress
Video
Karen Holliday - 3 Steps to Success from a VP of Quality Assurance
Available
In cart
Free
Free
Free
Free
Free
Free
Add to favorites
90
I like 0
I dislike 0
Available
In cart
Free
Free
Free
Free
Free
Free
  • Description
  • Transcript
  • Discussion

About speaker

Karen Holliday
VP, Quality and Customer Care at InGenius Software

As a motivated leader with a track record of developing quality products, Karen is passionate about unraveling complexity to solve business problems. Throughout Karen's 20+ year career, she has experience at leading small focused, as well as large, multi-function organizations including Development, QA, Services, and Support. Karen is passionate about building high performing teams and having fun while doing it! In her off work hours, Karen loves to spend time with her family, run, bike and ski.

View the profile

About the talk

Are you a quality leader with a quality problem? Every organization struggles with quality at some point in their product life cycle. Knowing what to measure and how to build a culture of quality with specific and actionable methods is key.

Join Karen as she goes through the 3 steps you can take to ensure success:

Karen begins by guiding you through your analysis of the issues including product, customer, and organizational factors. Secondly, Karen shows you the five key quality differentiators to implement within your development process. Finally, she focuses on how to create your own Customer Quality Metric for your organization.

Come and see how you can transform your organization into the quality team you know it can be!

00:12 Introduction

01:40 Companies we are working with

03:35 How customer data involved with other departments?

08:55 Why should you connect with other leaders?

09:42 The filtration rate and why it's important

14:18 About the roles and responsibilities in the QA team

16:58 Unit tests. How we run them?

19:16 The automated tests. Why are they effective?

20:19 About automation framework we have created

23:00 How to Influence your team?

26:57 The importance of diversity in the QA team. How do we involve other departments?

30:05 The result data and how to measure improvement

Share

Well, we might as well get started with our chat. 00:04 Alright so my name is Karen Holiday, 00:12 I come all the way from Ottawa, 00:14 Canada hour and a half on the flight 3 hours to get from the airport to here so I'm not sure. 00:16 You know how that really worked out, 00:20 but anyway, I'm here and I'm really excited to be here so I've been in tech for? 00:23

20 ideas doing a variety of roles have done everything from optical planning QA test automation test development marketing. 00:27 Um and customer care customer support so I've kind of done run the gamut. 00:37 But you know, my love always comes back to QA and you know. 00:42 Just making process is better, 00:45 um so in my personal time I run races like Marathons, 00:48 and I've. 00:53 Triathlons and I like to do anything outside with my family. 00:53

So that's me not chilling probably the most important thing about me today is that I'm the person standing between you and lunch. 00:57 So I will try to try to get you to lunch so you know throughout my career, 01:03 I've been. 01:09 Asked to come into different companies. 01:09 We have a legacy codebase. 01:12 We've got quality problems. 01:13 We were having trouble articulating what those are. 01:13

Releases are slow you know, 01:18 we don't really know what you know what customers are using we don't really know. 01:19 What's going on with the install base? 01:24 We don't really know you even want releases our customers are running like? 01:25 We have a quality problem, 01:29 so I definitely been asked to do this at Nortel Avaya halogen and Sabbath so and currently I work at Ingenious Software. 01:31

And it's probably tell you a little bit about ingenious software. 01:40 'cause some of my references are from there, 01:43 so ingenious software is a computer telephony integrator. 01:44 So we integrate between a sales force and Microsoft. 01:46 CRM and it's left me system like a Cisco or in a bio or web based. 01:50 So we basically make the single pane of glass on the CRM that you can control your Phone like remote control. 01:54

From your CRM so you don't have to have agents in call centers jump around so if you called? 02:01 Your favorite credit card company or your favorite travel company, 02:06 which are customers of ours don't give you a really good experience. 02:09 'cause they know who you are. 02:11 The pop too, and they'll do some good manipulation within the CRM so that's that's where we play an obviously it's all software based. 02:12 Javascripttypescriptwith.net back end so. 02:20

Throat throughout my career, 02:20 I've been working it doesn't seem to matter what kind of software does matter. 02:26 It's on Prem I've worked in. 02:30 Glass companies right now or even going to a server list model, 02:31 which is totally new for me, 02:34 so there is no. 02:36 There is no cloud based system. 02:36 There's no server, so that's even cooler so How do I test this stuff? 02:38

And How do I make sure that my company knows that actually quality is improving and I think that's always the biggest? 02:42 Her life had his I'm doing all these things and How do I actually show my leadership that? 02:48 Yes, I think the product is getting better? 02:53 Because there's always a variety of factors so there's 3 things I kind of liked to do when I started an organization. 02:55 And they can all go in parallel. 03:01

You don't have to do them one at a time and they may go at different speeds. 03:02 So the first thing is to review everything so you've got to be data driven when your QA you really have to know. 03:06 What is driving the business and what's driving the issues that are coming in so look at that and then there's some tactical easy improvements so. 03:12

There's some basics that if we're not kind of doing these 5 basic things then really we're going to have trouble even getting out of the gate so we have 03:21 to at least get. 03:29 The table sticks of those, 03:29 and then the third thing is creating a customer quality metric so this Is It. 03:31 Called over the years, it started up as a science project back in Nortel Avaya, 03:35 20 years ago and. 03:41

Kind of evolved into a real statistical way we can show how are releases are performing in the field so? 03:41 I'll share with you how I do that. 03:49 A few years ago, they had an inequality problem and they couldn't figure out why customers. 03:51 Reporting different things at different times and as we explored it with us. 03:57 Based companies do you think they'd all be on the same release but they're not? 04:00

They were all in different places running different custom code in different custom databases. 04:04 So as soon as we standardize that actually cut out about 20% 04:08 of our. 04:11 Issues coming in just by standardizing customers so there's some real basic stuff that you can do you know you're lucky if you're running a web app that 04:11 you can push updates to customers often? 04:20

Keep doing that, like keep them all updated force them to update you know keep them all at this. 04:22 At the same level because a secure a organization, 04:26 you just can't popcorn around 100 different releases. 04:29 So let's jump in here, 04:33 So what features are customers using or do they actually doing. 04:35 What are the common configurations? 04:39 What releases with the running? 04:41 What's attrition data so attrition data is why are customers leaving us. 04:42

So I ask you why you should know why customers leaving you is there, 04:45 leaving for product issues. 04:48 You should know that because that will factor into? 04:48 Specially exploratory testing and some of the edge cases that you do. 04:51 So I have an example, 04:55 you know quite awhile ways back. 04:57 We were doing a router just an edge router. 05:00 In the Bellsouth Network, an we had nothing but problems you know tough tough meetings with the customer. 05:04

We're doing everything we were automating tests were. 05:09 Yeah, setting up labs. 05:11 We did everything and then we realized we need to get some Telemetry until Emma tree is key. 05:11 So if you're running a SAS business, 05:17 you should have Telemetry on what features customers are using because you know it's 8020 rule. 05:20 Like a defender customers are using 20% 05:25 of your features and you just need to know what those 20% 05:27 are and make sure those. 05:30

Or 1000% solid and in the other 80% 05:31 you can fill in a little less rigorously. 05:34 So, in this time, we went through we realized there were only using about 10% 05:37 of what our code base. 05:42 So we just pumped up the testing on the 10% 05:44 of the code base. 05:45 Did another release and they were extremely happy? 05:45 We had five nights reliability, 05:49 but you know really the product as a whole probably wasn't that much more. 05:51

Forward advancing but we actually just narrowed in on what the customers were looking for so that was just an interesting learning app from way back. 05:54 And really the other thing is. 06:02 What does customer support think So so customer support tends to be very opinionated and you know as QA were the first customers. 06:04 Of our product, but because we're supports a second because they gotta support. 06:12 Whatever we give them in my current role I actually. 06:16

Run QA customer support for an professional services and sales engineering, 06:19 so whatever I get out of QA I've got it. 06:24 Needed on the back end with a customer so I definitely like customer support in QA to be really tide together. 06:26 So it's you know this is anecdotal an it may not it. 06:33 May not guide. 06:36 You too much, 06:36 but really should it really should be something that you do you've got to be? 06:38

The collaborator so QA people have to be the glue that sticks organization together. 06:42 If you're not a good collaborator. 06:45 It's going to be tough to lead your QA team because. 06:47 We're influencers were collaborators. 06:49 We we don't often get to make a lot of decisions but don't decide what we're going to build. 06:49 Product is that no can decide how to build it does, 06:56 that we can decide how to test it. 06:58

We don't get to decide how to support it can support does that, 07:00 so we really have to influence all these. 07:03 All these uh entities to make sure that it's a portable you know it's operational. 07:05 It feels good. the UI is good. 07:10 And so we can only influence it's difficult to own so I think that that's the hardest part about being accumulator is having to influence. 07:12 All those around you. 07:21

And then and then were really responsible for measuring so if I took this example from the business. 07:21 I worked in. 07:29 A few years ago, 07:29 I said, OK, after every build goes out let's take a look so the dark red ones. 07:31 Art bugs that came in from customer so those are field field bugs against that release. 07:37 And the red ones are the ones my QA team found so while we were doing patches, 07:41 so each one of these releases might have had a couple of patches in it. 07:45

Or or maybe 10? 07:48 Who knows right what happened and you can see here we put a UI refresh out. 07:48 And it blew up right because UI is very finicky and and you know, 07:55 there can be problems with it so. 07:58 You can see where we actually had to put some investment in internally to drive those numbers down so it kind of gives your leadership an idea. 08:00 Are you as a whole? 08:09 Are you driving the right direction. 08:10

You're going to have a release that might come out a bit bumpy it. 08:11 Does happen right? 08:14 But you have to be able to kind of show that you're you are narrowing down where your issues are. 08:14 Uh. 08:22 So the second part of you know examine everything is look at the product itself, 08:26 so sit down with your product management. 08:30 Ask them where the product is going ask them what their biggest fears are ask them. 08:33

What customers are telling them because they tend to get to go see the customers more and maybe then we do in the qac. 08:37 So ask where we're going because we've got lab build outs, 08:43 we've got you know scalability, 08:46 maybe concerns. 08:48 We've got things coming up. 08:48 If you wait for the dev team to actually be coding, 08:50 it up. 08:52 It's going to be too late for you to react so. 08:52

You really need to have constant touch points with the other leaders and you know, 08:55 and then you can measure when you're looking at the product measure performance so. 08:59 Filtration rate is huge and infiltration rate is basically the number of bugs you get from the field bug stories. 09:04 Issues customer doesn't like it doesn't matter any issue you get from Field. 09:10 Whatever has to filter to development to be. 09:15

Fixed by code is is your filtration rate so I've worked at companies where the worst in Class 15% 09:17 of. 09:23 Issues from the field actually had to go. 09:23 The development team and we figured out that they had just turned off, 09:25 logging and turned off a bunch stuff so. 09:29 The product even though it's as based product. 09:30 It was unsupportable and then I worked at best. 09:33 Glass companies works point 2%, 09:36 so only 2 out of every 1000 issues. 09:38 We got from the field. 09:41

Actually had to go to development and that's really what you want you want that filtration rate to be really low. 09:42 Because you don't want to distract the development or your own. 09:47 QA team on having to Repro customer issues get them tested get them fixed. 09:50 You've got to drive down that filtration rate so that you can. 09:54 Keep your development organization in yourselves. 09:58

Focused on moving forward and building new features nobody likes to go back and fix their old old junkie code. 10:02 And you know, and then you really have to also from product review perspective do all the normal QA things. 10:07 We got to see you know how much is automated? 10:13 How much is manual tests you know is the pyramid looking more like an hour glass you know it's something happening a bit. 10:15 Funny and is that OK do we have API test capabilities. 10:21

All of those things. 10:23 So when you're going to a new role in new product you could just figure out. 10:23 What do they have now and the quality of trends again it's more about interview your QA team? 10:28 'Cause they know they know even if they don't raise sometimes they won't raise a bug because they go. 10:35 At nobody's going to dress it or nobody sticks to the big idea big. 10:40

Deal or or you know, 10:43 I just don't want to get in a fight with them again about raising this. 10:44 Issue because it's not worth it, 10:48 but you know you as the leader can interview them and you can make notes of them and then when you're in your leadership means you can push the organizacion 10:49 gently. 10:56 To to make sure that we can raise those you can fix those as some organizations are really good and you can raise everything. 10:56

And if you work in one of those. 11:02 Lucky you but if not, 11:04 then you have to figure out a way to influence to get everything addressed that you think should be. 11:05 Um. 11:11 Yeah, so this is this is a company I worked with a few years ago and. 11:11 Well, actually it's 2017, 2018, 11:19 So what we did is we looked at we had some field problems and it's really quite static the number we're going up a little bit. 11:21 Choose this is Jira so this is filtration. 11:29

This is what was the filtration rate was this is what was actually raised with the? 11:31 With the dev an what we realized is that you know in 20 in 2017. 11:35 We were mostly raising orange, 11:39 which was bucks. 11:41 We're mostly raising bugs and that was cool. 11:41 You can raise bugs, but we noticed coming into the end of 2018. 11:45 We're getting more and more items. 11:48 Raisa stories which are feature requests so we realized we were actually entering a new market. 11:50

With our product and we had a gap so we couldn't fill with our product, 11:54 the gaps that we have. 11:59 So this year in 2019, 12:00 will working to address that so but it was becoming hidden because lots of times. 12:01 These stories get turned into some feature request backlog that goes off until the end of time and. 12:06 You know you know you probably manager probably has 1000 things in their feature request backlog but from a customer perspective. 12:11

It was a bug, but from a product perspective. 12:16 It's a story or a feature so if you can highlight those and say it doesn't really matter what you call it. 12:19 You still have to fix it or you still have to address it. 12:25 I think it highlights that in terms of the customer so. 12:29 QA should always be worried about those things that are coming in from the field and that are maybe feature gaps. 12:32

Or supportability issues that we didn't think up because we didn't know that's where the product was going to go. 12:38 And then the third thing you know in this first in the first area is kind of review the organization. 12:44 You know. 12:49 What does your QA team think their job is do they think their testers or do they think they are? 12:49 Responsible for the quality of the organization in the product an really like you know, 12:56 some organizations. 13:00

We can put testers in the tester role, 13:00 but That's not really what we want to do, 13:03 we want people who are collaborators and people who will influence and people? 13:04 Who will participate in design reviews and give good feedback? 13:08 We don't want testers really we want quality assurance that's what it is or whatever you wanna call it QE quality engineering. 13:11 You know we want people to be. 13:18

Empowered to have those conversations an that often requires you to to influence your peers to make sure that that's known that that's actually what their job is. 13:21 Is to influence it? 13:32 And influence what they do so you know, 13:32 I do definitely this is one example of. 13:37 You know what the QA role was at at halogen actually when I worked there, 13:40 I created this because. 13:45

It seemed as though I'm you know, 13:45 we went into great detail on what with each of these went and. 13:48 So each level of the tester and all those things, 13:51 but really it was just to give actually the greater organization. 13:54 A view into what we thought QA should be doing for the organization and not just were not just testing. 13:57 So that's really key and I think the other thing that. 14:04

You have to be really clear on and I've heard others mentioned it since I've been here is. 14:09 When you're setting up your own team. 14:16 You have to think about the roles and responsibilities in your own team and I agree with the speaker this morning cracker that. 14:18 You can't have manual testers who automate on the side because it will never, 14:24 ever ever happen. 14:29 You know, 14:29 even in my team I have. 14:30

Automation framework developers that don't actually write test cases. 14:32 They just developed a framework for the automation, 14:35 then I have automation developers to develop. 14:37 Test case and then I have manual testers and the manual testers actually the lead manual tester is the product. 14:39 For automation, so they get to decide what gets automated so that's that's the way I set it up so that. 14:45

You know the things that are causing the manual testers that are the most repeatable most boring don't want it don't execute it every release. 14:51 Those get prioritized in more quickly than than you know the automation developers often want to go off and like sports cool area code. 14:58 And and they're not really helping me from business perspective. 15:05 So you know, I think keeping that loop and making sure your team set up so that. 15:08

So you can talk and it could be really small team my team right now is only 6 or 7. 15:13 QA testers it's quite small but they have distinct roles and they can float between but for the most part. 15:17 We keep them pretty static and definitely you know the manual testers are embedded in a scrum teams and when I have bigger teams. 15:23 And I still try to do this now is in bed. 15:29

The automation developers in your scrum teams, 15:31 too, so that they know what's coming down the pipe so. 15:32 Is that equality that you can automate it like? 15:35 Let's get it done? 15:37 So. 15:37 You know it's really important to me that people. 15:40 No what position are playing on soccer field. 15:43 I would say there. 15:45 You know stop swarming the ball like yearly automation person through the manual tester like let's do it. 15:45 And then we can cross pollinate later time. 15:50

Kind of the first area that I. 15:54 I really do whenever I join an organization and the second things I do and I put 5. 15:58 Key differentiators, but they're really. 16:03 Not differentiators there pretty basic simple things that probably mostly organizations have and maybe you're doing with some degree of success or not. 16:05 But if you're not doing it, 16:15 then then you're kind of you're missing the table stakes So what will get into those. 16:17 Hum. 16:22

So they are code reviews. 16:24 So you may not be writing the code and you may not be in development, 16:27 but that code should be getting review Dan Reeves. 16:30 Well um code coverage there's some really cool. 16:32 New things coming out with full coverage as I talk about a bit. 16:35 Automate aggression defect escape analysis. 16:39 Really key and exploratory testing. 16:42 So all of these things are simple quicker to automate quicker. 16:45

Sorry quicker to implement but really give you actually. 16:49 Short term pay offs very quickly. 16:53 So if we get into the code review. 16:55 So we know that cold quality in the structural code conformance to standards whatever coding standards are at your company. 16:58 And the unit tests are generally a dev responsibility generally that's that's where they play you know down here is more work you a pic. 17:06

You know system test automated tests but if you look at the whole scrum team generally owns. 17:14 Everything up to the functional right and sometimes strong taken not only all that's fine, 17:22 too. 17:27 But you have to think that QA is part of that scripture so were part of owning the code quality. 17:27 And you know a lot of my QA team does participate in the code reviews. 17:32 And I think the real thing here is that? 17:38 It's OK to go through the motions of code review. 17:42

But it actually has to be like, 17:45 what I call a real code review where you actually. 17:46 We don't just throw it to your buddy across the aisle say did you review that? 17:49 Yeah, it's good? 17:52 You know back like that's not really a code review and the other thing we include in code reviews is unit tests. 17:52 So we actually do code review on the unit tests because we have right now, 17:59 I think. 18:03

In some most all the new areas of cobia, 18:03 Costa 100%, coverage in unit tests but we can still see breakages. 18:07 In the product because the unit tests. 18:10 Was basically used? 18:13 So. 18:13 And it's because sometimes it's implemented by someone who doesn't really understand what they're doing haven't done them or their junior or whatever, 18:13 so. 18:22 And encourage you know the QA team to get involved in reviewing the unit tests and then that also helps us know. 18:22

OK, so we know this area is 100% 18:28 unit test coverage and it's pretty good, 18:30 and some of those are actually getting really close to functional because the unit tests breath was so. 18:33 So so good so we can kind of narrow down what we need to do from a full stack or an API test. 18:38 API tests RPG pretty straightforward you can test everything but you know full stack. 18:44 You kind of can narrow down that based on the unit plus the API. 18:49

What do I really need to full stack 'cause those are expensive they are less reliable an you don't want to spend too much time on them. 18:52 Uh the second thing is code coverage so actually code coverage is kind of an old school thing or used to use code Coven. 19:00 You know, we check RC code make sure we had good coverage. 19:07 Now there's some really interesting new tools, 19:10 unevaluated when right now. 19:13 Called see lights and it actually does full stack. 19:13

So it'll do everything from from it'll look at where you tested manually what you test a full stack automated. 19:16 API in unit and come up with this is how much of your code direction covering which I find it really interesting. 19:22 You can find where the hot spots are that you're covering or not. 19:27 So this kind of went away as a thing. 19:30 But it's coming back, and I think we should be aware of it in QA that. 19:33

That you know, we should know what our code coverage is an I mean for our new product so obviously we're. 19:37 Targeting close to 100% it's difficult to get under center it doesn't even make sense all the time. 19:44 But you know. 19:48 It's it's it's it's pretty important that you know where it's covered and then you can marry that up without analysis. 19:48 You did, in the beginning of which features your customers are using? 19:57

And that's where you can target putting your investment in. 20:01 Uh. 20:04 The third thing test automation, 20:04 so you know. 20:08 Just saying I'm going to write a bunch of scripts and automating. 20:08 Hasn't really worked well for me in the past, 20:13 so the last couple places. 20:16 I've been we've actually created. 20:17 And automation framework ourselves you can use frameworks that are available. 20:19 Some of some areas are so customized right now. 20:25

The product that I'm testing for moderation perspective have to have softphones so we have 6 different varieties of softphones we've got. 20:28 Browsers we've got different. 20:37 CRM's and so, 20:37 if it's not built on a solid framework that has parallelization. 20:39 Has it we try on failure and some other really key items in a finite state machine. 20:43 It just would not run. 20:49

Because sometimes the phones go down sometimes things happen, 20:50 so you know from our automation framework what we do. 20:53 Is? 20:56 We store everything in test real so we actually go out reach out to a repo. 20:56 We grab all of our automated tests. 21:01 For today ones that are there so if anyone got put in this morning we grab that one too. 21:03 It automatically gets populated trestrail. 21:07 We use the trestrail API, 21:09 which is really cool and we can grab we say. 21:10

Give me a test that's not run yet. 21:13 So it gives me a test says bring this one, 21:14 so VM number one, I take it. 21:17 Jenkins is our schedules. 21:18 So it's scheduling us. 21:18 I take I take it in the VM will run test number one on VM number one or whatever it ends up. 21:21 Landing and you know at the same time, 21:26 I've got you know 21:29 PM is hovering I've got going that day. 10:12 And I'll I'll run the tests and so the round Robin they'll just use when he random VM so that. 21:32

If we have one that tends to go down or the Softphones went S because. 21:38 Cell phones tend to not be as solid as we'd like them to be that that test will only will try 3 times on 3 different VMS. 21:43 So we have statistics so we know which which environments are performing better than others, 21:50 which tests are performing better than others. 21:54 And we have you know we can put tests in the maintenance that continually fail things like that so. 21:56

It's a good way to. 22:02 To automate your tests in a really reliable way, 22:04 so that so that customers are so that your your QA guys aren't spending their whole day. 22:07 Reviewing failed tests because that's like the death of automation is if they come in and they've got 500 test. 22:12 It didn't work last night. 22:19 You know that's what they're going to do all week, 22:21 and they're not going to develop anything you so you have to really have a? 22:22

Framework that's going to pass we strive strive for 100%. 22:25 We're kind of at 99.98. 22:30 9889 somewhere in there, 98.9 somewhere in there, 22:31 anyway depends on the day but. 22:35 Yeah, we don't want to review more than 5 or 10 and if they flake out more than 3 days in a row, 22:37 I put them in maintenance. 22:41 And we have our costs teams get to look at those so that's how that's how we manage those. 22:42 Um. 22:47

The other thing that that you should do as well with your animations make sure it's abstracted so you know. 22:47 Rap everything so that you can actually spoke switch out your browser switch out your your. 22:55 You put your tools at anytime so we have everything wrapped. 23:00 And I think I have a sample of the build testing stages. 23:05 So you should be influencing your team and you should be setting this up so that. 23:08

Once the software is committed we have you know static code analysis unit tests database service test so those are kind of like? 23:13 And the API tests and then you have a smoke and then you shouldn't as a QA organization likely. 23:22 Or if you know if you're doing a B. 23:27 That's great. 23:29 You shouldn't push to anything real until after smokes good. 23:29 You make sure that the thing actually installs in this alive on your web page and we used to do it here. 23:33

And then we had a problem where Chrome broker so now we do it here. 23:39 So we can push too, 23:43 sometimes will push. 23:45 If you're feeling really confident will push the beta right away. 23:45 But this is the build that we take into. 23:50 Automation we target about 70% 23:52 full stack automation of our automated of our manual backlog just because in the industry. 23:54 I'm in right now, I'd love it. 24:00 90%. 24:01

But because it's a telephony case, 24:01 a lot of these complex telephony platforms don't lend themselves well to. 24:04 I do not. 24:08 Which ones? 24:11 A load loading capacity. 24:13 Sorry yeah, 24:13 so, so this is where we would do any kind of will load up Salesforce with. 24:16 500,000 users, or 100,000 users and see if our licensing system will license the mall or unlicensed mall. 24:20 And things like that. 24:26 Um. 24:26 And then we do a nightly run we try to keep it under 8 hours. 24:28

So the tests, the tests are mostly self contained well, 24:33 they're all self contained. 24:35 We target again the two minute timeline as well. 24:35 But some of them with it's often end up being four five. 24:39 And it's just a factor of the way the way those run it would do new feature test operational an. 24:42 You know when we before we started this framework. 24:49 It was for six week regression cycle. 24:51 If you can believe it at the end of every. 24:53

At the end of every major release and we were down to 1 to 2 weeks now, 24:55 so it's been about 18 months. 24:59 It's lower than you think and you have to really you know when you're implementing a new automation system. 25:01 I think trying to get return in a year is a good time frame. 25:06 I think if your management thinks. 25:10 Oh, I heard automation guy in 3 months from now. 25:12 Everything's going to be fine and I'm going to release faster it's. 25:14

Right so I think setting those expectations like this is not you know, 25:17 creating a new automation framework or even bringing off the shelf one integrating into your product is not a 3 month. 25:21 And ever it always takes 6 months to get off the ground and then once it's off the ground, 25:28 you actually have to automate some test so. 25:32 You know in terms of ROI your leadership should kind of expect within a year they should get some ROI but. 25:35

Setting those expectations upfront is a really good idea, 25:40 so that you're not under the gun after 3 months. 25:43 How come how come my world is not better in it. 25:45 You know it takes time so. 25:47 That that's my my big learning there, 25:50 I definitely got felt that pit. 25:52 Uhm I think this number for defect escape analysis, 25:55 so every time you have a customer issue. 25:57 Coming from the field and it makes it to you and your development team. 25:59

You should have analyze it so I use a trap method, 26:03 but it doesn't. 26:05 It doesn't matter what kind of root cause analysis, 26:05 you do. 26:09 Trap is trigger recovery architecture, 26:09 slash analysis and Prevention, so you can basically ask your QA team to go. 26:12 For each one of those bugs that got fixed you know work with the developer who fixed it and kind of come up with. 26:17

Why did this happen and you and the really key thing and recommended process improvements the real thing is processed? 26:22 Did you have to add a test case? 26:29 Did you have taken away if you're defective scape analysis ends up being? 26:31 Oh yeah, we did a test every time they are not doing a good job because they probably didn't modify it. 26:36 We didn't prioritize those test properly. 26:41

You know there's usually a lot more plate and so once you get a few months worth of data you can start to see where you have gaps so it's 26:43 uh. 26:48 Really helpful for driving quality to go through this process and I have another slide as well, 26:48 just for you to take home with a little bit more information. 26:54 Uhm I think this is the last area in this section exploratory testing so I'm sure you all do it. 26:57

Everybody loves exploratory testing we've made it into kind of an event at my current job and ingenious so we actually plan. 27:03 Or sometimes 2 days we bring in customer support. 27:11 We bring him solution developers. 27:15 We bring in. 27:17 QA obviously leads it. 27:17 And we bring in sometimes sales, 27:20 sometimes sales engineering, we bring in a cross cut out of the company. 27:21

And we give people personas so you're an agent at a travel company and you have to take you have to make inbound calls. 27:25 You have to make outbound calls your your level 1 Spartan you're having troubles you know going up to level 2. 27:35 And all of these things so we give them all personas and then we sit across functional group in a room together. 27:42

With with a plan in a task that they have planned you know, 27:48 I said subset of what browsers are going to use so we get kind of coverage across the board. 27:51 And we actually have them sit down in a room together for a day and. 27:56 And and bash at an ever attend their customers and you know in our last few regression cycles. 28:02 We've raised actually. 40% of our bugs in those 2 days versus the whole week or 2 that QA was she was doing it so. 28:07

It's a really great way to kick off your regression cycle because now we're doing exploratory. 28:13 All that I'm not saying we're not, 28:18 but to make it an event. 28:20 Is is really a fun for the company too? 28:21 And it gives the company an idea of what QA? 28:23 Is doing and it gives you an idea of what? 28:25 What does customer care see or what? 28:27 Does customer support team? 28:29

See everyday so it gives them a chance to also give builds our confidence like he knows what they're doing wow. 28:29 These guys are. 28:36 You know their testing and so it actually just builds a lot of collaboration, 28:36 so you end up not having that. 28:40 I send them as non thing with the support team versus yourself and then it's really important, 28:42 and we do this, once a month actually in my teams but. 28:49 Retro it retro everything you know if there's one tenant. 28:52

I love of Agile software development is retros because that's where you get a chance to learn and as a leader. 28:55 You get a chance to get the actions and really drive improvement. 29:00 So I think I remember one was that we should have had ice cream. 29:04 On this one, but maybe we have, 29:08 maybe could do better and you know, 29:10 we set people up with time zone, 29:13 then we book. 29:15

Rooms you know, 29:15 we make it kind of a really official day and you know this has been one of the funniest things that we've implemented at Ingenious and we're really happy with 29:16 it. 29:24 Going to do one and I just a couple weeks part of my presentation is so now I've done. 29:24 A bunch of analysis, I've got my QA team situated they know what their job is everyone else in the company. 29:29 Because they know what they know what we're trying to accomplish. 29:34

I know what my product where my product's going. 29:37 I've implemented some pretty simple, 29:40 but key things to improve the quality biggest non simple thing is I'm automating the regression and it's solid. 29:42 So now How do I go back to my CEO and tell her. 29:50 Yeah, I think we're getting better. 29:53 You know it can be anecdotal because you know, 29:55 sometimes it only takes 1 customer together, 29:58 she Ocala go. 30:00

Yeah, this isn't working, 30:00 right and all that effort that you spent goes down the tube. 30:02 So you want to you want to make sure that you have. 30:05 The data to prove that you are getting better. 30:08 There we go. 30:13 So basically we're going to measure the problems of customers give us with our filtration rate an we're going to. 30:13 Plot that and see how it releases going. 30:24 So often. 30:27 We still document things that aren't problems so aren't software issues. 30:27

So, we, we do. 30:34 Add usability you know anything anything with the customer reports to you, 30:34 that they think is a defect. 30:40 It doesn't matter if you know, 30:42 Dev says working as designed it doesn't really matter like your customer if they think it's not useable or it doesn't work. 30:43 Or it has a big gap that battle factor into your quality metric because that's the overall perceived quality. 30:51

So we're not calling out defects were calling out quality and the way the experience with the customer. 30:57 So. 31:04 These are kind of the three tenets of that real product defects. 31:04 Customer found issues that could or could not be defects and then their happiness. 31:09 So you probably your company probably runs an NPS survey. 31:14 Or maybe run case surveys after your after your You sporting does a case we do MPs we track this separately in the company. 31:17

I'm in now so I haven't really added that in here. 31:27 I decided on for now, 31:31 I'm focusing on internal beta and customer found defects. 31:32 And then these are customer defects in non defects, 31:36 So what usability whatever else. 31:39 So uhm. 31:42 If I looked at some of the releases. 31:42 We had, and actually these are a fictionalized numbers. 31:46

I'm not giving you are real numbers, 31:49 but If you look at let's say the last three releases you know, 31:51 these were the defects raised by the. 31:55 Those are total number of cases that went into the customer support team. 31:57 This is how many jurors they went to the dev team. 32:01 So this was my filtration rate. 32:03 And then the real key is how many issues were raised by per customer and that's really key because as you try to scale your business. 32:05

If you have 100 customers in your only getting. 32:12 .5 for customer that's fine, 32:15 you're getting 5050 real problems, 32:17 but if you have 5000 customers. 32:19 10,000 customers and still the same rate your going to be overrun and you're never going to get anywhere so. 32:21 That's why it driving this number down per customer is actually how you can show how you're improving. 32:27

You know, I've done it where you measure the customer quality metrics that maybe one month post release 3 months post release 6 months. 32:32 Release. 32:40 It's hard right because you're on your business. 32:40 You don't know, and less your sass basin you're pushing out. 32:43 You won't know how many how many customers are actually running your releases so it's really dependent. 32:46 So when I'm at a task based. 32:51

Business and I know I have licensing months on it, 32:52 so I know I have a user base of 10,000. 32:56 And I've had you know three months on it, 32:59 I know I've got 30,000 months on that and that's Easier to manage you know, 33:01 so it really is dependent on what your business is is to what your denominators going to be. 33:05 You have to figure that out for yourself and, 33:10 depending on how much data you have yeah, 33:12 so. 33:15

Sometimes it's licensing month, 33:15 sometimes it's just OK. 33:17 We put it out. 33:17 This date and we know we have. 33:19 12% of our customers running on it, 33:21 it among 3 so will will track that so it's just try to make it repeatable. 33:23 So here will come up with the. 33:28 With an example here. 33:31 So this is this is the last about. 33:35 Is about 3 years worth of data a major releases so at the patches kind of combined in there so? 33:39 You can see that back and release 2 and 3. 33:45

We were struggling so we have. 33:49 Customer reported defects are here. 33:51 And the number of licensing months are here, 33:54 so you can see that these are older releases so they've been around longer, 33:56 so they've got more licensing months. 34:01 But their defect rate is really quite high in history, 34:03 so realized around them. 34:06 We've got to do something. 34:06 And when it really came to a head. 34:09 With this release came back, 34:11 it had just gotten installed brand new. 34:13

Bad defect rate we can't we can't maintain that so that's when we decided OK. 34:16 We better start. 34:20 Some automation tests. 34:20 We better start better exploratory we've got to do some. 34:22 You know defect escape analysis and figure out what's happening and then we were able to drive a little bit better. 34:25 And then a lot better in release 5 right so now really 6 is only just out. 34:33 And we're hovering down here we're feeling pretty good about that. 34:38

But we've got a ways to go before it's got some more legs on it to make sure. 34:41 But generally in the first one to three months of your install you will have like 8090% 34:44 of the. 34:50 Of the real issues that are going to come in so that's why we use this. 34:50 This uh. 34:56 Maitre see it also helps guide investment so it helps you. 34:56 Push investment if you know if you can say each one of these is just a dot on a chart. 35:02

But you have all the reasons why you know why that dots there on the charter lights up here. 35:06 And so you can actually push for investment in different areas, 35:11 and help you prioritize because we only have so much. 35:14 Really so you have to kind of prioritize you invest. 35:16 Ann. 35:21 So that that is how I've been driving out in the last few years in terms of measuring whether what I've implemented is actually helping. 35:24 My business. 35:33 And. 35:33

It does resonate at the board room. 35:33 So that's that's it. 35:39 That's that's all I've learned in the last probably 10 years of trying to leave. 35:39 QA teams and trying to get them going. 35:45 I'd be happy to take any questions or just let you. 35:47

Cackle comments for the website

Buy this talk

Access to the talk “Karen Holliday - 3 Steps to Success from a VP of Quality Assurance”
Available
In cart
Free
Free
Free
Free
Free
Free

Access to all the recordings of the event

Get access to all videos “Test Leadership Congress”
Available
In cart
Free
Free
Free
Free
Free
Free
Ticket

Interested in topic “Software development”?

You might be interested in videos from this event

September 28, 2018
Moscow
16
129
app store, apps, development, google play, mobile, soft

Similar talks

Lina Zubyte
QA Consultant at ThoughtWorks
Available
In cart
Free
Free
Free
Free
Free
Free
Priyanka Halder
Head of Quality Engineering at Goodrx
Available
In cart
Free
Free
Free
Free
Free
Free
Joshua Gorospe
Senior Tester at Intersection Co.
Available
In cart
Free
Free
Free
Free
Free
Free

Buy this video

Video

Access to the talk “Karen Holliday - 3 Steps to Success from a VP of Quality Assurance”
Available
In cart
Free
Free
Free
Free
Free
Free

Conference Cast

With ConferenceCast.tv, you get access to our library of the world's best conference talks.

Conference Cast
525 conferences
20515 speakers
7489 hours of content