Customer Stories

Achieved a scalable QA with Autify to deploy 10+ times per day

Company
GoBankingRates Inc
https://www.gobankingrates.com/
Industry
Digital Marketing Media
Publish Date
March 2, 2023

We recently spoke with the QA team at GoBankingRates, a Digital Marketing Media Company that services clients across several verticals, including credit and banking. One of their most significant challenges was acquiring a simpler test automation tool to test their software stack. Autify was tried and proved to shift the landscape of their testing capabilities.

GoBankingRates was able to successfully migrate the majority of its test automation from Selenium to Autify. This allowed them to diversify their team. Senior developers could focus on higher-priority projects as they were no longer required to make updates with Selenium scripts. Furthermore, if there were any failures in the Selenium script tests, it became expensive to maintain this setup.

By transforming their QA process, they could restructure their team, reducing bottlenecks by hiring junior testers and even interns, thanks to the ease of use of Autify. The time to train new team members using Autify was far faster than with Selenium. Plus, their team scaled to include offshore developers. This allowed them to expand their coverage hours beyond the typical eight business hours in the U.S. Discover how Autify helped this QA team scale and deploy more than ten times per day!

We interviewed Trincy Thomas, Mitul Gohel, and Anvitha Munagala about their QA challenges and test automation efforts.

– First, would you like to tell us what GoBankingRates does?

Trincy: GoBankingRates is a Digital Marketing Media Company. We create products that help to generate traffic or conversion for clients. We support multiple verticals. We have credit vertical-based clients and banking-based clients. We partner with other companies as well to generate these conversions. So, there are multiple applications: like GoBankingRates which is a WordPress site, a few Web Applications to support our Credit Client base, and a few Internal Applications. QA Team validates development work for these products.

– What do you do at GoBankingRates?

Trincy: I am the QA Director. We have 4-5 teams where each QA manager is responsible for the QA activities in that particular team. We have two of them here today.

Anvitha: I’m the QA Manager of the Credit Vertical. As Trincy said, we support different domains, so Credit is one of them. I oversee all the QA activities for the various products within this vertical.

Mitul: I’m in the Banking Vertical, and I oversee all the QA activities on the banking side.

– How is your Team Structure and your Development Cycle?

Trincy: Initially, we were a big work flat with a bunch of Senior QA Engineers with one of the members taking the lead role. We divided our organization based on the verticals into smaller teams. Each team has its own QA, DevOps, and Product Manager, as a full set, and they all follow agile ceremonies. We have two-week sprints. Sprints allow us to determine the team’s velocity, and then when the sprint begins, we start working story by story. Each user story gets its own environment, we call them runways. We have multiple deployments throughout the day and throughout the sprint. A sprint is just to say we’ve finished all tasks within a sprint.

Built a scalable offshore QA team and now deploy more than 10 times daily.

– Even during the sprints, will you make the automatic deployment in that sprint?

Trincy: Correct, and in that sprint, each user story will be deployed separately. A user story will be first deployed to the runway, it will be functionally certified, and then we will run the regression. If it all works, it gets deployed to a staging environment, and then we run regression on that staging environment. If it looks good, we actually deploy it to a preview environment. We run a smoke test on the preview environment. If that looks good, we promote it to production, and we run a smoke test on that production environment. And then it’s live. So, some squads could have up to 10 releases, even per day, and some squads have one in 2-3 days.

– That’s really impressive. In total it may be more than 20 if you combine with other squads, since as you mentioned, one squad may deploy more than 10 times a day, right?

Trincy: Yes. We don’t have Autify per environment. Now you are adding 3 to 4 environments to that ticket, which adds the number of times we utilize Autify.

– That’s very impressive and that is why I would like to dive deep into how you make that possible. How are you guys doing that?

Trincy: When we were in Selenium, we tried to divide this into multiple parallel instances. So, based on the environment we were changing that instance. Let’s say, on a runway we would possibly run up to 10 instances. On staging maybe 20. The staging was used by every department. Including the production environment as it has its own domain. That would still be around 10.
We used to perform this differently. However, now that we are using Autify, it’s the same number of parallel instances regardless of any environment. Therefore, what we tried to do was hire offshore members so we could distribute throughout the day. That way we are not always fighting for resources. That is very beneficial. There was a time, once we moved to Autify, that there was a peak hour. There were a few hours when everybody needed Autify. This caused a bottleneck of them waiting. This has been resolved now by having team members that work in different time zones.

All of our automation is part of the CI/CD.

Each of our applications has its own regression suite. Based on the components, we have separate, smaller, test suites. All of our automation is part of our CI/CD. Based on the components, we mark the test plan. And then when deploying that tag, it kicks off the relevant regression suite.

Previously, we weren’t able to automate this, but with Autify’s APIs now we are able to figure out the success or the failure of the regression, and if successful, we are able to automatically move it to the next environment and kick off the next regression or smoke test.

– What kind of people are on your team? How many of them are on the QA team?

Trincy: This varies and differs per application or per product, or per team. Some teams support public-facing applications. All of our teams have to support a few domains, and each of those applications has frontend and backend components. So, they do require more QAs. Therefore, the Dev to QA ratio is almost 3 to 2 or 2 to 1. For the data team, we have a 10 to 1 ratio. They all perform the backend work, so it doesn’t require a higher QA ratio.

When it comes to the front end (like UI-based applications) it requires cross-browser and cross-device testing. Therefore, the Dev: QA ratio is different for each product. I would say we have approximately 20 QAs right now.

– Are you not doing any manual tests?

Trincy: We do manual tests. We spend a lot of time making sure the user stories are very clear so that a QA Team member can manually verify the acceptance criteria. While they are manually certifying, they are also building automated scripts for that user story. So, each user story is certified with new/updated automated test scripts.
When the code is deployed to the next environment, the changes in the test suite are merged to the regression test suite. So, it always has full test coverage.

With Autify, QA is no longer a bottleneck

– I’d like to ask you about the challenges you had before introducing Autify into your operation. What were those challenges?

Trincy: Our execution time for regression tests became a bottleneck to our development process. It took around 5 hours to complete regression.

To address this, we started adding parallel instances. We were able to bring it down to one hour. However, if anything failed, we had to re-run the regression, and that took an additional hour. Every minute seemed expensive when we had everything else working. We wanted to resolve it. However, since we started writing complex functions in Selenium, we required a highly talented person to update it quickly. The expectation was regardless of the QA assignee on a ticket, any QA should be able to update any test case quickly. That didn’t always happen, and we could not sustain hiring the best people because that is expensive.

We decided to change our whole structure. We had Leads or Managers, and then we started to hire Mid-level to Junior QAs. We replaced the complexity of Selenium test scripts with the support of AI by Autify.

We did evaluate other products before we agreed on Autify. Our goal was to reduce complexity, so we could hire people (such as Junior to Mid-level.) We also needed to run current regression reliably. In the past when we were using Selenium. Also, we were using spot instances, and sometimes mid-run, we would lose instances. Sometimes it would not generate a report before we lost that instance, and we would have to rerun the entire thing. That was a lot of wasted effort for us to have reliable test runs. Therefore, reducing some of our complexity was our initial goal.

Autify helped reduce our complexity. It is very easy to record steps and create any number of test cases if needed. We now follow a set of processes that help ensure we’re not clogging the system with too many test cases, but still building the right number of test cases. As long as we followed those processes, it seemed to be working very well. We can rely on self-healing to some extent. We did run into some challenges. But in the end, we did see the same amount of test cases, the same kind of test cases running faster in Autify. I don’t know why, I assume the logic is the same, right? Whether we build in Selenium or in Autify, both use a cloud service. Yet we are able to execute it faster using Autify.

60 days onboarding became 2 weeks with Autify

– The challenge with onboarding new members took a long time. How was it resolved?

Trincy: It wasn’t just Selenium. We had Selenium, BDD, and we were using Browserstack. We were using a lot of stock products to achieve our goal. And when we onboarded somebody we had to give them account access. The first week was more like an introduction. The second week was more about training which took a while for them to get acclimated and not make mistakes.

Now, we just have Autify. The onboarding is a light introduction to Autify, and that’s it. They can start with very simple test cases, and very little mentoring is required. We did try with interns, and we had the same level of success with them.

– That’s impressive. How long did it take before the interns were onboard?

Trincy: Initially it took between 30 to 60 days to be effective. Now it could take 1 to 2 weeks to achieve the same level.

In 3 months all the test cases were migrated to Autify without increasing resources

– How did you introduce Autify into your QA process in the first place? As you mentioned, you were using Selenium.

Trincy: We introduced Autify to validate one of our simplest products. It was our main GoBankingRates website. As soon as we discovered Autify’s potential, we started automating the application.

We started to develop new processes. Previously, we did not implement much structure. Therefore we added a more organized solution and people required some kind of guidance. We had developed a plan to migrate every test suite.I can recall it took 3 months. By the end of 3 months, all test cases were migrated to Autify.. However, during that time we ran both tools. It was a huge task and sounds farfetched, but we achieved it without increasing resources.

– That’s impressive! Are you still using the Selenium scripts, perhaps for the areas that Autify doesn’t really cover?

Trincy: We do. For some of the backend heavy work, we couldn’t migrate to Autify. So we have something called “Package Validation” which still resides in Selenium. Initially, we wanted to run our tests, post them in the next release to pick up as our input, and continue. We found ways to handle it. It is working, but it’s only for those tests that reside in Selenium.

GoBankingRates achieved 85% of coverage with Autify.

– What would you say is the coverage ratio? How much is in Autify and how much is in Selenium?

Trincy: I would say 15 to 20 percent in Selenium. So, Autify covers 80 to 85 percent of the test coverage.

– So 80 to 85 percent in Autify. I do believe that you don’t need to migrate everything to Autify. Yet the fact that you can cover more than 80% with Autify and you can shorten the amount of onboarding effort… That’s very huge!

Trincy: For the most part, it is an independent application. This is important as our QAs need to understand both technologies, which boils down to skillset. In relation to the number of Senior QAs, we can still handle that load. That only leaves our “Package Validation” related test scripts in Selenium. The majority of test scripts live in Autify so we can hire Mid-level Juniors.

– Your senior folks can focus on doing complicated backend-heavy automation, which leaves everyone else to work on Autify to automate an easier path.

Trincy: Yes.

Autify offers fully function test coverage

– You mentioned you had internal applications such as WordPress. Can you give us some of your most common use cases for Autify?

I would also like to know more about end-to-end testing for clients. Maybe you have automation for conversions on client websites or conversions on your own website. Do you have use cases for E-commerce or Marketing that could help those departments besides deployment?

Trincy: Yes, we do. It has full functional test coverage. For whatever we are configuring on WordPress, there are test cases for that. Based on the configuration, if they are displaying correctly, we validate that. Then we have feeds to our partner’s website, and we validate the content of that feed. These test cases cover end-to-end for the credit and banking verticals.

We have test suites for our internal configuration systems and applications, which is a PHP-based application. It has all kinds of complex logical situations. In those cases, we have to inject JavaScript to validate those conditions, because it’s just not recording the screen. Yet, after you configure the campaigns and launch them; they will launch in different domains, in different UIs. We do have UI-based test suites for that.

They all generate clicks, conversions, and more. There are different scenarios that generate data related to our “Package Validation.” What we can validate on the UI we validate there. However, we cannot validate some of that information because it could be network data or it could be on the client’s site. It comes back later on to our system. In the end, we validate all of the information that comes in XML or JSON format and extract the information from the Package, and then validate it. After that, whatever is validated goes back to our Data Warehouse. We have scripts that validate the data we extract from those sources match what went into our Data Warehouse.

– You mentioned that you don’t have access to your client’s website, so you cannot do an end-to-end test, but do you at least validate a conversion? For example, a classic demo conversion, request of a quote, or signup conversion on the client’s website.

Trincy: For us, conversion means different things in different scenarios. Sometimes conversion means we send them information about the user, who left our site to the client’s site. Perhaps that gets one test conversion. Other conversions happen when the user lands on the client’s site and signs up. In the end, it’s all data. It comes in all different formats based on where the conversion happens.

All these scenarios must be validated, and based on where the conversion is happening, the validation happens accordingly.

In some cases, we spoof where it is going to land and validate it accordingly. In other cases, it is just firing that conversion event, and we validate our system fires the event. That is where we end our test.

We do receive a file from the client saying these are the conversions or the clicks, or the impressions we received on this day. We validate that content and match it with the data we generate.

With Autify, we were able to quickly expand our QA team internationally to be faster and cover more time zones

– You have a lot of challenges, like maintaining the scenarios environment, onboarding new members, expanding your team sizes, and so on. What are the changes from the past to now?

Trincy: Let’s start with the cons because there is only one. Some people love to code. Their heart is coding, and to them, their main goal is going to the next level. That is a huge motivating factor for many in the QA department. We take it that away with Autify. They don’t need to code as much, if at all. We anticipated losing some people when we moved to Autify.

Eventually, that did happen. Though not immediately, it took a while for us to lose some people. As we knew it would happen, we replaced them with junior members. We did try interns, and that was very successful. We were confident we could hire someone, build their career, and help them learn whatever automation we are doing. That sets them up for a good career path.

We ended up hiring more people. I think we had between 7 to 8 people while we were migrating to Autify. Now we have approximately 20.

We are able to hire more people and even take a little risk of not putting that much stress on the skills but rather on their capacity to evolve. We could quickly train them and get them where we want them to be. We are hiring in Mexico and India. We can cover different time zones now.

That offered a lot of benefits and expanded the number of hours we can certify tickets. For example, if someone in the US certified a ticket, it took 2 days. Now that person can hand over the same ticket to someone offshore, and in less than 24 hours we can deliver that ticket.

It doesn’t matter if it is from offshore or if someone nearshore completed it. To them, within a day we had a turnover. So, we were able to speed up our process, and hiring became much easier.

– That’s a big change. Can you internally hire offshore members to cover all the time zones to cover up to 24 hours?

Trincy: Yes. Initially, we were hiring a lot of people in California. Now, we are hiring throughout the U.S. and all over the globe. The team’s work hours are no longer restricted to PST business hours, instead we have 24-hour coverage because the team is geographically distributed. It’s very beneficial because now people are deploying at different hours.

Future plans for speeding the delivery

– What would be the next step for you? It doesn’t necessarily have to be QA testing, it could be software development. Any future plans?

Trincy: Definitely, speed is the next thing. We are fast, but the world is going very fast, so we need to deliver even faster.

One thing we noticed was the human element in between environments or in between work statutes- is the major cause for slow deliveries. You may have a regression that is ready to be deployed to the next environment. If someone kicks off the deployment, it doesn’t kick off regression, then the next step doesn’t move ahead.

We are now trying to automate that. We achieved this for a while, and that’s why we have been trying to work with Autify for APIs. If we have that, we can promote between environments without any human interaction. If it fails, we can go back. If it succeeds, we don’t need a human saying go ahead, we can move to the next environment because we do have full coverage. The result has been reliable, so we can move ahead and deploy without anyone looking at those tickets.

That’s one of the things that allowed us to move from our staging environment to production with the current API availability.

One thing we are still waiting for from Autify is the need to create test plans dynamically. If we have that, we could automate a lot more portions than we have right now on our runways. We could create a test plan dynamically, move components based on test cases into it, and then, the QA could get a functionally certified ticket. If it doesn’t pass that functional testing, it could go back to development so they can fix it and deliver a quality ticket.

We could make improvements there, which is the focus now, to reduce any human interaction between tickets. We know we cannot eliminate that functional testing part, but we think everything else could be automated.

– That’s impressive! I’ll make sure to work on the test plan creation by API. I think it is on the list. One of us is taking the API ticket. I’m sure it will be delivered soon.

Anvitha: I just want to say that with all the features Autify is implementing, I’m very excited to see them. The speed by which you have implemented, we are very excited to see how fast they will be available. That helps us so much, so thank you for that.

– Great to hear, thank you so much!

Mitul: I don’t have anything else to add. The only thing is that Autify is improving. Every time a new feature comes, we are excited to try it, and that’s good!

– Cool! We are improving based on your feedback, it is precious. Feel free to reach out to me and also the team about any feedback you have. We are here to improve your creation process. I’m very excited to help you guys.

Trincy: Thank you! The Java snippet or share steps in the middle of the test were all very powerful features. It made a huge impact on us to organize. It makes our lives much easier.

– Thank you guys for your time. It was great to meet you. Hopefully, we can keep in touch. If you need anything, I’m also here anytime you need my help.

Trincy: Thank you so much!