View all web browser and mobile devices available in our cloud-based test lab.
Performance testing is a type of non-functional testing that is critical to the success of a mobile or web application. It tests the quality of an app such as speed, stability, and scalability. Without it, users may encounter poor usability. Here, we break down what automated performance testing is, why you should shift it left, and how to do it.
Table of ContentsWhat Is Automated Performance Testing?What Are the Types of Performance Testing?Website Performance TestingMobile App Performance TestingWhy Shift Performance Testing Left?Key Automated Performance Testing ToolsHow to Automate Performance TestingDo Automated Performance Testing With Perfecto & BlazeMeterTable of Contents1 - What Is Automated Performance Testing?2 - What Are the Types of Performance Testing?3 - Website Performance Testing4 - Mobile App Performance Testing5 - Why Shift Performance Testing Left?6 - Key Automated Performance Testing Tools7 - How to Automate Performance Testing8 - Do Automated Performance Testing With Perfecto & BlazeMeter
Automated performance testing checks the speed, response time, reliability, resource usage, and scalability of software under an expected workload by leveraging automation.
The goal of performance testing is to eliminate performance bottlenecks in the software.
Leveraging automated performance testing can help you check speed, scalability, stability, and reliability faster. That’s why it’s time to shift performance testing left. Here are several types of performance testing:
Load testing tests the performance by increasing the load until it reaches a threshold.
Stress testing tests the performance and stability when hardware resources aren’t sufficient. Common types of stress tests include soak testing and spike testing.
Measures how an app performs over an extended period of time.
Measures an app’s performance at handling an increased workload.
Measures how an app performs with sudden jumps in workload.
Measures an app’s performance with large amounts of data.
Performance tests should be tested like your users are using the service. The art of building the right test coverage metrics is key. Your testing strategy should include recommendations on when to test.
You need to know which types of platforms and operating systems need to be tested, no matter where you are in the globe. This ensures that your coverage is aligned and the user experience is optimized for performance.
One thing that can help with this is referencing a test coverage index.
For website performance testing, you need to test page load time across browsers, refresh rates, varying screen sizes, and resolution across different browser OEMs. All of these factors contribute to the performance of web apps and can provide top-notch experiences when done right.
Ready to level up your automated performance testing? Get started with BlazeMeter today!Start Testing
Ready to level up your automated performance testing? Get started with BlazeMeter today!
Mobile app performance testing must include user conditions such as varying network conditions (poor 4G, 3g, etc.), apps running in the background, and the latency of sensors like location and camera. These factors need to be tested across devices and OS versions in order to provide the best end-user experience on mobile apps.
It’s so important to shift testing left and include all types of testing in a single pipeline. This includes performance testing. By shifting performance testing left, you’ll identify potential performance bottleneck issues fast so you can move faster in today's digital reality.
Here’s why this is important.
In Waterfall development, performance testing typically happens just before deployment. But if development and functional testing take longer than expected, there’s less time for performance testing. (And this happens often.) So, you might discover performance problems too late. Performance issues can be tricky to resolve, too. If issues are related to code or architecture, you don’t have time to react.
Moving to Agile and shifting performance testing left helps you resolve these issues earlier in the development cycle, when you actually have time to fix them.
For example, a microservices architecture is made up of small components. One of the advantages is that when you have major loads or peaks, you don’t have to redeploy the entire stack anymore. You can just deploy more nodes.
But this can create performance issues. If your application is consuming resources heavily, you’ll keep spinning up more nodes. You’ll have delivered acceptable response times or user experience. But the cost of the infrastructure has increased significantly.
You can use performance testing to simulate this scenario before it happens.
To get performance testing right, you need to:
To do automated performance testing, you need the right tools.
Key automated performance testing tools include:
Here’s how to automate performance testing by leveraging Perfecto and Neoload. In this example, we use an e-commerce site.
Keep reading for a recap of how to automate performance testing. Or watch the webinar below to see how it’s done. Jump to 48:27 for the demo.
First, you’ll need to retrieve the latest version of the code and test it from your version control system, such as GitHub.
Then you’ll generate the test case and trigger the test over Perfecto. Once the test case passes, you can move on to performance testing.
Now it’s time to go into Jenkins to generate a Junit report.
You can do this by going into your IDE (here, we use IntelliJ). And then go into the Perfecto app class.
Here, you’ll define the driver:
You’ll also select which devices to test.
Then, you can go into the Perfecto dashboard to see the live stream of the testing.
Then, you’ll deploy the Neoload Load Generator for load testing.
The Neoload project also uses Jenkins.
When you go into Neoload, you’ll have a test case with two types of users:
This helps you measure user experience. In the Neoload dashboard, you’ll see the tests that have been executed so far, and you can compare the results.
Next, you’ll run a load test in Neoload with the Perfecto integration to test the user experience.
Neoload reports back on the protocol level. Perfecto reports back on the user experience.
Automated performance testing is easier with Perfecto and BlazeMeter.
With both of these testing platforms under the Perforce umbrella, Perfecto and BlazeMeter work together to offer performance testing that is fully comprehensive. For example, BlazeMeter Performance testing can be used in conjunction with your mobile tests for UX testing.
Using both of these tools in conjunction, you can simulate realistic mobile traffic patterns to test both your mobile user experience and your backend under load in the cloud. Plus, you can scale up to two million virtual users.
See for yourself how you can leverage Perfecto and BlazeMeter for automated performance testing today.
DevOps Chief Evangelist & Sr. Director at Perforce Software, Perfecto
Eran Kinsbruner is a person overflowing with ideas and inspiration, beyond that, he makes them happen. He is a best-selling author, continuous-testing and DevOps thought-leader, patent-holding inventor (test exclusion automated mechanisms for mobile J2ME testing), international speaker, and blogger.
With a background of over 20 years of experience in development and testing, Eran empowers clients to create products that their customers love, igniting real results for their companies.