Software testing involves managing all kinds of projects with requirements.
It’s amazing how each project that I’ve worked on over the years has been different.
It’s honestly not a cliche when they say “no two software testing projects are the same”.
In this article I’m going to give you an overview of NFRs followed by a checklist to cover various parts of your organisation.
Table of Contents
What are non functional requirements?
Non functional requirements (NFRs) are the qualities or attributes that describe the performance, usability, security and other characteristics of a software system, rather than its specific functions.
NFRs can be defined as system wide characteristics that impact the user experience.
Non functional requirements are basically all the “behind the scenes” stuff that you don’t “functionally” test but are incredibly important.
Why is it important to test Non Functional Requirements?
Having a functional system is great. However if the platform and infrastructure isn’t tested thoroughly, you could potentially have issues in the future.
Below are a number of reasons why it’s important to test NFRs.
- User satisfaction: NFRs such as usability, stability and reliability all have an impact on the end user’s satisfaction on the software system. If the application doesn’t meet these NFRs, it could lead to negative reviews and feedback.
- Compliance: A number of industries such as banking have very strict regulatory requirements. In order to be compliant, they may need to adhere to these data privacy and security NFRs to avoid any fines and penalties.
- Cost savings: Identifying any major issues with NFRs early on in the software development life cycle can save time and money. Just imagine
- System stability: A system that can handle heavy traffic during peak times without crashing is one of the many reasons why having a reliable and available system is important. Can you imagine the impact of a website crash on launch day?
Which teams are involved in non functional requirements?
Depending on the size of your organisation, multiple teams are typically involved in non-functional requirement testing.
In smaller organisations, it might be a single team or a handful that have accountability.
Below is an example of the teams that may be involved.
Business Analysts:
BA’s will work with business stakeholders to capture business, functional and non-functional requirements in specification documents.
Architecture Team
They will define the NFRs around the overall architecture of the system. In my experience, I’ve found that this team will liaise with the various teams and come up with a list of NFRs.
Development Team:
The Dev team will be primarily responsible in ensuring the system is built in compliance with the non functional requirements.
Testing Team
Responsible for proving the non-functional requirements from a testing perspective. Depending on the capability of the test team, they may conduct all the testing in house or manage the testing by various stakeholders.
This may be in-house or external vendors.
Operations / End User Team
Responsible for deploying and running the systems. As the end user, their User Acceptance Testing will be paramount in proving the system is available and reliable.
IT Security Team
Responsible for ensuring the system meets all security requirements.
User Experience Team
Responsible for ensuring the system is user friendly and meets all usability requirements.
Business Stakeholders
Overall, non-functional requirements are a shared responsibility across multiple teams in an organisation, as they impact the overall success of the software system.
The test team will generally be responsible for facilitating the testing between teams and vendors.
Real Life Examples of Non Functional Tests
It’s all good and well giving you lots of info but I find it even more helpful when you have some examples.
NFR Type | Description of NFR | Example of Test |
Performance testing | Assesses how a software system performs under various conditions, such as high user loads, heavy data volume, and peak traffic. The goal of performance testing is to ensure that the software system performs optimally and meets the specified performance requirements. | Measuring the response time of a website or application when a large number of users are simultaneously accessing it. |
Security testing | Assesses the software system’s security features to ensure that it is protected against unauthorised access, data breaches, and cyber-attacks. | Performing penetration testing to simulate attacks on the system to identify vulnerabilities that can be exploited by hackers. |
Usability testing | Assesses how easy and intuitive the software system is to use for the end-users. | Conducting user surveys or user interviews to collect feedback on the software system’s user interface, navigation, and other usability aspects. |
Compatibility testing | Assesses how well the software system performs across different hardware, operating systems, and web browsers. | Testing a website or application on multiple browsers, such as Google Chrome, Mozilla Firefox, and Microsoft Edge, to ensure that it works correctly on all of them. |
Reliability testing | Assesses how well the software system performs under prolonged use and stress. | Running the software system for a prolonged period to ensure that it does not crash, hang, or exhibit other unexpected behaviours. |
Your Non Functional Requirements Checklist
Below are some examples of non functional tests you might want to consider for your system. Feel free to print and amend as required. This page will be periodically updated.
This list is not exhaustive and you should always address your systems requirements.
Performance Testing Checklist
NFR ID | Test Description (What to test) | How to Test |
PT-001 | Test the software system with different numbers of simultaneous users (load testing) to determine the maximum number of users that the system can handle without degrading performance. | Record the response times for different numbers of users and compare them to determine the system’s ability to handle peak traffic. |
PT-002 | Test the software system under high stress by increasing the number of simultaneous users beyond the maximum capacity to determine how the system behaves when overloaded. | Measure the response times and the number of errors or crashes that occur and compare them to the system’s capacity to handle stress. |
PT-003 | Test the software system’s ability to operate continuously for an extended period under heavy load (endurance testing). | Monitor system resources such as memory, CPU usage, and network bandwidth over an extended period to identify any resource leaks or memory leaks that could affect performance. |
PT-004 | Test the software system’s ability to handle large volumes of data (volume testing). | Test the system with data that exceeds the typical range of usage to verify the system’s ability to scale and handle increased data loads without performance degradation. |
PT-005 | Test the software system’s response time under varying user loads to determine if it meets the required performance criteria. | Run tests with small, medium, and large user loads and measure the response time for each test. |
PT-006 | Test the software system’s ability to handle multiple concurrent transactions. | Simulate multiple users performing transactions simultaneously and measure the response time for each transaction to verify that the system is capable of handling concurrent transactions without slowing down. |
PT-007 | Test the software system’s ability to recover from failures. | Introduce failure scenarios (such as database crashes or network outages) and measure the system’s ability to recover and resume normal operation. |
PT-008 | Test the software system’s ability to scale and handle increased user loads. | Increase the user load incrementally and measure the response time to verify that the system can handle increased loads without performance degradation. |
PT-009 | Test the software system’s ability to handle heavy loads for extended periods of time. | Run the system under heavy loads for an extended period (24 hours or more) to verify that the system is capable of handling sustained high loads without degradation. |
IT Security Non Functional Checklist
NFR ID | Test Description |
ST001 | Perform a penetration test to determine whether unauthorised access can be gained through known vulnerabilities. |
ST002 | Test for vulnerabilities in the system by performing a vulnerability scan to identify and eliminate weaknesses. |
ST003 | Perform a compliance test to ensure that the system complies with regulatory requirements and industry standards. For example a SOC 2 report. |
ST004 | Check if the system is protected against common types of attacks such as SQL injection, cross-site scripting, and cross-site request forgery. |
ST005 | Test the authentication and authorisation mechanisms in the system to ensure that only authorised users can access sensitive data. |
ST006 | Verify that the data stored in the system is encrypted and can only be accessed by authorised users. |
ST007 | Test the system’s ability to detect and prevent security breaches in real-time by simulating various cyber-attack scenarios. |
ST008 | Verify that the system logs and audits all security-related events and generates alerts when necessary. |
ST009 | Test the system’s ability to handle different types of encryption algorithms, such as symmetric and asymmetric encryption. |
ST010 | Verify that the system has implemented secure coding standards to prevent common vulnerabilities, such as buffer overflows, injection attacks, and race conditions. |
ST011 | Verify that the system has implemented proper access control mechanisms, such as role-based access control (RBAC), to prevent unauthorised access to sensitive data. |
ST012 | Test the system’s ability to handle denial-of-service (DoS) attacks, which aim to overload the system’s resources and make it unavailable. |
ST013 | Verify that the system has implemented proper encryption, hashing, and salting techniques to protect sensitive data, such as passwords and credit card numbers. |
ST014 | Perform a social engineering test to assess the system’s vulnerability to attacks that exploit human behaviour, such as phishing and baiting. |
ST015 | Test the system’s ability to recover from security incidents, such as data breaches, and ensure that it has proper backup and restore mechanisms in place. |
ST016 | Verify that the system has implemented proper authentication mechanisms, such as multi-factor authentication, to ensure that only authorised users can access the system. |
ST017 | Test the system’s ability to handle security incidents, such as intrusion attempts and security breaches, and ensure that it has proper alerting and notification mechanisms in place. |
ST018 | Verify that the system has implemented proper data privacy and protection mechanisms, such as data masking and anonymisation, to protect sensitive data. |
ST019 | Test the system’s ability to handle network attacks, such as man-in-the-middle attacks and packet sniffing, and ensure that it has proper encryption and authentication mechanisms in place to secure network communications. |
ST020 | Verify that the system has implemented proper logging and auditing mechanisms to track user activities and system events, and ensure that it complies with regulatory requirements, such as GDPR and HIPAA. |
ST021 | Verify that the system has implemented proper access control mechanisms to restrict user access based on their roles and privileges, and ensure that it has proper authorisation mechanisms in place to enforce security policies. |
ST022 | Test the system’s ability to handle denial-of-service (DoS) attacks, such as flooding and resource exhaustion attacks, and ensure that it has proper mitigation mechanisms in place to prevent service disruptions. |
ST023 | Verify that the system has implemented proper encryption mechanisms to protect data at rest and in transit, and ensure that it complies with industry standards, such as AES and |
Compatibility Testing Checklist
NFR ID | Test Description | Test Steps |
COMPAT-001 | Cross-browser testing to verify compatibility | Install and configure the software system on a test environment.Test the system on the latest version of popular web browsers such as Google Chrome, Mozilla Firefox, Microsoft Edge, and Safari.Verify that the system functions as expected on each web browser as per your test case design. Record any issues or bugs encountered during testing and report them to the development team for resolution. |
COMPAT-002 | Cross-platform testing to verify compatibility | Install and configure the software system on a test environment.Test the system on the latest versions of popular operating systems such as Windows, MacOS, and Linux.Verify that the system functions as expected on each operating system.Verify that the user interface, layout, and features of the software are consistent across all platforms.Record any issues or bugs encountered during testing and report them to the development team for resolution. |
COMPAT-003 | Device compatibility testing to verify compatibility | Install and configure the software system on a test environment.Test the system on the latest versions of popular mobile devices such as iPhone, iPad, and Android smartphones and tablets.Verify that the system functions as expected on each device.Record any issues or bugs encountered during testing and report them to the development team for resolution. |
COMPAT-004 | Internationalisation testing to verify compatibility | Install and configure the software system on a test environment.Test the system with different languages and character sets such as Chinese, Arabic, and Russian.Verify that the system displays the correct characters and formatting for each language.Record any issues or bugs encountered during testing and report them to the development team for resolution. |
Reliability Testing Non Functional Requirement Checklist
NFR ID | Test Description | How to Test |
REL0001 | Load Testing | Test the system under various loads (e.g., light, moderate, heavy) to determine its response times, throughput, and resource utilisation.Test the software system’s ability to handle a heavy workload and stress for a prolonged period. Increase the workload and measure the system’s response time and resource utilisation. Record the performance metrics such as CPU usage, memory usage, and network usage. Analyse the system’s behaviour under different loads and identify any bottlenecks. |
REL0002 | Stress Testing | Push the system beyond its design limits to measure its response times and stability under extreme loads or conditions.Test the software system’s ability to handle high loads that exceed its capacity. Create a load that exceeds the system’s capacity and observe how it handles the load. Analyse the system’s behaviour under stress and identify any bottlenecks or failures. |
REL0003 | Endurance Testing | Execute a continuous workload for a prolonged period of time (e.g., 24 hours) to verify that the system can maintain performance and reliability. Test the software system’s ability to perform over a prolonged period without any issues or failures. Run the system under normal load for an extended period and monitor its performance. Analyse the system’s behaviour over time and identify any degradation or issues that arise. |
REL0004 | Recovery Testing | Intentionally disrupt the system (e.g., by simulating a power outage) and measure the time it takes for the system to recover and resume normal operation. Test the software system’s ability to recover from a software failure. Introduce a software fault or error and observe the system’s behaviour. Verify that the system can recover from the error and continue to function correctly. |
REL0005 | Failover testing | Simulate a failure of a primary system component and measure the time it takes for the secondary component to take over and resume normal operation. |
REL0006 | Scalability testing | Increase the number of concurrent users, transactions, or data volume to determine the system’s ability to scale up or down without performance degradation. |
REL0007 | Latency testing | Measure the time it takes for the system to respond to a user request and ensure it meets the required response time thresholds. |
Availability Testing Checklist
NFR ID | Test Description | How to Test |
AV-001 | Fault tolerance testing | Simulate a fault in the system and ensure that the system can continue to operate without downtime. |
AV-002 | Failover testing | Test the ability of the system to switch to a backup or redundant system in the event of a failure. |
AV-003 | Disaster recovery testing | Simulate a disaster scenario such as power outage, hardware failure, or natural disaster, and ensure that the system can recover and resume normal operation as quickly as possible. |
AV-004 | High availability testing | Test the system’s ability to remain available and responsive under high load or traffic conditions. |
AV-005 | Redundancy testing | Test the redundancy of critical components such as servers, networks, or databases, to ensure that a failure in one component does not cause a system outage. |
AV-006 | Scalability testing | Test the system’s ability to scale up or down based on demand, ensuring that it can maintain uptime and performance levels during times of high traffic or usage. |
AV-007 | Capacity testing | Test the system’s capacity to handle a large number of users or transactions, ensuring that it does not become unavailable or experience performance degradation. |
AV-008 | Patch testing | Test the system’s ability to handle updates or patches to the software or underlying infrastructure without causing downtime or loss of data. |
AV-009 | Configuration testing | Test the system’s ability to handle changes to its configuration, such as adding or removing servers or changing network settings, without causing downtime or loss of data. |
AV-010 | Security testing | Test the system’s ability to maintain availability in the face of security threats such as DDoS attacks, malware infections, or unauthorised access attempts. |
AV-011 | Data backup and recovery testing | Test the system’s ability to backup and recover data in the event of data loss or corruption, ensuring that data remains available and recoverable. |
AV-012 | Network testing | Test the system’s ability to handle network issues such as packet loss, latency, or bandwidth limitations without causing downtime or loss of data. |
AV-013 | System maintenance testing | Test the system’s ability to remain available and responsive during routine maintenance tasks such as software updates or hardware repairs. |
AV-014 | Load balancing testing | Test the system’s ability to distribute traffic across multiple servers or resources to ensure high availability and performance. |
AV-015 | Service-level agreement (SLA) testing | Test the system’s ability to meet or exceed agreed-upon SLAs for availability and response time, ensuring that it meets customer expectations. |
In Summary…
There you have it.
Hopefully the information in this article has given you a good starting point when it comes to non functional requirements.
I’m hoping that in addition to acquiring background information, you’re also able to gauge the types of NFR tests as part of your test plan.