Sunteți pe pagina 1din 49

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/343334368

Book Title: Performance Testing – Microsoft Dynamics 365 Finance and


Operations For sales order creation web service by deploying Blazemeter and
JMeter

Book · July 2020

CITATIONS READS
0 297

2 authors:

Shahid Ali Andrii Chernenko


Ascent Group Of Institutes AGI Education Limited
28 PUBLICATIONS   67 CITATIONS    1 PUBLICATION   0 CITATIONS   

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Support Vector Machine (SVM) Aggregation Modelling for Spatio-temporal Air Pollution Analysis View project

All content following this page was uploaded by Shahid Ali on 08 September 2020.

The user has requested enhancement of the downloaded file.


Performance Testing - Microsoft Dynamics 365
Finance and Operations
(Sales order creation web service)
by deploying BlazeMeter and JMeter tools
Andrii Chernenko1 and Shahid Ali2
1
Department of Information Technology, AGI Institute, Auckland, New Zealand
18491003@student.agi.ac.nz
2
Department of Information Technology, AGI Institute, Auckland, New Zealand
Shahid@agi.ac.nz

ABSTRACT
Microsoft Dynamics 365 is a cloud-based product that supports various users with different production
capacities. That is why, to ensure reliable and fast operation of Microsoft Dynamics 365, conducting
performance tests is a vital measure for maintaining the leading position of the product in this segment of the
software industry. Performance testing is an important and necessary test process to ensure that there are no
problems with the performance of the product, such as: long download time, poor response time and poor
scalability. In addition, these types of testing activities help to identify bottlenecks that interfere with the
operation of the hardware-software platform of the product resource and can lead to significant delays in
performance and even to the failure or crash of the resource as a whole. Apache JMeter and BlazeMeter are
sufficient automation tools to accelerate and utilize performance-testing processes at this stage. Based on the
results of various performance tests, software and hardware engineers will develop and implement a set of
measures to improve the performance of both individual modules and the resource as a whole. After the
implantation of fixes and updates, as well as to verify performance after scheduled and restoration service work
in the hardware-software environment of the product, it is necessary to conduct a regression test for validation
of the performance of Microsoft Dynamics 365.

KEYWORDS
Microsoft Dynamics 365, Performance Testing (Smoke, Scalability, Load, Volume, Stress, Spike), Automation
Testing Tools (Apache JMeter, BlazeMeter)
1 Executive Summary
This report intends to give an overview of the work done for the performance testing of Microsoft
Dynamics 365 for Finance and Operations order creation web service.
Based on requirements for the performance testing of Microsoft Dynamics 365 for Finance and
Operations order creation web service, the following work was carried out:
• Materials were collected and reviewed on the work performed in the project and related areas
for further use in the implementation of the project.
• The functional features of the Microsoft Dynamics 365 system, the order creation web service
function and the data exchange protocol were studied.
• In accordance with the Agile approach, a project implementation schedule was developed,
and a communication plan was drawn up.
• Based on the standards and best practices of software testing, the following were identified:
collection metrics, test deliverables, Pass/Fail criteria, suspension and resumption criteria and
the sign-off process as well as a defect report and the tracking process.
• Based on the analysis of the system and the test environment, the risks were identified, and
the test environment and project structure were described.
• A market analysis of performance testing tools was also conducted, and BlazeMeter and
JMeter tools for the implementation of the project were selected.
• Based on the business requirements and the Microsoft Dynamics 365 for Finance and
Operations order creation web service analysis, a Traceability Matrix was described, and
manual and automation test cases were prepared.
• Using the selected tools, scripts were written for automation performance testing, and both
manual and automation testing was performed. In the process of preparing automated tests,
it was found that for testing it is necessary to use the Json protocol.
• The collected results were systematised, analysed and discussed with subsequent findings
and recommendations.
• According to the test results, a pattern was revealed of a decrease in the speed of writing a
new sales order with an increase in the number of lines. Also, restrictions were identified in
the possibility of creating an order containing 150 or more lines, errors occurred while writing
orders containing 70 lines or more and the service occasionally failed when overloaded. No
system crashes were detected.
• It is recommended that attention to be given to detected issues discussed with further testing
if necessary. An action plan to further monitor and improve system performance is also
recommended.
2 Introduction
IASI company specialises in Software Testing and works as a Microsoft subcontractor. The aim of
the project is Performance Testing Microsoft Dynamics 365 and designed to accelerate the process
of formation and facilitate decision-making based on data analysis and recommendations. Microsoft
Dynamics 365 is one of the products of Microsoft Corporation. According to the company's annual
report, in the fiscal year 2018 Microsoft Corporation delivered $110.4 billion in revenue and $35.1
billion in operating income [20]. Dynamics 365 represents the next generation of modular business
applications in the cloud. The system combines enterprise resource planning (ERP) with advanced
customer relationship management (CRM) capabilities. CRM functions provide users with the ability
to unify relationship data both inside and outside the company. ERP functionality includes tools of
artificial intelligence (AI), machine learning and mixed reality (a hybrid of reality and virtual reality).
Through automation and optimisation of manual tasks and through the introduction of intelligent
workflows, system users can work more efficiently [21].

2.1 Project Rationale


Microsoft Dynamics 365 is a modular system, its users have an opportunity to begin its integration
into their business with the basic and necessary modules and, if necessary, add new application
modules as their integration deepens and their business grows. System users have the opportunity to
access their data in the system and use them on various types of MS Windows devices as well as the
mobile application is available for iOS and Android one. Microsoft Dynamics 365 is a comprehensive
cloud solution used in various fields and at different scales of business all over the world. The system
offers: the predictive sales guide, auto fraud protection, virtual sales agents, product visualisation,
holographic guides, and remote assistance in mixed reality. In addition, users gain the same
experience and level of protection as the corporate level, which ensures the protection of the data of
their business and their customers.
Obviously, the system has a very complex architecture and flexibility, changes in which can lead to
delays or performers bugs. That is why it is necessary to carry out regular performance test activities
and to conduct evaluative monitoring of the resource’s performance to provide acceptable system
response for its users and, as a result, their satisfaction.

2.2 Scope and Objective


The aim of the project is to test the performance of the Microsoft Dynamics 365 for Finance and
Operations sales order creation web service, namely: creating orders “10 lines, 20 lines, etc., as well
as sending 100 orders to the service and furthers even 200 orders, etc.”
During the implementation of the project, the following test activities should be carried out:
• Familiarisation with the functionality of the service.
• Test Plan Preparation.
• Selection tools for test scripts recording and executing.
• Creation and execution manual test cases.
• Recording, parametrising and executing automated test scenarios.
• Analysis and concluding and issuing recommendations.
3 Literature Review
This is a new study that primarily focuses on performance testing of Microsoft Dynamics 365. There
are no available online studies for literature reviewing on this topic. The area of performance testing
Microsoft Dynamics 365 is highly specialised, and all discussions on it are held in professional
forums, for example: “How to run performance test for dynamics 365?” [9] or “Load Testing
Dynamics CRM/365 with LoadRunner” [17], without going beyond practical advice. That is why, for
the implementation of the project, it is necessary to study publications both in Microsoft Dynamics
365 and in the field of performance testing.
Software testing is a part of the project management process. Determining the goals and quality of
testing in the early stages of software development [16] improves and accelerates the process of
project implementation and, as a result, product quality.
The goal of performance testing is to identify the state when applications unexpectedly show degraded
performance when entering certain combinations of values [7]. From researcher point of view, the
main goal of performance testing is to select a controlled subset of input data to automate the search
for performance problems in applications [7]. Thanks to their innovation, problems in system
performance were automatically identified and then validated by experienced testers and developers.
By using a comprehensive study of 110 real performance errors that were randomly selected from
software packages (Apache, Chrome, GCC, Mozilla and MySQL), firstly the performance rules were
identified and then from them 25 corrections were extracted and used to identify errors. As a result
of this study was found that minor corrections in the developer code can significantly improve the
performance of system [12].
Modern research clearly shows that user satisfaction depends on both key performance indicators of
the system and the overall success of implementing an ERP system [3], and makes it possible to
measure the effectiveness and attributes of customer relationship management, introducing a new
approach for indexing performance CRM [2].
Over the past decades, active work has been carried out in the field of optimisation, automation and
systematisation of performance testing processes, such as developing mechanisms for automatically
identifying problems in system performance [7]. Unfortunately, it is impossible to cover all
performance topics in one small study. However, it is necessary to explain many factors in the process
of preparing and conducting testing activities that must be considered, such as:
• Conducting data processing, performing batch analysis, and using multithreading in data
processing [26].
• Taking into consideration methods for automatic testing of performance in a dynamic
production environment [14].
• Stimulating the load on the system, generating the load in real time, and measuring
various performance indicators for understanding how various types of users interact
with the system [1].
• Measuring the system performance based on some characteristics or matrices, and then
combining all characteristics into one matrix [24].
• Using performance regression testing to identify problems in system performance [10].
• Evaluating the actual performance of cloud service through testing reference applications
hosted in a cloud service under load conditions. [5].
Microsoft Dynamics 365 is the system that provides CRM and ERP solutions that support large and
medium-sized organisations with a complete integrated solution for business management [18]. For
effective system performance testing, it is necessary to study and understand Microsoft Dynamics
365 from both sides: namely, from the user’s side, how to use the system [18], and from the
administrator’s side, how to administer, configure and maintain the system as a cloud service [4].
For testing multi-level and multi-purpose cloud-based modular systems with a contingently unlimited
number of users such as Microsoft Dynamics 365, first of all it is necessary to identify the testing
methods and then properly organise all the test process activities in accordance with the requirements
[8].
To summarise the review of all the above materials on the topic "Performance Testing Dynamics 365
for Finance and Operations (Sales order creation web service)", it can be concluded that undoubtedly
performance testing of the system has been carried out and is carried out, but unfortunately without
any specific written reports on this topic.
In addition, as can be seen from the materials given above, performance testing is a critical part of
the introducing of a system into a business and it is vital for identifying performance bugs in the
system and fixing them, ensuring high performance and reliability and, as a result, satisfaction among
users of the system.
To add to these, there is no up-to-date information on using JMeter on the Internet for performance
testing Microsoft Dynamics 365 system. However, there are some studies in which JMeter was used
with web applications with an integrated unified authentication platform [15] and for identification
of performance-improving factors for web application [22]. That is why it will be useful in the study
to use this free tool for automated performance testing of Microsoft Dynamics 365.
Apache JMeter is a tool that allows users to create and run tests to carry out the performance process
improvement of web pages, applications and databases [6]. JMeter integration with Agile and DevOps
processes improves tool utilisation [23].
The modern software development industry requires flexible approaches that quickly adapt to market
conditions. That is why following Agile approaches, such as Scrum, Kanban, XP and so on, is an
important and integral part of preparing and conducting testing processes [19].
4 Project Execution
4.1 Automation Test Phase
This phase identifies the most important parts of the system under the test and its critical components
affecting the performance testing process.
Microsoft Dynamics 365 for Finance and Operations is a system under the test. In accordance with
the Business requirements of the project (BR) it is necessary to conduct Performance testing of
sales order creation web service, namely: "We will be needed to test orders of 10 lines, 20 lines and
so on, also sending 100 orders to the service, 200 orders and so on".
Table 1 shows Epic and user stories describing user behaviour according to the received task.
Epic and user stories

User
Epic # User Story
Story #
EPIC1. As a user I want to login to the system
New sales order
As an authorised user I want to create a new “sales order”, add 10
creation
items and save the order, so that I can verify correct “sales order”
creation, adding items and saving the order
For testing it is necessary to:
• Obtain authorised access to the service.
• Study the structure and function algorithm of the service.
• Prepare data for testing.

Figure 1. Start page of Microsoft dynamics 365


Figure 2. A new sales order creation for Microsoft dynamics 365

Figure 3. New lines and items adding Microsoft dynamics 365

Figures 1, 2 and 3 show the structure of the service, namely: access the page of Microsoft dynamics
365, the page with the menu for creating a new sales order, the page for adding new lines and items,
and saving the order.
The following is the architecture of the testing environment in which the testing process is conducted.
Testing Environment
Cloud Service

BBA-SPARK
global-gateway

Local network

Local router
ax.dynamics.com

Test execution Hardware

Figure 4. Architecture of the testing environment

It is also necessary to consider the unavailability of the service for unauthorised users during
preparation of a project.

Figure 5. Ping the service

Since the system under test may be in a state of peak loads or even failure during the performance
testing process, the service testing schedule is coordinated and flexibly adapted to meet the company's
production requirements.

4.2 Methodology
The software development lifecycle in the modern IT industry is so fast and flexible now that it is
probably impossible to imagine it without using the Agile approach.
Even though this is a short-term project for educational purposes, the Scrum [19] approach will be
followed during its implementation.
The project Implementation is divided into five sprints:
Sprints for the Project

Sprint # Task name


Setting and initial familiarisation with the task
Studying materials on established topics
Preparing and writing the Proposal
Preparing and writing the Proposal
Proposals presentation
The study of the system function block under the test
Preparation of the Test Execution Environment
Preparation and writing of Test cases
Preparation and debugging of Automated Performance Scripts
Report writing
Preparation and debugging of Automated Performance Scripts
Performance Test Execution
Processing and analysis of test execution results
Test Closer activity
Report writing
Report writing
Preparation of the finalised presentation
Finalised presentation

4.3 Communication Plan


The Agile approach will be used during the execution of the project. All tasks, plans, their
implementation, and results will be discussed with responsible people in the process and as training
and production needs arise. Table 3 shows the communication plan that will be used during the
implementation of all the project stages.
Communication Plan

Type Objective Channels Frequency Audience Owner Deliverable


Introduction Paper task Face-to- Once Program Program Agenda
introduction face leader, leader
Team Dr. Shahid
Ali
Type Objective Channels Frequency Audience Owner Deliverable
Supervisor Project task Face-to- Once Industry Industry Agenda
Meeting, introduction face Supervisor Supervisor
Introduction Team Ajay
Khanna
SCRUM Project Face-to- Twice a Industry Industry Issue Log
Meetings status face week Supervisor, Supervisor Updating
review Team Ajay
Khanna
Emergency Issue Face-to- Daily if Program Program Meeting
Meetings resolution face, necessary leader, leader Minutes,
Email, Industry Dr. Shahid Issue Log
Phone Supervisor, Ali, Updating
Team Industry
Supervisor
Ajay
Khanna
Sprint Review Face-to- In Program Program Meeting
(Project) completed face accordance leader, leader Minutes
Review Sprint with the Team Dr. Shahid
(Project) Education Ali,
Plan Industry
Supervisor
Ajay
Khanna

4.4 Schedule
Conventionally, the project implementation process can be divided into three parts:
• The initialisation part includes familiarisation with the task and the development of proposals
for the project implementation.
• The implementation of the practical part of the project and report-writing.
• Finalisation of the project and discussion of its results.
Schedule for the Project

# Task name Start date End date


The initialisation
1) Setting and initial familiarisation with the task 28.10.2019 29.10.2019
2) Studying materials on established topics 30.10.2019 4.11.2019
3) Preparing and writing the Proposal 4.11.2019 11.11.2019
4) Proposals presentation 12.11.2019 12.11.2019
# Task name Start date End date
The implementation
5) The study of the system function block under the test 12.11.2019 13.11.2019
6) Preparation of the Test Execution Environment 13.11.2019 13.11.2019
7) Preparation and writing of Test cases 14.11.2019 14.11.2019
8) Preparation and debugging of Automated Performance 15.11.2019 18.11.2019
Scripts
9) Performance Test Execution 19.11.2019 19.11.2019
10) Processing and analysis of test execution results 20.11.2019 21.11.2019
11) Test Closer activity 21.11.2019 22.11.2019
12) Report writing 12.11.2019 25.11.2019
Finalisation
13) Preparation of the finalised presentation 25.11.2019 28.11.2019
14) Finalised presentation 29.11.2019 29.11.2019

Table 4 presents a detailed Schedule for the project. The table shows the three stages of the project
with temporary expectations, namely: the initialisation, the implementation and finalisation.

4.5 Gantt Chart


Table 5 presents the Gantt Chart that in a graphic form reflects all the stages of the project
implementation from the setting and initial familiarisation with the task to the finalised presentation
and facilitates the perception of labour costs and time control.
Gantt Chart
10-Nov
11-Nov
12-Nov
13-Nov
14-Nov
15-Nov
16-Nov
17-Nov
18-Nov
19-Nov
20-Nov
21-Nov
22-Nov
23-Nov
24-Nov
25-Nov
26-Nov
27-Nov
28-Nov
29-Nov
28-Oct
29-Oct
30-Oct
31-Oct
1-Nov
2-Nov
3-Nov
4-Nov
5-Nov
6-Nov
7-Nov
8-Nov
9-Nov

Task name
Setting and initial familiarisation with
the task

Studying materials on established topics

Preparation and writing the Proposal

Proposals presentation
The study of the system function block
under the test
Preparation of the Test Execution
Environment

Preparation and writing of Test cases


Preparation and debugging of
Automated Performance Scripts

Performance Test Execution


Processing and analysis of test execution
results

Test Closer activity

Report writing

Preparation of the finalised presentation

Finalised presentation
4.6 Collection Metrics
To monitor the test execution process and further analyse its results and issue recommendations, the
following metrics are collected:
Metrics Collection

# Metrics names Report type


Threads activity over test time Active Threads Over Time Graph
Results Tree
Response time Aggregate Graph/Table
Response Time Graph
Results Tree
Graph Results
Transactions per Second Graph
Throughput Bytes Throughput Over Time Graph
Transaction Throughput vs Threads Graph
Results Tree
Graph Results
Resource utilisation CPU Graph
Memory Graph
TCP Graph
Network I/O Graph
Assertion Results Assertion Results
Errors Table

All test results are collected in a table, namely Results in Table.

4.7 Test deliverables


The table below lists the test artefacts - technical products that will be presented at the end of the
project.
Test deliverables

# Deliverable Name Author Reviewer


Test Plan Tester Program leader / Industry Supervisor
Test Cases Tester Program leader / Industry Supervisor
Test results report Tester Program leader / Industry Supervisor
# Deliverable Name Author Reviewer
Test Closure report Tester Program leader / Industry Supervisor
Conclusion and Tester Program leader / Industry Supervisor
recommendations

4.8 Risks
The table below provides an analysis of the risks that may arise during testing.
Risks identification

# Risk Expectation Impact Action/Mitigation Assigned to


Crash of resource Low CRITICAL Stop all testing Supervisor
under the test activities, resource
recovery
Network failure Low CRITICAL Stop all testing Tester
activities, Network
recovery
Resource overload Normal HIGH Reduce test load Tester
Network overload Low HIGH Reduce test load Tester
Overload of Test High LOW Upgrade Test execution Tester
execution Software/Hardware
Software/Hardware
Failure of Test Normal NORMAL Recovery of Test Tester,
execution execution Supervisor
Software/Hardware Software/Hardware
Crash of Test Low HIGH Recovery/Change of Test Analyst
execution Test execution
Software/Hardware Software/Hardware

4.9 Pass/Fail criteria

4.9.1 Pass criteria


• No critical defects exist after 100% of test execution
• All requirements are covered in the testing process.

4.9.2 Fail criteria


• The system crashes after error or failure
• Existence of at least one critical defect
• Requirement traceability matrix does not have 100% coverage.
4.10 Suspension and resumption criteria

4.10.1 Suspension criteria


• Lack of test data
• Environment is not set up properly

4.10.2 Resumption criteria


• Fully fledged test data is ready
• Environment is ready for testing
• After getting approval from industry supervisor.

4.11 Sign-off process


• All the artefacts should be approved
• All the performance testing scripts should be committed to version control
• Test supervisor has to check all the documents for the acceptance criteria.
4.12 Defect report and tracking

Tester finds a Defect Status = New

No Already Raised? Yes In Scope? Yes Valid Defect

Yes No
No

Assigns to Team discussion


Developer

Status = Status = Status = Status =


Assigned Duplicate Deferred Rejected

Developer starts Status = Re-


fixing the Defect Open Yes

No
Status =
Open

Complex Issue?

Developer fixed the


Defect
No

Status =
Tester re-tests Test Pass?
Fixed

Yes

Status = Closed

Figure 6. Defect reporting and tracking


4.13 Architecture of Automation Test Plan
The project includes measures to test the performance of Microsoft Dynamics 365, a new sales order
creation service, by conducting Load, Volume, Spike and Stress tests.

4.13.1 Testing Environment


The table below lists the components of the test execution environment:
Testing Environment

# Required hardware /software Type of hardware /software


Web service ax.dynamics.com (public IP address: 104.215.95.87)
Test execution Hardware laptop Intel i7, RAM 12 GB, HDD 256GB, 512 GB
(public IP address: ***.**.***.170)
Test execution Operation Windows 10 Home
system
Network ***.**.***.170 BBA-SPARK <=> 104.215.95.187 –
Cloud Service
Execution environment Java (jre.1.8.0_201 and jdk1.8.0_221packages)
language
Browser Google Chrome Version 78.0.3904.97 (Official Build)
(64-bit)
Script recording tool BlazeMeter 4.7.1
Coding script environment tool Apache JMeter Version: 5.1.1
Execution script tool Apache JMeter Version: 5.1.1

4.13.2 Structure of the project


Groups of users are used in the project to create a realistic load on the service.
New sales
Login Add Items Save orders
orders

Users

Jmeter

ax.dynamics.com

Reports

Figure 7. Structure of the performance test project

4.14 Test automation and tools


To implement the project, a review and analysis of various tools was carried out. The results of the
comparative analysis are presented in the table.
Competitive analysis of the performance test tools
Apache JMeter is most suitable for the project implementation for a number of reasons, namely:

Figure 8. The advantages of JMeter (GDST 707 Report)

The advantages and disadvantages of BlazeMeter should also be discussed.


BlazeMeter is easy to use for recording scripts, and after that it allows the tester to convert the script
to JMeter format and download the .jmx file for continuous use and improvement with the Apache
JMeter. The free version of BlazeMeter is convenient for writing a basic script, but it is very limited
in its ability to prepare a test script and run a test. It is possible to prepare the script in JMeter and
execute it in BlazeMeter but all information from the script file is uploaded to the internet, and from
the point of view of system security this is unacceptable. Undoubtedly, the commercial version of
this product is a worthy competitor to JMeter, but one of the main factors for choosing a tool is its
cheapness, and JMeter, as described earlier in this segment, has no competitors.
Figure 9. BlazeMeter test execution

Accordingly, BlazeMeter will be used to record test scripts and Apache JMeter will be used for
performance testing to implement part of the project related to recording, parameterisation and
execution of automated test scenarios.
Selected Tools

Tool Usage
BlazeMeter Test scripts recording
Apache JMeter with plugins Performance, Load, Volume, Spike and Stress
testing; Metrics tracking

New tools, plugins and methodologies will be implemented if required.

4.15 Traceability Matrix


Traceability Matrix displays the relationship for each issue and facilitates defect tracking process.
Traceability Matrix 1

Business Epic User Test Test case Expected Actual


Status
requirements Story # case ID description result result
B1 Epic1 US1, TC1, PASS
US2 TC2, Description of Expected Actual
FAIL
TCA1, test case result result
TCA2

Traceability Matrix 2

Req Requirement TS
Test Scenario TC ID Test Case Description
ID Description ID

New Sales order TC1 Validate login to the


B1 TS01 Verify sufficient
creation performance of a new TCA1 system
Req Requirement TS
Test Scenario TC ID Test Case Description
ID Description ID
order creation web TC2 Validate the performance
service of a new order creation
TCA2 web service

4.16 Test Cases


During the project implementation, both manual and automation testing of the Microsoft Dynamics
365 system is carried out. Test cases are given below.
Manual test cases

User Test
story case Test case description
ID ID
US01. TC01. Enter into Microsoft dynamics 365 system with the valid parameters
US02. TC02. Create a new sales order, add 10 items and save the order

Automation test cases

User Test
story case Test case description
ID ID
US01. TCA01. Enter into the Microsoft dynamics 365 system with the valid parameters
US02. TCA02. Create new sales orders 100, 200, 300 and so on, add 10, 20, 30 and so on items
and save the order
4.17 Screenshot of Scripts
The screenshots below show the body of the project and explain the idea of its architecture, scripting
and request generation.
Thread Group contains the base script library
Transaction controller
Simple controller
Authorisation of http request
Authorisation of parameters configurator
Extractor for Json format
Authorisation of assertion “200”
Simple controller
New sales order creation http request x1 line
Parameterisation generator for WEBORDID
Parameterisation generator for lines in Json
Assertion “200”
Recorder of response body
Recorder of request body
Configurator of order http request parameters
New sales order creation http request x10 lines
Parameterisation generator for WEBORDID
Parameterisation generator for lines in Json
Assertion “200”
Recorder of response body
Recorder of request body
Order http request parameters configurator
Items parameters configurator (Items.csv)

Figure 10. Elements of the base script


One example of the parameterisation of Thread groups:

Figure 11. Paramiterisation of Thread Groups


The overall structure of the project is shown in the figure below.

Figure 12. Structure of the project

Using JavaScript random number generator to parameterise the request:

Figure 13. Random number generator for line multiplier

Using JavaScript to write response results to a file:

Figure 14. Script for saving the response from service to file
Using Logic controllers on the example of the implementation of one of the groups:

Figure 15. Module Controller architecture


Parameterisation of the request using the built-in parameters in the JMeter and generated by
JavaScript:

Figure 16. Body of the parameterised request data

Figure 17. Request generation for a new order


Figure 18. Request generator for NxLines
Figure 19. Request generation for 10 lines

Figure 20. Request generation for 20 lines


5 Results
5.1 Manual test cases execution
Manual testing completed. No defects were found during execution. The test results are presented in
the table below:
Manual test cases execution

Test
case Test case description Expected result Actual result Status
ID
TC01. Enter into the Microsoft User should be able to see the As expected PASS
dynamics 365 system with application start page
the valid parameters
TC02. Create a new Sales order, User should be able to create a As expected PASS
add 10 items and save the new Sales order, add 10 items
order and save the order

5.2 Automation test cases execution


In the automation test cases execution phase the following test activity was performed: 1) Smoke test
to verify the workability of the basic performance test script; 2) Smoke test: the scalability of the
system response to determine the response of the system; 3) Load test to determine the system
performance limit; 4) Volume test to check the stable operation of the system under load; 5) Spike
and Stress test to control stress resistance and system response to a sharp increase in load.

5.2.1 Smoke test


To test the functionality of the base test script and the correct operation of metrics-collecting plug-
ins, Smoke test was performed.

Figure 21. Smoke test. Result Tree

Figure 22. Smoke test. Active Threads Over Time


Figure 23. Smoke test. Aggregate Graph

Figure 24. Smoke test. PerfMon Metrics Collector CPU

Figure 25. Smoke test. PerfMon Metrics Collector Memory


Figure 26. Smoke test. PerfMon Metrics Collector Network I/O

Figure 27. Smoke test. PerfMon Metrics Collector TCP

The smoke test execution passed smoothly and validated the workability of the basic script with basic
parameters.

5.2.2 Smoke test: the scalability of the system response


During manual testing, a large system delay was noted during the operation of writing a new order.
To determine the response of the system, a Smoke test for the scalability of the system response was
conducted to create a new record with a scale of 10 lines from 10 to 50 lines and also for 100 lines
and 140 lines. The creation of a 150-line record proved impossible because the system does not accept
this number of lines.
Figure 28. Smoke test. The scalability of the system response for 10 lines Aggregate Graph

Figure 29. Smoke test. The scalability of the system response for 20 lines Aggregate Graph

Figure 30. Smoke test. The scalability of the system response for 30 lines Aggregate Graph

Figure 31. Smoke test. The scalability of the system response for 40 lines Aggregate Graph
Figure 32. Smoke test. The scalability of the system response for 50 lines Aggregate Graph

Figure 33. Smoke test. The scalability of the system response for 100 lines Aggregate Graph

Figure 34. Smoke test. The scalability of the system response for 140 lines Aggregate Graph

A comparative analysis of the time taken to process the recording of a new order by the system for
100 and 140 lines revealed a “CLRError” defect in the record. This defect is described in detail in the
section “9 Discussion”. To identify the moment of occurrence of this defect, an additional testing of
the system was carried out in the range from 10 lines to 140 lines with a step up of 10 lines each time.
Figure 35. Smoke test.The scalability of the system response for 10 to 140 lines Aggregate
Graph

Figure 36. Smoke test.The scalability of the system response for 10 to 140 lines Aggregate
Report

Figure 37. Smoke test.The scalability of the system response


for 10 to 140 lines Bytes Throughput Over Time

Figure 38. Smoke test.The scalability of the system response


for 10 to 140 lines PerfMon Metrics Collector CPU

Overloads of the test load generating stand, that could affect the test results during the test execution,
were not observed.
Figure 39. Smoke test. The scalability of the system response
for 10 to 140 lines Response Time Graph

Figure 40. Defect “CLRError”. Smoke test. The scalability of the system response
for 10 to 140 lines View Results Tree

According to the test results, it was possible to establish the turning point at which the defect begins
to appear, namely in the order record with 70 lines. Further condemnation of the defect is continued
in section “9 Discussion”. For clarity, a number of metrics are shown that demonstrate the absence
of the influence of the stand generating the load on the test result.

5.2.3 Load test


Based on previous test results, the Load test scenario was scaled. Due to a sharp drop in performance
as the load rises for writing a new order, and the previously detected defect, the test was carried out
in the range from 10 to 50 lines in steps of 10.
Figure 41. Load test. Scenario for Load test Thread Group New Order Stepping

Figure 42. Load test. Active Threads Over Time

Figure 43. Load test. Aggregate Graph

The Load test Aggregate Graph provides a visual indication of a significant decrease in the
performance of the data recording service when the load increases, even to the appearance of errors.
Figure 44. Load test. Bytes Throughput Over Time

Figure 45. Load test. PerfMon Metrics Collector CPU

Overloads of the test load generating stand, that could affect the test results during the test execution,
were not observed.

Figure 46. Load test. View Results in Table

Figure 47. Load test. Response Time Graph


Figure 48. Load test. Transaction Throughput vs Threads

For clarity, a number of metrics are shown that demonstrate the absence of the influence of the stand
generating the load on the test result.

5.2.4 Volume test


For the Volume test, in comparison with the Load test, the number of lines in the orders was reduced
from 50 to 30, but the number of samples was increased from 181 to 504.

Figure 49. Volume test. Active Threads Over Time

Figure 50. Volume test. Aggregate Graph


Figure 51. Volume test. Response Time Graph

Figure 52. Volume test. PerfMon Metrics Collector Memory

Figure 53. Volume test. Bytes Throughput Over Time

Figure 54. Volume test. Graph Results


Figure 55. Volume test. Transaction Throughput vs Threads

Figure 56. Volume test. PerfMon Metrics Collector CPU

Figure 57. Volume test. PerfMon Metrics Collector Network IO

Figure 58. Volume test. PerfMon Metrics Collector TCP


Overloads of the test load generating stand, that could affect the test results during the test execution,
were not observed.

Figure 59. Volume test. View Results in Table

Figure 60. Volume test. View Results Tree


The above screenshot displays a critical error during test execution.

Figure 61. Volume test. Assertion Results


Despite the decrease in the number of rows in orders during the test, errors were also observed. Also,
the processing time of the request for writing an order has increased, which should be expected in
connection with an increase in the density of requests.

5.2.5 Spike and Stress test


In this test, Spike and Stress tests are combined. The base load is generated by one Thread Group and
the Spike load by a second Thread Group.

Figure 62. Spike and Stress test. Thread Group New Order LOAD

Figure 63. Spike and Stress test. Thread Group New Order STRESS

Figure 64. Spike and Stress test. Active Threads Over Time

This graph displays the activity of two Thread Groups during testing. It was planned to generate two
peak activities, but the test was interrupted due to critical recording errors.
Figure 65. Spike and Stress test. Summary Aggregate Graph

Figure 66. Spike and Stress test. Base load Aggregate Graph

Figure 67. Spike and Stress test. Spike load Aggregate Graph
The above graphs clearly reflect the increase in the delay time of saving a new order during peak
loads.

Figure 68. Spike and Stress test. Response Time Graph

Figure 69. Spike and Stress test. View Results Tree

The above screenshot displays a critical error during test execution.


Figure 70. Spike and Stress test. View Results in Table

Figure 71. Spike and Stress test. Assertion Results

The results of the Spike and Stress test confirm the previously obtained results in the above tests.
Overloads of the test load generating stand, that could affect the test results during the test execution,
were not observed.
6 Discussion
In this section the project implementation stages, the issues encountered during the implementation
of the project and the defects found are discussed.
The performance testing Microsoft Dynamics CRM 365 for the order creation web service has been
studied. Based on the methodologies and standards of the testing software industry, a test plan was
developed, automation performance scripts were prepared and executed, and metrics were collected
and analysed, followed by conclusions and future recommendations.
As a result of using the BlazeMeter for script recording, an authorisation problem was discovered.
The need to use the Json protocol for testing has also been identified. The actual value of using the
BlazeMeter was reduced to familiarisation with the format of data packets. Thus, the script was
developed manually. The request in Json format has been processed and prepared for use with a Logic
Controller system.
The Logic Controller framework was used in the creation of Thread Groups for different types of
performance testing. The following performance testing types were conducted:
• Smoke test.
• Smoke test: the scalability of the system response.
• Load test.
• Volume test.
• Spike and Stress test.
In the smoke test, an examination of the base script was carried out to validate its performance. In the
smoke test for scalability of the response of the system, the dependence of the recording speed of the
sales order on the number of lines was checked. In the Load test, the reaction of the system to an
increase in load was studied, and the approximate allowable load range was determined for further
testing of the system without significant failures. In the volume test, the stability of the system under
load was checked. Due to the high dependence of the recording speed of orders on the number of
lines and the rather frequent occurrence of errors, it was decided not to overload the system and to
combine the two tests into one. In the process of conducting this test, it was planned to simulate the
basic component of the load and create two peaks, adding additional load, and to interrupt the test in
case of an error. One error occurred and the test was interrupted. All the above types of performance
testing were carried out many times because of the need to scale the load on the system.
The first issue was getting access to the Microsoft Dynamics CRM 365. The BlazeMeter recorded
script was not able to perform testing because of authentication issues. The system has a high level
of security and requires special privileges to run tests. To solve this problem, I turned to my supervisor
and received all the necessary information to complete the task.
The second issue was the saving of the sales order process. Even during manual testing, it was
revealed that the process of saving the sales orders was slow. Based on this observation, it was decided
to conduct a scalability of the system response, as a result of which a pattern was revealed with a
sharp increase in the time required to save the order with an increase in the number of lines. Also, a
bug or restriction was also found when creating an order of more than 150 lines and the occurrence
of a recording error when there are 70 or more lines in the order.
As a result of the work done for the performance testing Microsoft Dynamics CRM 365 for the order
creation web service function, the following defects were discovered: 1) inability to save an order
containing more than 150 lines; service outages for saving an order with 70 or more lines; 3) failure
of the service under heavy loads.
The low performance of this service when saving a new order in the database should also be
emphasised.
The error “Inability to save an order containing 150 or more lines” (Figure 73, Figure 73) can be
caused because of a bug in the code or restrictions in the service.

Figure 72. Inability to save an order containing 150

Figure 73. Error. Response code:500

Service outages for saving an order with 70 or more lines is undoubtedly a bug and needs to be fixed.

Figure 74. Error. “CLRError”

The occurrence of “Failure of the service under heavy loads” defect can be caused either by a software
error, or by incorrect configuration of one of the services that support the saving operation, or by
insufficiently productive hardware, or by the combined influence of all of the above. To study the
causes of this defect, the access to all components of the system must be provided. Low performance
of this service when saving a new order in the database can be assumed with high probability that the
low performance of the service is caused by a lack of resources allocated to the database, although
there is also the likelihood that a security protocol algorithm, or other configuration problems, has
influenced the processing of the package.
To study the causes of this defect, the access to all components of the system must be provided.
7 Conclusion
Well-designed and prepared project documentation is an important part of the initialisation of the
project, its implementation, analysis of the results, drawing conclusions about the work done and
drawing up plans for future improvements.
The implementation plan at this stage is that the principles, methods, tools and means necessary for
the implementation are determined and laid down. Also, at this stage, the project is divided into sub-
tasks and milestones are determined that help to plan correctly and subsequently control all stages of
implementation. With a properly designed implementation plan, production time, resources and
financial costs are reduced. For example: Firstly, correctly selected tools will reduce the time spent
on recording, parameterising and executing scripts, as well as collecting the necessary metrics for
issuing recommendations. Secondly) a properly designed Communication Plan will reduce the
likelihood of conflicts and misunderstandings during the implementation process. It is well-known
that a well-prepared Implementation plan for the project gives a much greater chance of success than
failure.
Moreover, at the end of the project, the Implementation plan is used to compare the results obtained
with the expected ones and this knowledge can be further used to optimise a similar project or for
evaluative analysis when developing a proposal on a new topic. That is why all the effort spent on
the Implementation plan will pay off in the future.
Undoubtedly, to automate a process, it is necessary first to "touch" it. The tester must know and
understand the system under test and correctly describe the necessary actions for future use in test
cases and script writing. Therefore, manual testing is an integral part of any testing automation
process.
The modern tester must know and understand both the architecture of the hardware and network
technologies, as well as various operating and software systems, and cloud web services, the various
parts of which can be in different places on the planet.
Using modern specialised software allows the tester to significantly speed up the testing process;
quickly record, parameterise, find errors in automation scripts, and debug them, as well as collect the
necessary metrics for analysis. A clear confirmation of the above statement is the JMeter tool and its
various features and plug-ins, such as: Thread Groups, Config elements, Listeners, Pre and Post
Processors, Assertions, Controllers and so on, that were used in this project. Apache JMeter is a
convenient and sufficient tool not only for testing the performance testing Microsoft Dynamics CRM
365 for the order creation web service, but also for further support and development of system
performance testing.
Using parameterisation in script writing can significantly expand the range of data changes, thereby
increasing the coverage of tests. Collecting various types of metrics in a run-time test allows for
qualitative data analysis.
In the process of preparing the project, materials were used that were studied both in the school
curriculum and from external sources. Practical application of this knowledge brings valuable
experience necessary for future work in the industry.
To briefly summarize the project, this project confirms the significance of performance testing.
Obviously, performance testing helps to identify performance weaknesses in the functioning of both
software and hardware environments and to develop a plan for their elimination and performance
improvement.
8 Future recommendations
The defects identified during the conducting of this research work, if necessary, can be re-tested or
regression testing in the case of clarification and fixing of the defects will be conducted.
Since the Microsoft Dynamics CRM 365 comes in 24 languages, it is necessary to conduct
localisation testing of the system to exclude a drop-in performance when using different regional
settings by preparation of more data and by performing a deeper request parameterisation.
To validate stable performance of the Microsoft Dynamics 365 system and to extend performance
test coverage it is also necessary to:
• Conduct separate performance testing of all system services and based on the results, evaluate
recommendations for increasing their performance.
• Develop a comprehensive plan for testing overall system performance.
• Design a regression performance test for the system as a whole and conduct both regular
scheduled testing and system testing after detecting errors or making changes.
• Conduct web-application security testing.
For further work on improving performance testing processes, automating them, identifying potential
problems, and reducing costs, it is important to use the following modern innovation research ideas
such as:
• Reducing the cost of testing performance and identifying potential problems with service
performance through the implementation of new inventions [27].
• Developing special testing mechanisms to evaluate the correctness and good performance of
the system [25].
• Using a modern methodology for testing web application performance, which allows testers
to increase the performance of all testing processes, from the development of test cases and
the generation of test scripts to the test case execution [13] and for facilitating and simplifying
this process [11].
9 Referencing
[1] Abbors, F., Ahmad, T., Truscan, D., & Porres, I. (2013, April). Model-based performance testing
in the cloud using the mbpet tool. In ICPE (pp. 423-424).
[2] Baksi, A. K. (2013). Exploring nomological link between automated service quality, customer
satisfaction and behavioural intentions with CRM performance indexing approach: Empirical evidence
from Indian banking industry. Management Science Letters, 3(1), 1-22.
[3] Batada, I., & Rahman, A. (2012). Measuring system performance & user satisfaction after
implementation of ERP. In Proceedings of Informing Science & IT Education Conference (InSITE)
(pp. 603-611).
[4] Beckner, M. (2017). Administering, Configuring, and Maintaining Microsoft Dynamics 365 in the
Cloud. Berlin, Germany: Walter de Gruyter GmbH & Co KG.
[5] Chhetri, M. B., Chichin, S., Quoc Bao Vo, & Kowalczyk, R. (2013, June). Smart CloudBench--
Automated Performance Benchmarking of the Cloud. In 2013 IEEE Sixth International Conference on
Cloud Computing (pp. 414-421). IEEE.
[6] Erinle, B. (2017). Performance Testing with JMeter 3. Birmingham, England: Packt Publishing.
[7] Grechanik, M., Fu, C., & Xie, Q. (2012, June). Automatically finding performance problems with
feedback-directed learning software testing. In 2012 34th International Conference on Software
Engineering (ICSE) (pp. 156-166). IEEE.
[8] Hooda, I., & Singh Chhillar, R. (2015). Software test process, testing types and techniques.
International Journal of Computer Applications, 111(13).
[9] How to run performance test for dynamics 365? (2018). Retrieved from
https://community.dynamics.com/365/financeandoperations/b/howtodynamics365/posts/how-to-run-
performance-test-for-dynamics-365
[10] Huang, P., Ma, X., Shen, D., & Zhou, Y. (2014, May). Performance regression testing target
prioritization via performance risk analysis. In Proceedings of the 36th International Conference on
Software Engineering (pp. 60-71). ACM.
[11] Jayasinghe, D., Swint, G., Malkowski, S., Li, J., Wang, Q., Park, J., & Pu, C. (2012, June).
Expertus: A generator approach to automate performance testing in iaas clouds. In 2012 IEEE Fifth
International Conference on Cloud Computing (pp. 115-122). IEEE.
[12] Jin, G., Song, L., Shi, X., Scherpelz, J., & Lu, S. (2012). Understanding and detecting real-world
performance bugs. ACM SIGPLAN Notices, 47(6), 77-88.
[13] Kao, C. H., Lin, C. C., & Chen, J. (2013, July). Performance testing framework for rest-based web
applications. In 2013 13th International Conference on Quality Software (pp. 349-354). IEEE.
[14] Khanapurkar, A., Parab, O., & Malan, S. (2014). U.S. Patent No. 8,756,586. Washington, DC: U.S.
Patent and Trademark Office.
[15] Kiran, S., Mohapatra, A., & Swamy, R. (2015, August). Experiences in performance testing of web
applications with Unified Authentication platform using JMeter. In 2015 international symposium on
technology management and emerging technologies (ISTMET) (pp. 74-78). IEEE.
[16] Lewis, W. E. (2017). Software testing and continuous quality improvement. Auerbach publications.
[17] Load Testing Dynamics CRM / 365 with LoadRunner. (2018). Retrieved from
https://community.dynamics.com/crm/b/crmperformancetesting/posts/dynamics-crm-365-
performance-testing-with-loadrunner
[18] Luszczak, A. (2018). Using Microsoft Dynamics 365 for Finance and Operations: Learn and
understand the functionality of Microsoft's enterprise solution. Springer.
[19] Maximini, D. (2015). The Scrum Culture. Switzerland: Springer International Publishing.
[20] Microsoft Annual Report 2018. (2018). Retrieved from https://www.microsoft.com/en-
us/annualreports/ar2018/annualreport
[21] Microsoft Dynamics 365 Software. (n.d.). Retrieved from
https://www.softwareadvice.com/nz/crm/dynamics-365-profile/
[22] Patil, S. S., & Joshi, S. D. (2012). Identification of Performance Improving Factors for Web
Application by Performance Testing. Int. J. Emerg. Technol. Adv. Eng., 2(8), 433-436.
[23] Rodrigues, A. G., Demion, B., & Mouawad, P. (2019). Master Apache JMeter - From Load Testing
to DevOps: Master performance testing with JMeter. Birmingham, England: Packt Publishing.
[24] Sadiq, M., Iqbal, M. S., Malip, A., & Othman, W. M. (2015). A Survey of Most Common Referred
Automated Performance Testing Tools. ARPN Journal of Science and Technology, 5(11), 525-536.
[25] Segura, S., Galindo, J. A., Benavides, D., Parejo, J. A., & Ruiz-Cortés, A. (2012, January). BeTTy:
benchmarking and testing on the automated analysis of feature models. In Proceedings of the Sixth
International Workshop on Variability Modeling of Software-Intensive Systems (pp. 63-71). ACM.
[26] Varela-González, M., González-Jorge, H., Riveiro, B., & Arias, P. (2013). Performance testing of
LiDAR exploitation software. Computers & Geosciences, 54, 122-129.
[27] Zhou, J., Zhou, B., & Li, S. (2014, July). Automated model-based performance testing for PaaS cloud
services. In 2014 IEEE 38th International Computer Software and Applications Conference
Workshops (pp. 644-649). IEEE.

View publication stats

S-ar putea să vă placă și