One-dimensional linear regression of raster

The trend analysis is based on one-dimensional linear regression trend analysis, which can simulate the trend of each raster, and then integrated to reflect the spatial and temporal changes of the whole region, calculated by the following formula.

where Slope is the slope of the pixel r regression equation, the variable i is the time series number from 1 to n, n is the time span, and Mr,i is the maximum r value in the nth period.
If Slope > 0, it means that the change in r over time shows an upward trend, and the larger the Slope value, the more obvious the upward trend, and conversely, if Slope < 0, it means that the change in r over time shows a downward trend.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
# -*- coding: utf-8 -*-
import arcpy
from arcpy.sa import *
arcpy.CheckOutExtension("Spatial")

def rasterSlope(rasterFolder,resultSlopeSavePath):
folderin = rasterFolder
arcpy.env.workspace = folderin
rlist = arcpy.ListRasters()
N=len(rlist)
i = 0
sum1 = 0
sum2 = 0
sum3 = 0
sum4 = 0
for r in rlist:
i += 1
print(i)
sum1 += i * Raster(r)
sum2 += Raster(r)
sum3 += i * i
sum4 += i
print(r)
result = (N * sum1 - ((N + 1) * N / 2) * sum2) / (N * sum3 - sum4 * sum4)
result.save(resultSlopeSavePath)