This paper describes a system based onPythonLanguage, based onExceldata in a column within the table file, calculating the data in that column for each of theSpecified number of rowsrange (e.g., every4
(within the scope of the line) of theInterval MaximumThe methodology.
It is known that we have an existing.csv
formalizedExceltable file that has a column of data that we want to add to theInterval Maximumof the calculation - i.e., from this column of theData section(a.k.a.Excluding listings(part of) beginning with the first1
go to the end of the line4
The maximum value between rows, the first5
go to the end of the line8
The maximum value of the line, the first9
go to the end of the line12
The maximum value of a row, etc., is calculated separately for each4
the maximum value in the row; moreover, if the number of data in this column can not be4
Divide, then how many are left at the end, then just maximize those.
Once the requirements are clear, we can start writing the code; this is shown below.
# -*- coding: utf-8 -*- """ Created on Wed Jul 26 12:24:58 2023 @author: fkxxgis """ import pandas as pd def calculate_max_every_eight_rows(excel_file, column_name): df = pd.read_csv(excel_file) column_data = df[column_name] max_values = [] for i in range(0, len(column_data), 4): max_values.append(column_data[i:i+4].max()) return max_values excel_file = r"C:\Users\15922\Desktop\data_table_1.csv" column_name = 'NDVI' result = calculate_max_every_eight_rows(excel_file, column_name) rdf = (result, columns = ["Max"]) output_file = r"C:\Users\15922\Desktop\" rdf.to_csv(output_file, index = False)
Here, we define a functioncalculate_max_every_eight_rows
(Because at first I was trying to calculate8
The interval maximum of the data for all functions with the nameeight
(if you understand it), accepts two parameters, which are the path to the input fileexcel_file
and the name of the column that corresponds to the maximum value of the interval to be calculated.column_name
。
In the function, we first read the file and save the data to thedf
Next, we get the specified columncolumn_name
and create an empty listmax_values
, which is used to save the maximum value for each grouping. Subsequently, using therange
The function generates the data from the0
Starting with a step size of4
The sequence of indexes to be used for each4
The rows are grouped together; here you can modify them according to your actual needs. Within each grouping, we start with thecolumn_data
to retrieve the corresponding4
rows of data and calculates the maximum value within that grouping, adding the maximum value to themax_values
in the list. Finally, the function returns the list holding the maximum value for each groupingmax_values
。
Secondly, we have been able to achieve this through theexcel_file
Specify the path to the input file, via thecolumn_name
Specify the names of the columns to be processed, and then you can invoke thecalculate_max_every_eight_rows
function and saves the returned result to theresult
variable, the result is a list containing the maximum value for each grouping.
Subsequently, we chose to save the maximum result in order to save it, so we chose to set theresult
The list is converted to a newDataFrameformat datardf
and specify the column name asMax
. Finally, byrdf.to_csv()
: put thisrdf
Save as a new.csv
format file and set theindex=False
to not save indexed columns.
Execute the above code, we can get the result file. As shown in the figure below, in order to facilitate the comparison, we here will copy the results file to the original file to view. You can see that the result column in the first1
numbers, which are the first two numbers in the original column4
The maximum value of the row; the first value in the result column3
number, then it is the first number in the original column9
travel to12
The maximum value of the row, and so on.
At this point, the job is done.
Above is Python to achieve a form file to find the maximum value of a region of the cell details, more information about Python to find the maximum value of the file please pay attention to my other related articles!