Preface to the study
I found out that there are not only a lot of Keras models, but also a lot of PyTorch models. It's better to learn Pytorch, and I'd like to understand what the following tensor actually is.
Important Foundation Functions in PyTorch
1. class Net() neural network construction.
PyTorch in the construction of neural networks and Tensorflow is not the same, it needs to use a class to build (later you can also use the Sequential model similar to Keras to build), of course, the basis is still using the class to build, this class needs to inherit the neural network model in PyTorch,, the specific construction method is as follows:
# Inheritance models class Net(): # Overloaded initialization functions (I forget if this is called overloading) def __init__(self, n_feature, n_hidden, n_output): super(Net, self).__init__() # Applies a linear transformation to the incoming data: :math:y = xA^T + b # Fully-connected layer with equation y = xA^T + b # Build two fully-connected layers (i.e., one implicit layer) while initializing = (n_feature, n_hidden) = (n_hidden, n_output) # forward function is used to build the forward passes def forward(self, x): # Output of the implicit layer hidden_layer = ((x)) # Actual output output_layer = (hidden_layer) return output_layer
This section constructs a neural network containing one hidden layer with n_hidden number of hidden layer neurons.
After creating the above classes, the neural network can be built by using the following function:
net = Net(n_feature=1, n_hidden=10, n_output=1)
2. optimizer optimizer
The optimizer used to build the model is in the same sense as the optimizer in tensorflow, PyTorch's optimizer is in the library prefixed with.
The optimizer needs to pass in the parameters for the net network.
It is used in the following ways:
# It's the optimizer module # Adam can be changed to other optimizers such as SGD, RMSprop etc. optimizer = ((), lr=1e-3)
3. loss loss function definition
Loss is used to define the loss function for neural network training, and the commonly used loss functions are the mean-variance loss function (regression) and the cross-entropy loss function (classification).
It is used in the following ways:
# mean square (statistics)lossloss_func = ()
4. Training process
The training process is divided into three steps:
1. Utilize the network to predict results.
prediction = net(x)
2, Use the predicted results to compare with the true value to generate the loss.
loss = loss_func(prediction, y)
3. Perform a reverse pass (there are three steps in this section).
# Mean square deviation loss # Reverse pass steps # 1. Initialize the gradient optimizer.zero_grad() # 2. Calculate the gradient () # 3. Perform optimizer optimization ()
All Codes
This is a simple regression prediction model.
import torch from import Variable import as functional import as plt import numpy as np # The shape of x is (100,1) x = torch.from_numpy((-1,1,100).reshape([100,1])).type() # y has a shape of (100,1) y = (x) + 0.2*(()) class Net(): def __init__(self, n_feature, n_hidden, n_output): super(Net, self).__init__() # Applies a linear transformation to the incoming data: :math:y = xA^T + b # Fully-connected layer with equation y = xA^T + b = (n_feature, n_hidden) = (n_hidden, n_output) def forward(self, x): # Output of the implicit layer hidden_layer = ((x)) output_layer = (hidden_layer) return output_layer # Class creation net = Net(n_feature=1, n_hidden=10, n_output=1) # It's the optimizer module optimizer = ((), lr=1e-3) # Mean square deviation loss loss_func = () for t in range(1000): prediction = net(x) loss = loss_func(prediction, y) # Reverse pass steps # 1. Initialize the gradient optimizer.zero_grad() # 2. Calculate the gradient () # 3. Perform optimizer optimization () if t & 50 == 0: print("The loss is",())
The results of the run are:
The loss is 0.27913737
The loss is 0.2773982
The loss is 0.27224126
…………
The loss is 0.0035993527
The loss is 0.0035974088
The loss is 0.0035967692
Above is the detailed content of python neural network learning regression operation using PyTorch, more information about python neural network PyTorch regression operation please pay attention to my other related articles!