SoFunction
Updated on 2024-11-15

Python implementation of plotting multiple activation function curves in detail

Use numpy, matplotlib, sympydraftsigmoid, tanh, ReLU, leaky ReLU, softMax function

Cause: On the way to deep learning, the teacher left a homework assignment to plot the activation function and its derivative, which took quite a long time to record the learning process.

Preparation: download numpy, matplotlib, sympy

pip install numpy matplotlib sympy

Find the documentation for the corresponding library:

numpy documentation matplotlib documentation sympy files

When I wrote the code, I found that vscode would not format my python, and after checking it out, I had to install flake8 and yapf, which is a tool for checking code specification and a formatting tool, and then configure it.

".flake8Enabled": true, // Specification checking tool
"": "yapf", // Formatting tools
".flake8Args": ["--max-line-length=248"], // Set the maximum character limit for a single line
"": false, // Close the pylint tool

Now that we're done, let's see how we're going to write the code.

Step 1 Create a new py file

First of all, the activation function of the function expression to write out, which has two ways, if simply to produce the results of the calculation, in fact, using numpy is enough, but also to their own derivatives, then you need to use sympy to write out the function equation.

sympy expresses functions in this way:

from sympy import symbols, evalf, diff
# We first need to define what the independent variables are, this side by demand, this is the example of the document has two variables
x, y = symbols('x y')
# Then we write the function expression
expr = x + 2*y
# Output to see what's what
expr # x + 2*y
# Then it's time to use the functions we've defined
(subs={x: 10, y: 20}) # 50.000000
# And then we'll take the derivative of our function
diff(expr, x, 1) # Derivation on x yields 1, which is also the expression

diff is the derivative function of sympy

(f, *symbols, **kwargs)

Next we define the expression for the activation function

def sigmoid():
    """
    Define the sigmoid function
    """
    x = symbols('x')
    return 1. / (1 + exp(-x))
def tanh():
    """
    Define the tanh function
    """
    x = symbols('x')
    return (exp(x) - exp(-x)) / (exp(x) + exp(-x))
def relu():
    """
    Define the ReLU function
    """
    x = symbols('x')
    return Piecewise((0, x < 0), (x, x >= 0))
def leakyRelu():
    """
    Define the Leaky ReLu function
    """
    x = symbols('x')
    return Piecewise((0.1 * x, x < 0), (x, x >= 0))
def softMax(x: ):
    """
    Defining the SoftMax function\n
    """
    exp_x = (x)
    print(exp_x, (exp_x))
    return exp_x / (exp_x)
def softmax_derivative(x):
    """
    Define the SoftMax derivative function \n
    x - input x vector
    """
    s = softMax(x)
    return s * (1 - s)

Then define a derivative function

def derivate(formula, len, variate):
    """
    Define function derivation
      formula: the formula of a function
      len: number of derivations
      variate: independent variable
    """
    return diff(formula, variate, len)

One question over here, why is there one function for all other functions and two for the softMax function, one for the definition of the softMax function and one for the definition of its derivative function?

Let's look at what the softMax function looks like

softMax function denominator need to write the process of accumulation, use can not be derived by sympy (some people can, I do not know why, may be the use of different ways to know can be exchanged) and the use of or can only be from i to n each time to 1 for the unit of accumulation

For example, suppose there is an expression m**x (m to the xth power)(m**x, (x, 0, 100)) that results in m**100 + m**99 + m**98 ... + m**1, and the ndarray I defined is (-10, 10, 0.05), which doesn't meet the requirement and can't be derived.

So just write two functions, one for the definition of the original function and one for the definition of the derivative function, and as I said before, if it's a value, you can actually do it with just numpy.

At this point, all the functions as well as the derivative function are defined by us

Step 2 Plot the curve using matplotlib

First, we need to know what matplotlib has, right?

Matplotlib mainly has Figure, Axes, Axis, Artist, I understand that figure is the canvas, we have to prepare the canvas before drawing the graphic; axes and axis translation is the meaning of axes, but axes should be the axes, axis is one of the axes; artist for other elements can be added to the

To draw a simple diagram you can do this

x = (0, 2, 100)  # Sample data.

# Note that even in the OO-style, we use `.` to create the Figure.
fig, ax = (figsize=(5, 2.7), layout='constrained')
(x, x, label='linear')  # Plot some data on the axes.
(x, x**2, label='quadratic')  # Plot more data on the axes...
(x, x**3, label='cubic')  # ... and some more.
ax.set_xlabel('x label')  # Add an x-label to the axes.
ax.set_ylabel('y label')  # Add a y-label to the axes.
ax.set_title("Simple Plot")  # Add a title to the axes.
()  # Add a legend.

Then we're ready to plot our function curve.

('x label') // Two ways to add a label, one is ax.set_xlabel (object-oriented) and one is this (function-oriented)
('y label')

After adding the laben , I considered two ways of plotting, one is to plot all the curves in one figure, but divided into different axes

Using the subplot function you can divide the figure into 2 rows and 2 columns of axes

(2, 2, 1, adjustable='box') # 1 row, 1 column
(2, 2, 2, adjustable='box') # 1 row, 2 columns

The second one draws the specified function by entering the function name

do = input( 'input function expression what you want draw(sigmoid, tanh, relu, leakyRelu, softMax)\n' )

After getting the input

 try:
        ('x label')
        ('y label')
        (do)
        if (do == 'softMax'):
            (num, softMax(num), label='Softmax')
            (num, softmax_derivative(num), label='Softmax Derivative')
        else:
            (
                num,
                [eval(f'{do}()').evalf(subs={symbols("x"): i}) for i in num])
            (num, [
                derivate(eval(f'{do}()'), 1, 'x').evalf(subs={symbols('x'): i})
                for i in num
            ])

        plt.tight_layout()
        ()
    except TypeError:
        print(
            'input function expression is wrong or the funciton is not configured'
        )

That's the end of it. Here's a picture of the seller.

Above is the Python implementation of drawing a variety of activation function curve details, more information about Python draw activation function curve please pay attention to my other related articles!