SoFunction
Updated on 2024-11-16

How python supports concurrent methods in detail

The existence of the GIL (Global Interpreter Lock) allows Python processes to use only one core of the CPU, which corresponds to one of the operating system, at the same time.
Kernel threads, for a Python web program, if there is a request and it is all long time-consuming computational tasks (occupancy), this program, after accepting the first request
Can other requests be handled? Suppose the web program accepts the request while True:

def handle_request(request):
 while True:
  pass

Understanding this from the code, Python has only one real thread of execution, and the code goes to thewhile TrueIt'll just take up the only cpu core it'll have a chance to process.
Any other assignments?

to start both threads while True and see if they both execute to simulate the two requests:

import time, threading

def f1(name):
 while True:
  print(name)
  (1)

(target=f1, args=('f1', )).start()
(target=f1, args=('f2', )).start()

Output results:

f1
f2
f2f1

f2
f1
...

Actually testing with Django (a Python web framework), even if a request executeswhile Truecode like this, it can still handle other requests (concurrency support);

to explain why the twowhile True All can be executed:
Still using the lock GIL, the firstwhile Truethread to get this lock in order to execute, and then it executes aprint(name)Then the lock was released.
And it pauses. Then the second one.while TrueThreads get the GIL and start executing, alternating around the GIL, and you've implemented Python's multithreading.

To summarize:

while TrueIt also can't hold CPU resources all the time, it also executes for a while and rests for a while, which gives other processes a chance, there are two key points here:

  • How to grab this lock
  • How to release a lock

Grab a lock and queue it up. Arrange a queue for the lock, want to execute into this queue.

Releasing a lock is somewhat similar to process scheduling:

  • Divide the time slice (perform the same time)
  • Instruction execution count (number of times the same instruction is executed)
  • IO operation encountered (passive wait)
  • Active waiting (wait/join/sleep)

When you encounter an IO operation, you need to wait for the IO device to complete the calculation before you can continue the execution of the thread, which does not occupy CPU resources during this time, and the lock is released first.
Active waiting, typically sleep, actively relinquishes the lock and waits until a certain moment before re-executing.

The above analysis shows that Python supports concurrency, but because it cannot take advantage of multi-core processors, it is not suitable for computationally intensive applications with high concurrency.
Not suitable for Python.

to this article on how to support concurrency python article is introduced to this, more related python support concurrency content please search for my previous articles or continue to browse the following related articles I hope you will support me in the future more!