SoFunction
Updated on 2024-11-15

Analyzing the concepts related to python coprocessing

This post is the result of a reader's friend's learning experience with python coprocessing, and here's what it's all about:

The history of co-programming is a long story, starting with generators.

If you have read my previous post python odyssey: iterators and generators , the concept of generators should be well understood. Generators save memory and generate results only when they are used.

 

# Generator expressions
a = (x*x for x in range(10))
# next generates values
next(a()) # Output 0
next(a()) # Output 1
next(a()) # exports4

Unlike generators, which produce data, concatenators can receive data as well as produce it, specifically by placing a yield to the right of the expression. We can use .send() to send data to a concurrent function.

 def writer():
  print('-> coroutine started')
  for i in range(8):
    w = yield
    print(i+w)

w = writer()
# It's still a generator at heart
>>> w
<generator object writer at 0x000002595BC57468>
# First, you have to next() the coprogram to activate it.
>>> next(w)
-> coroutine started
# Send data
>>> (1)
1
# Throws an exception after the eighth send.
# 'Cause the concordance has ended #
---------------------------------------------------------------------------
StopIteration               Traceback (most recent call last)

The first step must be to activate the coprocessor function using next() so that the data can be sent in the next step using .send().

As you can see, after the 8th time of receiving data, an end exception is generated because the program flow is over, which is normal. Just add an exception. What if you need to pass data between two coprograms?

def writer():
  while True:
    w = yield
    print('>>', w)

def writer_wrapper(coro):
  # Activate
  next(coro)
  while True:
    # Exception handling
    try:
      x = yield
      # Send data to writer
      (x)
    except StopIteration:
      pass
w = writer()
wrap = writer_wrapper(w)
# Activate
next(wrap)
for i in range(4):
  (i)
# Output
>> 0
>> 1
>> 2
>> 3

In the above code, the data is first passed to writer_wrapper and later to writer .

data——>writer_wrapper——>writer

You could write it that way, but it seems like a bit of a pain to have to pre-activate and add exceptions. yield from comes along to solve this problem, again passing data:

def writer():
  while True:
    w = yield
    print('>>', w)
def writer_wrapper2(coro):
  yield from coro

One line of code solves the problem.

In short, yield from is the equivalent of providing a channel through which data can flow between concurrent programs. writer_wrapper2 uses yield from coro when it is used in writer_wrapper2, coro gains control at this point, and writer_wrapper2 is blocked until writer prints out the result while we .send() the data.

At this stage, the concatenation still essentially consists of generators.

Even though we used yield from to simplify the process, the knowledge of concatenation and generators was still a bit confusing to understand, and there were many problems with yield from in asynchronous programming (asyncio used to use yield from), so in version 3.5 of python, the use of yield from was discarded, and two new keywords were added async and await, and instead of a generator type, concatenation is a native concatenation type.

Now we define a concurrent program to look like the following:

async def func():
  await 'some code'

What to do with a concurrent program that is not used for asynchrony, I don't know yet. So, that's the end of the introduction to concatenation. Thank you for your support.