1. Multiprocessing (multiprocess)
1. Module introduction
- effect: Create multiple independent running processes (each process has independent memory space)
- Applicable scenarios: CPU-intensive tasks such as mathematical calculations and image processing
- Core Principle: Bypass Python's GIL locks and truly utilize multi-core CPUs
2. Detailed case explanation: parallel calculation of sum of squares
import multiprocessing import time # Calculate squared task functiondef calculate_square(number): total = 0 for n in range(number): total += n ** 2 print(f"Calculation results:{total}") if __name__ == "__main__": # This sentence must be added, otherwise the Windows system will report an error # Create 4 processes processes = [] numbers = [10_000_000, 10_000_000, 10_000_000, 10_000_000] # Four big numbers # Record the start time start_time = () # Create and start the process for num in numbers: p = (target=calculate_square, args=(num,)) (p) () # Start the process (returns immediately, will not wait for completion) # Wait for all processes to complete for p in processes: () # Block the main process until the child process ends # Calculate total time print(f"Total time consumption:{() - start_time:.2f}Second")
3. Implementation logic
Main process (Boss)
│
├─ Subprocess 1 (employee 1) → Independent calculation
├─ Subprocess 2 (employee 2) → Independent calculation
├─ Subprocess 3 (employee 3) → Independent calculation
└─ Subprocess 4 (employee 4) → Independent calculation
4. Things to note
- Inter-processCan't share variables directly, need to use
Queue
orPipe
Communication - Each process consumes more memory (independent memory space)
- Suitable for handling independent tasks (such as processing multiple files at the same time)
2. threading (multi-threading)
1. Module introduction
- effect: Create multiple threads (share the same process memory)
- Applicable scenarios: File reading and writing, network request and other I/O waiting tasks
- Core features: Due to GIL restrictions, only one thread can execute Python bytecode at the same time
2. Detailed case explanation: simultaneously download the file and display the progress bar
import threading import time import requests # Global variables (thread sharing)download_complete = False def download_file(url): global download_complete print("Start downloading files...") response = (url) with open("", "wb") as f: () download_complete = True print("\nDownload is complete!") def show_progress(): while not download_complete: print(".", end="", flush=True) # No line breaking output points (0.5) if __name__ == "__main__": # Create two threads download_thread = ( target=download_file, args=("/",) ) progress_thread = (target=show_progress) # Start the thread download_thread.start() progress_thread.start() # Wait for the download thread to complete download_thread.join() progress_thread.join() # Need to manually stop the progress bar thread
3. Implementation logic
Main thread
│
├─ Download thread → Execute download (GIL is released when encountering network waiting)
└─ Progress bar thread → Print progress point
4. Things to note
- Shared variables need to be used
Lock
Avoid data competition - Threads are suitable for scenarios where frequent data sharing is required (such as GUI programs)
- Don't use multi-threading to do math calculations(It will be slower)
3. asyncio (coroutine)
1. Module introduction
- effect: High concurrency is achieved through task switching within a single thread
- Applicable scenarios: Web server, high-frequency I/O operations (such as crawlers)
- Core mechanism: Event Loop drives coroutine switching
2. Detailed case explanation: Asynchronous batch request web page
import asyncio import aiohttp # Need to install: pip install aiohttpasync def fetch_page(url): async with () as session: # Create a session async with (url) as response: # Make a request return await () # Asynchronous waiting for responseasync def main(): urls = [ "", "", "" ] # Create a task list tasks = [fetch_page(url) for url in urls] # Execute all tasks in parallel pages = await (*tasks) # Key Point: Gathering Tasks # Output result for url, content in zip(urls, pages): print(f"{url} → length:{len(content)}") # Start the event loop(main()) # Python 3.7+
3. Implementation logic
Event loop (general dispatcher)
│
├─ Task 1: Request Baidu → Encounter waiting → Hang
├─ Task 2: Request Taobao → Encounter waiting → Hang
└─ Task 3: Request JD → Encounter Wait → Hang
When a request returns, resume corresponding task execution
4. Things to note
- Coroutine functions must be used
async def
definition - Blocking operation must be used
await
(Otherwise it will block the entire event loop) - Need to be used with asynchronous library (such as
aiohttp
replacerequests
)
Summary of the core differences between the three
characteristic | multiprocessing | threading | asyncio |
---|---|---|---|
Parallel capability | True multi-core parallelism | Pseudo-parallel (restricted by GIL) | Single thread concurrency |
Memory usage | High (independent memory space) | Low (shared memory) | lowest |
Applicable scenarios | CPU intensive tasks | I/O intensive tasks | Ultra-high concurrent I/O tasks |
Code complexity | Medium (need to handle process communication) | Low (but lock needs to be handled) | High (need to understand asynchronous syntax) |
How to choose?
-
Mathematical calculation acceleration is required→ Select
multiprocessing
-
Simple I/O operations (such as file reading and writing)→ Select
threading
-
High-performance network requests (such as crawlers)→ Select
asyncio
- Hybrid tasks→ Combination use (such as multi-process + coroutine)
Through these three cases, it can be clearly seen that multi-processes are like multiple independent factories, multi-threads are like multiple collaborative workers in the factory, and coroutines are like a person using ultra-efficient time management methods. After understanding this core difference, you can choose the right tool according to actual needs.
This is the article about the analysis of typical examples of Python multi-process, multi-threading, coroutines. For more related Python multi-process, multi-threading, coroutines, and coroutines, please search for my previous articles or continue browsing the related articles below. I hope you can support me more in the future!