Many people learn the basic knowledge of the Python language, and then move to the application stage, and later seldom follow the new changes and new content of the language itself to learn and update their knowledge, and even the new features of Python 3.6, which has been released for several years, lack of understanding.
This article lists the new features of Python versions 3.6, 3.7, and 3.8, and learning them will help you improve your knowledge of Python and keep up with the latest trends.
I. Python3.6 new features
1. New way of formatting strings
The new way of formatting strings, i.e., adding an f or F prefix to a normal string, has an effect similar to (). For example
name = "red" print(f"He said his name is {name}.") # 'He said his name is red.'
Equivalent:
print("He said his name is {name}.".format(**locals()))
In addition, this feature supports nested fields such as:
import decimal width = 10 precision = 4 value = ("12.34567") print(f"result: {value:{width}.{precision}}") #'result: 12.35'
2. Variable declaration syntax
You can declare a variable and specify the type as below:
from typing import List, Dict primes: List[int] = [] captain: str # There's no initial value at this point class Starship: stats: Dict[str, int] = {}
3. Underline writing of numbers
Allows the use of underscores in numbers to improve the readability of multi-digit numbers.
a = 1_000_000_000_000_000 # 1000000000000000 b = 0x_FF_FF_FF_FF # 4294967295
In addition to this, string formatting also supports the _ option to print more readable strings of numbers:
'{:_}'.format(1000000) # '1_000_000' '{:_x}'.format(0xFFFFFFFF) # 'ffff_ffff'
4. Asynchronous generator
In Python 3.5, the new syntaxes async and await were introduced to implement collaborative programs. However, there was a restriction that you could not use both yield and await within the same function, and in Python 3.6, this restriction was relaxed to allow the definition of asynchronous generators:
async def ticker(delay, to): """Yield numbers from 0 to *to* every *delay* seconds.""" for i in range(to): yield i await (delay)
5. Asynchronous parser
Allows the use of async or await syntax in list, set, and dictionary dict parsers.
result = [i async for i in aiter() if i % 2] result = [await fun() for fun in funcs if await condition()]
6. Newly added modules
A new module has been added to The Standard Library: secrets, which is used to generate more secure random numbers for managing passwords, account authentication, security tokens, and related secrets. data.
7. Other new features
- The new PYTHONMALLOC environment variable allows developers to set up memory allocators, as well as register debug hooks and more.
- The asyncio module is more stable, more efficient, and is no longer a temporary module, and the APIs in it are all stable now.
- The typing module has also been somewhat improved and is no longer temporary.
- and began supporting the ISO 8601 time identifiers %G, %u, %V.
- The hashlib and ssl modules now support OpenSSL 1.1.0.
- The hashlib module is starting to support new hash algorithms such as BLAKE2, SHA-3 and SHAKE.
- The default encoding of filesystem and console on Windows has been changed to UTF-8.
- The () and () functions in the json module now support binary input.
Refer to the official documentation for more:What's New In Python 3.6
II. Python3.7 New Features
Python 3.7 was released on June 27, 2018, and includes many new features and optimizations, adding numerous new classes for data processing, optimizations for script compilation and garbage collection, and faster asynchronous I/O, notably as follows:
- Data classes that reduce sample code when processing data with classes.
- One change that may not be backward compatible involves handling exceptions in the generator.
- Interpreter-oriented "development model".
- Time objects with nanosecond resolution.
- The environment defaults to UTF-8 mode with UTF-8 encoding.
- A new built-in function that triggers the debugger.
1, new built-in function breakpoint ()
Using this built-in function is equivalent to setting breakpoints by way of code, which will automatically enter Pbd debugging mode.
Setting PYTHONBREAKPOINT=0 in the environment variable ignores this function. Also, pdb is only one of many debuggers available, and you can configure which debugger you want to use by setting the new PYTHONBREAKPOINT environment variable.
Here is a simple example where the user needs to enter a number and determine if it is the same as the target number:
"""Guess the Number Game""" def guess(target): user_guess = input("Please enter the number you guessed >>>") if user_guess == target: return "You bet!" else: return "Wrong guess." if __name__ == '__main__': a = 100 print(guess(a))
Unfortunately, even if the number guessed is the same as the target number, the printout is 'wrong guess' and there is no exception or error message.
To figure out what's happening, we can insert a breakpoint to debug it. In the past, this was usually done through the print method or the IDE's debugging tools, but now we can use breakpoint().
"""Guess the Number Game""" def guess(target): user_guess = input("Please enter the number you guessed >>>") breakpoint() // Add this line if user_guess == target: return "You bet!" else: return "Wrong guess." if __name__ == '__main__': a = 100 print(guess(a))
At the pdb prompt, we can call locals() to see all the variables in the current local scope. (pdb has a large number of commands that you can also run normal Python statements in)
Please enter your guess. >>> 100 > d:\work\for_test\py3_test\(7)guess() -> if user_guess == target: (Pdb) locals() {'target': 100, 'user_guess': '100'} (Pdb) type(user_guess) <class 'str'>
Figuring out that target is an integer and user_guess is a string, a type comparison error occurred here.
2. Types and annotations
Type annotations have become increasingly popular since Python 3.5. For those unfamiliar with type hints, it's a completely optional way to annotate code to specify the type of a variable.
What are annotations? They are syntactic support for associating metadata with variables, which can be arbitrary expressions that are computed but ignored by Python at runtime. Annotations can be any valid Python expression.
Here's an example of a comparison:
# without type annotations def foo(bar, baz): # with type annotations def foo(bar: 'Describe the bar', baz: print('random')) -> 'return thingy':
The above approach is really Python's enhancement of its own weakly typed language, in the hope of gaining a certain level of type reliability and robustness, and moving closer to languages like Java.
The syntax for annotations was standardized in Python 3.5, and since then the Python community has made extensive use of annotated type hints.
However, annotations are simply a development tool that can be checked using an IDE such as PyCharm or a third-party tool such as Mypy, and are not a limitation at the syntax level.
Our previous number-guessing program would have looked like this if we had added the type annotation:
"""Guess the Number Game""" def guess(target:str): user_guess:str = input("Please enter the number you guessed >>>") breakpoint() if user_guess == target: return "You bet!" else: return "Wrong guess." if __name__ == '__main__': a:int = 100 print(guess(a))
PyCharm will give us gray alerts for specification errors, but not red alerts for syntax errors.
There are two main issues when using annotations as type hints: startup performance and forward references.
- Computing a large number of arbitrary expressions at definition time is quite bad for startup performance, and typing modules are very slow.
- You can't annotate with a type that hasn't been declared yet!
Part of the reason typing modules have been so slow is that the original design goal was to implement typing modules without modifying the core CPython interpreter. As type hinting has become more popular, this restriction has been removed, meaning that there is now core support for typing.
And for forward references, look at the following example:
class User: def __init__(self, name: str, prev_user: User) -> None: pass
The error is that the User type has not yet been declared, and prev_user cannot be defined as a User type at this point.
To address this issue, Python 3.7 postponed the evaluation of annotations. Moreover, this change is backwards incompatible, requiring annotations to be imported first, and will only become the default behavior after Python 4.0.
from __future__ import annotations class User: def __init__(self, name: str, prev_user: User) -> None: pass
Or as in the example below:
class C: def validate_b(self, obj: B) -> bool: ... class B: ...
3. New dataclasses module
This feature is probably more commonly used since Python 3.7, what does it do?
Suppose we need to write a class like the following:
from datetime import datetime import dateutil class Article(object): def __init__(self, _id, author_id, title, text, tags=None, created=(), edited=()): self._id = _id self.author_id = author_id = title = text = list() if tags is None else tags = created = edited if type() is str: = () if type() is str: = () def __eq__(self, other): if not isinstance(other, self.__class__): return NotImplemented return (self._id, self.author_id) == (other._id, other.author_id) def __lt__(self, other): if not isinstance(other, self.__class__): return NotImplemented return (self._id, self.author_id) < (other._id, other.author_id) def __repr__(self): return '{}(id={}, author_id={}, title={})'.format( self.__class__.__name__, self._id, self.author_id, )
A large number of initialization properties have to be defined with default values, and a bunch of magic methods may need to be rewritten for printing, comparing, sorting, and de-duplicating class instances.
If transformed using dataclasses, it could be written like this:
from dataclasses import dataclass, field from typing import List from datetime import datetime import dateutil @dataclass(order=True) //Note here class Article(object): _id: int author_id: int title: str = field(compare=False) text: str = field(repr=False, compare=False) tags: List[str] = field(default=list(), repr=False, compare=False) created: datetime = field(default=(), repr=False, compare=False) edited: datetime = field(default=(), repr=False, compare=False) def __post_init__(self): if type() is str: = () if type() is str: = ()
This makes the class not only easy to set up, but it also automatically generates beautiful strings when we create an instance and print it out. It also behaves appropriately when comparing it to other class instances. This is because dataclasses automatically generates the __init__ method for us, in addition to some other special methods such as repr, eq, and hash.
Dataclasses use the field field to provide default values, and manually constructing a field() function gives you access to other options to change the default values. For example, here the default_factory in field is set to a lambda function that prompts the user for a name.
from dataclasses import dataclass, field class User: name: str = field(default_factory=lambda: input("enter name"))
4. Generator exception handling
In Python 3.7, after a generator throws a StopIteration exception, the StopIteration exception is converted to a RuntimeError exception so that it doesn't creep all the way through the application's stack frame. This means that some programs that are less astute about how to handle the behavior of generators will throw a RuntimeError in Python 3.7. In Python 3.6, this behavior generates a deprecation warning; in Python 3.7, it generates a full-blown error.
An easy way to do this is to use a try/except snippet that captures the StopIteration outside of where it propagates to the generator. A better solution would be to rethink how the generator is constructed - for example, using a return statement to terminate the generator instead of manually raising the StopIteration.
5. Development model
The Python interpreter has added a new command line switch, -X, that lets developers set many low-level options for the interpreter.
This runtime checking mechanism usually has a significant impact on performance, but is useful for developers during debugging.
-X Activated options include:
- Debug mode for asyncio modules. This provides more detailed logging and exception handling for asynchronous operations, which can be difficult to debug or reason about.
- Memory allocator oriented debugging hooks. This is useful for those writing CPython extensions. It enables more explicit runtime checks on how CPython allocates memory and frees memory internally.
- Enable the faulthandler module, that way the traceback is always dumped out after a crash.
6, high precision time function
A new class of time functions in Python 3.7 return time values with nanosecond precision. Although Python is an interpreted language, Victor Stinner, a core Python developer, advocates reporting time with nanosecond precision. The primary reason for this is to avoid losing precision when dealing with converting time values recorded by other programs, such as databases.
The new time functions use the suffix _ns. For example, the nanosecond version of time.process_time() is time.process_time_ns(). Note that not all time functions have corresponding nanosecond versions.
7. Other new features
- Dictionaries now maintain insertion order. This was unofficial in 3.6, but is now the official language specification. In most cases, plain dict can be replaced.
- .pyc files are deterministic and support repeatable builds -- that is, always produce the same byte-for-byte output for the same input file.
- New contextvars module to provide context variables for asynchronous tasks.
- The code in __main__ displays a DeprecationWarning.
- Added UTF-8 mode. On Linux/Unix systems, the system locale will be ignored and UTF-8 will be used as the default encoding. On non-Linux/Unix systems, UTF-8 mode needs to be enabled with the -X utf8 option.
- Allow modules to define __getattr__, __dir__ functions to facilitate discard warnings, delayed import submodules, etc.
- New threaded local storage C API.
- Update Unicode data to 11.0.
III. Python3.8 new features
Python version 3.8 was released on October 14, 2019, and here are the new features added to Python 3.8 compared to 3.7
1. Walrus assignment expressions
A new syntax, :=, assigns values to variables in a larger expression. It is affectionately known as the walrus operator because it looks like a walrus's eyes and tusks.
The "walrus operator" can make your code neater at certain times, for example:
In the following example, the assignment expression avoids having to call len () twice.
if (n := len(a)) > 10: print(f"List is too long ({n} elements, expected <= 10)")
A similar benefit can be seen in cases where regular expression matching requires the use of the match object twice, once to detect whether a match has occurred, and again to extract subgroups: the
discount = 0.0 if (mo := (r'(\d+)% discount', advertisement)): discount = float((1)) / 100.0
This operator can also be used in conjunction with a while loop to compute a value to detect if the loop is terminated and the same value is used again in the loop body.
# Loop over fixed length blocks while (block := (256)) != '': process(block)
Or in list derivatives, where a value is computed in a filter condition and the same value needs to be used in an expression:.
[clean_name.title() for name in names if (clean_name := normalize('NFC', name)) in allowed_names]
Please try to limit the use of walrus operators to clear situations to reduce complexity and improve readability.
2、Limited to positional parameters
Add a new function formal parameter syntax / to indicate that some function formal parameters must take the form of position-only rather than keyword arguments.
This tagging syntax is the same as the C functions tagged with Larry Hastings' Argument Clinic utility shown via help ().
In the following examples, formal parameters a and b are position-only formal parameters, c or d can be either positional or keyword formal parameters, and e or f are required to be keyword formal parameters.
def f(a, b, /, c, d, *, e, f): print(a, b, c, d, e, f)
The following calls are legal.
f(10, 20, 30, d=40, e=50, f=60)
However, all of the following are illegal calls.
f(10, b=20, c=30, d=40, e=50, f=60) # b may not be a keyword argument f(10, 20, 30, 40, 50, f=60) # e must be a keyword argument
One use case for this form of markup is that it allows pure Python functions to fully emulate the behavior of existing functions written in C code. For example, the built-in pow () function does not take a keyword argument.
def pow(x, y, z=None, /): "Emulate the built in pow() function" r = x ** y return r if z is None else r%z
Another use case is to exclude keyword arguments when the name of the formal parameter is not needed. For example, the built-in len () function is signed len (obj, /). This excludes awkward calls such as the following.
len(obj='hello') # The "obj" keyword argument impairs readability
Another benefit is that marking a formal parameter as positional-only will allow the name of the formal parameter to be modified in the future without breaking the client's code. For example, in the statistics module, the formal parameter name dist may be modified in the future. This makes the following function description possible.
def quantiles(dist, /, *, n=4, method='exclusive') ...
Since the formal parameter on the left side of / is not exposed as a usable keyword, other formal parameter names can still be used in **kwargs:.
>>> def f(a, b, /, **kwargs): ... print(a, b, kwargs) ... >>> f(10, 20, a=1, b=2, c=3) # a and b are used in two ways 10 20 {'a': 1, 'b': 2, 'c': 3}
This greatly simplifies the implementation of functions and methods that need to accept arbitrary keyword arguments. For example, here is an excerpt of code from the collections module:
class Counter(dict): def __init__(self, iterable=None, /, **kwds): # Note "iterable" is a possible keyword argument
3. f String support =
Adding the = descriptor is used for f-strings. f-strings of the form f'{expr=}' will be expanded to be represented as expression text, with an equals sign, and the result of the expression's evaluation. Example:
>>> user = 'eric_idle' >>> member_since = date(1975, 7, 31) >>> f'{user=} {member_since=}' "user='eric_idle' member_since=(1975, 7, 31)" f String format descriptors allow for more fine-grained control over the result of the expression to be displayed: >>> delta = () - member_since >>> f'{user=!s} {=:,d}' 'user=eric_idle =16,075' = The descriptor will output the entire expression,in order to demonstrate the calculation process in detail: >>> print(f'{theta=} {cos(radians(theta))=:.3f}') theta=30 cos(radians(theta))=0.866
4. Improvements to the typing module
Python is a dynamically typed language, but type hints can be added via the typing module to allow third-party tools to validate Python code.Python 3.8 adds some new elements to typing so it can support more robust checking:
- The final modifier and Final type annotation indicate that the modified or annotated object should not be overridden, inherited, or reassigned at any time.
- The Literal type limits an expression to a specific value or list of values (not necessarily values of the same type).
- TypedDict can be used to create dictionaries whose values for specific keys are restricted to one or more types. Note that these restrictions are only used to determine the legality of values at compile time, not at run time.
5. Multi-process shared memory
The multiprocessing module has a new SharedMemory class that creates a shared memory area between different Python progressions.
In older versions of Python, sharing data between processes could only be done by writing to a file, sending it over a network socket, or serializing it using Python's pickle module. Shared memory provides a faster way to pass data between processes, thus making multi-processor and multi-core programming in Python more efficient.
Shared memory fragments can be allocated as mere byte regions or as unmodifiable list-like objects that can hold a small number of Python objects such as numeric types, strings, byte objects, None objects, and so on.
6. New version of the pickle protocol
Python's pickle module provides a way to serialize and deserialize Python data structures or instances, saving the dictionary as is for later reading. Different versions of Python support different pickle protocols, and version 3.8 has a wider, more powerful, and more efficient serialization support.
The version 5 pickle protocol introduced in Python 3.8 allows to pickle objects in a new way that supports Python's buffer protocols such as bytes, memoryviews, or Numpy array. The new pickle avoids many of the memory copying operations that occur when pickling these objects.
External libraries such as NumPy, Apache Arrow and others support the new pickle protocol in their respective Python bindings. The new pickle is also available as a plugin for Python 3.6 and 3.7, which can be installed from PyPI.
7. Performance improvement
- Many of the built-in methods and functions are 20% to 50% faster, as many of them previously required unnecessary parameter conversions.
- A new opcode cache can improve the speed of specific instructions in the interpreter. However, the only speed improvement currently realized is the LOAD_GLOBAL opcode, which is 40% faster. Similar optimizations will be made in future releases.
- File copy operations such as () and () now use platform-specific calls and other optimizations to improve operation speed.
- Newly created lists are now on average 12% smaller than before, thanks to list constructors that can be optimized if the list length is known in advance.
- Writes to class variables of new types of classes, such as class A(object), have become faster in Python 3.8. () and () have also been optimized for speed.
For more detailed features, see the Python 3.8.0 documentation:/zh-cn/3.8/whatsnew/3.
to this article you should know about Python3.6, 3.7, 3.8 new features of the summary of the article is introduced to this, more related Python3.6, 3.7, 3.8 new features Please search my previous posts or continue to browse the following related articles I hope that you will support me in the future more!