Intro#

There is a set of packages that are preinstalled with the Python build or so common that they are used in almost any Python project, either directly or indirectly. We will call these packages the standard library, and they are considered in this section.

Logging#

Logging is a built-in Python library for organizing logs. Its purpose is to create different Logger objects, each of which can be used in a specific part of the program, allowing you to maintain control over your program’s output.

For more details check corresponding page.


The following example demonstrates how to create a logger, show_logger, and attach different handlers to it. Handlers define the destination of the output, and each handler has a unique formatter—an object that defines the format of the records produced by the corresponding handler.

import logging

show_logger = logging.getLogger("show logger")

handler1 = logging.StreamHandler()
formatter = logging.Formatter('%(asctime)s - %(levelname)s - %(message)s')
handler1.setFormatter(formatter)

handler2 = logging.StreamHandler()
formatter = logging.Formatter('%(message)s|%(asctime)s|%(levelname)s')
handler2.setFormatter(formatter)

show_logger.addHandler(handler1)
show_logger.addHandler(handler2)

show_logger.critical("This is my message")
2024-08-30 11:58:58,516 - CRITICAL - This is my message
This is my message|2024-08-30 11:58:58,516|CRITICAL

In the end, we receive messages formatted according to the logger’s settings.

Functools#

Functools is a module in Python that provides functions that return other functions that have specific properties. Check corresponding documentation.

Partial#

With functools.partial you can create a function that wraps another function in order to define default values for the arguments of the wrapped function. This is really useful when you need to pass the same set of arguments many times for example in unit tests.


The following cell defines a function with three parameters and demostrates how to call it.

from functools import partial

def some_fun(a, b, c):
    print(f"{a}, {b}, {c}")

some_fun(1, 2, 3)
1, 2, 3

Now created default_some_fun function, which is the same as some_fun but uses a=3 and b=10 as default values.

default_some_fun = partial(some_fun, a=3, c=10)

The following cell shows the default_some_fun call with b set to “hello”

default_some_fun(b="hello")
3, hello, 10

Obviously it is just like some_fun(3, "hello", 10).

Note: It is possible to redefine values specified in parital. The following code calls default_some_fun with new value for a parameter.

default_some_fun(a="three", b="hello")
three, hello, 10

As a result, instead of 3 specified by partical, we get three specified in the call to the result.

Note: With partial you can redefine any callable you like not only functions, for example you can specify a class that will always have the same argument of the __init__. The following cell creates a special object that will behave just like a dict but will have a default hello: 10 key-value pair.

value = partial(dict, hello=10)
value(), value(new=3)
({'hello': 10}, {'hello': 10, 'new': 3})

Tracemalloc#

The tracemalloc package allows you to monitor memory in python. For more information, check tracemalloc - Trace memory allocations page of official documentation.

Check more detainls in the Tracemalloc page.


To eliminate calls associated with the jupyter notebook, separate script supposed to be reated.

%%writefile /tmp/my_file.py
import tracemalloc

# Start monitoring
tracemalloc.start()

value = 10

# Taking information from the calls up to the present moment
snapshot = tracemalloc.take_snapshot()
print(snapshot.statistics('lineno'))
Overwriting /tmp/my_file.py

The script’s run is in the following cell.

!python3 /tmp/my_file.py
[]

The snapshot.statistisc('lineno') returns a list of different memory allocations. In the example under consideration, there is just one line that corresponds to the value = 10 call.