Sequential

Sequential#

torch.nn.sequential allows you to connect layers into single chain.

import torch
from random import randint

Display#

You can check what your sequential contains by viewing it in terminal or jupyter cell.


As an example, consider the chain of linear layers defined in the following cell:

layers_sizes = [randint(4,10) for i in range(5)]

layer = torch.nn.Sequential(*[
    torch.nn.Linear(layers_sizes[i], layers_sizes[i+1]) 
    for i in range(len(layers_sizes)-1)
])

You can coinvert it into string.

str(layer)
'Sequential(\n  (0): Linear(in_features=5, out_features=9, bias=True)\n  (1): Linear(in_features=9, out_features=5, bias=True)\n  (2): Linear(in_features=5, out_features=10, bias=True)\n  (3): Linear(in_features=10, out_features=5, bias=True)\n)'

As a result, you can apply print to the sequential or use it as output for jupyter cell.

print(layer)
layer
Sequential(
  (0): Linear(in_features=5, out_features=9, bias=True)
  (1): Linear(in_features=9, out_features=5, bias=True)
  (2): Linear(in_features=5, out_features=10, bias=True)
  (3): Linear(in_features=10, out_features=5, bias=True)
)
Sequential(
  (0): Linear(in_features=5, out_features=9, bias=True)
  (1): Linear(in_features=9, out_features=5, bias=True)
  (2): Linear(in_features=5, out_features=10, bias=True)
  (3): Linear(in_features=10, out_features=5, bias=True)
)

Defining layers names#

Defining sequential with collections.OrderedDict you can set names for layers.


Consider the following sequence of randomly named ReLU layers with “my_special_layer” sigmoid at the end as an example.

from random_word import RandomWords
from collections import OrderedDict

random_words = RandomWords()

example_sequential = torch.nn.Sequential(
    OrderedDict(
        [
            (random_words.get_random_word(), torch.nn.ReLU()) 
            for i in range(5)
        ] + 
        [("my_special_layer", torch.nn.Sigmoid())]
    )
)

example_sequential
Sequential(
  (anatriptic): ReLU()
  (hookeys): ReLU()
  (splenical): ReLU()
  (starvers): ReLU()
  (harlot): ReLU()
  (my_special_layer): Sigmoid()
)

Layers already displayed with specific names in parentheses.

You can now access the layers using the . operator. So let us try to access my_special_layer which we defined for this purpose.

example_sequential.my_special_layer
Sigmoid()