ikkuna.models¶
Contents
Module contents¶
-
class
ikkuna.models.
AlexNetMini
(input_shape, num_classes=1000, exporter=None)[source]¶ Bases:
torch.nn.modules.module.Module
Reduced AlexNet (basically just a few conv layers with relu and max-pooling) which attempts to adapt to arbitrary input sizes, provided they are large enough to survive the strides and conv cutoffs.
-
features
¶ Convolutional module, extracting features from the input
Type: torch.nn.Module
-
classifier
¶ Classifier with relu and dropout
Type: torch.nn.Module
-
__init__
(input_shape, num_classes=1000, exporter=None)[source]¶ Parameters: input_shape (tuple) – H, W C (beware of channel order, is different from what you call the model with)
-
forward
(x)[source]¶ Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
-
-
class
ikkuna.models.
DenseNet
(input_shape, growth_rate=12, block_config=(16, 16, 16), compression=0.5, num_init_features=24, bn_size=4, drop_rate=0, num_classes=10, small_inputs=True, efficient=False, exporter=None)[source]¶ Bases:
torch.nn.modules.module.Module
Densenet-BC model class, based on “Densely Connected Convolutional Networks” <https://arxiv.org/pdf/1608.06993.pdf>
Parameters: - growth_rate (int) – how many filters to add each layer (
k
in paper) - block_config (list(int)) – 3 or 4 ints, how many layers in each pooling block
- num_init_features (int) – the number of filters to learn in the first convolution layer
- bn_size (int) – multiplicative factor for number of bottle neck layers (i.e.
bn_size
*k
features in the bottleneck layer) - drop_rate (float) – dropout rate after each dense layer
- num_classes (int) – number of classification classes
- small_inputs (bool) – set to True if images are
32x32
(e.g. CIFAR). Otherwise assumes images are larger. - efficient (bool) – set to
True
to use checkpointing. Much more memory efficient, but slower. - exporter (ikkuna.export.Exporter) – Optional exporter to use for monitoring
-
forward
(x)[source]¶ Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- growth_rate (int) – how many filters to add each layer (
-
class
ikkuna.models.
ResNet
(BlockType, num_blocks, num_classes=10, exporter=None)[source]¶ Bases:
torch.nn.modules.module.Module
-
forward
(x)[source]¶ Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
-
-
class
ikkuna.models.
VGG
(input_shape, num_classes=1000, exporter=None)[source]¶ Bases:
torch.nn.modules.module.Module
-
forward
(x)[source]¶ Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
-
-
class
ikkuna.models.
FullyConnectedModel
(input_shape, num_classes=1000, exporter=None)[source]¶ Bases:
torch.nn.modules.module.Module
-
forward
(x)[source]¶ Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
-