ikkuna.utils

Module contents

class ikkuna.utils.DatasetMeta[source]

Bases: tuple

Class encapsulating a dataset and makes information about it more easily accessible.

dataset

Alias for field number 0

num_classes

Alias for field number 1

shape

Alias for field number 2

size

Number of examples in the dataset

Type:int
__getnewargs__()

Return self as a plain tuple. Used by copy and pickle.

static __new__(_cls, dataset: object, num_classes: int, shape: tuple)

Create new instance of DatasetMeta(dataset, num_classes, shape)

__repr__()

Return a nicely formatted representation string

_asdict()

Return a new OrderedDict which maps field names to their values.

classmethod _make(iterable, new=<built-in method __new__ of type object>, len=<built-in function len>)

Make a new DatasetMeta object from a sequence or iterable

_replace(**kwds)

Return a new DatasetMeta object replacing specified fields with new values

class ikkuna.utils.ModuleTree(module, name=None, drop_name=True, recursive=True)[source]

Bases: object

_module
Type:torch.nn.Module
_name

Hierarchical name for this module

Type:str
_children

Children of this module. Can be empty.

Type:list(ModuleTree)
_type_counter

Dict for keeping track of number of child modules of each class. Used for disambiguating e.g. different successive conv layers which are children of the same sequential module.

Type:dict(int)
__init__(module, name=None, drop_name=True, recursive=True)[source]
Parameters:
  • module (torch.nn.Module) –
  • name (str or None) – If no name is given, one will be generated. If drop_name == True, this parameter is ignored
  • drop_name (bool) – Ignore the given name and set it to ''. Useful for dropping the root name (e.g. alexnet) lest it appear in every child name.
  • recursive (bool) – Add all torch.nn.Module.named_children() as children to this tree
preorder(depth=-1)[source]

Traverse the tree in preorder.

Parameters:depth (int) – Depth to which to traverse the tree. -1 means infinite depth.
Yields:ikkuna.utils.NamedModule – Tuple of name, module and parent module
class ikkuna.utils.NamedModule(module, name)[source]

Bases: tuple

module

Alias for field number 0

name

Alias for field number 1

__getnewargs__()

Return self as a plain tuple. Used by copy and pickle.

static __new__(_cls, module: torch.nn.modules.module.Module, name: str)

Create new instance of NamedModule(module, name)

__repr__()

Return a nicely formatted representation string

_asdict()

Return a new OrderedDict which maps field names to their values.

classmethod _make(iterable, new=<built-in method __new__ of type object>, len=<built-in function len>)

Make a new NamedModule object from a sequence or iterable

_replace(**kwds)

Return a new NamedModule object replacing specified fields with new values

ikkuna.utils.make_fill_polygons(xs, ys, zs)[source]

Make a set of polygons to fill the space below a line in 3d.

Returns:
Return type:mpl_toolkits.mplot3d.art3d.Poly3DCollection
ikkuna.utils.available_optimizers()[source]

List names of all available torch optimizers form torch.optim.

Returns:A list of all optimizer names.
Return type:list(str)
ikkuna.utils.create_optimizer(model, name, **kwargs)[source]

Create an optimizer for models parameters. Will disregard all params with requires_grad == False.

Parameters:
  • model (torch.nn.Module) –
  • name (str) – Name of the optimizer
  • kwargs (dict) – All arguments which should be passed to the optimizer.
Raises:

ValueError – If superfluous kwargs are passed.

ikkuna.utils.initialize_model(module, bias_val=0.01)[source]

Perform weight initialization on module. This is somewhat hacky since it assumes the presence of weight and/or bias fields on the module. Will skip if not present.

Parameters:
  • module (torch.nn.Module) – The model
  • bias_val (float) – Constant for biases (should be small and positive)
Raises:

ValueError – If module is not one of the known models (currently AlexNetMini and DenseNet)

ikkuna.utils.load_dataset(name, train_transforms=None, test_transforms=None, **kwargs)[source]

Retrieve a dataset and determine the number of classes. This estimate is obtained from the number of different values in the training labels.

Parameters:
Keyword Arguments:
 
  • root (str) – Root directory for dataset folders. Defaults to /home/share/software/data/<name>/Images
  • formats (list) – List of file extensions for dataset folders. Defaults to ['jpg', 'png']
Returns:

2 DatasetMetas are returned, one for train and one test set

Return type:

tuple

Submodules

ikkuna.utils.numba

This module contains functionality for numba-torch interoperability. This isn’t used in the library but may be useful in the future. Documentation is spotty.

ikkuna.utils.numba.compute_bin[source]

Compute the bins for histogram computation

ikkuna.utils.numba.dtype_min_max(dtype)[source]

Get the min and max value for a numeric dtype

ikkuna.utils.numba.numba_gpu_histogram(a_gpu, bins)[source]

Compute a histogram on a numba device array.

ikkuna.utils.numba.typestr(tensor)[source]
ikkuna.utils.numba.tensor_to_numba(tensor)[source]

Convert a tensor to a numba device array. The tensor will share the underlying CUDA storage This works by setting the __cuda_array_interface__ attribute on the tensor.

Parameters:tensor (torch.Tensor) –