limap.features.models package
Submodules
limap.features.models.base_model module
Base class for models. See mnist_net.py for an example of model.
- class limap.features.models.base_model.BaseModel(conf={})
Bases:
Module
- What the child model is expect to declare:
default_conf: dictionary of the default configuration of the model. It overwrites base_default_conf in BaseModel, and it is overwritten by the user-provided configuration passed to __init__. Configurations can be nested. required_data_keys: list of expected keys in the input data dictionary. strict_conf (optional): boolean. If false, BaseModel does not raise an error when the user provides an unknown configuration entry. _init(self, conf): initialization method, where conf is the final configuration object (also accessible with self.conf). Accessing unkown configuration entries will raise an error. _forward(self, data): method that returns a dictionary of batched prediction tensors based on a dictionary of batched input data tensors. loss(self, pred, data): method that returns a dictionary of losses, computed from model predictions and input data. Each loss is a batch of scalars, i.e. a torch.Tensor of shape (B,). The total loss to be optimized has the key ‘total’. metrics(self, pred, data): method that returns a dictionary of metrics, each as a batch of scalars.
- base_default_conf = {'freeze_batch_normalization': False, 'name': None, 'trainable': False}
- default_conf = {}
- forward(data)
Check the data and call the _forward method of the child model.
- abstract loss(pred, data)
To be implemented by the child class.
- abstract metrics(pred, data)
To be implemented by the child class.
- required_data_keys = []
- strict_conf = True
- train(mode=True)
Sets the module in training mode.
This has any effect only on certain modules. See documentations of particular modules for details of their behaviors in training/evaluation mode, if they are affected, e.g.
Dropout
,BatchNorm
, etc.- Parameters:
mode (bool) – whether to set training mode (
True
) or evaluation mode (False
). Default:True
.- Returns:
self
- Return type:
Module
- training: bool
limap.features.models.s2dnet module
- class limap.features.models.s2dnet.AdapLayers(hypercolumn_layers: List[str], output_dim: int = 128)
Bases:
Module
Small adaptation layers.
- forward(features: List[tensor])
Apply adaptation layers.
- training: bool
- class limap.features.models.s2dnet.S2DNet(conf={})
Bases:
BaseModel
- default_conf = {'checkpointing': None, 'hypercolumn_layers': ['conv1_2'], 'output_dim': 128, 'pretrained': 's2dnet'}
- download_s2dnet_model(path)
- loss(pred, data)
To be implemented by the child class.
- mean = [0.485, 0.456, 0.406]
- metrics(pred, data)
To be implemented by the child class.
- std = [0.229, 0.224, 0.225]
- training: bool
- limap.features.models.s2dnet.print_gpu_memory()
limap.features.models.vggnet module
- class limap.features.models.vggnet.VGGNet(conf={})
Bases:
BaseModel
- default_conf = {'checkpointing': None, 'hypercolumn_layers': ['conv1_2', 'conv3_3'], 'output_dim': 128, 'pretrained': 'imagenet'}
- loss(pred, data)
To be implemented by the child class.
- mean = [0.485, 0.456, 0.406]
- metrics(pred, data)
To be implemented by the child class.
- std = [0.229, 0.224, 0.225]
- training: bool