py_config_runner.config_utils¶
This module contains some helper methods to minimally check input configuration inside running script.
- class py_config_runner.config_utils.BaseConfigSchema(*, seed, debug=False)[source]¶
Base configuration schema.
- Schema defines required parameters:
seed (int)
debug (bool), default False
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
__init__ uses __pydantic_self__ instead of the more common self for the first arg to allow self as a field name.
- model_config: ClassVar[ConfigDict] = {'arbitrary_types_allowed': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- model_fields: ClassVar[dict[str, FieldInfo]] = {'debug': FieldInfo(annotation=bool, required=False, default=False), 'seed': FieldInfo(annotation=int, required=True)}¶
Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo].
This replaces Model.__fields__ from Pydantic V1.
- class py_config_runner.config_utils.InferenceConfigSchema(*, seed, debug=False, device='cuda', model, data_loader, weights_path)[source]¶
Inference configuration schema with a PyTorch model. Derived from
py_config_runner.config_utils.TorchModelConfigSchema
.This schema is available only if torch is installed.
- Schema defines required parameters:
data_loader (torch DataLoader or Iterable)
weights_path (str)
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
__init__ uses __pydantic_self__ instead of the more common self for the first arg to allow self as a field name.
- Parameters:
- model_config: ClassVar[ConfigDict] = {'arbitrary_types_allowed': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- model_fields: ClassVar[dict[str, FieldInfo]] = {'data_loader': FieldInfo(annotation=Union[DataLoader, Iterable], required=True), 'debug': FieldInfo(annotation=bool, required=False, default=False), 'device': FieldInfo(annotation=str, required=False, default='cuda'), 'model': FieldInfo(annotation=Module, required=True), 'seed': FieldInfo(annotation=int, required=True), 'weights_path': FieldInfo(annotation=str, required=True)}¶
Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo].
This replaces Model.__fields__ from Pydantic V1.
- class py_config_runner.config_utils.Schema[source]¶
Base class for all custom configuration schemas
Example:
from typing import * import torch from torch.utils.data import DataLoader from py_config_runner import Schema class TrainingConfigSchema(Schema): seed: int debug: bool = False device: str = "cuda" train_loader: Union[DataLoader, Iterable] num_epochs: int model: torch.nn.Module optimizer: Any criterion: torch.nn.Module config = ConfigObject("/path/to/config.py") # Check the config TrainingConfigSchema.validate(config)
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
__init__ uses __pydantic_self__ instead of the more common self for the first arg to allow self as a field name.
- model_config: ClassVar[ConfigDict] = {'arbitrary_types_allowed': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class py_config_runner.config_utils.TorchModelConfigSchema(*, seed, debug=False, device='cuda', model)[source]¶
Base configuration schema with a PyTorch model. Derived from
py_config_runner.config_utils.BaseConfigSchema
.This schema is available only if torch is installed.
- Schema defines required parameters:
device (str), default “cuda”
model (torch.nn.Module)
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
__init__ uses __pydantic_self__ instead of the more common self for the first arg to allow self as a field name.
- model_config: ClassVar[ConfigDict] = {'arbitrary_types_allowed': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- model_fields: ClassVar[dict[str, FieldInfo]] = {'debug': FieldInfo(annotation=bool, required=False, default=False), 'device': FieldInfo(annotation=str, required=False, default='cuda'), 'model': FieldInfo(annotation=Module, required=True), 'seed': FieldInfo(annotation=int, required=True)}¶
Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo].
This replaces Model.__fields__ from Pydantic V1.
- class py_config_runner.config_utils.TrainConfigSchema(*, seed, debug=False, device='cuda', model, train_loader, num_epochs, criterion, optimizer)[source]¶
Training configuration schema with a PyTorch model. Derived from
py_config_runner.config_utils.TorchModelConfigSchema
.This schema is available only if torch is installed.
- Schema defines required parameters:
train_loader (torch DataLoader or Iterable)
num_epochs (int)
criterion (torch.nn.Module)
optimizer (Any)
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
__init__ uses __pydantic_self__ instead of the more common self for the first arg to allow self as a field name.
- Parameters:
- model_config: ClassVar[ConfigDict] = {'arbitrary_types_allowed': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- model_fields: ClassVar[dict[str, FieldInfo]] = {'criterion': FieldInfo(annotation=Module, required=True), 'debug': FieldInfo(annotation=bool, required=False, default=False), 'device': FieldInfo(annotation=str, required=False, default='cuda'), 'model': FieldInfo(annotation=Module, required=True), 'num_epochs': FieldInfo(annotation=int, required=True), 'optimizer': FieldInfo(annotation=Any, required=True), 'seed': FieldInfo(annotation=int, required=True), 'train_loader': FieldInfo(annotation=Union[DataLoader, Iterable], required=True)}¶
Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo].
This replaces Model.__fields__ from Pydantic V1.
- class py_config_runner.config_utils.TrainvalConfigSchema(*, seed, debug=False, device='cuda', model, train_loader, num_epochs, criterion, optimizer, train_eval_loader, val_loader, lr_scheduler)[source]¶
Training/Validation configuration schema with a PyTorch model. Derived from
py_config_runner.config_utils.TrainConfigSchema
.This schema is available only if torch is installed.
- Schema defines required parameters:
train_eval_loader (torch DataLoader or Iterable)
val_loader (torch DataLoader or Iterable)
lr_scheduler (Any)
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
__init__ uses __pydantic_self__ instead of the more common self for the first arg to allow self as a field name.
- Parameters:
- model_config: ClassVar[ConfigDict] = {'arbitrary_types_allowed': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- model_fields: ClassVar[dict[str, FieldInfo]] = {'criterion': FieldInfo(annotation=Module, required=True), 'debug': FieldInfo(annotation=bool, required=False, default=False), 'device': FieldInfo(annotation=str, required=False, default='cuda'), 'lr_scheduler': FieldInfo(annotation=Any, required=True), 'model': FieldInfo(annotation=Module, required=True), 'num_epochs': FieldInfo(annotation=int, required=True), 'optimizer': FieldInfo(annotation=Any, required=True), 'seed': FieldInfo(annotation=int, required=True), 'train_eval_loader': FieldInfo(annotation=Union[DataLoader, Iterable, NoneType], required=True), 'train_loader': FieldInfo(annotation=Union[DataLoader, Iterable], required=True), 'val_loader': FieldInfo(annotation=Union[DataLoader, Iterable], required=True)}¶
Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo].
This replaces Model.__fields__ from Pydantic V1.
- py_config_runner.config_utils.get_params(config, required_fields)[source]¶
Method to convert configuration into a dictionary matching required_fields.
- Parameters:
config (ConfigObject) – configuration object
required_fields (Type[Schema] or Sequence of (str, type)) – Required attributes that should exist in the configuration. Either can accept a Schema class or a sequence of pairs
(("a", (int, str)), ("b", str),)
.
- Returns:
a dictionary
- Return type:
Example:
from typing import * import torch from torch.utils.data import DataLoader from py_config_runner import Schema class TrainingConfigSchema(Schema): seed: int debug: bool = False device: str = "cuda" train_loader: Union[DataLoader, Iterable] num_epochs: int model: torch.nn.Module optimizer: Any criterion: torch.nn.Module config = ConfigObject("/path/to/config.py") # Get config required parameters print(get_params(config, TrainingConfigSchema)) # > # {"seed": 12, "debug": False, "device": "cuda", ...}