phlower.settings.PhlowerTrainerSetting¶

class phlower.settings.PhlowerTrainerSetting(*, loss_setting=<factory>, optimizer_setting=<factory>, scheduler_settings=<factory>, handler_settings=<factory>, n_epoch=10, random_seed=0, batch_size=1, num_workers=0, device='cpu', evaluation_for_training=True, log_every_n_epoch=1, initializer_setting=<factory>, lazy_load=True, non_blocking=False)[source]¶

Bases: BaseModel

Methods

get_early_stopping_patience()

Attributes

model_computed_fields

model_config

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

model_extra

Get extra fields set during validation.

model_fields

model_fields_set

Returns the set of fields that have been explicitly set on this model instance.

loss_setting

setting for loss function

optimizer_setting

setting for optimizer

scheduler_settings

setting for schedulers

handler_settings

setting for handlers

n_epoch

the number of epochs.

random_seed

random seed.

batch_size

batch size.

num_workers

the number of cores.

device

device name.

evaluation_for_training

If True, evaluation for training dataset is performed

log_every_n_epoch

dump log items every nth epoch

initializer_setting

setting for trainer initializer

lazy_load

If True, data is loaded lazily.

non_blocking

Parameters:
  • loss_setting (LossSetting)

  • optimizer_setting (OptimizerSetting)

  • scheduler_settings (list[SchedulerSetting])

  • handler_settings (list[Annotated[Annotated[EarlyStoppingSetting, Tag(tag=EarlyStopping)] | Annotated[UserDefinedHandlerSetting, Tag(tag=UserCustom)], Discriminator(discriminator=~phlower.settings._handler_settings._custom_handler_discriminator, custom_error_type=invalid_union_member, custom_error_message=Invalid union member, custom_error_context={'discriminator': 'handler_checkk'})]])

  • n_epoch (int)

  • random_seed (int)

  • batch_size (int)

  • num_workers (int)

  • device (str)

  • evaluation_for_training (bool)

  • log_every_n_epoch (int)

  • initializer_setting (TrainerInitializerSetting)

  • lazy_load (bool)

  • non_blocking (bool)

batch_size: int¶

batch size. Defaults to 1

device: str¶

device name. Defaults to cpu

evaluation_for_training: bool¶

If True, evaluation for training dataset is performed

handler_settings: list[HandlerSettingType]¶

setting for handlers

initializer_setting: TrainerInitializerSetting¶

setting for trainer initializer

lazy_load: bool¶

If True, data is loaded lazily. If False, all data is loaded at once. Defaults to True.

log_every_n_epoch: int¶

dump log items every nth epoch

loss_setting: LossSetting¶

setting for loss function

model_config: ClassVar[ConfigDict] = {'extra': 'forbid', 'frozen': True}¶

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

n_epoch: int¶

the number of epochs. Defaults to 10.

num_workers: int¶

the number of cores. Defaults to 0.

optimizer_setting: OptimizerSetting¶

setting for optimizer

random_seed: int¶

random seed. Defaults to 0

scheduler_settings: list[SchedulerSetting]¶

setting for schedulers