Module futureexpert.checkin
Contains the models with the configuration for CHECK-IN.
Classes
class BaseConfig (**data: Any)-
Expand source code
class BaseConfig(BaseModel): """Basic Configuaration for all models.""" model_config = ConfigDict(extra='forbid')Basic Configuaration for all models.
Create a new model by parsing and validating input data from keyword arguments.
Raises [
ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.selfis explicitly positional-only to allowselfas a field name.Ancestors
- pydantic.main.BaseModel
Subclasses
Class variables
var model_config
class CheckInResult (**data: Any)-
Expand source code
class CheckInResult(BaseModel): """Result of the CHECK-IN. Parameters ---------- time_series: builtins.list[futureexpert.shared_models.TimeSeries] Time series values. version_id: builtins.str Id of the time series version. Used to identifiy the time series """ time_series: list[TimeSeries] version_id: strResult of the CHECK-IN.
Parameters
time_series:builtins.list[TimeSeries]- Time series values.
version_id:builtins.str- Id of the time series version. Used to identifiy the time series
Create a new model by parsing and validating input data from keyword arguments.
Raises [
ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.selfis explicitly positional-only to allowselfas a field name.Ancestors
- pydantic.main.BaseModel
Class variables
var model_configvar time_series : list[TimeSeries]var version_id : str
class Column (**data: Any)-
Expand source code
class Column(BaseConfig): """Base model for the different column models (`DateColumn`, `ValueColumn` and `GroupColumn`). Parameters ---------- name: builtins.str The original name of the column. name_new: typing.Optional[builtins.str] The new name of the column. """ name: str name_new: Optional[str] = NoneBase model for the different column models (
DateColumn,ValueColumnandGroupColumn).Parameters
name:builtins.str- The original name of the column.
name_new:typing.Optional[builtins.str]- The new name of the column.
Create a new model by parsing and validating input data from keyword arguments.
Raises [
ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.selfis explicitly positional-only to allowselfas a field name.Ancestors
- BaseConfig
- pydantic.main.BaseModel
Subclasses
Class variables
var model_configvar name : strvar name_new : str | None
class DataDefinition (**data: Any)-
Expand source code
class DataDefinition(BaseConfig): """Model for the input parameter needed for the first CHECK-IN step. Every single column in your data must be accounted for. Each column must either be assigned a type (`date_column`, `value_columns`, `group_columns`) or be explicitly marked for removal in `remove_columns`. Parameters ---------- date_column: futureexpert.checkin.DateColumn Definition of the date column. Must be a single column that contains the complete date information. value_columns: builtins.list[futureexpert.checkin.ValueColumn] Definitions of the value columns. Not all columns defined here must be used for time series creation; selecting a subset or combining is possible in a later step. group_columns: builtins.list[futureexpert.checkin.GroupColumn] Definitions of the group columns. Not all columns defined here must be used for time series creation; selecting a subset is possible in a later step. Grouping information can also be used to create hierarchical levels. remove_rows: typing.Optional[builtins.list[builtins.int]] Indexes of the rows to be removed before validation. Note: If the raw data was committed as pandas data frame the header is the first row (row index 0). remove_columns: typing.Optional[builtins.list[builtins.int]] Indexes of the columns to be removed before validation. Any column that is not assigned a type must be listed here. """ date_column: DateColumn value_columns: list[ValueColumn] group_columns: list[GroupColumn] = [] remove_rows: Optional[list[int]] = [] remove_columns: Optional[list[int]] = []Model for the input parameter needed for the first CHECK-IN step. Every single column in your data must be accounted for. Each column must either be assigned a type (
date_column,value_columns,group_columns) or be explicitly marked for removal inremove_columns.Parameters
date_column:DateColumn- Definition of the date column. Must be a single column that contains the complete date information.
value_columns:builtins.list[ValueColumn]- Definitions of the value columns. Not all columns defined here must be used for time series creation; selecting a subset or combining is possible in a later step.
group_columns:builtins.list[GroupColumn]- Definitions of the group columns. Not all columns defined here must be used for time series creation; selecting a subset is possible in a later step. Grouping information can also be used to create hierarchical levels.
remove_rows:typing.Optional[builtins.list[builtins.int]]- Indexes of the rows to be removed before validation. Note: If the raw data was committed as pandas data frame the header is the first row (row index 0).
remove_columns:typing.Optional[builtins.list[builtins.int]]- Indexes of the columns to be removed before validation. Any column that is not assigned a type must be listed here.
Create a new model by parsing and validating input data from keyword arguments.
Raises [
ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.selfis explicitly positional-only to allowselfas a field name.Ancestors
- BaseConfig
- pydantic.main.BaseModel
Class variables
var date_column : DateColumnvar group_columns : list[GroupColumn]var model_configvar remove_columns : list[int] | Nonevar remove_rows : list[int] | Nonevar value_columns : list[ValueColumn]
class DateColumn (**data: Any)-
Expand source code
class DateColumn(Column): """Model for the date columns. Parameters ---------- format: builtins.str The format of the date. name: builtins.str name_new: typing.Optional[builtins.str] """ format: strModel for the date columns.
Parameters
format:builtins.str- The format of the date.
name:builtins.strname_new:typing.Optional[builtins.str]
Create a new model by parsing and validating input data from keyword arguments.
Raises [
ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.selfis explicitly positional-only to allowselfas a field name.Ancestors
- Column
- BaseConfig
- pydantic.main.BaseModel
Class variables
var format : strvar model_config
class FileSpecification (**data: Any)-
Expand source code
class FileSpecification(BaseConfig): """Specify the format of the CSV file. Parameters ---------- delimiter: typing.Optional[builtins.str] The delimiter used to separate values. decimal: typing.Optional[builtins.str] The decimal character used in decimal numbers. thousands: typing.Optional[builtins.str] The thousands separator used in numbers. """ delimiter: Optional[str] = ',' decimal: Optional[str] = '.' thousands: Optional[str] = NoneSpecify the format of the CSV file.
Parameters
delimiter:typing.Optional[builtins.str]- The delimiter used to separate values.
decimal:typing.Optional[builtins.str]- The decimal character used in decimal numbers.
thousands:typing.Optional[builtins.str]- The thousands separator used in numbers.
Create a new model by parsing and validating input data from keyword arguments.
Raises [
ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.selfis explicitly positional-only to allowselfas a field name.Ancestors
- BaseConfig
- pydantic.main.BaseModel
Class variables
var decimal : str | Nonevar delimiter : str | Nonevar model_configvar thousands : str | None
class FilterSettings (**data: Any)-
Expand source code
class FilterSettings(BaseConfig): """Model for the filters. Parameters ---------- type: typing.Literal['exclusion', 'inclusion'] The type of filter: `exclusion` or `inclusion`. variable: builtins.str The columns name to be used for filtering. items: builtins.list[builtins.str] The list of values to be used for filtering. """ type: Literal['exclusion', 'inclusion'] variable: str items: list[str]Model for the filters.
Parameters
type:typing.Literal['exclusion', 'inclusion']- The type of filter:
exclusionorinclusion. variable:builtins.str- The columns name to be used for filtering.
items:builtins.list[builtins.str]- The list of values to be used for filtering.
Create a new model by parsing and validating input data from keyword arguments.
Raises [
ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.selfis explicitly positional-only to allowselfas a field name.Ancestors
- BaseConfig
- pydantic.main.BaseModel
Class variables
var items : list[str]var model_configvar type : Literal['exclusion', 'inclusion']var variable : str
class GroupColumn (**data: Any)-
Expand source code
class GroupColumn(Column): """Model for the group columns. Parameters ---------- dtype_str: typing.Optional[typing.Literal['Character']] The data type of the column. name: builtins.str name_new: typing.Optional[builtins.str] """ dtype_str: Optional[Literal['Character']] = NoneModel for the group columns.
Parameters
dtype_str:typing.Optional[typing.Literal['Character']]- The data type of the column.
name:builtins.strname_new:typing.Optional[builtins.str]
Create a new model by parsing and validating input data from keyword arguments.
Raises [
ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.selfis explicitly positional-only to allowselfas a field name.Ancestors
- Column
- BaseConfig
- pydantic.main.BaseModel
Class variables
var dtype_str : Literal['Character'] | Nonevar model_config
class NewValue (**data: Any)-
Expand source code
class NewValue(BaseConfig): """Model for the value data. Parameters ---------- first_variable: builtins.str The first variable name. operator: typing.Literal['x', '+', '-'] The operator that will be used to do the math operation between the first and second variable. second_variable: builtins.str The second variable name. new_variable: builtins.str The new variable name. unit: typing.Optional[builtins.str] The unit. """ first_variable: str operator: Literal['x', '+', '-'] second_variable: str new_variable: str unit: Optional[str] = NoneModel for the value data.
Parameters
first_variable:builtins.str- The first variable name.
operator:typing.Literal['x', '+', '-']- The operator that will be used to do the math operation between the first and second variable.
second_variable:builtins.str- The second variable name.
new_variable:builtins.str- The new variable name.
unit:typing.Optional[builtins.str]- The unit.
Create a new model by parsing and validating input data from keyword arguments.
Raises [
ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.selfis explicitly positional-only to allowselfas a field name.Ancestors
- BaseConfig
- pydantic.main.BaseModel
Class variables
var first_variable : strvar model_configvar new_variable : strvar operator : Literal['x', '+', '-']var second_variable : strvar unit : str | None
class TimeSeriesVersion (**data: Any)-
Expand source code
class TimeSeriesVersion(BaseModel): """Time series version created in CHECK-IN. Parameters ---------- version_id: builtins.str Id of the time series version. Used to identifiy the time series. description: typing.Optional[builtins.str] Description of the time series version. creation_time_utc: datetime.datetime Time of the creation. keep_until_utc: datetime Last day where the data is stored until it is deleted. keep_until_utc: datetime.datetime """ version_id: str description: Optional[str] creation_time_utc: datetime keep_until_utc: datetime @pydantic.model_validator(mode="after") def fix_time_stamps(self) -> Self: last_day = self.keep_until_utc.date() self.creation_time_utc = self.creation_time_utc.replace(microsecond=0) self.keep_until_utc = datetime.combine(last_day, datetime.min.time()) return selfTime series version created in CHECK-IN.
Parameters
version_id:builtins.str- Id of the time series version. Used to identifiy the time series.
description:typing.Optional[builtins.str]- Description of the time series version.
creation_time_utc:datetime.datetime- Time of the creation.
keep_until_utc:datetime- Last day where the data is stored until it is deleted.
keep_until_utc:datetime.datetime
Create a new model by parsing and validating input data from keyword arguments.
Raises [
ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.selfis explicitly positional-only to allowselfas a field name.Ancestors
- pydantic.main.BaseModel
Class variables
var creation_time_utc : datetime.datetimevar description : str | Nonevar keep_until_utc : datetime.datetimevar model_configvar version_id : str
Methods
def fix_time_stamps(self) ‑> Self-
Expand source code
@pydantic.model_validator(mode="after") def fix_time_stamps(self) -> Self: last_day = self.keep_until_utc.date() self.creation_time_utc = self.creation_time_utc.replace(microsecond=0) self.keep_until_utc = datetime.combine(last_day, datetime.min.time()) return self
class TsCreationConfig (**data: Any)-
Expand source code
class TsCreationConfig(BaseConfig): """Configuration for the creation of time series. Parameters ---------- value_columns_to_save: builtins.list[builtins.str] Value columns that should be saved. time_granularity: typing.Literal['yearly', 'quarterly', 'monthly', 'weekly', 'daily', 'hourly', 'halfhourly'] Target granularity of the time series. description: typing.Optional[builtins.str] A short description of the time series. start_date: typing.Optional[builtins.str] Dates before this date are excluded. end_date: typing.Optional[builtins.str] Dates after this date are excluded. grouping_level: builtins.list[builtins.str] Names of group columns that should be used as the grouping level. save_hierarchy: builtins.bool If true, interpretes the given grouping levels as levels of a hierarchy and saves all hierachy levels. Otherwise, no hierarchy levels are implied and only the single level with the given grouping is saved. e.g. if grouping_level is ['A', 'B', 'C'] time series of grouping 'A', 'AB' and 'ABC' is saved. For later filtering use {'grouping.A': {'$exists': True}} filter: builtins.list[futureexpert.checkin.FilterSettings] Settings for including or excluding values during time series creation. new_variables: builtins.list[futureexpert.checkin.NewValue] New value column that is a combination of two other value columns. missing_value_handler: typing.Literal['keepNaN', 'setToZero'] Strategy how to handle missing values during time series creation. """ value_columns_to_save: list[str] time_granularity: Literal['yearly', 'quarterly', 'monthly', 'weekly', 'daily', 'hourly', 'halfhourly'] description: Optional[str] = None grouping_level: list[str] = [] start_date: Optional[str] = None end_date: Optional[str] = None save_hierarchy: bool = False filter: list[FilterSettings] = [] new_variables: list[NewValue] = [] missing_value_handler: Literal['keepNaN', 'setToZero'] = 'keepNaN'Configuration for the creation of time series.
Parameters
value_columns_to_save:builtins.list[builtins.str]- Value columns that should be saved.
time_granularity:typing.Literal['yearly', 'quarterly', 'monthly', 'weekly', 'daily', 'hourly', 'halfhourly']- Target granularity of the time series.
description:typing.Optional[builtins.str]- A short description of the time series.
start_date:typing.Optional[builtins.str]- Dates before this date are excluded.
end_date:typing.Optional[builtins.str]- Dates after this date are excluded.
grouping_level:builtins.list[builtins.str]- Names of group columns that should be used as the grouping level.
save_hierarchy:builtins.bool- If true, interpretes the given grouping levels as levels of a hierarchy and saves all hierachy levels. Otherwise, no hierarchy levels are implied and only the single level with the given grouping is saved. e.g. if grouping_level is ['A', 'B', 'C'] time series of grouping 'A', 'AB' and 'ABC' is saved. For later filtering use {'grouping.A': {'$exists': True}}
filter:builtins.list[FilterSettings]- Settings for including or excluding values during time series creation.
new_variables:builtins.list[NewValue]- New value column that is a combination of two other value columns.
missing_value_handler:typing.Literal['keepNaN', 'setToZero']- Strategy how to handle missing values during time series creation.
Create a new model by parsing and validating input data from keyword arguments.
Raises [
ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.selfis explicitly positional-only to allowselfas a field name.Ancestors
- BaseConfig
- pydantic.main.BaseModel
Class variables
var description : str | Nonevar end_date : str | Nonevar filter : list[FilterSettings]var grouping_level : list[str]var missing_value_handler : Literal['keepNaN', 'setToZero']var model_configvar new_variables : list[NewValue]var save_hierarchy : boolvar start_date : str | Nonevar time_granularity : Literal['yearly', 'quarterly', 'monthly', 'weekly', 'daily', 'hourly', 'halfhourly']var value_columns_to_save : list[str]
class ValueColumn (**data: Any)-
Expand source code
class ValueColumn(Column): """Model for the value columns. Parameters ---------- min: typing.Optional[builtins.int] The set minimum value of the column. max: typing.Optional[builtins.int] The set maximum value of the column. dtype_str: typing.Optional[typing.Literal['Numeric', 'Integer']] The data type of the column. unit: typing.Optional[builtins.str] The unit of the column. name: builtins.str name_new: typing.Optional[builtins.str] """ min: Optional[int] = None max: Optional[int] = None dtype_str: Optional[Literal['Numeric', 'Integer']] = None unit: Optional[str] = NoneModel for the value columns.
Parameters
min:typing.Optional[builtins.int]- The set minimum value of the column.
max:typing.Optional[builtins.int]- The set maximum value of the column.
dtype_str:typing.Optional[typing.Literal['Numeric', 'Integer']]- The data type of the column.
unit:typing.Optional[builtins.str]- The unit of the column.
name:builtins.strname_new:typing.Optional[builtins.str]
Create a new model by parsing and validating input data from keyword arguments.
Raises [
ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.selfis explicitly positional-only to allowselfas a field name.Ancestors
- Column
- BaseConfig
- pydantic.main.BaseModel
Class variables
var dtype_str : Literal['Numeric', 'Integer'] | Nonevar max : int | Nonevar min : int | Nonevar model_configvar unit : str | None