• plug_model() adds a parsnip model to the tidyflow.

  • drop_model() removes the model specification as well as any fitted model object. Any extra formulas are also removed. Doesn't remove any steps from the pre stage.

  • replace_model() first removes the model then adds the new specification to the tidyflow.

plug_model(x, spec, formula = NULL)

drop_model(x)

replace_model(x, spec, formula = NULL)

Arguments

x

A tidyflow.

spec

A parsnip model specification.

formula

An optional formula override to specify the terms of the model. Typically, the terms are extracted from the formula or recipe preprocessing methods. However, some models (like survival and bayesian models) use the formula not to preprocess, but to specify the structure of the model. In those cases, a formula specifying the model structure must be passed unchanged into the model call itself. This argument is used for those purposes.

Value

x, updated with either a new or removed model.

Details

plug_model() is a required step to construct a minimal tidyflow.

Examples

library(parsnip)

# Define two competing model:
lm_model <- set_engine(linear_reg(), "lm")
regularized_model <- set_engine(lm_model, "glmnet")

# Define a minimal tidyflow: data + formula + model
wf <-
  mtcars %>%
  tidyflow() %>%
  plug_formula(mpg ~ .) %>% 
  plug_model(lm_model)

wf
#> ══ Tidyflow ════════════════════════════════════════════════════════════════════
#> Data: 32 rows x 11 columns
#> Split: None
#> Formula: mpg ~ .
#> Resample: None
#> Grid: None
#> Model:
#> Linear Regression Model Specification (regression)
#> 
#> Computational engine: lm 
#> 

# We can drop the model at any time and the remaining steps
# are intact
drop_model(wf)
#> ══ Tidyflow ════════════════════════════════════════════════════════════════════
#> Data: 32 rows x 11 columns
#> Split: None
#> Formula: mpg ~ .
#> Resample: None
#> Grid: None
#> Model: None

# We can fit the model with `fit`:
fitted <- fit(wf)

# Extract the model if needed:
fitted %>%
  pull_tflow_fit()
#> parsnip model object
#> 
#> 
#> Call:
#> stats::lm(formula = ..y ~ ., data = data)
#> 
#> Coefficients:
#> (Intercept)          cyl         disp           hp         drat           wt  
#>    12.30337     -0.11144      0.01334     -0.02148      0.78711     -3.71530  
#>        qsec           vs           am         gear         carb  
#>     0.82104      0.31776      2.52023      0.65541     -0.19942  
#> 

# If we remove the model from the fitted `tidyflow`,
# the fit is dropped:
drop_model(fitted)
#> ══ Tidyflow ════════════════════════════════════════════════════════════════════
#> Data: 32 rows x 11 columns
#> Split: None
#> Formula: mpg ~ .
#> Resample: None
#> Grid: None
#> Model: None

# We could replace the model from the initial tidyflow with
# the regularized model with `replace_model`

## TODO: when https://github.com/cimentadaj/tidyflow/issues/4 is fixed
## replace wf with fitted here.

reg_fitted <-
  wf %>%
  replace_model(regularized_model) %>%
  fit()
#> Error in .check_glmnet_penalty_fit(x): For the glmnet engine, `penalty` must be a single number (or a value of `tune()`).
#>  There are 0 values for `penalty`.
#>  To try multiple values for total regularization, use the tune package.
#>  To predict multiple penalties, use `multi_predict()`

reg_fitted %>%
  pull_tflow_fit()
#> Error in is_tidyflow(x): object 'reg_fitted' not found