tuneandtrainRobustTuneCBoost {RobustPrediction} | R Documentation |
Tune and Train RobustTuneC Boosting
Description
This function tunes and trains a Boosting classifier using the mboost::glmboost
function
and the "RobustTuneC" method. The function performs K-fold cross-validation on the training dataset
and evaluates a sequence of boosting iterations (mstop
) based on the Area Under the Curve (AUC).
Usage
tuneandtrainRobustTuneCBoost(
data,
dataext,
K = 5,
mstop_seq = seq(5, 1000, by = 5),
nu = 0.1
)
Arguments
data |
Training data as a data frame. The first column should be the response variable. |
dataext |
External validation data as a data frame. The first column should be the response variable. |
K |
Number of folds to use in cross-validation. Default is 5. |
mstop_seq |
A sequence of boosting iterations to consider. Default is a sequence starting at 5 and increasing by 5 each time, up to 1000. |
nu |
Learning rate for the boosting algorithm. Default is 0.1. |
Details
After cross-validation, the best mstop
value is selected based on the AUC, and the final Boosting
model is trained using this optimal mstop
. The external validation dataset is then used to calculate
the final AUC and assess the model performance.
Value
A list containing the best number of boosting iterations ('best_mstop'), the final trained model ('best_model'), and the chosen c value('best_c').
Examples
# Load the sample data
data(sample_data_train)
data(sample_data_extern)
# Example usage with the sample data
mstop_seq <- seq(50, 500, by = 50)
result <- tuneandtrainRobustTuneCBoost(sample_data_train, sample_data_extern, mstop_seq = mstop_seq)
result$best_mstop
result$best_model
result$best_c