DNN_model {DNNSIM}R Documentation

Define and train the DNN-SIM model

Description

Define and train the DNN-SIM model

Usage

DNN_model(
  formula,
  data,
  model,
  num_epochs,
  verbatim = TRUE,
  CV = FALSE,
  CV_K = 10,
  bootstrap = FALSE,
  bootstrap_B = 1000,
  bootstrap_num_epochs = 100,
  U_new = FALSE,
  U_min = -4,
  U_max = 4,
  random_state = 100
)

Arguments

formula

an object of class "formula" (or one that can be coerced to that class): a symbolic description of the model to be fitted.

data

a data frame.

model

the model type. It must be be one of "N-GX-D","SN-GX-D","ST-GX-D","N-GX-B","SN-GX-B","ST-GX-B","N-FX","SN-FX","ST-FX".

num_epochs

an integer. The number of complete passes through the training dataset.

verbatim

TRUE/FALSE.If verbatim is TRUE, then log information from training the DNN-SIM model will be printed.

CV

TRUE/FALSE. Whether use the cross-validation to measure the prediction accuracy.

CV_K

an integer. The number of folders K-folder cross-validation.

bootstrap

TRUE/FALSE. Whether use the bootstrap method to quantify the uncertainty. The bootstrap option ONLY works for the "ST-GX-D" model.

bootstrap_B

an integer. The number of bootstrap iteration.

bootstrap_num_epochs

an integer. The number of complete passes through the training dataset in the bootstrap procedure.

U_new

TRUE/FALSE. Whether use self defined U for the estimation of single index function, g(U).

U_min

a numeric value. The minimum of the self defined U.

U_max

a numeric value. The maximum of the self defined U.

random_state

an integer. The random seed for initiating the neural network.

Details

The DNNSIM model is defined as:

Y = g(\mathbf{X} \boldsymbol{\beta}) + e.

The residuals e follow a skewed T distribution, skewed normal distribution, or normal distribution. The single index function g is assumed to be a monotonic increasing function.

Value

A list consisting of the point estimation, g function estimation (optional), cross-validation results (optional) and bootstrap results(optional).

References

Liu Q, Huang X, Bai R (2024). “Bayesian Modal Regression Based on Mixture Distributions.” Computational Statistics & Data Analysis, 108012. doi:10.1016/j.csda.2024.108012.

Examples



# check python module dependencies
if (reticulate::py_module_available("torch") &
    reticulate::py_module_available("numpy") &
    reticulate::py_module_available("sklearn") &
    reticulate::py_module_available("scipy")) {

  # set the random seed
  set.seed(100)

  # simulate some data
  df1 <- data_simulation(n=100,beta=c(1,1,1),w=0.3,
                         sigma=0.1,delta=10.0,seed=100)

  # the cross-validation and bootstrap takes a long time
  DNN_model_output <- DNN_model(y ~ X1 + X2 + X3 - 1,
                                data = df1,
                                model = "ST-GX-D",
                                num_epochs = 5,
                                verbatim = FALSE,
                                CV = TRUE,
                                CV_K = 2,
                                bootstrap = TRUE,
                                bootstrap_B = 2,
                                bootstrap_num_epochs = 5,
                                U_new = TRUE,
                                U_min = -4.0,
                                U_max = 4.0)
  print(DNN_model_output)
}




[Package DNNSIM version 0.1.1 Index]