GMMN_trained {gnn} | R Documentation |
Trained generative moment matching networks (GMMNs); see also
the demos GMMN_QMC_paper
and GMMN_MTS_paper
.
data("GMMN_dim_2_300_2_ntrn_60000_nbat_5000_nepo_300_C_tau_0.25") data("GMMN_dim_2_300_2_ntrn_60000_nbat_5000_nepo_300_C_tau_0.5") data("GMMN_dim_2_300_2_ntrn_60000_nbat_5000_nepo_300_C_tau_0.75") data("GMMN_dim_2_300_2_ntrn_60000_nbat_5000_nepo_300_G_tau_0.25") data("GMMN_dim_2_300_2_ntrn_60000_nbat_5000_nepo_300_G_tau_0.5") data("GMMN_dim_2_300_2_ntrn_60000_nbat_5000_nepo_300_G_tau_0.75") data("GMMN_dim_2_300_2_ntrn_60000_nbat_5000_nepo_300_eqmix_C_tau_0.5_rot90_t4_tau_0.5") data("GMMN_dim_2_300_2_ntrn_60000_nbat_5000_nepo_300_eqmix_G_tau_0.5_rot90_t4_tau_0.5") data("GMMN_dim_2_300_2_ntrn_60000_nbat_5000_nepo_300_eqmix_MO_0.75_0.6_rot90_t4_tau_0.5") data("GMMN_dim_2_300_2_ntrn_60000_nbat_5000_nepo_300_MO_0.75_0.6") data("GMMN_dim_2_300_2_ntrn_60000_nbat_5000_nepo_300_t4_tau_0.25") data("GMMN_dim_2_300_2_ntrn_60000_nbat_5000_nepo_300_t4_tau_0.5") data("GMMN_dim_2_300_2_ntrn_60000_nbat_5000_nepo_300_t4_tau_0.75") data("GMMN_dim_3_300_3_ntrn_60000_nbat_5000_nepo_300_NC21_tau_0.25_0.5") data("GMMN_dim_3_300_3_ntrn_60000_nbat_5000_nepo_300_NG21_tau_0.25_0.5") data("GMMN_dim_5_300_5_ntrn_60000_nbat_5000_nepo_300_C_tau_0.5") data("GMMN_dim_5_300_5_ntrn_60000_nbat_5000_nepo_300_G_tau_0.5") data("GMMN_dim_5_300_5_ntrn_60000_nbat_5000_nepo_300_NC23_tau_0.25_0.5_0.75") data("GMMN_dim_5_300_5_ntrn_60000_nbat_5000_nepo_300_NG23_tau_0.25_0.5_0.75") data("GMMN_dim_5_300_5_ntrn_60000_nbat_5000_nepo_300_t4_tau_0.5") data("GMMN_dim_10_300_10_ntrn_60000_nbat_5000_nepo_300_C_tau_0.5") data("GMMN_dim_10_300_10_ntrn_60000_nbat_5000_nepo_300_G_tau_0.5") data("GMMN_dim_10_300_10_ntrn_60000_nbat_5000_nepo_300_NC55_tau_0.25_0.5_0.75") data("GMMN_dim_10_300_10_ntrn_60000_nbat_5000_nepo_300_NG55_tau_0.25_0.5_0.75") data("GMMN_dim_10_300_10_ntrn_60000_nbat_5000_nepo_300_t4_tau_0.5") data("GMMN_dim_3_100_3_ntrn_4996_nbat_4996_nepo_1000_PCA_ZCB_USD") data("GMMN_dim_3_300_3_ntrn_4996_nbat_4996_nepo_1000_PCA_ZCB_USD") data("GMMN_dim_3_600_3_ntrn_4996_nbat_4996_nepo_1000_PCA_ZCB_USD") data("GMMN_dim_4_100_4_ntrn_4947_nbat_4947_nepo_1000_PCA_ZCB_CAD") data("GMMN_dim_4_300_4_ntrn_4947_nbat_4947_nepo_1000_PCA_ZCB_CAD") data("GMMN_dim_4_600_4_ntrn_4947_nbat_4947_nepo_1000_PCA_ZCB_CAD") data("GMMN_dim_5_100_5_ntrn_5478_nbat_5478_nepo_1000_FX_USD") data("GMMN_dim_5_300_5_ntrn_5478_nbat_5478_nepo_1000_FX_USD") data("GMMN_dim_5_600_5_ntrn_5478_nbat_5478_nepo_1000_FX_USD") data("GMMN_dim_6_100_6_ntrn_5478_nbat_5478_nepo_1000_FX_GBP") data("GMMN_dim_6_300_6_ntrn_5478_nbat_5478_nepo_1000_FX_GBP") data("GMMN_dim_6_600_6_ntrn_5478_nbat_5478_nepo_1000_FX_GBP")
raw
R object representing a GMMN (input and output
layer are two-dimensional, the single hidden layer is
300-dimensional) trained on 60000 pseudo-samples (with batch size 5000 and
300 epochs) from a bivariate Clayton
copula (with parameter chosen such that Kendall's tau equals 0.25).
raw
R object representing a GMMN (input and output
layer are two-dimensional, the single hidden layer is
300-dimensional) trained on 60000 pseudo-samples (with batch size 5000 and
300 epochs) from a bivariate Clayton
copula (with parameter chosen such that Kendall's tau equals 0.5).
raw
R object representing a GMMN (input and output
layer are two-dimensional, the single hidden layer is
300-dimensional) trained on 60000 pseudo-samples (with batch size 5000 and
300 epochs) from a bivariate Clayton
copula (with parameter chosen such that Kendall's tau equals 0.75).
raw
R object representing a GMMN (input and output
layer are two-dimensional, the single hidden layer is
300-dimensional) trained on 60000 pseudo-samples (with batch size 5000 and
300 epochs) from a bivariate Gumbel
copula (with parameter chosen such that Kendall's tau equals 0.25).
raw
R object representing a GMMN (input and output
layer are two-dimensional, the single hidden layer is
300-dimensional) trained on 60000 pseudo-samples (with batch size 5000 and
300 epochs) from a bivariate Gumbel
copula (with parameter chosen such that Kendall's tau equals 0.5).
raw
R object representing a GMMN (input and output
layer are two-dimensional, the single hidden layer is
300-dimensional) trained on 60000 pseudo-samples (with batch size 5000 and
300 epochs) from a bivariate Gumbel
copula (with parameter chosen such that Kendall's tau equals 0.75).
raw
R object representing a GMMN (input and output
layer are two-dimensional, the single hidden layer is
300-dimensional) trained on 60000 pseudo-samples (with batch size 5000 and
300 epochs) from a bivariate half-half mixture of a Clayton
copula (with parameter chosen such that Kendall's tau equals 0.5)
and a rotated (by 90 degree) $t$ copula (with 4 degrees of freedom
and correlation parameter chosen such that Kendall's tau equals
0.5).
raw
R object representing a GMMN (input and output
layer are two-dimensional, the single hidden layer is
300-dimensional) trained on 60000 pseudo-samples (with batch size 5000 and
300 epochs) from a bivariate half-half mixture of a Gumbel
copula (with parameter chosen such that Kendall's tau equals 0.5)
and a rotated (by 90 degree) $t$ copula (with 4 degrees of freedom
and correlation parameter chosen such that Kendall's tau equals
0.5).
raw
R object representing a GMMN (input and output
layer are two-dimensional, the single hidden layer is
300-dimensional) trained on 60000 pseudo-samples (with batch size 5000 and
300 epochs) from a bivariate half-half mixture of a Marshall–Olkin
copula (with alpha_1 = 0.75 and
alpha_2 = 0.60)
and a rotated (by 90 degree) $t$ copula (with 4 degrees of freedom
and correlation parameter chosen such that Kendall's tau equals
0.5).
raw
R object representing a GMMN (input and output
layer are two-dimensional, the single hidden layer is
300-dimensional) trained on 60000 pseudo-samples (with batch size 5000 and
300 epochs) from a Marshall–Olkin
copula (with alpha_1=0.75 and
alpha_2=0.60).
raw
R object representing a GMMN (input and output
layer are two-dimensional, the single hidden layer is
300-dimensional) trained on 60000 pseudo-samples (with batch size 5000 and
300 epochs) from a two-dimensional $t$ copula (with 4 degrees of freedom
and equi-correlation parameter chosen such that Kendall's tau equals
0.25).
raw
R object representing a GMMN (input and output
layer are two-dimensional, the single hidden layer is
300-dimensional) trained on 60000 pseudo-samples (with batch size 5000 and
300 epochs) from a two-dimensional $t$ copula (with 4 degrees of freedom
and equi-correlation parameter chosen such that Kendall's tau equals
0.5).
raw
R object representing a GMMN (input and output
layer are two-dimensional, the single hidden layer is
300-dimensional) trained on 60000 pseudo-samples (with batch size 5000 and
300 epochs) from a two-dimensional $t$ copula (with 4 degrees of freedom
and equi-correlation parameter chosen such that Kendall's tau equals
0.75).
raw
R object representing a GMMN (input and output
layer are three-dimensional, the single hidden layer is
300-dimensional) trained on 60000 pseudo-samples (with batch size 5000 and
300 epochs) from a three-dimensional nested Clayton copula
(with sector dimensions 2 and 1, corresponding Kendall's tau 0.5
within the first sector and Kendall's tau 0.25 between the two sectors).
raw
R object representing a GMMN (input and output
layer are three-dimensional, the single hidden layer is
300-dimensional) trained on 60000 pseudo-samples (with batch size 5000 and
300 epochs) from a three-dimensional nested Gumbel copula
(with sector dimensions 2 and 1, corresponding Kendall's tau 0.5
within the first sector and Kendall's tau 0.25 between the two sectors).
raw
R object representing a GMMN (input and output
layer are five-dimensional, the single hidden layer is
300-dimensional) trained on 60000 pseudo-samples (with batch size 5000 and
300 epochs) from a five-dimensional Clayton
copula (with parameter chosen such that Kendall's tau equals 0.5).
raw
R object representing a GMMN (input and output
layer are five-dimensional, the single hidden layer is
300-dimensional) trained on 60000 pseudo-samples (with batch size 5000 and
300 epochs) from a five-dimensional Gumbel
copula (with parameter chosen such that Kendall's tau equals 0.5).
raw
R object representing a GMMN (input and output
layer are five-dimensional, the single hidden layer is
300-dimensional) trained on 60000 pseudo-samples (with batch size 5000 and
300 epochs) from a five-dimensional nested Clayton copula
(with sector dimensions 2 and 3, corresponding Kendall's tau 0.5
and 0.75, and Kendall's tau 0.25 between the two sectors).
raw
R object representing a GMMN (input and output
layer are five-dimensional, the single hidden layer is
300-dimensional) trained on 60000 pseudo-samples (with batch size 5000 and
300 epochs) from a five-dimensional nested Gumbel copula
(with sector dimensions 2 and 3, corresponding Kendall's tau 0.5
and 0.75, and Kendall's tau 0.25 between the two sectors).
raw
R object representing a GMMN (input and output
layer are five-dimensional, the single hidden layer is
300-dimensional) trained on 60000 pseudo-samples (with batch size 5000 and
300 epochs) from a five-dimensional $t$ copula (with 4 degrees of freedom
and equi-correlation parameter chosen such that Kendall's tau equals
0.5).
raw
R object representing a GMMN (input and output
layer are 10-dimensional, the single hiddenlayer is
300-dimensional) trained on 60000 pseudo-samples (with batch size 5000 and
300 epochs) from a 10-dimensional Clayton
copula (with parameter chosen such that Kendall's tau equals 0.5).
raw
R object representing a GMMN (input and output
layer are 10-dimensional, the single hiddenlayer is
300-dimensional) trained on 60000 pseudo-samples (with batch size 5000 and
300 epochs) from a 10-dimensional Gumbel
copula (with parameter chosen such that Kendall's tau equals 0.5).
raw
R object representing a GMMN (input and output
layer are 10-dimensional, the single hidden layer is
300-dimensional) trained on 60000 pseudo-samples (with batch size 5000 and
300 epochs) from a 10-dimensional nested Clayton copula
(with sector dimensions 5 and 5, corresponding Kendall's tau 0.5
and 0.75, and Kendall's tau 0.25 between the two sectors).
raw
R object representing a GMMN (input and output
layer are 10-dimensional, the single hidden layer is
300-dimensional) trained on 60000 pseudo-samples (with batch size 5000 and
300 epochs) from a 10-dimensional nested Gumbel copula
(with sector dimensions 5 and 5, corresponding Kendall's tau 0.5
and 0.75, and Kendall's tau 0.25 between the two sectors).
raw
R object representing a GMMN (input and output
layer are 10-dimensional, the single hidden layer is
300-dimensional) trained on 60000 pseudo-samples (with batch size 5000 and
300 epochs) from a 10-dimensional $t$ copula (with 4 degrees of freedom
and equi-correlation parameter chosen such that Kendall's tau equals
0.5).
raw
R object representing a GMMN (input and output
layer are 3-dimensional, the single hidden layer is
100-dimensional) trained on 4996 pseudo-observations (with batch size 4996 and
1000 epochs) constructed using the first 3 principal components of
the standardized residuals after de-ARMA(1,1)-GARCH(1,1)-ing
a 30-dimensional US ZCB yield curve series from
1995-01-01 to 2014-12-31.
raw
R object representing a GMMN (input and output
layer are 3-dimensional, the single hidden layer is
300-dimensional) trained on 4996 pseudo-observations (with batch size 4996 and
1000 epochs) constructed using the first 3 principal components of
the standardized residuals after de-ARMA(1,1)-GARCH(1,1)-ing
a 30-dimensional US ZCB yield curve series from
1995-01-01 to 2014-12-31.
raw
R object representing a GMMN (input and output
layer are 3-dimensional, the single hidden layer is
600-dimensional) trained on 4996 pseudo-observations (with batch size 4996 and
1000 epochs) constructed using the first 3 principal components of
the standardized residuals after de-ARMA(1,1)-GARCH(1,1)-ing
a 30-dimensional US ZCB yield curve series from
1995-01-01 to 2014-12-31.
raw
R object representing a GMMN (input and output
layer are 4-dimensional, the single hidden layer is
100-dimensional) trained on 4947 pseudo-observations (with batch size 4947 and
1000 epochs) constructed using the three 4 principal components of
the standardized residuals after de-ARMA(1,1)-GARCH(1,1)-ing
a 120-dimensional CAD ZCB yield curve series from
1995-01-01 to 2014-12-31.
raw
R object representing a GMMN (input and output
layer are 4-dimensional, the single hidden layer is
300-dimensional) trained on 4947 pseudo-observations (with batch size 4947 and
1000 epochs) constructed using the first 4 principal components of
the standardized residuals after de-ARMA(1,1)-GARCH(1,1)-ing
a 120-dimensional CAD ZCB yield curve series from
1995-01-01 to 2014-12-31.
raw
R object representing a GMMN (input and output
layer are 4-dimensional, the single hidden layer is
600-dimensional) trained on 4947 pseudo-observations (with batch size 4947 and
1000 epochs) constructed using the first 4 principal components of
the standardized residuals after de-ARMA(1,1)-GARCH(1,1)-ing
a 120-dimensional CAD ZCB yield curve series from
1995-01-01 to 2014-12-31.
raw
R object representing a GMMN (input and output
layer are 5-dimensional, the single hidden layer is
100-dimensional) trained on 5478 pseudo-observations (with batch size 5478 and
1000 epochs) constructed using the
standardized residuals after de-ARMA(1,1)-GARCH(1,1)-ing
a 5-dimensional USD FX return series from 2000-01-01 to 2014-12-31.
raw
R object representing a GMMN (input and output
layer are 5-dimensional, the single hidden layer is
300-dimensional) trained on 5478 pseudo-observations (with batch size 5478 and
1000 epochs) constructed using the standardized residuals after
de-ARMA(1,1)-GARCH(1,1)-ing a 5-dimensional USD FX return series from
2000-01-01 to 2014-12-31.
raw
R object representing a GMMN (input and output
layer are 5-dimensional, the single hidden layer is
600-dimensional) trained on 5478 pseudo-observations (with batch size 5478 and
1000 epochs) constructed using the
standardized residuals after de-ARMA(1,1)-GARCH(1,1)-ing
a 5-dimensional USD FX return series from
2000-01-01 to 2014-12-31.
raw
R object representing a GMMN (input and output
layer are 6-dimensional, the single hidden layer is
100-dimensional) trained on 5478 pseudo-observations (with batch size 5478 and
1000 epochs) constructed using the
standardized residuals after de-ARMA(1,1)-GARCH(1,1)-ing
a 6-dimensional GBP FX return series from 2000-01-01 to 2014-12-31.
raw
R object representing a GMMN (input and output
layer are 6-dimensional, the single hidden layer is
300-dimensional) trained on 5478 pseudo-observations (with batch size 5478 and
1000 epochs) constructed using the
standardized residuals after de-ARMA(1,1)-GARCH(1,1)-ing
a 6-dimensional GBP FX return series from 2000-01-01 to 2014-12-31.
raw
R object representing a GMMN (input and output
layer are 6-dimensional, the single hidden layer is
600-dimensional) trained on 5478 pseudo-observations (with batch size 5478 and
1000 epochs) constructed using the
standardized residuals after de-ARMA(1,1)-GARCH(1,1)-ing
a 6-dimensional GBP FX return series from 2000-01-01 to 2014-12-31.
Marius Hofert and Avinash Prasad
GPU server with NVIDIA Tesla P100 GPUs.
Hofert, M., Prasad, A. and Zhu, M. (2018). Quasi-Monte Carlo for multivariate distributions via generative neural networks. (See https://arxiv.org/abs/1811.00683 for an early version) Hofert, M. Prasad, A. and Zhu, M. (2019). Multivariate time-series modeling with generative neural networks (See arXiv preprint arXiv:2002.10645 for an early version)
GMMN_model()
, to_callable()
# to avoid win-builder error "Error: Installation of TensorFlow not found" ## Load a trained GMMN (see train_once()) NNname <- "GMMN_dim_2_300_2_ntrn_60000_nbat_5000_nepo_300_eqmix_C_tau_0.5_rot90_t4_tau_0.5" NN <- read_rda(NNname, package = "gnn") GMMN1 <- to_callable(NN) str(GMMN1) ## Alternative NNnm <- data(list = NNname) GMMN2 <- to_callable(get(NNnm)) str(GMMN2) ## Check (the check-able components) stopifnot(identical(GMMN1[names(GMMN1) != "model"], GMMN2[names(GMMN2) != "model"])) ## Evaluate set.seed(271) N.prior <- matrix(rnorm(2000 * 2), ncol = 2) X <- predict(GMMN1[["model"]], x = N.prior) plot(X, xlab = expression(X[1]), ylab = expression(X[2]))