crps_weights.Rd
given true values and predictive samples from different models, `crps_weights` returns the stacking weights which produce the ensemble that minimises the Continuos Ranked Probability Score (CRPS).
crps_weights(data, lambda = NULL, gamma = NULL, dirichlet_alpha = 1.001)
a data.frame with the following entries:
observed, the true observed values
predicted, predicted values corresponding to the true values in observed
model, the name of the model used to generate the correspondig predictions
geography (optional), the regions for which predictions are generated. If geography is missing, it will be assumed there are no geographical differenes to take into account. Internally, regions will be ordered alphabetically
date (the date of the corresponding prediction / true value). Also works with numbers to indicate timesteps
weights given to timepoints. If lamba
is NULL
,
the default gives more weight to recent time points with
lambda[t] = 2 - (1 - t / T)^2. Note that elemeents of lambda need not
necessarily sum up to one as the stan model automatically constraints
the final weights to sum to one irrespective of lambda.
lambda = "equal"
uses equal weights
weights given to regions. If gamma
is NULL
the
default is equal weights for the regions. Weights are mapped to regions
alphabetically, so make sure that the the weights correspond to the
regions in alphabetical order.
prior for the weights. Default is 1.001
returns a vector with the model weights
Strictly Proper Scoring Rules, Prediction,and Estimation, Tilmann Gneiting and Adrian E. Raftery, 2007, Journal of the American Statistical Association, Volume 102, 2007 - Issue 477
Using Stacking to Average Bayesian Predictive Distributions, Yuling Yao , Aki Vehtari, Daniel Simpson, and Andrew Gelman, 2018, Bayesian Analysis 13, Number 3, pp. 917–1003
if (FALSE) { # \dontrun{
library("data.table")
splitdate <- as.Date("2020-03-28")
data <- setDT(example_data)
traindata <- data[date <= splitdate]
testdata <- data[date > splitdate]
weights <- crps_weights(traindata)
} # }