require(IgGeneUsage)
require(rstan)
require(knitr)
require(ggplot2)
require(ggforce)
require(ggrepel)
require(reshape2)
require(patchwork)
Decoding the properties of immune receptor repertoires (IRRs) is key to understanding how our adaptive immune system responds to challenges, such as viral infection or cancer. One important quantitative property of IRRs is their immunoglobulin (Ig) gene usage, i.e. how often are the differnt Igs that make up the immune receptors used in a given IRR. Furthermore, we may ask: is there differential gene usage (DGU) between IRRs from different biological conditions (e.g. healthy vs tumor).
Both of these questions can be answered quantitatively by are answered by IgGeneUsage.
The main input of IgGeneUsage is a data.frame that has the following columns:
IgGeneUsage transforms the input data as follows.
First, given \(R\) repertoires with \(G\) genes each, IgGeneUsage generates a gene usage matrix \(Y^{R \times G}\). Row sums in \(Y\) define the total usage (\(N\)) in each repertoire.
Second, for the analysis of DGU between biological conditions, we use a Bayesian model (\(M\)) for zero-inflated beta-binomial regression. Empirically, we know that Ig gene usage data can be noisy also not exhaustive, i.e. some Ig genes that are systematically rearranged at low probability might not be sampled, and certain Ig genes are not encoded (or dysfunctional) in some individuals. \(M\) can fit over-dispersed and zero-inflated Ig gene usage data.
In the output of IgGeneUsage, we report the mean effect size (es or \(\gamma\)) and its 95% highest density interval (HDI). Genes with \(\gamma \neq 0\) (e.g. if 95% HDI of \(\gamma\) excludes 0) are most likely to experience differential usage. Additionally, we report the probability of differential gene usage (\(\pi\)): \[\begin{align} \pi = 2 \cdot \max\left(\int_{\gamma = -\infty}^{0} p(\gamma)\mathrm{d}\gamma, \int_{\gamma = 0}^{\infty} p(\gamma)\mathrm{d}\gamma\right) - 1 \end{align}\] with \(\pi = 1\) for genes with strong differential usage, and \(\pi = 0\) for genes with negligible differential gene usage. Both metrics are computed based on the posterior distribution of \(\gamma\), and are thus related.
IgGeneUsage has a couple of built-in Ig gene usage datasets. Some were obtained from studies and others were simulated.
Lets look into the simulated dataset d_zibb_3
. This dataset was generated
by a zero-inflated beta-binomial (ZIBB) model, and IgGeneUsage
was designed to fit ZIBB-distributed data.
data("d_zibb_3", package = "IgGeneUsage")
knitr::kable(head(d_zibb_3))
individual_id | gene_name | gene_usage_count | condition |
---|---|---|---|
I_1 | G_1 | 29 | C_1 |
I_1 | G_2 | 135 | C_1 |
I_1 | G_3 | 6 | C_1 |
I_1 | G_4 | 52 | C_1 |
I_1 | G_5 | 68 | C_1 |
I_1 | G_6 | 41 | C_1 |
We can also visualize d_zibb_3
with ggplot:
ggplot(data = d_zibb_3)+
geom_point(aes(x = gene_name, y = gene_usage_count, col = condition),
position = position_dodge(width = .7), shape = 21)+
theme_bw(base_size = 11)+
ylab(label = "Gene usage [count]")+
xlab(label = '')+
theme(legend.position = "top")+
theme(axis.text.x = element_text(angle = 90, hjust = 1, vjust = 0.4))
As main input IgGeneUsage uses a data.frame formatted as e.g.
d_zibb_3
. Other input parameters allow you to configure specific settings
of the rstan sampler.
In this example, we analyze d_zibb_3
with 3 MCMC chains, 1500 iterations
each including 500 warm-ups using a single CPU core (Hint: for parallel
chain execution set parameter mcmc_cores
= 3). We report for each model
parameter its mean and 95% highest density interval (HDIs).
Important remark: you should run DGU analyses using default IgGeneUsage parameters. If warnings or errors are reported with regard to the MCMC sampling, please consult the Stan manual1 https://mc-stan.org/misc/warnings.html and adjust the inputs accordingly. If the warnings persist, please submit an issue with a reproducible script at the Bioconductor support site or on Github2 https://github.com/snaketron/IgGeneUsage/issues.
M <- DGU(ud = d_zibb_3, # input data
mcmc_warmup = 300, # how many MCMC warm-ups per chain (default: 500)
mcmc_steps = 1500, # how many MCMC steps per chain (default: 1,500)
mcmc_chains = 3, # how many MCMC chain to run (default: 4)
mcmc_cores = 1, # how many PC cores to use? (e.g. parallel chains)
hdi_lvl = 0.95, # highest density interval level (de fault: 0.95)
adapt_delta = 0.8, # MCMC target acceptance rate (default: 0.95)
max_treedepth = 10) # tree depth evaluated at each step (default: 12)
FALSE
FALSE SAMPLING FOR MODEL 'dgu' NOW (CHAIN 1).
FALSE Chain 1:
FALSE Chain 1: Gradient evaluation took 0.000167 seconds
FALSE Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 1.67 seconds.
FALSE Chain 1: Adjust your expectations accordingly!
FALSE Chain 1:
FALSE Chain 1:
FALSE Chain 1: Iteration: 1 / 1500 [ 0%] (Warmup)
FALSE Chain 1: Iteration: 50 / 1500 [ 3%] (Warmup)
FALSE Chain 1: Iteration: 100 / 1500 [ 6%] (Warmup)
FALSE Chain 1: Iteration: 150 / 1500 [ 10%] (Warmup)
FALSE Chain 1: Iteration: 200 / 1500 [ 13%] (Warmup)
FALSE Chain 1: Iteration: 250 / 1500 [ 16%] (Warmup)
FALSE Chain 1: Iteration: 300 / 1500 [ 20%] (Warmup)
FALSE Chain 1: Iteration: 301 / 1500 [ 20%] (Sampling)
FALSE Chain 1: Iteration: 350 / 1500 [ 23%] (Sampling)
FALSE Chain 1: Iteration: 400 / 1500 [ 26%] (Sampling)
FALSE Chain 1: Iteration: 450 / 1500 [ 30%] (Sampling)
FALSE Chain 1: Iteration: 500 / 1500 [ 33%] (Sampling)
FALSE Chain 1: Iteration: 550 / 1500 [ 36%] (Sampling)
FALSE Chain 1: Iteration: 600 / 1500 [ 40%] (Sampling)
FALSE Chain 1: Iteration: 650 / 1500 [ 43%] (Sampling)
FALSE Chain 1: Iteration: 700 / 1500 [ 46%] (Sampling)
FALSE Chain 1: Iteration: 750 / 1500 [ 50%] (Sampling)
FALSE Chain 1: Iteration: 800 / 1500 [ 53%] (Sampling)
FALSE Chain 1: Iteration: 850 / 1500 [ 56%] (Sampling)
FALSE Chain 1: Iteration: 900 / 1500 [ 60%] (Sampling)
FALSE Chain 1: Iteration: 950 / 1500 [ 63%] (Sampling)
FALSE Chain 1: Iteration: 1000 / 1500 [ 66%] (Sampling)
FALSE Chain 1: Iteration: 1050 / 1500 [ 70%] (Sampling)
FALSE Chain 1: Iteration: 1100 / 1500 [ 73%] (Sampling)
FALSE Chain 1: Iteration: 1150 / 1500 [ 76%] (Sampling)
FALSE Chain 1: Iteration: 1200 / 1500 [ 80%] (Sampling)
FALSE Chain 1: Iteration: 1250 / 1500 [ 83%] (Sampling)
FALSE Chain 1: Iteration: 1300 / 1500 [ 86%] (Sampling)
FALSE Chain 1: Iteration: 1350 / 1500 [ 90%] (Sampling)
FALSE Chain 1: Iteration: 1400 / 1500 [ 93%] (Sampling)
FALSE Chain 1: Iteration: 1450 / 1500 [ 96%] (Sampling)
FALSE Chain 1: Iteration: 1500 / 1500 [100%] (Sampling)
FALSE Chain 1:
FALSE Chain 1: Elapsed Time: 2.051 seconds (Warm-up)
FALSE Chain 1: 4.897 seconds (Sampling)
FALSE Chain 1: 6.948 seconds (Total)
FALSE Chain 1:
FALSE
FALSE SAMPLING FOR MODEL 'dgu' NOW (CHAIN 2).
FALSE Chain 2:
FALSE Chain 2: Gradient evaluation took 9.4e-05 seconds
FALSE Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.94 seconds.
FALSE Chain 2: Adjust your expectations accordingly!
FALSE Chain 2:
FALSE Chain 2:
FALSE Chain 2: Iteration: 1 / 1500 [ 0%] (Warmup)
FALSE Chain 2: Iteration: 50 / 1500 [ 3%] (Warmup)
FALSE Chain 2: Iteration: 100 / 1500 [ 6%] (Warmup)
FALSE Chain 2: Iteration: 150 / 1500 [ 10%] (Warmup)
FALSE Chain 2: Iteration: 200 / 1500 [ 13%] (Warmup)
FALSE Chain 2: Iteration: 250 / 1500 [ 16%] (Warmup)
FALSE Chain 2: Iteration: 300 / 1500 [ 20%] (Warmup)
FALSE Chain 2: Iteration: 301 / 1500 [ 20%] (Sampling)
FALSE Chain 2: Iteration: 350 / 1500 [ 23%] (Sampling)
FALSE Chain 2: Iteration: 400 / 1500 [ 26%] (Sampling)
FALSE Chain 2: Iteration: 450 / 1500 [ 30%] (Sampling)
FALSE Chain 2: Iteration: 500 / 1500 [ 33%] (Sampling)
FALSE Chain 2: Iteration: 550 / 1500 [ 36%] (Sampling)
FALSE Chain 2: Iteration: 600 / 1500 [ 40%] (Sampling)
FALSE Chain 2: Iteration: 650 / 1500 [ 43%] (Sampling)
FALSE Chain 2: Iteration: 700 / 1500 [ 46%] (Sampling)
FALSE Chain 2: Iteration: 750 / 1500 [ 50%] (Sampling)
FALSE Chain 2: Iteration: 800 / 1500 [ 53%] (Sampling)
FALSE Chain 2: Iteration: 850 / 1500 [ 56%] (Sampling)
FALSE Chain 2: Iteration: 900 / 1500 [ 60%] (Sampling)
FALSE Chain 2: Iteration: 950 / 1500 [ 63%] (Sampling)
FALSE Chain 2: Iteration: 1000 / 1500 [ 66%] (Sampling)
FALSE Chain 2: Iteration: 1050 / 1500 [ 70%] (Sampling)
FALSE Chain 2: Iteration: 1100 / 1500 [ 73%] (Sampling)
FALSE Chain 2: Iteration: 1150 / 1500 [ 76%] (Sampling)
FALSE Chain 2: Iteration: 1200 / 1500 [ 80%] (Sampling)
FALSE Chain 2: Iteration: 1250 / 1500 [ 83%] (Sampling)
FALSE Chain 2: Iteration: 1300 / 1500 [ 86%] (Sampling)
FALSE Chain 2: Iteration: 1350 / 1500 [ 90%] (Sampling)
FALSE Chain 2: Iteration: 1400 / 1500 [ 93%] (Sampling)
FALSE Chain 2: Iteration: 1450 / 1500 [ 96%] (Sampling)
FALSE Chain 2: Iteration: 1500 / 1500 [100%] (Sampling)
FALSE Chain 2:
FALSE Chain 2: Elapsed Time: 2.722 seconds (Warm-up)
FALSE Chain 2: 13.067 seconds (Sampling)
FALSE Chain 2: 15.789 seconds (Total)
FALSE Chain 2:
FALSE
FALSE SAMPLING FOR MODEL 'dgu' NOW (CHAIN 3).
FALSE Chain 3:
FALSE Chain 3: Gradient evaluation took 9.2e-05 seconds
FALSE Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.92 seconds.
FALSE Chain 3: Adjust your expectations accordingly!
FALSE Chain 3:
FALSE Chain 3:
FALSE Chain 3: Iteration: 1 / 1500 [ 0%] (Warmup)
FALSE Chain 3: Iteration: 50 / 1500 [ 3%] (Warmup)
FALSE Chain 3: Iteration: 100 / 1500 [ 6%] (Warmup)
FALSE Chain 3: Iteration: 150 / 1500 [ 10%] (Warmup)
FALSE Chain 3: Iteration: 200 / 1500 [ 13%] (Warmup)
FALSE Chain 3: Iteration: 250 / 1500 [ 16%] (Warmup)
FALSE Chain 3: Iteration: 300 / 1500 [ 20%] (Warmup)
FALSE Chain 3: Iteration: 301 / 1500 [ 20%] (Sampling)
FALSE Chain 3: Iteration: 350 / 1500 [ 23%] (Sampling)
FALSE Chain 3: Iteration: 400 / 1500 [ 26%] (Sampling)
FALSE Chain 3: Iteration: 450 / 1500 [ 30%] (Sampling)
FALSE Chain 3: Iteration: 500 / 1500 [ 33%] (Sampling)
FALSE Chain 3: Iteration: 550 / 1500 [ 36%] (Sampling)
FALSE Chain 3: Iteration: 600 / 1500 [ 40%] (Sampling)
FALSE Chain 3: Iteration: 650 / 1500 [ 43%] (Sampling)
FALSE Chain 3: Iteration: 700 / 1500 [ 46%] (Sampling)
FALSE Chain 3: Iteration: 750 / 1500 [ 50%] (Sampling)
FALSE Chain 3: Iteration: 800 / 1500 [ 53%] (Sampling)
FALSE Chain 3: Iteration: 850 / 1500 [ 56%] (Sampling)
FALSE Chain 3: Iteration: 900 / 1500 [ 60%] (Sampling)
FALSE Chain 3: Iteration: 950 / 1500 [ 63%] (Sampling)
FALSE Chain 3: Iteration: 1000 / 1500 [ 66%] (Sampling)
FALSE Chain 3: Iteration: 1050 / 1500 [ 70%] (Sampling)
FALSE Chain 3: Iteration: 1100 / 1500 [ 73%] (Sampling)
FALSE Chain 3: Iteration: 1150 / 1500 [ 76%] (Sampling)
FALSE Chain 3: Iteration: 1200 / 1500 [ 80%] (Sampling)
FALSE Chain 3: Iteration: 1250 / 1500 [ 83%] (Sampling)
FALSE Chain 3: Iteration: 1300 / 1500 [ 86%] (Sampling)
FALSE Chain 3: Iteration: 1350 / 1500 [ 90%] (Sampling)
FALSE Chain 3: Iteration: 1400 / 1500 [ 93%] (Sampling)
FALSE Chain 3: Iteration: 1450 / 1500 [ 96%] (Sampling)
FALSE Chain 3: Iteration: 1500 / 1500 [100%] (Sampling)
FALSE Chain 3:
FALSE Chain 3: Elapsed Time: 2.005 seconds (Warm-up)
FALSE Chain 3: 7.508 seconds (Sampling)
FALSE Chain 3: 9.513 seconds (Total)
FALSE Chain 3:
In the output of DGU, we provide the following objects:
dgu
and dgu_prob
(main results of IgGeneUsage):
quantitative DGU summary on a log- and probability-scale, respectively.gu
: condition-specific relative gene usage (GU) of each genetheta
: probabilities of gene usage in each sampleppc
: posterior predictive checks data (see section ‘Model checking’)ud
: processed Ig gene usage datafit
: rstan (‘stanfit’) object of the fitted model \(\rightarrow\) used
for model checks (see section ‘Model checking’)summary(M)
FALSE Length Class Mode
FALSE dgu 9 data.frame list
FALSE dgu_prob 9 data.frame list
FALSE gu 8 data.frame list
FALSE theta 12 data.frame list
FALSE ppc 2 -none- list
FALSE ud 24 -none- list
FALSE fit 1 stanfit S4
Check your model fit. For this, you can use the object glm.
rstan::check_hmc_diagnostics(M$fit)
FALSE
FALSE Divergences:
FALSE
FALSE Tree depth:
FALSE
FALSE Energy:
rstan::stan_rhat(object = M$fit)|rstan::stan_ess(object = M$fit)
The model used by IgGeneUsage is generative, i.e. with the model we can generate usage of each Ig gene in a given repertoire (y-axis). Error bars show 95% HDI of mean posterior prediction. The predictions can be compared with the observed data (x-axis). For points near the diagonal \(\rightarrow\) accurate prediction.
ggplot(data = M$ppc$ppc_rep)+
facet_wrap(facets = ~individual_id, ncol = 5)+
geom_abline(intercept = 0, slope = 1, linetype = "dashed", col = "darkgray")+
geom_errorbar(aes(x = observed_count, y = ppc_mean_count,
ymin = ppc_L_count, ymax = ppc_H_count), col = "darkgray")+
geom_point(aes(x = observed_count, y = ppc_mean_count), size = 1)+
theme_bw(base_size = 11)+
theme(legend.position = "top")+
xlab(label = "Observed usage [counts]")+
ylab(label = "PPC usage [counts]")
Prediction of generalized gene usage within a biological condition is also possible. We show the predictions (y-axis) of the model, and compare them against the observed mean usage (x-axis). If the points are near the diagonal \(\rightarrow\) accurate prediction. Errors are 95% HDIs of the mean.
ggplot(data = M$ppc$ppc_condition)+
geom_errorbar(aes(x = gene_name, ymin = ppc_L_prop*100,
ymax = ppc_H_prop*100, col = condition),
position = position_dodge(width = 0.65), width = 0.1)+
geom_point(aes(x = gene_name, y = ppc_mean_prop*100,col = condition),
position = position_dodge(width = 0.65))+
theme_bw(base_size = 11)+
theme(legend.position = "top")+
xlab(label = "Observed usage [%]")+
ylab(label = "PPC usage [%]")+
theme(axis.text.x = element_text(angle = 90, hjust = 1, vjust = 0.4))
Each row of glm
summarizes the degree of DGU observed for specific
Igs. Two metrics are reported:
es
(also referred to as \(\gamma\)): effect size on DGU, where contrast
gives the direction of the effect (e.g. tumor - healthy or healthy - tumor)pmax
(also referred to as \(\pi\)): probability of DGU (parameter \(\pi\)
from model \(M\))For es
we also have the mean, median standard error (se), standard
deviation (sd), L (low bound of 95% HDI), H (high bound of 95% HDI)
kable(x = head(M$dgu), row.names = FALSE, digits = 2)
es_mean | es_mean_se | es_sd | es_median | es_L | es_H | contrast | gene_name | pmax |
---|---|---|---|---|---|---|---|---|
0.18 | 0.01 | 0.29 | 0.11 | -0.23 | 0.94 | C_1-vs-C_2 | G_1 | 0.47 |
-0.01 | 0.00 | 0.20 | -0.01 | -0.44 | 0.41 | C_1-vs-C_2 | G_4 | 0.03 |
-0.08 | 0.00 | 0.23 | -0.05 | -0.63 | 0.33 | C_1-vs-C_2 | G_3 | 0.25 |
-0.06 | 0.00 | 0.17 | -0.04 | -0.44 | 0.26 | C_1-vs-C_2 | G_2 | 0.26 |
0.05 | 0.00 | 0.18 | 0.04 | -0.29 | 0.44 | C_1-vs-C_2 | G_5 | 0.25 |
-0.03 | 0.00 | 0.19 | -0.02 | -0.46 | 0.33 | C_1-vs-C_2 | G_8 | 0.12 |
We know that the values of \gamma
and \pi
are related to each other.
Lets visualize them for all genes (shown as a point). Names are shown for
genes associated with \(\pi \geq 0.95\). Dashed horizontal line represents
null-effect (\(\gamma = 0\)).
Notice that the gene with \(\pi \approx 1\) also has an effect size whose 95% HDI (error bar) does not overlap the null-effect. The genes with high degree of differential usage are easy to detect with this figure.
# format data
stats <- M$dgu
stats <- stats[order(abs(stats$es_mean), decreasing = FALSE), ]
stats$gene_fac <- factor(x = stats$gene_name, levels = unique(stats$gene_name))
ggplot(data = stats)+
geom_hline(yintercept = 0, linetype = "dashed", col = "gray")+
geom_errorbar(aes(x = pmax, y = es_mean, ymin = es_L, ymax = es_H),
col = "darkgray")+
geom_point(aes(x = pmax, y = es_mean, col = contrast))+
geom_text_repel(data = stats[stats$pmax >= 0.95, ],
aes(x = pmax, y = es_mean, label = gene_fac),
min.segment.length = 0, size = 2.75)+
theme_bw(base_size = 11)+
theme(legend.position = "top")+
xlab(label = expression(pi))+
xlim(c(0, 1))+
ylab(expression(gamma))
Lets visualize the observed data of the genes with high probability of differential gene usage (\(\pi \geq 0.95\)). Here we show the gene usage in %.
promising_genes <- stats$gene_name[stats$pmax >= 0.95]
ppc_gene <- M$ppc$ppc_condition
ppc_gene <- ppc_gene[ppc_gene$gene_name %in% promising_genes, ]
ppc_rep <- M$ppc$ppc_rep
ppc_rep <- ppc_rep[ppc_rep$gene_name %in% promising_genes, ]
ggplot()+
geom_point(data = ppc_rep,
aes(x = gene_name, y = observed_prop*100, col = condition),
size = 1, fill = "black",
position = position_jitterdodge(jitter.width = 0.1,
jitter.height = 0,
dodge.width = 0.35))+
geom_errorbar(data = ppc_gene,
aes(x = gene_name, ymin = ppc_L_prop*100,
ymax = ppc_H_prop*100, group = condition),
position = position_dodge(width = 0.35), width = 0.15)+
theme_bw(base_size = 11)+
theme(legend.position = "top")+
theme(axis.text.x = element_text(angle = 90, hjust = 1, vjust = 0.4))+
ylab(label = "PPC usage [%]")+
xlab(label = '')
Lets also visualize the predicted gene usage counts in each repertoire.
ggplot()+
geom_point(data = ppc_rep,
aes(x = gene_name, y = observed_count, col = condition),
size = 1, fill = "black",
position = position_jitterdodge(jitter.width = 0.1,
jitter.height = 0,
dodge.width = 0.5))+
theme_bw(base_size = 11)+
theme(legend.position = "top")+
ylab(label = "PPC usage [count]")+
xlab(label = '')+
theme(axis.text.x = element_text(angle = 90, hjust = 1, vjust = 0.4))
IgGeneUsage also reports the inferred gene usage (GU)
probability of individual genes in each condition. For a given gene we
report its mean GU (prob_mean
) and the 95% (for instance) HDI (prob_L
and prob_H
).
ggplot(data = M$gu)+
geom_errorbar(aes(x = gene_name, y = prob_mean, ymin = prob_L,
ymax = prob_H, col = condition),
width = 0.1, position = position_dodge(width = 0.4))+
geom_point(aes(x = gene_name, y = prob_mean, col = condition), size = 1,
position = position_dodge(width = 0.4))+
theme_bw(base_size = 11)+
theme(legend.position = "top")+
ylab(label = "GU [probability]")+
theme(axis.text.x = element_text(angle = 90, hjust = 1, vjust = 0.4))
To assert the robustness of the probability of DGU (\(\pi\)) and the effect size (\(\gamma\)), IgGeneUsage has a built-in procedure for fully Bayesian leave-one-out (LOO) analysis.
During each step of LOO, we discard the data of one of the R repertoires, and use the remaining data to analyze for DGU. In each step we record \(\pi\) and \(\gamma\) for all genes, including the mean and 95% HDI of \(\gamma\). We assert quantitatively the robustness of \(\pi\) and \(\gamma\) by evaluating their variability for a specific gene.
This analysis can be computationally demanding.
L <- LOO(ud = d_zibb_3, # input data
mcmc_warmup = 500, # how many MCMC warm-ups per chain (default: 500)
mcmc_steps = 1000, # how many MCMC steps per chain (default: 1,500)
mcmc_chains = 1, # how many MCMC chain to run (default: 4)
mcmc_cores = 1, # how many PC cores to use? (e.g. parallel chains)
hdi_lvl = 0.95, # highest density interval level (de fault: 0.95)
adapt_delta = 0.8, # MCMC target acceptance rate (default: 0.95)
max_treedepth = 10) # tree depth evaluated at each step (default: 12)
FALSE
FALSE SAMPLING FOR MODEL 'dgu' NOW (CHAIN 1).
FALSE Chain 1:
FALSE Chain 1: Gradient evaluation took 0.000118 seconds
FALSE Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 1.18 seconds.
FALSE Chain 1: Adjust your expectations accordingly!
FALSE Chain 1:
FALSE Chain 1:
FALSE Chain 1: Iteration: 1 / 1000 [ 0%] (Warmup)
FALSE Chain 1: Iteration: 50 / 1000 [ 5%] (Warmup)
FALSE Chain 1: Iteration: 100 / 1000 [ 10%] (Warmup)
FALSE Chain 1: Iteration: 150 / 1000 [ 15%] (Warmup)
FALSE Chain 1: Iteration: 200 / 1000 [ 20%] (Warmup)
FALSE Chain 1: Iteration: 250 / 1000 [ 25%] (Warmup)
FALSE Chain 1: Iteration: 300 / 1000 [ 30%] (Warmup)
FALSE Chain 1: Iteration: 350 / 1000 [ 35%] (Warmup)
FALSE Chain 1: Iteration: 400 / 1000 [ 40%] (Warmup)
FALSE Chain 1: Iteration: 450 / 1000 [ 45%] (Warmup)
FALSE Chain 1: Iteration: 500 / 1000 [ 50%] (Warmup)
FALSE Chain 1: Iteration: 501 / 1000 [ 50%] (Sampling)
FALSE Chain 1: Iteration: 550 / 1000 [ 55%] (Sampling)
FALSE Chain 1: Iteration: 600 / 1000 [ 60%] (Sampling)
FALSE Chain 1: Iteration: 650 / 1000 [ 65%] (Sampling)
FALSE Chain 1: Iteration: 700 / 1000 [ 70%] (Sampling)
FALSE Chain 1: Iteration: 750 / 1000 [ 75%] (Sampling)
FALSE Chain 1: Iteration: 800 / 1000 [ 80%] (Sampling)
FALSE Chain 1: Iteration: 850 / 1000 [ 85%] (Sampling)
FALSE Chain 1: Iteration: 900 / 1000 [ 90%] (Sampling)
FALSE Chain 1: Iteration: 950 / 1000 [ 95%] (Sampling)
FALSE Chain 1: Iteration: 1000 / 1000 [100%] (Sampling)
FALSE Chain 1:
FALSE Chain 1: Elapsed Time: 2.763 seconds (Warm-up)
FALSE Chain 1: 2.113 seconds (Sampling)
FALSE Chain 1: 4.876 seconds (Total)
FALSE Chain 1:
FALSE
FALSE SAMPLING FOR MODEL 'dgu' NOW (CHAIN 1).
FALSE Chain 1:
FALSE Chain 1: Gradient evaluation took 0.000138 seconds
FALSE Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 1.38 seconds.
FALSE Chain 1: Adjust your expectations accordingly!
FALSE Chain 1:
FALSE Chain 1:
FALSE Chain 1: Iteration: 1 / 1000 [ 0%] (Warmup)
FALSE Chain 1: Iteration: 50 / 1000 [ 5%] (Warmup)
FALSE Chain 1: Iteration: 100 / 1000 [ 10%] (Warmup)
FALSE Chain 1: Iteration: 150 / 1000 [ 15%] (Warmup)
FALSE Chain 1: Iteration: 200 / 1000 [ 20%] (Warmup)
FALSE Chain 1: Iteration: 250 / 1000 [ 25%] (Warmup)
FALSE Chain 1: Iteration: 300 / 1000 [ 30%] (Warmup)
FALSE Chain 1: Iteration: 350 / 1000 [ 35%] (Warmup)
FALSE Chain 1: Iteration: 400 / 1000 [ 40%] (Warmup)
FALSE Chain 1: Iteration: 450 / 1000 [ 45%] (Warmup)
FALSE Chain 1: Iteration: 500 / 1000 [ 50%] (Warmup)
FALSE Chain 1: Iteration: 501 / 1000 [ 50%] (Sampling)
FALSE Chain 1: Iteration: 550 / 1000 [ 55%] (Sampling)
FALSE Chain 1: Iteration: 600 / 1000 [ 60%] (Sampling)
FALSE Chain 1: Iteration: 650 / 1000 [ 65%] (Sampling)
FALSE Chain 1: Iteration: 700 / 1000 [ 70%] (Sampling)
FALSE Chain 1: Iteration: 750 / 1000 [ 75%] (Sampling)
FALSE Chain 1: Iteration: 800 / 1000 [ 80%] (Sampling)
FALSE Chain 1: Iteration: 850 / 1000 [ 85%] (Sampling)
FALSE Chain 1: Iteration: 900 / 1000 [ 90%] (Sampling)
FALSE Chain 1: Iteration: 950 / 1000 [ 95%] (Sampling)
FALSE Chain 1: Iteration: 1000 / 1000 [100%] (Sampling)
FALSE Chain 1:
FALSE Chain 1: Elapsed Time: 2.913 seconds (Warm-up)
FALSE Chain 1: 1.867 seconds (Sampling)
FALSE Chain 1: 4.78 seconds (Total)
FALSE Chain 1:
FALSE
FALSE SAMPLING FOR MODEL 'dgu' NOW (CHAIN 1).
FALSE Chain 1:
FALSE Chain 1: Gradient evaluation took 0.0001 seconds
FALSE Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 1 seconds.
FALSE Chain 1: Adjust your expectations accordingly!
FALSE Chain 1:
FALSE Chain 1:
FALSE Chain 1: Iteration: 1 / 1000 [ 0%] (Warmup)
FALSE Chain 1: Iteration: 50 / 1000 [ 5%] (Warmup)
FALSE Chain 1: Iteration: 100 / 1000 [ 10%] (Warmup)
FALSE Chain 1: Iteration: 150 / 1000 [ 15%] (Warmup)
FALSE Chain 1: Iteration: 200 / 1000 [ 20%] (Warmup)
FALSE Chain 1: Iteration: 250 / 1000 [ 25%] (Warmup)
FALSE Chain 1: Iteration: 300 / 1000 [ 30%] (Warmup)
FALSE Chain 1: Iteration: 350 / 1000 [ 35%] (Warmup)
FALSE Chain 1: Iteration: 400 / 1000 [ 40%] (Warmup)
FALSE Chain 1: Iteration: 450 / 1000 [ 45%] (Warmup)
FALSE Chain 1: Iteration: 500 / 1000 [ 50%] (Warmup)
FALSE Chain 1: Iteration: 501 / 1000 [ 50%] (Sampling)
FALSE Chain 1: Iteration: 550 / 1000 [ 55%] (Sampling)
FALSE Chain 1: Iteration: 600 / 1000 [ 60%] (Sampling)
FALSE Chain 1: Iteration: 650 / 1000 [ 65%] (Sampling)
FALSE Chain 1: Iteration: 700 / 1000 [ 70%] (Sampling)
FALSE Chain 1: Iteration: 750 / 1000 [ 75%] (Sampling)
FALSE Chain 1: Iteration: 800 / 1000 [ 80%] (Sampling)
FALSE Chain 1: Iteration: 850 / 1000 [ 85%] (Sampling)
FALSE Chain 1: Iteration: 900 / 1000 [ 90%] (Sampling)
FALSE Chain 1: Iteration: 950 / 1000 [ 95%] (Sampling)
FALSE Chain 1: Iteration: 1000 / 1000 [100%] (Sampling)
FALSE Chain 1:
FALSE Chain 1: Elapsed Time: 2.555 seconds (Warm-up)
FALSE Chain 1: 1.747 seconds (Sampling)
FALSE Chain 1: 4.302 seconds (Total)
FALSE Chain 1:
FALSE
FALSE SAMPLING FOR MODEL 'dgu' NOW (CHAIN 1).
FALSE Chain 1:
FALSE Chain 1: Gradient evaluation took 8.9e-05 seconds
FALSE Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.89 seconds.
FALSE Chain 1: Adjust your expectations accordingly!
FALSE Chain 1:
FALSE Chain 1:
FALSE Chain 1: Iteration: 1 / 1000 [ 0%] (Warmup)
FALSE Chain 1: Iteration: 50 / 1000 [ 5%] (Warmup)
FALSE Chain 1: Iteration: 100 / 1000 [ 10%] (Warmup)
FALSE Chain 1: Iteration: 150 / 1000 [ 15%] (Warmup)
FALSE Chain 1: Iteration: 200 / 1000 [ 20%] (Warmup)
FALSE Chain 1: Iteration: 250 / 1000 [ 25%] (Warmup)
FALSE Chain 1: Iteration: 300 / 1000 [ 30%] (Warmup)
FALSE Chain 1: Iteration: 350 / 1000 [ 35%] (Warmup)
FALSE Chain 1: Iteration: 400 / 1000 [ 40%] (Warmup)
FALSE Chain 1: Iteration: 450 / 1000 [ 45%] (Warmup)
FALSE Chain 1: Iteration: 500 / 1000 [ 50%] (Warmup)
FALSE Chain 1: Iteration: 501 / 1000 [ 50%] (Sampling)
FALSE Chain 1: Iteration: 550 / 1000 [ 55%] (Sampling)
FALSE Chain 1: Iteration: 600 / 1000 [ 60%] (Sampling)
FALSE Chain 1: Iteration: 650 / 1000 [ 65%] (Sampling)
FALSE Chain 1: Iteration: 700 / 1000 [ 70%] (Sampling)
FALSE Chain 1: Iteration: 750 / 1000 [ 75%] (Sampling)
FALSE Chain 1: Iteration: 800 / 1000 [ 80%] (Sampling)
FALSE Chain 1: Iteration: 850 / 1000 [ 85%] (Sampling)
FALSE Chain 1: Iteration: 900 / 1000 [ 90%] (Sampling)
FALSE Chain 1: Iteration: 950 / 1000 [ 95%] (Sampling)
FALSE Chain 1: Iteration: 1000 / 1000 [100%] (Sampling)
FALSE Chain 1:
FALSE Chain 1: Elapsed Time: 2.311 seconds (Warm-up)
FALSE Chain 1: 1.978 seconds (Sampling)
FALSE Chain 1: 4.289 seconds (Total)
FALSE Chain 1:
FALSE
FALSE SAMPLING FOR MODEL 'dgu' NOW (CHAIN 1).
FALSE Chain 1:
FALSE Chain 1: Gradient evaluation took 8.9e-05 seconds
FALSE Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.89 seconds.
FALSE Chain 1: Adjust your expectations accordingly!
FALSE Chain 1:
FALSE Chain 1:
FALSE Chain 1: Iteration: 1 / 1000 [ 0%] (Warmup)
FALSE Chain 1: Iteration: 50 / 1000 [ 5%] (Warmup)
FALSE Chain 1: Iteration: 100 / 1000 [ 10%] (Warmup)
FALSE Chain 1: Iteration: 150 / 1000 [ 15%] (Warmup)
FALSE Chain 1: Iteration: 200 / 1000 [ 20%] (Warmup)
FALSE Chain 1: Iteration: 250 / 1000 [ 25%] (Warmup)
FALSE Chain 1: Iteration: 300 / 1000 [ 30%] (Warmup)
FALSE Chain 1: Iteration: 350 / 1000 [ 35%] (Warmup)
FALSE Chain 1: Iteration: 400 / 1000 [ 40%] (Warmup)
FALSE Chain 1: Iteration: 450 / 1000 [ 45%] (Warmup)
FALSE Chain 1: Iteration: 500 / 1000 [ 50%] (Warmup)
FALSE Chain 1: Iteration: 501 / 1000 [ 50%] (Sampling)
FALSE Chain 1: Iteration: 550 / 1000 [ 55%] (Sampling)
FALSE Chain 1: Iteration: 600 / 1000 [ 60%] (Sampling)
FALSE Chain 1: Iteration: 650 / 1000 [ 65%] (Sampling)
FALSE Chain 1: Iteration: 700 / 1000 [ 70%] (Sampling)
FALSE Chain 1: Iteration: 750 / 1000 [ 75%] (Sampling)
FALSE Chain 1: Iteration: 800 / 1000 [ 80%] (Sampling)
FALSE Chain 1: Iteration: 850 / 1000 [ 85%] (Sampling)
FALSE Chain 1: Iteration: 900 / 1000 [ 90%] (Sampling)
FALSE Chain 1: Iteration: 950 / 1000 [ 95%] (Sampling)
FALSE Chain 1: Iteration: 1000 / 1000 [100%] (Sampling)
FALSE Chain 1:
FALSE Chain 1: Elapsed Time: 2.717 seconds (Warm-up)
FALSE Chain 1: 2.343 seconds (Sampling)
FALSE Chain 1: 5.06 seconds (Total)
FALSE Chain 1:
FALSE
FALSE SAMPLING FOR MODEL 'dgu' NOW (CHAIN 1).
FALSE Chain 1:
FALSE Chain 1: Gradient evaluation took 0.000141 seconds
FALSE Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 1.41 seconds.
FALSE Chain 1: Adjust your expectations accordingly!
FALSE Chain 1:
FALSE Chain 1:
FALSE Chain 1: Iteration: 1 / 1000 [ 0%] (Warmup)
FALSE Chain 1: Iteration: 50 / 1000 [ 5%] (Warmup)
FALSE Chain 1: Iteration: 100 / 1000 [ 10%] (Warmup)
FALSE Chain 1: Iteration: 150 / 1000 [ 15%] (Warmup)
FALSE Chain 1: Iteration: 200 / 1000 [ 20%] (Warmup)
FALSE Chain 1: Iteration: 250 / 1000 [ 25%] (Warmup)
FALSE Chain 1: Iteration: 300 / 1000 [ 30%] (Warmup)
FALSE Chain 1: Iteration: 350 / 1000 [ 35%] (Warmup)
FALSE Chain 1: Iteration: 400 / 1000 [ 40%] (Warmup)
FALSE Chain 1: Iteration: 450 / 1000 [ 45%] (Warmup)
FALSE Chain 1: Iteration: 500 / 1000 [ 50%] (Warmup)
FALSE Chain 1: Iteration: 501 / 1000 [ 50%] (Sampling)
FALSE Chain 1: Iteration: 550 / 1000 [ 55%] (Sampling)
FALSE Chain 1: Iteration: 600 / 1000 [ 60%] (Sampling)
FALSE Chain 1: Iteration: 650 / 1000 [ 65%] (Sampling)
FALSE Chain 1: Iteration: 700 / 1000 [ 70%] (Sampling)
FALSE Chain 1: Iteration: 750 / 1000 [ 75%] (Sampling)
FALSE Chain 1: Iteration: 800 / 1000 [ 80%] (Sampling)
FALSE Chain 1: Iteration: 850 / 1000 [ 85%] (Sampling)
FALSE Chain 1: Iteration: 900 / 1000 [ 90%] (Sampling)
FALSE Chain 1: Iteration: 950 / 1000 [ 95%] (Sampling)
FALSE Chain 1: Iteration: 1000 / 1000 [100%] (Sampling)
FALSE Chain 1:
FALSE Chain 1: Elapsed Time: 2.554 seconds (Warm-up)
FALSE Chain 1: 1.89 seconds (Sampling)
FALSE Chain 1: 4.444 seconds (Total)
FALSE Chain 1:
FALSE
FALSE SAMPLING FOR MODEL 'dgu' NOW (CHAIN 1).
FALSE Chain 1:
FALSE Chain 1: Gradient evaluation took 9e-05 seconds
FALSE Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.9 seconds.
FALSE Chain 1: Adjust your expectations accordingly!
FALSE Chain 1:
FALSE Chain 1:
FALSE Chain 1: Iteration: 1 / 1000 [ 0%] (Warmup)
FALSE Chain 1: Iteration: 50 / 1000 [ 5%] (Warmup)
FALSE Chain 1: Iteration: 100 / 1000 [ 10%] (Warmup)
FALSE Chain 1: Iteration: 150 / 1000 [ 15%] (Warmup)
FALSE Chain 1: Iteration: 200 / 1000 [ 20%] (Warmup)
FALSE Chain 1: Iteration: 250 / 1000 [ 25%] (Warmup)
FALSE Chain 1: Iteration: 300 / 1000 [ 30%] (Warmup)
FALSE Chain 1: Iteration: 350 / 1000 [ 35%] (Warmup)
FALSE Chain 1: Iteration: 400 / 1000 [ 40%] (Warmup)
FALSE Chain 1: Iteration: 450 / 1000 [ 45%] (Warmup)
FALSE Chain 1: Iteration: 500 / 1000 [ 50%] (Warmup)
FALSE Chain 1: Iteration: 501 / 1000 [ 50%] (Sampling)
FALSE Chain 1: Iteration: 550 / 1000 [ 55%] (Sampling)
FALSE Chain 1: Iteration: 600 / 1000 [ 60%] (Sampling)
FALSE Chain 1: Iteration: 650 / 1000 [ 65%] (Sampling)
FALSE Chain 1: Iteration: 700 / 1000 [ 70%] (Sampling)
FALSE Chain 1: Iteration: 750 / 1000 [ 75%] (Sampling)
FALSE Chain 1: Iteration: 800 / 1000 [ 80%] (Sampling)
FALSE Chain 1: Iteration: 850 / 1000 [ 85%] (Sampling)
FALSE Chain 1: Iteration: 900 / 1000 [ 90%] (Sampling)
FALSE Chain 1: Iteration: 950 / 1000 [ 95%] (Sampling)
FALSE Chain 1: Iteration: 1000 / 1000 [100%] (Sampling)
FALSE Chain 1:
FALSE Chain 1: Elapsed Time: 2.675 seconds (Warm-up)
FALSE Chain 1: 1.808 seconds (Sampling)
FALSE Chain 1: 4.483 seconds (Total)
FALSE Chain 1:
FALSE
FALSE SAMPLING FOR MODEL 'dgu' NOW (CHAIN 1).
FALSE Chain 1:
FALSE Chain 1: Gradient evaluation took 9.9e-05 seconds
FALSE Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.99 seconds.
FALSE Chain 1: Adjust your expectations accordingly!
FALSE Chain 1:
FALSE Chain 1:
FALSE Chain 1: Iteration: 1 / 1000 [ 0%] (Warmup)
FALSE Chain 1: Iteration: 50 / 1000 [ 5%] (Warmup)
FALSE Chain 1: Iteration: 100 / 1000 [ 10%] (Warmup)
FALSE Chain 1: Iteration: 150 / 1000 [ 15%] (Warmup)
FALSE Chain 1: Iteration: 200 / 1000 [ 20%] (Warmup)
FALSE Chain 1: Iteration: 250 / 1000 [ 25%] (Warmup)
FALSE Chain 1: Iteration: 300 / 1000 [ 30%] (Warmup)
FALSE Chain 1: Iteration: 350 / 1000 [ 35%] (Warmup)
FALSE Chain 1: Iteration: 400 / 1000 [ 40%] (Warmup)
FALSE Chain 1: Iteration: 450 / 1000 [ 45%] (Warmup)
FALSE Chain 1: Iteration: 500 / 1000 [ 50%] (Warmup)
FALSE Chain 1: Iteration: 501 / 1000 [ 50%] (Sampling)
FALSE Chain 1: Iteration: 550 / 1000 [ 55%] (Sampling)
FALSE Chain 1: Iteration: 600 / 1000 [ 60%] (Sampling)
FALSE Chain 1: Iteration: 650 / 1000 [ 65%] (Sampling)
FALSE Chain 1: Iteration: 700 / 1000 [ 70%] (Sampling)
FALSE Chain 1: Iteration: 750 / 1000 [ 75%] (Sampling)
FALSE Chain 1: Iteration: 800 / 1000 [ 80%] (Sampling)
FALSE Chain 1: Iteration: 850 / 1000 [ 85%] (Sampling)
FALSE Chain 1: Iteration: 900 / 1000 [ 90%] (Sampling)
FALSE Chain 1: Iteration: 950 / 1000 [ 95%] (Sampling)
FALSE Chain 1: Iteration: 1000 / 1000 [100%] (Sampling)
FALSE Chain 1:
FALSE Chain 1: Elapsed Time: 2.929 seconds (Warm-up)
FALSE Chain 1: 1.688 seconds (Sampling)
FALSE Chain 1: 4.617 seconds (Total)
FALSE Chain 1:
FALSE
FALSE SAMPLING FOR MODEL 'dgu' NOW (CHAIN 1).
FALSE Chain 1:
FALSE Chain 1: Gradient evaluation took 8.9e-05 seconds
FALSE Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.89 seconds.
FALSE Chain 1: Adjust your expectations accordingly!
FALSE Chain 1:
FALSE Chain 1:
FALSE Chain 1: Iteration: 1 / 1000 [ 0%] (Warmup)
FALSE Chain 1: Iteration: 50 / 1000 [ 5%] (Warmup)
FALSE Chain 1: Iteration: 100 / 1000 [ 10%] (Warmup)
FALSE Chain 1: Iteration: 150 / 1000 [ 15%] (Warmup)
FALSE Chain 1: Iteration: 200 / 1000 [ 20%] (Warmup)
FALSE Chain 1: Iteration: 250 / 1000 [ 25%] (Warmup)
FALSE Chain 1: Iteration: 300 / 1000 [ 30%] (Warmup)
FALSE Chain 1: Iteration: 350 / 1000 [ 35%] (Warmup)
FALSE Chain 1: Iteration: 400 / 1000 [ 40%] (Warmup)
FALSE Chain 1: Iteration: 450 / 1000 [ 45%] (Warmup)
FALSE Chain 1: Iteration: 500 / 1000 [ 50%] (Warmup)
FALSE Chain 1: Iteration: 501 / 1000 [ 50%] (Sampling)
FALSE Chain 1: Iteration: 550 / 1000 [ 55%] (Sampling)
FALSE Chain 1: Iteration: 600 / 1000 [ 60%] (Sampling)
FALSE Chain 1: Iteration: 650 / 1000 [ 65%] (Sampling)
FALSE Chain 1: Iteration: 700 / 1000 [ 70%] (Sampling)
FALSE Chain 1: Iteration: 750 / 1000 [ 75%] (Sampling)
FALSE Chain 1: Iteration: 800 / 1000 [ 80%] (Sampling)
FALSE Chain 1: Iteration: 850 / 1000 [ 85%] (Sampling)
FALSE Chain 1: Iteration: 900 / 1000 [ 90%] (Sampling)
FALSE Chain 1: Iteration: 950 / 1000 [ 95%] (Sampling)
FALSE Chain 1: Iteration: 1000 / 1000 [100%] (Sampling)
FALSE Chain 1:
FALSE Chain 1: Elapsed Time: 2.551 seconds (Warm-up)
FALSE Chain 1: 1.786 seconds (Sampling)
FALSE Chain 1: 4.337 seconds (Total)
FALSE Chain 1:
FALSE
FALSE SAMPLING FOR MODEL 'dgu' NOW (CHAIN 1).
FALSE Chain 1:
FALSE Chain 1: Gradient evaluation took 8.8e-05 seconds
FALSE Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.88 seconds.
FALSE Chain 1: Adjust your expectations accordingly!
FALSE Chain 1:
FALSE Chain 1:
FALSE Chain 1: Iteration: 1 / 1000 [ 0%] (Warmup)
FALSE Chain 1: Iteration: 50 / 1000 [ 5%] (Warmup)
FALSE Chain 1: Iteration: 100 / 1000 [ 10%] (Warmup)
FALSE Chain 1: Iteration: 150 / 1000 [ 15%] (Warmup)
FALSE Chain 1: Iteration: 200 / 1000 [ 20%] (Warmup)
FALSE Chain 1: Iteration: 250 / 1000 [ 25%] (Warmup)
FALSE Chain 1: Iteration: 300 / 1000 [ 30%] (Warmup)
FALSE Chain 1: Iteration: 350 / 1000 [ 35%] (Warmup)
FALSE Chain 1: Iteration: 400 / 1000 [ 40%] (Warmup)
FALSE Chain 1: Iteration: 450 / 1000 [ 45%] (Warmup)
FALSE Chain 1: Iteration: 500 / 1000 [ 50%] (Warmup)
FALSE Chain 1: Iteration: 501 / 1000 [ 50%] (Sampling)
FALSE Chain 1: Iteration: 550 / 1000 [ 55%] (Sampling)
FALSE Chain 1: Iteration: 600 / 1000 [ 60%] (Sampling)
FALSE Chain 1: Iteration: 650 / 1000 [ 65%] (Sampling)
FALSE Chain 1: Iteration: 700 / 1000 [ 70%] (Sampling)
FALSE Chain 1: Iteration: 750 / 1000 [ 75%] (Sampling)
FALSE Chain 1: Iteration: 800 / 1000 [ 80%] (Sampling)
FALSE Chain 1: Iteration: 850 / 1000 [ 85%] (Sampling)
FALSE Chain 1: Iteration: 900 / 1000 [ 90%] (Sampling)
FALSE Chain 1: Iteration: 950 / 1000 [ 95%] (Sampling)
FALSE Chain 1: Iteration: 1000 / 1000 [100%] (Sampling)
FALSE Chain 1:
FALSE Chain 1: Elapsed Time: 2.364 seconds (Warm-up)
FALSE Chain 1: 1.788 seconds (Sampling)
FALSE Chain 1: 4.152 seconds (Total)
FALSE Chain 1:
FALSE
FALSE SAMPLING FOR MODEL 'dgu' NOW (CHAIN 1).
FALSE Chain 1:
FALSE Chain 1: Gradient evaluation took 0.000162 seconds
FALSE Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 1.62 seconds.
FALSE Chain 1: Adjust your expectations accordingly!
FALSE Chain 1:
FALSE Chain 1:
FALSE Chain 1: Iteration: 1 / 1000 [ 0%] (Warmup)
FALSE Chain 1: Iteration: 50 / 1000 [ 5%] (Warmup)
FALSE Chain 1: Iteration: 100 / 1000 [ 10%] (Warmup)
FALSE Chain 1: Iteration: 150 / 1000 [ 15%] (Warmup)
FALSE Chain 1: Iteration: 200 / 1000 [ 20%] (Warmup)
FALSE Chain 1: Iteration: 250 / 1000 [ 25%] (Warmup)
FALSE Chain 1: Iteration: 300 / 1000 [ 30%] (Warmup)
FALSE Chain 1: Iteration: 350 / 1000 [ 35%] (Warmup)
FALSE Chain 1: Iteration: 400 / 1000 [ 40%] (Warmup)
FALSE Chain 1: Iteration: 450 / 1000 [ 45%] (Warmup)
FALSE Chain 1: Iteration: 500 / 1000 [ 50%] (Warmup)
FALSE Chain 1: Iteration: 501 / 1000 [ 50%] (Sampling)
FALSE Chain 1: Iteration: 550 / 1000 [ 55%] (Sampling)
FALSE Chain 1: Iteration: 600 / 1000 [ 60%] (Sampling)
FALSE Chain 1: Iteration: 650 / 1000 [ 65%] (Sampling)
FALSE Chain 1: Iteration: 700 / 1000 [ 70%] (Sampling)
FALSE Chain 1: Iteration: 750 / 1000 [ 75%] (Sampling)
FALSE Chain 1: Iteration: 800 / 1000 [ 80%] (Sampling)
FALSE Chain 1: Iteration: 850 / 1000 [ 85%] (Sampling)
FALSE Chain 1: Iteration: 900 / 1000 [ 90%] (Sampling)
FALSE Chain 1: Iteration: 950 / 1000 [ 95%] (Sampling)
FALSE Chain 1: Iteration: 1000 / 1000 [100%] (Sampling)
FALSE Chain 1:
FALSE Chain 1: Elapsed Time: 2.535 seconds (Warm-up)
FALSE Chain 1: 1.667 seconds (Sampling)
FALSE Chain 1: 4.202 seconds (Total)
FALSE Chain 1:
FALSE
FALSE SAMPLING FOR MODEL 'dgu' NOW (CHAIN 1).
FALSE Chain 1:
FALSE Chain 1: Gradient evaluation took 8.7e-05 seconds
FALSE Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.87 seconds.
FALSE Chain 1: Adjust your expectations accordingly!
FALSE Chain 1:
FALSE Chain 1:
FALSE Chain 1: Iteration: 1 / 1000 [ 0%] (Warmup)
FALSE Chain 1: Iteration: 50 / 1000 [ 5%] (Warmup)
FALSE Chain 1: Iteration: 100 / 1000 [ 10%] (Warmup)
FALSE Chain 1: Iteration: 150 / 1000 [ 15%] (Warmup)
FALSE Chain 1: Iteration: 200 / 1000 [ 20%] (Warmup)
FALSE Chain 1: Iteration: 250 / 1000 [ 25%] (Warmup)
FALSE Chain 1: Iteration: 300 / 1000 [ 30%] (Warmup)
FALSE Chain 1: Iteration: 350 / 1000 [ 35%] (Warmup)
FALSE Chain 1: Iteration: 400 / 1000 [ 40%] (Warmup)
FALSE Chain 1: Iteration: 450 / 1000 [ 45%] (Warmup)
FALSE Chain 1: Iteration: 500 / 1000 [ 50%] (Warmup)
FALSE Chain 1: Iteration: 501 / 1000 [ 50%] (Sampling)
FALSE Chain 1: Iteration: 550 / 1000 [ 55%] (Sampling)
FALSE Chain 1: Iteration: 600 / 1000 [ 60%] (Sampling)
FALSE Chain 1: Iteration: 650 / 1000 [ 65%] (Sampling)
FALSE Chain 1: Iteration: 700 / 1000 [ 70%] (Sampling)
FALSE Chain 1: Iteration: 750 / 1000 [ 75%] (Sampling)
FALSE Chain 1: Iteration: 800 / 1000 [ 80%] (Sampling)
FALSE Chain 1: Iteration: 850 / 1000 [ 85%] (Sampling)
FALSE Chain 1: Iteration: 900 / 1000 [ 90%] (Sampling)
FALSE Chain 1: Iteration: 950 / 1000 [ 95%] (Sampling)
FALSE Chain 1: Iteration: 1000 / 1000 [100%] (Sampling)
FALSE Chain 1:
FALSE Chain 1: Elapsed Time: 2.406 seconds (Warm-up)
FALSE Chain 1: 1.669 seconds (Sampling)
FALSE Chain 1: 4.075 seconds (Total)
FALSE Chain 1:
FALSE
FALSE SAMPLING FOR MODEL 'dgu' NOW (CHAIN 1).
FALSE Chain 1:
FALSE Chain 1: Gradient evaluation took 8.8e-05 seconds
FALSE Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.88 seconds.
FALSE Chain 1: Adjust your expectations accordingly!
FALSE Chain 1:
FALSE Chain 1:
FALSE Chain 1: Iteration: 1 / 1000 [ 0%] (Warmup)
FALSE Chain 1: Iteration: 50 / 1000 [ 5%] (Warmup)
FALSE Chain 1: Iteration: 100 / 1000 [ 10%] (Warmup)
FALSE Chain 1: Iteration: 150 / 1000 [ 15%] (Warmup)
FALSE Chain 1: Iteration: 200 / 1000 [ 20%] (Warmup)
FALSE Chain 1: Iteration: 250 / 1000 [ 25%] (Warmup)
FALSE Chain 1: Iteration: 300 / 1000 [ 30%] (Warmup)
FALSE Chain 1: Iteration: 350 / 1000 [ 35%] (Warmup)
FALSE Chain 1: Iteration: 400 / 1000 [ 40%] (Warmup)
FALSE Chain 1: Iteration: 450 / 1000 [ 45%] (Warmup)
FALSE Chain 1: Iteration: 500 / 1000 [ 50%] (Warmup)
FALSE Chain 1: Iteration: 501 / 1000 [ 50%] (Sampling)
FALSE Chain 1: Iteration: 550 / 1000 [ 55%] (Sampling)
FALSE Chain 1: Iteration: 600 / 1000 [ 60%] (Sampling)
FALSE Chain 1: Iteration: 650 / 1000 [ 65%] (Sampling)
FALSE Chain 1: Iteration: 700 / 1000 [ 70%] (Sampling)
FALSE Chain 1: Iteration: 750 / 1000 [ 75%] (Sampling)
FALSE Chain 1: Iteration: 800 / 1000 [ 80%] (Sampling)
FALSE Chain 1: Iteration: 850 / 1000 [ 85%] (Sampling)
FALSE Chain 1: Iteration: 900 / 1000 [ 90%] (Sampling)
FALSE Chain 1: Iteration: 950 / 1000 [ 95%] (Sampling)
FALSE Chain 1: Iteration: 1000 / 1000 [100%] (Sampling)
FALSE Chain 1:
FALSE Chain 1: Elapsed Time: 2.696 seconds (Warm-up)
FALSE Chain 1: 3.195 seconds (Sampling)
FALSE Chain 1: 5.891 seconds (Total)
FALSE Chain 1:
FALSE
FALSE SAMPLING FOR MODEL 'dgu' NOW (CHAIN 1).
FALSE Chain 1:
FALSE Chain 1: Gradient evaluation took 9.1e-05 seconds
FALSE Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.91 seconds.
FALSE Chain 1: Adjust your expectations accordingly!
FALSE Chain 1:
FALSE Chain 1:
FALSE Chain 1: Iteration: 1 / 1000 [ 0%] (Warmup)
FALSE Chain 1: Iteration: 50 / 1000 [ 5%] (Warmup)
FALSE Chain 1: Iteration: 100 / 1000 [ 10%] (Warmup)
FALSE Chain 1: Iteration: 150 / 1000 [ 15%] (Warmup)
FALSE Chain 1: Iteration: 200 / 1000 [ 20%] (Warmup)
FALSE Chain 1: Iteration: 250 / 1000 [ 25%] (Warmup)
FALSE Chain 1: Iteration: 300 / 1000 [ 30%] (Warmup)
FALSE Chain 1: Iteration: 350 / 1000 [ 35%] (Warmup)
FALSE Chain 1: Iteration: 400 / 1000 [ 40%] (Warmup)
FALSE Chain 1: Iteration: 450 / 1000 [ 45%] (Warmup)
FALSE Chain 1: Iteration: 500 / 1000 [ 50%] (Warmup)
FALSE Chain 1: Iteration: 501 / 1000 [ 50%] (Sampling)
FALSE Chain 1: Iteration: 550 / 1000 [ 55%] (Sampling)
FALSE Chain 1: Iteration: 600 / 1000 [ 60%] (Sampling)
FALSE Chain 1: Iteration: 650 / 1000 [ 65%] (Sampling)
FALSE Chain 1: Iteration: 700 / 1000 [ 70%] (Sampling)
FALSE Chain 1: Iteration: 750 / 1000 [ 75%] (Sampling)
FALSE Chain 1: Iteration: 800 / 1000 [ 80%] (Sampling)
FALSE Chain 1: Iteration: 850 / 1000 [ 85%] (Sampling)
FALSE Chain 1: Iteration: 900 / 1000 [ 90%] (Sampling)
FALSE Chain 1: Iteration: 950 / 1000 [ 95%] (Sampling)
FALSE Chain 1: Iteration: 1000 / 1000 [100%] (Sampling)
FALSE Chain 1:
FALSE Chain 1: Elapsed Time: 2.352 seconds (Warm-up)
FALSE Chain 1: 1.72 seconds (Sampling)
FALSE Chain 1: 4.072 seconds (Total)
FALSE Chain 1:
FALSE
FALSE SAMPLING FOR MODEL 'dgu' NOW (CHAIN 1).
FALSE Chain 1:
FALSE Chain 1: Gradient evaluation took 9.2e-05 seconds
FALSE Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.92 seconds.
FALSE Chain 1: Adjust your expectations accordingly!
FALSE Chain 1:
FALSE Chain 1:
FALSE Chain 1: Iteration: 1 / 1000 [ 0%] (Warmup)
FALSE Chain 1: Iteration: 50 / 1000 [ 5%] (Warmup)
FALSE Chain 1: Iteration: 100 / 1000 [ 10%] (Warmup)
FALSE Chain 1: Iteration: 150 / 1000 [ 15%] (Warmup)
FALSE Chain 1: Iteration: 200 / 1000 [ 20%] (Warmup)
FALSE Chain 1: Iteration: 250 / 1000 [ 25%] (Warmup)
FALSE Chain 1: Iteration: 300 / 1000 [ 30%] (Warmup)
FALSE Chain 1: Iteration: 350 / 1000 [ 35%] (Warmup)
FALSE Chain 1: Iteration: 400 / 1000 [ 40%] (Warmup)
FALSE Chain 1: Iteration: 450 / 1000 [ 45%] (Warmup)
FALSE Chain 1: Iteration: 500 / 1000 [ 50%] (Warmup)
FALSE Chain 1: Iteration: 501 / 1000 [ 50%] (Sampling)
FALSE Chain 1: Iteration: 550 / 1000 [ 55%] (Sampling)
FALSE Chain 1: Iteration: 600 / 1000 [ 60%] (Sampling)
FALSE Chain 1: Iteration: 650 / 1000 [ 65%] (Sampling)
FALSE Chain 1: Iteration: 700 / 1000 [ 70%] (Sampling)
FALSE Chain 1: Iteration: 750 / 1000 [ 75%] (Sampling)
FALSE Chain 1: Iteration: 800 / 1000 [ 80%] (Sampling)
FALSE Chain 1: Iteration: 850 / 1000 [ 85%] (Sampling)
FALSE Chain 1: Iteration: 900 / 1000 [ 90%] (Sampling)
FALSE Chain 1: Iteration: 950 / 1000 [ 95%] (Sampling)
FALSE Chain 1: Iteration: 1000 / 1000 [100%] (Sampling)
FALSE Chain 1:
FALSE Chain 1: Elapsed Time: 2.369 seconds (Warm-up)
FALSE Chain 1: 1.784 seconds (Sampling)
FALSE Chain 1: 4.153 seconds (Total)
FALSE Chain 1:
Next, we collected the results (GU and DGU) from each LOO iteration:
L_gu <- do.call(rbind, lapply(X = L, FUN = function(x){return(x$gu)}))
L_dgu <- do.call(rbind, lapply(X = L, FUN = function(x){return(x$dgu)}))
… and plot them:
ggplot(data = L_dgu)+
facet_wrap(facets = ~contrast, ncol = 1)+
geom_hline(yintercept = 0, linetype = "dashed", col = "gray")+
geom_errorbar(aes(x = gene_name, y = es_mean, ymin = es_L,
ymax = es_H, col = contrast, group = loo_id),
width = 0.1, position = position_dodge(width = 0.75))+
geom_point(aes(x = gene_name, y = es_mean, col = contrast,
group = loo_id), size = 1,
position = position_dodge(width = 0.75))+
theme_bw(base_size = 11)+
theme(legend.position = "none")+
ylab(expression(gamma))
ggplot(data = L_dgu)+
facet_wrap(facets = ~contrast, ncol = 1)+
geom_point(aes(x = gene_name, y = pmax, col = contrast,
group = loo_id), size = 1,
position = position_dodge(width = 0.5))+
theme_bw(base_size = 11)+
theme(legend.position = "none")+
ylab(expression(pi))
ggplot(data = L_gu)+
geom_hline(yintercept = 0, linetype = "dashed", col = "gray")+
geom_errorbar(aes(x = gene_name, y = prob_mean, ymin = prob_L,
ymax = prob_H, col = condition,
group = interaction(loo_id, condition)),
width = 0.1, position = position_dodge(width = 1))+
geom_point(aes(x = gene_name, y = prob_mean, col = condition,
group = interaction(loo_id, condition)), size = 1,
position = position_dodge(width = 1))+
theme_bw(base_size = 11)+
theme(legend.position = "top")+
ylab("GU [probability]")+
theme(axis.text.x = element_text(angle = 90, hjust = 1, vjust = 0.4))
data("d_zibb_4", package = "IgGeneUsage")
knitr::kable(head(d_zibb_4))
individual_id | condition | gene_name | replicate | gene_usage_count |
---|---|---|---|---|
I_1 | C_1 | G_1 | R_1 | 29 |
I_1 | C_1 | G_2 | R_1 | 66 |
I_1 | C_1 | G_3 | R_1 | 285 |
I_1 | C_1 | G_4 | R_1 | 20 |
I_1 | C_1 | G_5 | R_1 | 38 |
I_1 | C_1 | G_6 | R_1 | 709 |
We can also visualize d_zibb_4
with ggplot:
ggplot(data = d_zibb_4)+
geom_point(aes(x = gene_name, y = gene_usage_count, col = condition,
shape = replicate), position = position_dodge(width = 0.8))+
theme_bw(base_size = 11)+
ylab(label = "Gene usage [count]")+
xlab(label = '')+
theme(legend.position = "top")+
theme(axis.text.x = element_text(angle = 90, hjust = 1, vjust = 0.4))
M <- DGU(ud = d_zibb_4, # input data
mcmc_warmup = 500, # how many MCMC warm-ups per chain (default: 500)
mcmc_steps = 1500, # how many MCMC steps per chain (default: 1,500)
mcmc_chains = 2, # how many MCMC chain to run (default: 4)
mcmc_cores = 1, # how many PC cores to use? (e.g. parallel chains)
hdi_lvl = 0.95, # highest density interval level (de fault: 0.95)
adapt_delta = 0.8, # MCMC target acceptance rate (default: 0.95)
max_treedepth = 10) # tree depth evaluated at each step (default: 12)
FALSE
FALSE SAMPLING FOR MODEL 'dgu_rep' NOW (CHAIN 1).
FALSE Chain 1:
FALSE Chain 1: Gradient evaluation took 0.000473 seconds
FALSE Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 4.73 seconds.
FALSE Chain 1: Adjust your expectations accordingly!
FALSE Chain 1:
FALSE Chain 1:
FALSE Chain 1: Iteration: 1 / 1500 [ 0%] (Warmup)
FALSE Chain 1: Iteration: 50 / 1500 [ 3%] (Warmup)
FALSE Chain 1: Iteration: 100 / 1500 [ 6%] (Warmup)
FALSE Chain 1: Iteration: 150 / 1500 [ 10%] (Warmup)
FALSE Chain 1: Iteration: 200 / 1500 [ 13%] (Warmup)
FALSE Chain 1: Iteration: 250 / 1500 [ 16%] (Warmup)
FALSE Chain 1: Iteration: 300 / 1500 [ 20%] (Warmup)
FALSE Chain 1: Iteration: 350 / 1500 [ 23%] (Warmup)
FALSE Chain 1: Iteration: 400 / 1500 [ 26%] (Warmup)
FALSE Chain 1: Iteration: 450 / 1500 [ 30%] (Warmup)
FALSE Chain 1: Iteration: 500 / 1500 [ 33%] (Warmup)
FALSE Chain 1: Iteration: 501 / 1500 [ 33%] (Sampling)
FALSE Chain 1: Iteration: 550 / 1500 [ 36%] (Sampling)
FALSE Chain 1: Iteration: 600 / 1500 [ 40%] (Sampling)
FALSE Chain 1: Iteration: 650 / 1500 [ 43%] (Sampling)
FALSE Chain 1: Iteration: 700 / 1500 [ 46%] (Sampling)
FALSE Chain 1: Iteration: 750 / 1500 [ 50%] (Sampling)
FALSE Chain 1: Iteration: 800 / 1500 [ 53%] (Sampling)
FALSE Chain 1: Iteration: 850 / 1500 [ 56%] (Sampling)
FALSE Chain 1: Iteration: 900 / 1500 [ 60%] (Sampling)
FALSE Chain 1: Iteration: 950 / 1500 [ 63%] (Sampling)
FALSE Chain 1: Iteration: 1000 / 1500 [ 66%] (Sampling)
FALSE Chain 1: Iteration: 1050 / 1500 [ 70%] (Sampling)
FALSE Chain 1: Iteration: 1100 / 1500 [ 73%] (Sampling)
FALSE Chain 1: Iteration: 1150 / 1500 [ 76%] (Sampling)
FALSE Chain 1: Iteration: 1200 / 1500 [ 80%] (Sampling)
FALSE Chain 1: Iteration: 1250 / 1500 [ 83%] (Sampling)
FALSE Chain 1: Iteration: 1300 / 1500 [ 86%] (Sampling)
FALSE Chain 1: Iteration: 1350 / 1500 [ 90%] (Sampling)
FALSE Chain 1: Iteration: 1400 / 1500 [ 93%] (Sampling)
FALSE Chain 1: Iteration: 1450 / 1500 [ 96%] (Sampling)
FALSE Chain 1: Iteration: 1500 / 1500 [100%] (Sampling)
FALSE Chain 1:
FALSE Chain 1: Elapsed Time: 43.513 seconds (Warm-up)
FALSE Chain 1: 49.38 seconds (Sampling)
FALSE Chain 1: 92.893 seconds (Total)
FALSE Chain 1:
FALSE
FALSE SAMPLING FOR MODEL 'dgu_rep' NOW (CHAIN 2).
FALSE Chain 2:
FALSE Chain 2: Gradient evaluation took 0.000347 seconds
FALSE Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 3.47 seconds.
FALSE Chain 2: Adjust your expectations accordingly!
FALSE Chain 2:
FALSE Chain 2:
FALSE Chain 2: Iteration: 1 / 1500 [ 0%] (Warmup)
FALSE Chain 2: Iteration: 50 / 1500 [ 3%] (Warmup)
FALSE Chain 2: Iteration: 100 / 1500 [ 6%] (Warmup)
FALSE Chain 2: Iteration: 150 / 1500 [ 10%] (Warmup)
FALSE Chain 2: Iteration: 200 / 1500 [ 13%] (Warmup)
FALSE Chain 2: Iteration: 250 / 1500 [ 16%] (Warmup)
FALSE Chain 2: Iteration: 300 / 1500 [ 20%] (Warmup)
FALSE Chain 2: Iteration: 350 / 1500 [ 23%] (Warmup)
FALSE Chain 2: Iteration: 400 / 1500 [ 26%] (Warmup)
FALSE Chain 2: Iteration: 450 / 1500 [ 30%] (Warmup)
FALSE Chain 2: Iteration: 500 / 1500 [ 33%] (Warmup)
FALSE Chain 2: Iteration: 501 / 1500 [ 33%] (Sampling)
FALSE Chain 2: Iteration: 550 / 1500 [ 36%] (Sampling)
FALSE Chain 2: Iteration: 600 / 1500 [ 40%] (Sampling)
FALSE Chain 2: Iteration: 650 / 1500 [ 43%] (Sampling)
FALSE Chain 2: Iteration: 700 / 1500 [ 46%] (Sampling)
FALSE Chain 2: Iteration: 750 / 1500 [ 50%] (Sampling)
FALSE Chain 2: Iteration: 800 / 1500 [ 53%] (Sampling)
FALSE Chain 2: Iteration: 850 / 1500 [ 56%] (Sampling)
FALSE Chain 2: Iteration: 900 / 1500 [ 60%] (Sampling)
FALSE Chain 2: Iteration: 950 / 1500 [ 63%] (Sampling)
FALSE Chain 2: Iteration: 1000 / 1500 [ 66%] (Sampling)
FALSE Chain 2: Iteration: 1050 / 1500 [ 70%] (Sampling)
FALSE Chain 2: Iteration: 1100 / 1500 [ 73%] (Sampling)
FALSE Chain 2: Iteration: 1150 / 1500 [ 76%] (Sampling)
FALSE Chain 2: Iteration: 1200 / 1500 [ 80%] (Sampling)
FALSE Chain 2: Iteration: 1250 / 1500 [ 83%] (Sampling)
FALSE Chain 2: Iteration: 1300 / 1500 [ 86%] (Sampling)
FALSE Chain 2: Iteration: 1350 / 1500 [ 90%] (Sampling)
FALSE Chain 2: Iteration: 1400 / 1500 [ 93%] (Sampling)
FALSE Chain 2: Iteration: 1450 / 1500 [ 96%] (Sampling)
FALSE Chain 2: Iteration: 1500 / 1500 [100%] (Sampling)
FALSE Chain 2:
FALSE Chain 2: Elapsed Time: 37.1 seconds (Warm-up)
FALSE Chain 2: 79.601 seconds (Sampling)
FALSE Chain 2: 116.701 seconds (Total)
FALSE Chain 2:
ggplot(data = M$ppc$ppc_rep)+
facet_wrap(facets = ~individual_id, ncol = 3)+
geom_abline(intercept = 0, slope = 1, linetype = "dashed", col = "darkgray")+
geom_errorbar(aes(x = observed_count, y = ppc_mean_count,
ymin = ppc_L_count, ymax = ppc_H_count), col = "darkgray")+
geom_point(aes(x = observed_count, y = ppc_mean_count), size = 1)+
theme_bw(base_size = 11)+
theme(legend.position = "top")+
xlab(label = "Observed usage [counts]")+
ylab(label = "PPC usage [counts]")
The top panel shows the average gene usage (GU) in different biological conditions. The bottom panels shows the differential gene usage (DGU) between pairs of biological conditions.
g1 <- ggplot(data = M$gu)+
geom_errorbar(aes(x = gene_name, y = prob_mean, ymin = prob_L,
ymax = prob_H, col = condition),
width = 0.1, position = position_dodge(width = 0.4))+
geom_point(aes(x = gene_name, y = prob_mean, col = condition), size = 1,
position = position_dodge(width = 0.4))+
theme_bw(base_size = 11)+
theme(legend.position = "top")+
ylab(label = "GU [probability]")+
theme(axis.text.x = element_text(angle = 90, hjust = 1, vjust = 0.4))
stats <- M$dgu
stats <- stats[order(abs(stats$es_mean), decreasing = FALSE), ]
stats$gene_fac <- factor(x = stats$gene_name, levels = unique(stats$gene_name))
g2 <- ggplot(data = stats)+
facet_wrap(facets = ~contrast)+
geom_hline(yintercept = 0, linetype = "dashed", col = "gray")+
geom_errorbar(aes(x = pmax, y = es_mean, ymin = es_L, ymax = es_H),
col = "darkgray")+
geom_point(aes(x = pmax, y = es_mean, col = contrast))+
geom_text_repel(data = stats[stats$pmax >= 0.95, ],
aes(x = pmax, y = es_mean, label = gene_fac),
min.segment.length = 0, size = 2.75)+
theme_bw(base_size = 11)+
theme(legend.position = "top")+
xlab(label = expression(pi))+
xlim(c(0, 1))+
ylab(expression(gamma))
(g1/g2)
sessionInfo()
FALSE R version 4.5.0 Patched (2025-04-21 r88169)
FALSE Platform: x86_64-apple-darwin20
FALSE Running under: macOS Monterey 12.7.6
FALSE
FALSE Matrix products: default
FALSE BLAS: /Library/Frameworks/R.framework/Versions/4.5-x86_64/Resources/lib/libRblas.0.dylib
FALSE LAPACK: /Library/Frameworks/R.framework/Versions/4.5-x86_64/Resources/lib/libRlapack.dylib; LAPACK version 3.12.1
FALSE
FALSE locale:
FALSE [1] C/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8
FALSE
FALSE time zone: America/New_York
FALSE tzcode source: internal
FALSE
FALSE attached base packages:
FALSE [1] stats graphics grDevices utils datasets methods base
FALSE
FALSE other attached packages:
FALSE [1] patchwork_1.3.0 reshape2_1.4.4 ggrepel_0.9.6
FALSE [4] ggforce_0.4.2 ggplot2_3.5.2 knitr_1.50
FALSE [7] rstan_2.32.7 StanHeaders_2.32.10 IgGeneUsage_1.23.0
FALSE [10] BiocStyle_2.37.0
FALSE
FALSE loaded via a namespace (and not attached):
FALSE [1] tidyselect_1.2.1 dplyr_1.1.4
FALSE [3] farver_2.1.2 loo_2.8.0
FALSE [5] fastmap_1.2.0 tweenr_2.0.3
FALSE [7] digest_0.6.37 lifecycle_1.0.4
FALSE [9] magrittr_2.0.3 compiler_4.5.0
FALSE [11] rlang_1.1.6 sass_0.4.10
FALSE [13] tools_4.5.0 yaml_2.3.10
FALSE [15] S4Arrays_1.9.0 labeling_0.4.3
FALSE [17] pkgbuild_1.4.7 curl_6.2.2
FALSE [19] DelayedArray_0.35.1 plyr_1.8.9
FALSE [21] abind_1.4-8 withr_3.0.2
FALSE [23] purrr_1.0.4 BiocGenerics_0.55.0
FALSE [25] grid_4.5.0 polyclip_1.10-7
FALSE [27] stats4_4.5.0 colorspace_2.1-1
FALSE [29] inline_0.3.21 scales_1.3.0
FALSE [31] MASS_7.3-65 tinytex_0.57
FALSE [33] SummarizedExperiment_1.39.0 cli_3.6.4
FALSE [35] rmarkdown_2.29 crayon_1.5.3
FALSE [37] generics_0.1.3 RcppParallel_5.1.10
FALSE [39] httr_1.4.7 cachem_1.1.0
FALSE [41] stringr_1.5.1 parallel_4.5.0
FALSE [43] BiocManager_1.30.25 XVector_0.49.0
FALSE [45] matrixStats_1.5.0 vctrs_0.6.5
FALSE [47] V8_6.0.3 Matrix_1.7-3
FALSE [49] jsonlite_2.0.0 bookdown_0.43
FALSE [51] IRanges_2.43.0 S4Vectors_0.47.0
FALSE [53] magick_2.8.6 jquerylib_0.1.4
FALSE [55] tidyr_1.3.1 glue_1.8.0
FALSE [57] codetools_0.2-20 stringi_1.8.7
FALSE [59] gtable_0.3.6 GenomeInfoDb_1.45.0
FALSE [61] QuickJSR_1.7.0 GenomicRanges_1.61.0
FALSE [63] UCSC.utils_1.5.0 munsell_0.5.1
FALSE [65] tibble_3.2.1 pillar_1.10.2
FALSE [67] htmltools_0.5.8.1 GenomeInfoDbData_1.2.14
FALSE [69] R6_2.6.1 evaluate_1.0.3
FALSE [71] lattice_0.22-7 Biobase_2.69.0
FALSE [73] bslib_0.9.0 rstantools_2.4.0
FALSE [75] Rcpp_1.0.14 gridExtra_2.3
FALSE [77] SparseArray_1.9.0 xfun_0.52
FALSE [79] MatrixGenerics_1.21.0 pkgconfig_2.0.3