[graph-tool] Model Selection across Distributions
kicasta at gmail.com
Mon Nov 30 22:39:06 CET 2020
yes, I mean edge-covariates. In the example you referenced you compare
state.entropy() for two distributions, i.e. exponential and
log-normal, where for the log-normal model the covariates were scaled,
which is handled by subtracting log(g.ep.weight.a).sum().
In case I want to simply compare two models with unscaled discrete
covariates: one using a geometric distribution and one using a
binomial distribution. Can I perform model selection by simply
comparing their state.entropy() values?
El lun, 30 nov 2020 a las 13:45, Tiago de Paula Peixoto
(<tiago at skewed.de>) escribió:
> Am 30.11.20 um 10:29 schrieb kicasta:
> > Hi all,
> > I´d have a question regarding model selection with different distributions.
> > When we want to decide the partition that best describes the data for a
> > given distribution we go with that that gives the smallest entropy. However
> > say we want to compare 2 different distributions d1 and d2 and the best fit
> > for d1 gives an entropy value of e1 and for d2 e2 respectively. If e1 < e2,
> > can we say that d1 describes better our data than d2?
> Could you be more specific about to which "distributions" you are
> referring? Are you talking about edge covariates?
> If so, model selection is explained here:
> In this case, the entropy* itself is not enough, you have to consider
> also the derivative terms, as is explained in the above.
> (The term "entropy" is actually misleading in this context, since the
> value refers to a log-density rather than a log-probability.)
> Tiago de Paula Peixoto <tiago at skewed.de>
> graph-tool mailing list
> graph-tool at skewed.de
More information about the graph-tool