./math/py-pymc3, Bayesian modeling and probabilistic machine learning

[ CVSweb ] [ Homepage ] [ RSS ] [ Required by ] [ Add to tracker ]


Branch: CURRENT, Version: 3.7, Package name: py37-pymc3-3.7, Maintainer: minskim

PyMC3 is a Python package for Bayesian statistical modeling and
Probabilistic Machine Learning focusing on advanced Markov chain Monte
Carlo (MCMC) and variational inference (VI) algorithms. Its
flexibility and extensibility make it applicable to a large suite of
problems.


Required to run:
[graphics/py-matplotlib] [devel/py-setuptools] [math/py-scipy] [math/py-numpy] [math/py-pandas] [devel/py-h5py] [math/py-patsy] [misc/py-tqdm] [lang/python37] [math/py-Theano]

Required to build:
[pkgtools/cwrappers]

Master sites:

SHA1: 6faee9eaee99f0d1386a772bf053d0ad977a8dce
RMD160: 3286dbb723ef2025af308e7fad503f774b68be78
Filesize: 30058.912 KB

Version history: (Expand)


CVS history: (Expand)


   2019-06-16 21:17:23 by Adam Ciarcinski | Files touched by this commit (3) | Package updated
Log message:
py-pymc3: updated to 3.7

PyMC3 3.7 (May 29 2019)

New features

Add data container class (Data) that wraps the theano SharedVariable class and \ 
let the model be aware of its inputs and outputs.
Add function set_data to update variables defined as Data.
Mixture now supports mixtures of multidimensional probability distributions, not \ 
just lists of 1D distributions.
GLM.from_formula and LinearComponent.from_formula can extract variables from the \ 
calling scope. Customizable via the new eval_env argument.
Added the distributions.shape_utils module with functions used to help broadcast \ 
samples drawn from distributions using the size keyword argument.
Used numpy.vectorize in distributions.distribution._compile_theano_function. \ 
This enables sample_prior_predictive and sample_posterior_predictive to ask for \ 
tuples of samples instead of just integers.

Maintenance

All occurances of sd as a parameter name have been renamed to sigma. sd will \ 
continue to function for backwards compatibility.
HamiltonianMC was ignoring certain arguments like target_accept, and not using \ 
the custom step size jitter function with expectation 1.
Made BrokenPipeError for parallel sampling more verbose on Windows.
Added the broadcast_distribution_samples function that helps broadcasting arrays \ 
of drawn samples, taking into account the requested size and the inferred \ 
distribution shape. This sometimes is needed by distributions that call several \ 
rvs separately within their random method, such as the ZeroInflatedPoisson.
The Wald, Kumaraswamy, LogNormal, Pareto, Cauchy, HalfCauchy, Weibull and \ 
ExGaussian distributions random method used a hidden _random function that was \ 
written with scalars in mind. This could potentially lead to artificial \ 
correlations between random draws. Added shape guards and broadcasting of the \ 
distribution samples to prevent this.
Added a fix to allow the imputation of single missing values of observed data, \ 
which previously would fail.
The draw_values function was too permissive with what could be grabbed from \ 
inside point, which lead to an error when sampling posterior predictives of \ 
variables that depended on shared variables that had changed their shape after \ 
pm.sample() had been called.
draw_values now adds the theano graph descendants of TensorConstant or \ 
SharedVariables to the named relationship nodes stack, only if these descendants \ 
are ObservedRV or MultiObservedRV instances.
Fixed bug in broadcast_distrution_samples, which did not handle correctly cases \ 
in which some samples did not have the size tuple prepended.
Changed MvNormal.random's usage of tensordot for Cholesky encoded covariances. \ 
This lead to wrong axis broadcasting and seemed to be the cause for issue 3343.
Fixed defect in Mixture.random when multidimensional mixtures were involved. The \ 
mixture component was not preserved across all the elements of the dimensions of \ 
the mixture. This meant that the correlations across elements within a given \ 
draw of the mixture were partly broken.
Restructured Mixture.random to allow better use of vectorized calls to \ 
comp_dists.random.
Added tests for mixtures of multidimensional distributions to the test suite.
Fixed incorrect usage of broadcast_distribution_samples in DiscreteWeibull.
Mixture's default dtype is now determined by theano.config.floatX.
dist_math.random_choice now handles nd-arrays of category probabilities, and \ 
also handles sizes that are not None. Also removed unused k kwarg from \ 
dist_math.random_choice.
Changed Categorical.mode to preserve all the dimensions of p except the last \ 
one, which encodes each category's probability.
Changed initialization of Categorical.p. p is now normalized to sum to 1 inside \ 
logp and random, but not during initialization. This could hide negative values \ 
supplied to p as mentioned in 2082.
Categorical now accepts elements of p equal to 0. logp will return -inf if there \ 
are values that index to the zero probability categories.
Add sigma, tau, and sd to signature of NormalMixture.
Set default lower and upper values of -inf and inf for \ 
pm.distributions.continuous.TruncatedNormal. This avoids errors caused by their \ 
previous values of None.
Converted all calls to pm.distributions.bound._ContinuousBounded and \ 
pm.distributions.bound._DiscreteBounded to use only and all positional \ 
arguments.
Restructured distributions.distribution.generate_samples to use the shape_utils \ 
module. This solves issues 3421 and 3147 by using the size aware broadcating \ 
functions in shape_utils.
Fixed the Multinomial.random and Multinomial.random_ methods to make them \ 
compatible with the new generate_samples function. In the process, a bug of the \ 
Multinomial.random_ shape handling was discovered and fixed.
Fixed a defect found in Bound.random where the point dictionary was passed to \ 
generate_samples as an arg instead of in not_broadcast_kwargs.
Fixed a defect found in Bound.random_ where total_size could end up as a float64 \ 
instead of being an integer if given size=tuple().
Fixed an issue in model_graph that caused construction of the graph of the model \ 
for rendering to hang: replaced a search over the powerset of the nodes with a \ 
breadth-first search over the nodes.
Removed variable annotations from model_graph but left type hints. This means \ 
that we support python>=3.5.4.
Default target_acceptfor HamiltonianMC is now 0.65, as suggested in Beskos et. \ 
al. 2010 and Neal 2001.
Fixed bug in draw_values that lead to intermittent errors in python3.5. This \ 
happened with some deterministic nodes that were drawn but not added to givens.
Deprecations

nuts_kwargs and step_kwargs have been deprecated in favor of using the standard \ 
kwargs to pass optional step method arguments.
SGFS and CSG have been removed. They have been moved to pymc3-experimental.
References to live_plot and corresponding notebooks have been removed.
Function approx_hessian was removed, due to numdifftools becoming incompatible \ 
with current scipy. The function was already optional, only available to a user \ 
who installed numdifftools separately, and not hit on any common codepaths.
Deprecated vars parameter of sample_posterior_predictive in favor of varnames.
References to live_plot and corresponding notebooks have been removed.
Deprecated vars parameters of sample_posterior_predictive and \ 
sample_prior_predictive in favor of var_names. At least for the latter, this is \ 
more accurate, since the vars parameter actually took names.
   2018-07-23 03:37:54 by Min Sik Kim | Files touched by this commit (3) | Package updated
Log message:
math/py-pymc3: Update to 3.5

New features:

- Add documentation section on survival analysis and censored data
  models
- Add check_test_point method to pm.Model
- Add Ordered Transformation and OrderedLogistic distribution
- Add Chain transformation
- Improve error message Mass matrix contains zeros on the
  diagonal. Some derivatives might always be zero during tuning of
  pm.sample
- Improve error message NaN occurred in optimization. during ADVI
- Save and load traces without pickle using pm.save_trace and
  pm.load_trace
- Add Kumaraswamy distribution
- Add TruncatedNormal distribution
- Rewrite parallel sampling of multiple chains on py3. This resolves
  long standing issues when transferring large traces to the main
  process, avoids pickling issues on UNIX, and allows us to show a
  progress bar for all chains. If parallel sampling is interrupted, we
  now return partial results.
- Add sample_prior_predictive which allows for efficient sampling from
  the unconditioned model.
- SMC: remove experimental warning, allow sampling using sample,
  reduce autocorrelation from final trace.
- Add model_to_graphviz (which uses the optional dependency graphviz)
  to plot a directed graph of a PyMC3 model using plate notation.
- Add beta-ELBO variational inference as in beta-VAE model
  (Christopher P. Burgess et al. NIPS, 2017)
- Add __dir__ to SingleGroupApproximation to improve autocompletion in
  interactive environments
   2018-07-06 05:46:44 by Min Sik Kim | Files touched by this commit (4)
Log message:
math/py-pymc3: Import version 3.4.1

PyMC3 is a Python package for Bayesian statistical modeling and
Probabilistic Machine Learning focusing on advanced Markov chain Monte
Carlo (MCMC) and variational inference (VI) algorithms. Its
flexibility and extensibility make it applicable to a large suite of
problems.