Next | Query returned 11 messages, browsing 1 to 10 | Previous

History of commit frequency

CVS Commit History:


   2023-08-02 01:20:57 by Thomas Klausner | Files touched by this commit (158)
Log message:
*: remove more references to Python 3.7
   2023-07-01 10:37:47 by Thomas Klausner | Files touched by this commit (105) | Package updated
Log message:
*: restrict py-numpy users to 3.9+ in preparation for update
   2022-04-10 02:57:15 by David H. Gutteridge | Files touched by this commit (18)
Log message:
Fix build breakage from py-scipy now being Python >= 3.8
   2022-01-04 21:55:40 by Thomas Klausner | Files touched by this commit (1595)
Log message:
*: bump PKGREVISION for egg.mk users

They now have a tool dependency on py-setuptools instead of a DEPENDS
   2021-12-30 14:05:42 by Adam Ciarcinski | Files touched by this commit (125)
Log message:
Forget about Python 3.6
   2021-10-26 12:56:13 by Nia Alarie | Files touched by this commit (458)
Log message:
math: Replace RMD160 checksums with BLAKE2s checksums

All checksums have been double-checked against existing RMD160 and
SHA512 hashes
   2021-10-07 16:28:36 by Nia Alarie | Files touched by this commit (458)
Log message:
math: Remove SHA1 hashes for distfiles
   2021-04-09 16:41:35 by Tobias Nygren | Files touched by this commit (14)
Log message:
revert wrong fix for py-scipy python 3.6 deprecation, fix properly
   2019-06-16 21:17:23 by Adam Ciarcinski | Files touched by this commit (3) | Package updated
Log message:
py-pymc3: updated to 3.7

PyMC3 3.7 (May 29 2019)

New features

Add data container class (Data) that wraps the theano SharedVariable class and \ 
let the model be aware of its inputs and outputs.
Add function set_data to update variables defined as Data.
Mixture now supports mixtures of multidimensional probability distributions, not \ 
just lists of 1D distributions.
GLM.from_formula and LinearComponent.from_formula can extract variables from the \ 
calling scope. Customizable via the new eval_env argument.
Added the distributions.shape_utils module with functions used to help broadcast \ 
samples drawn from distributions using the size keyword argument.
Used numpy.vectorize in distributions.distribution._compile_theano_function. \ 
This enables sample_prior_predictive and sample_posterior_predictive to ask for \ 
tuples of samples instead of just integers.

Maintenance

All occurances of sd as a parameter name have been renamed to sigma. sd will \ 
continue to function for backwards compatibility.
HamiltonianMC was ignoring certain arguments like target_accept, and not using \ 
the custom step size jitter function with expectation 1.
Made BrokenPipeError for parallel sampling more verbose on Windows.
Added the broadcast_distribution_samples function that helps broadcasting arrays \ 
of drawn samples, taking into account the requested size and the inferred \ 
distribution shape. This sometimes is needed by distributions that call several \ 
rvs separately within their random method, such as the ZeroInflatedPoisson.
The Wald, Kumaraswamy, LogNormal, Pareto, Cauchy, HalfCauchy, Weibull and \ 
ExGaussian distributions random method used a hidden _random function that was \ 
written with scalars in mind. This could potentially lead to artificial \ 
correlations between random draws. Added shape guards and broadcasting of the \ 
distribution samples to prevent this.
Added a fix to allow the imputation of single missing values of observed data, \ 
which previously would fail.
The draw_values function was too permissive with what could be grabbed from \ 
inside point, which lead to an error when sampling posterior predictives of \ 
variables that depended on shared variables that had changed their shape after \ 
pm.sample() had been called.
draw_values now adds the theano graph descendants of TensorConstant or \ 
SharedVariables to the named relationship nodes stack, only if these descendants \ 
are ObservedRV or MultiObservedRV instances.
Fixed bug in broadcast_distrution_samples, which did not handle correctly cases \ 
in which some samples did not have the size tuple prepended.
Changed MvNormal.random's usage of tensordot for Cholesky encoded covariances. \ 
This lead to wrong axis broadcasting and seemed to be the cause for issue 3343.
Fixed defect in Mixture.random when multidimensional mixtures were involved. The \ 
mixture component was not preserved across all the elements of the dimensions of \ 
the mixture. This meant that the correlations across elements within a given \ 
draw of the mixture were partly broken.
Restructured Mixture.random to allow better use of vectorized calls to \ 
comp_dists.random.
Added tests for mixtures of multidimensional distributions to the test suite.
Fixed incorrect usage of broadcast_distribution_samples in DiscreteWeibull.
Mixture's default dtype is now determined by theano.config.floatX.
dist_math.random_choice now handles nd-arrays of category probabilities, and \ 
also handles sizes that are not None. Also removed unused k kwarg from \ 
dist_math.random_choice.
Changed Categorical.mode to preserve all the dimensions of p except the last \ 
one, which encodes each category's probability.
Changed initialization of Categorical.p. p is now normalized to sum to 1 inside \ 
logp and random, but not during initialization. This could hide negative values \ 
supplied to p as mentioned in 2082.
Categorical now accepts elements of p equal to 0. logp will return -inf if there \ 
are values that index to the zero probability categories.
Add sigma, tau, and sd to signature of NormalMixture.
Set default lower and upper values of -inf and inf for \ 
pm.distributions.continuous.TruncatedNormal. This avoids errors caused by their \ 
previous values of None.
Converted all calls to pm.distributions.bound._ContinuousBounded and \ 
pm.distributions.bound._DiscreteBounded to use only and all positional \ 
arguments.
Restructured distributions.distribution.generate_samples to use the shape_utils \ 
module. This solves issues 3421 and 3147 by using the size aware broadcating \ 
functions in shape_utils.
Fixed the Multinomial.random and Multinomial.random_ methods to make them \ 
compatible with the new generate_samples function. In the process, a bug of the \ 
Multinomial.random_ shape handling was discovered and fixed.
Fixed a defect found in Bound.random where the point dictionary was passed to \ 
generate_samples as an arg instead of in not_broadcast_kwargs.
Fixed a defect found in Bound.random_ where total_size could end up as a float64 \ 
instead of being an integer if given size=tuple().
Fixed an issue in model_graph that caused construction of the graph of the model \ 
for rendering to hang: replaced a search over the powerset of the nodes with a \ 
breadth-first search over the nodes.
Removed variable annotations from model_graph but left type hints. This means \ 
that we support python>=3.5.4.
Default target_acceptfor HamiltonianMC is now 0.65, as suggested in Beskos et. \ 
al. 2010 and Neal 2001.
Fixed bug in draw_values that lead to intermittent errors in python3.5. This \ 
happened with some deterministic nodes that were drawn but not added to givens.
Deprecations

nuts_kwargs and step_kwargs have been deprecated in favor of using the standard \ 
kwargs to pass optional step method arguments.
SGFS and CSG have been removed. They have been moved to pymc3-experimental.
References to live_plot and corresponding notebooks have been removed.
Function approx_hessian was removed, due to numdifftools becoming incompatible \ 
with current scipy. The function was already optional, only available to a user \ 
who installed numdifftools separately, and not hit on any common codepaths.
Deprecated vars parameter of sample_posterior_predictive in favor of varnames.
References to live_plot and corresponding notebooks have been removed.
Deprecated vars parameters of sample_posterior_predictive and \ 
sample_prior_predictive in favor of var_names. At least for the latter, this is \ 
more accurate, since the vars parameter actually took names.
   2018-07-23 03:37:54 by Min Sik Kim | Files touched by this commit (3)
Log message:
math/py-pymc3: Update to 3.5

New features:

- Add documentation section on survival analysis and censored data
  models
- Add check_test_point method to pm.Model
- Add Ordered Transformation and OrderedLogistic distribution
- Add Chain transformation
- Improve error message Mass matrix contains zeros on the
  diagonal. Some derivatives might always be zero during tuning of
  pm.sample
- Improve error message NaN occurred in optimization. during ADVI
- Save and load traces without pickle using pm.save_trace and
  pm.load_trace
- Add Kumaraswamy distribution
- Add TruncatedNormal distribution
- Rewrite parallel sampling of multiple chains on py3. This resolves
  long standing issues when transferring large traces to the main
  process, avoids pickling issues on UNIX, and allows us to show a
  progress bar for all chains. If parallel sampling is interrupted, we
  now return partial results.
- Add sample_prior_predictive which allows for efficient sampling from
  the unconditioned model.
- SMC: remove experimental warning, allow sampling using sample,
  reduce autocorrelation from final trace.
- Add model_to_graphviz (which uses the optional dependency graphviz)
  to plot a directed graph of a PyMC3 model using plate notation.
- Add beta-ELBO variational inference as in beta-VAE model
  (Christopher P. Burgess et al. NIPS, 2017)
- Add __dir__ to SingleGroupApproximation to improve autocompletion in
  interactive environments

Next | Query returned 11 messages, browsing 1 to 10 | Previous