-
Notifications
You must be signed in to change notification settings - Fork 101
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PINT should have a derivative-aware MCMC method #1303
Comments
The NUTS sampler: https://arxiv.org/abs/1111.4246 |
Parallel tempering to get more parallelism: http://auai.org/uai2017/proceedings/papers/289.pdf |
@abhisrkckl: this may be relevant to you |
I think it is better to have an API that provides lnlikelihood, lnlikelihood_gradient, lnprior, and prior_transform than to have sampling methods baked into PINT itself. This will allow users to use the sampler of their choice rather than be tied to what PINT itself has. I am working on an interface like this. I have not implemented lnlikelihood_gradient yet but it should be straightforward to do using the design matrix. |
Related discussion in issue #1310 |
When derivatives of the objective function are available, MCMC methods that use them, for example Hamiltonian Monte Carlo, can be very much faster than MCMC methods that aren't. PINT has gone to great lengths to make derivatives available, so we should do this. PyMC3 includes the supposedly tuning-free "NUTS" sampler, so it could probably be used for this, although there would probably be some plumbing work. This could probably work for
event_optimize
as well, where faster runtimes would be extremely welcome. I'm not sure how well it parallelizes.The text was updated successfully, but these errors were encountered: