- refactored Scaler, Potential, and Trainer
- optimized Kalman Filter JIT kernels
- separated scaler and model parameters from potential
- bug fixes
- added an optimized implementation (100x faster) of the ACSF gradient calculation
- required adjustments on the base codes and example notebooks to work with the recent version of JAX
- added autodiff gradient method to the LJ force calculation kernel
- Changed package name to Pantea
- Refactored structure including design, documentation, and performance (JIT kernels)
- Refactored RuNNer dataset
- Implemented a Molecular dynamics (MD) simulator
- Monte-Carlo (MC) simulator
- Implemented Kalman filter trainer
- Applied extensive refactoring
- Replaced PyTorch main dependency with JAX
- First release on PyPI.
- JAX optimizations applied to the ACSF descriptor
- Some optimizations using torch.jit.script
- Primary implementation and validation