Releases: secondmind-labs/trieste
Release 1.0.0
Note: this release marks the first 1.x release but is compatible with 0.13.3. Future releases will (try to) conform to semantic versioning. Since trieste is a research-led toolbox, this may result in reasonably frequent major version increments.
New features
NARGP multifidelity model (#665)
BO-specific inducing point allocators (#683)
Improvements
Support for explicit constraints in ExpectedImprovement (#664)
Support broadcasting in search space contains method (#677)
Improved logging for GPflow models (#680)
Faster sampler for deep ensembles (#682)
Build changes
.gitignore additions (#679)
Upgrade to GPflow 2.7.0 (#684)
Workaround for slowtest OOM crash (#685)
Full Changelog: v0.13.3...v1.0.0
Release 0.13.3
This release reintroduces ProbabilityOfImprovement (#638) and Pareto diverse sample method (#643), which were temporarily removed in 0.13.2.
Note that the minimum supported TensorFlow version has been raised to 2.5.
New features
Batch Expected Improvement (#641, #653)
Portfolio method for Batch BO (#651, #659, #663)
Multifidelity modeling (#621, #654)
Explicit constraints (#656, #660)
Improvements
Allow scipy optimizer to be changed (#655)
Allow arbitrary dataset/model tags (#668) — note that this may break type checking for existing code
Slight improvements to synthetic objective functgions (#671)
Build changes
Support latest gpflow and gpflux (#649)
Support more recent tox versions (#669)
Full Changelog: v0.13.1...v0.13.3
Release 0.13.2
This release fixes the 0.13.1 release by temporarily removing the two new features (#638 and #643) as they were preventing trieste from being used with the latest release of GPflow. A future release will reintroduce them.
The fix to handle constant priors in randomize_hyperparameters (#646) remains in the release.
Full Changelog: v0.13.0...v0.13.2
Release 0.13.1
New features
ProbabilityOfImprovement acquisition function (#638)
Pareto diverse sample method (#643)
Improvements
Handle constant priors in randomize_hyperparameters (#646)
Full Changelog: v0.13.0...v0.13.1
Release 0.13.0
New features
Trajectory sampler support for multiple outputs (#582)
Acquisition Functions are now pickleable (#607)
Improvements
TensorBoard logging improvements (#593, #600, #605, #612, #620, #630)
Scalable Pareto front calculation (#633)
Models with keras callbacks now pickleable (#604, #615)
Keras backend fix for Deep GPs (#591)
Fix issue with compiled tf.debugging.Asserts (#622, #623)
Fix trajectory samplers for deep ensemble (#611, #625)
Allow zero kernel samples in GPR hyperparameter init (#635)
Deprecated features
Remove model config support (#626)
Build changes
Code overview tutorial (#592)
Expose trieste version (#613, #618)
Cleaner API for single and multi objective test functions (#628)
AskTell model setter (#595)
Full Changelog: v0.12.0...v0.13.0
Release 0.12.0
New features
Early stop callback (#579)
Experimental plotting utilities package (#584)
Improvements
Scipy BFGS optimizer tensorboard logging (#577)
Support empty tagged product spaces (#578)
Build changes
Versioned documentation (#581, #585, #586, #590)
Detect unreachable code (#576)
Fix docstring parameter typos (#587)
Full Changelog: v0.11.3...v0.12.0
Release 0.11.3
This point release fixes support for GPflux 0.2.7, which was broken when adding support for GPflux 0.3.0 in the last release.
Full Changelog: v0.11.2...v0.11.3
Release 0.11.2
This point release fixes another bug affecting the copying and saving of deep ensemble models, and also adds support for saving deep GP models.
New features
Copying and saving deep GP models (#569)
Fixes
Fix copying deep ensemble model with no optimizer callbacks (#569)
Build changes
Test against gpflux 0.3.0 and tensorflow 2.8.0 (#571)
Full Changelog: v0.11.1...v0.11.2
Release 0.11.1
This point release fixes a bug affecting the copying and saving of deep ensemble models.
Fixes
Fix copying of deep ensemble models (#567)
Full Changelog: v0.11.0...v0.11.1
Release 0.11.0
New features
SparseGaussianProcessRegression model wrapper (#531)
MonteCarloExpectedImprovement acquisition function (#393, #554)
DecoupledTrajectorySampler (#504, #556, #559)
DeepGaussianProcessDecoupledTrajectorySampler (#393, #549)
Fixed reference points for multi-objective optimization (#385)
Inducing point selectors (#511, #564)
HIPPO: HIghly Parallelizable Pareto Optimization (#519)
Saving optimization history to disk (#535)
Copying and saving deep ensemble models (#540)
MakePositive acquisition function transformation (#516)
try_get_optimal_point result helper method (#517)
Improvements
TensorBoard monitoring overhaul (#514, #515, #536, #544, #551)
Use gpflow posterior objects to speed up acquisition (#523, #532)
DeepEnsembleTrajectorySampler improvements (#541)
Avoid recompiling training loss closures (#553)
Default to compiled model optimizer for gpflow (#528)
Multiplication of mixed search spaces (#518)
Seed argument in search space sample method (#561)
Support ask-tell with uncopyable models (#545)
Make gpflux builders consistent with gpflow builders (#478)
Type checking improvements (#506, #507, #513, #521)
Build changes
Update to gpflow 2.5.2 (#522, #550, #555)
Name integration tests (#508)
Make fake8 ignore build directory (#557)
Thompson Sampling notebook (#527)
Documentation fixes (#502, #525, #542)
Full Changelog: v0.10.0...v0.11.0