diff --git a/CHANGELOG.md b/CHANGELOG.md index c83fd7bb..f540ac9d 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,15 +1,6 @@ # Changelog -## HEAD - -- Add support for monotonic constraints for gradient boosted trees. - -## Fix - -- Fix Window compilation with Visual Studio 2019 -- Improved error messages for invalid training configuration - -## Breaking change +## Breaking changes - The dependency to the distributed gradient boosted trees learner is renamed from @@ -22,16 +13,19 @@ - The training configuration must contain a label. A missing label is no longer interpreted as the label being the input feature "". -## 1.6.0 - rc0 2023-08-22 - ### Feature + +- Add support for monotonic constraints for gradient boosted trees. - Improve speed of dataset reading and writing. ### Fix - Proper error message when using distributed training on more than 2^31 (i.e., ~2B) examples while compiling YDF with 32-bits example index. +- Fix Window compilation with Visual Studio 2019 +- Improved error messages for invalid training configuration +- Replaced outdated dependencies ## 1.5.0 - 2023-07-03 diff --git a/README.md b/README.md index fde85380..fb0e7f2a 100644 --- a/README.md +++ b/README.md @@ -166,6 +166,11 @@ the following paper: Yggdrasil Decision Forests: A Fast and Extensible Decision Forests Library, Guillame-Bert et al., KDD 2023: 4068-4077. doi:10.1145/3580305.3599933 +## Contact + +You can contact the core development team at +[decision-forests-contact@google.com](mailto:decision-forests-contact@google.com). + ## Credits Yggdrasil Decision Forests and TensorFlow Decision Forests are developed by: diff --git a/documentation/developer_manual.md b/documentation/developer_manual.md index 35384935..8c2296d8 100644 --- a/documentation/developer_manual.md +++ b/documentation/developer_manual.md @@ -4,13 +4,7 @@ -* [Developer Manual](#developer-manual) - * [Table of Contents](#table-of-contents) - * [Design principles](#design-principles) - * [Where to change the code](#where-to-change-the-code) - * [Integration with existing code](#integration-with-existing-code) - * [How to test the code](#how-to-test-the-code) - * [Models and Learners](#models-and-learners) + diff --git a/documentation/rtd/hyper_parameters.md b/documentation/rtd/hyper_parameters.md index 4e2780fa..9ce23b98 100644 --- a/documentation/rtd/hyper_parameters.md +++ b/documentation/rtd/hyper_parameters.md @@ -869,7 +869,7 @@ reasonable time. A CART (Classification and Regression Trees) a decision tree. The non-leaf nodes -contains conditions (also known as splits) while the leaf nodes contains +contains conditions (also known as splits) while the leaf nodes contain prediction values. The training dataset is divided in two parts. The first is used to grow the tree while the second is used to prune the tree. @@ -1121,8 +1121,8 @@ learner hyper-parameters. - **Type:** Real **Default:** 0.1 **Possible values:** min:0 max:1 -- Ratio of the training dataset used to create the validation dataset used to - prune the tree. If set to 0, the entire dataset is used for training, and +- Ratio of the training dataset used to create the validation dataset for + pruning the tree. If set to 0, the entire dataset is used for training, and the tree is not pruned.