Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Continue training in CLI if one iteration produces a single-leaf tree #5699

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 4 additions & 5 deletions src/boosting/gbdt.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -231,10 +231,8 @@ void GBDT::Train(int snapshot_freq, const std::string& model_output_path) {
bool is_finished = false;
auto start_time = std::chrono::steady_clock::now();
for (int iter = 0; iter < config_->num_iterations && !is_finished; ++iter) {
is_finished = TrainOneIter(nullptr, nullptr);
if (!is_finished) {
is_finished = EvalAndCheckEarlyStopping();
}
TrainOneIter(nullptr, nullptr);
is_finished = EvalAndCheckEarlyStopping();
auto end_time = std::chrono::steady_clock::now();
// output used time per iteration
Log::Info("%f seconds elapsed, finished iteration %d", std::chrono::duration<double,
Expand Down Expand Up @@ -421,6 +419,8 @@ bool GBDT::TrainOneIter(const score_t* gradients, const score_t* hessians) {
models_.push_back(std::move(new_tree));
}

++iter_;

if (!should_continue) {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The one-leaf trees will be pop?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tested the result and stopping or using it as this is works the same (assumming there is no other built tree down the line). Incrementing the iter_ variable is needed for the boosting to take it as a finished tree (even if the tree is of no use). Otherwise it will continue forever.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sorry for the late response. I think we should make the behavior the same as the python-package. I remember that the one-leaf tree will be kept in the python-package, but the current implementation will pop the one-leaf trees.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Samsagax what do you think about my comment?

Log::Warning("Stopped training because there are no more leaves that meet the split requirements");
if (models_.size() > static_cast<size_t>(num_tree_per_iteration_)) {
Expand All @@ -431,7 +431,6 @@ bool GBDT::TrainOneIter(const score_t* gradients, const score_t* hessians) {
return true;
}

++iter_;
return false;
}

Expand Down