You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In ESMM, it seems there is a bug in auc computation. When there is no positive example in a batch, this batch is ignored for the auc computation. We cannot just simply remove this batch from computation, which is different from the auc caculation in tensorflow.
In my opinion, auc should be computed globally, insteady of averaging locally like in x-deeplearning. The all-negative batch can still change the number of FP globally, which cannot be ignored.
This bug makes the performance of esmm pretty high.
The text was updated successfully, but these errors were encountered:
In ESMM, it seems there is a bug in auc computation. When there is no positive example in a batch, this batch is ignored for the auc computation. We cannot just simply remove this batch from computation, which is different from the auc caculation in tensorflow.
In my opinion, auc should be computed globally, insteady of averaging locally like in x-deeplearning. The all-negative batch can still change the number of FP globally, which cannot be ignored.
This bug makes the performance of esmm pretty high.
The text was updated successfully, but these errors were encountered: