-
Notifications
You must be signed in to change notification settings - Fork 425
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Could we train Implicit Sequence Models in a multi-label scenario? #156
Comments
Are you trying to model a scenario where a user buys two or more items at the same time? You are right that this isn't supported here in a straightforward way. |
I'm not sure the intent of the original question but this is something I am also attempting to do. I think about my sequences like this The evaluation is easy enough ,the default Len(intersect)/Len(predictions)/Len(intersect)/Len(targets) slash F1 works just fine, just use T1 ... T1+i to predict Tn. You have to modify the evaluation function to separate by transaction instead of by masking the last k products and deal with potential repeat purchases within a transaction. As far as the model goes however I'm not certain the current implementation works for semi sequential data. Do you have any thoughts about a way to extend the current model to handle a scenario like this? Thank you for all the hard work on this amazing package. |
Come to think of it it may be possible to do some form of sub-sorting within transactions. My idea is to impose order in the training set. My initial thought is sorting within a transaction by the value of that product within the transaction (utility) so that value is always ascending within a transaction. Then you could manually assign custom timestamps to impose this order. This should work because the order of my prediction sequence isn't important. |
@maciejkula yeah, that's it. Each time step |
instead of having only one single label (item) per position, have many instead. I dont know what losses we could use.
The text was updated successfully, but these errors were encountered: