-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for 3D and 2D grouped conolutions #33
base: main
Are you sure you want to change the base?
Conversation
a31f546
to
0bdd955
Compare
@@ -33,30 +35,36 @@ class ConvConfig: | |||
Q: int | |||
F: int | |||
S: int | |||
is_grouped_conv: bool | |||
G: int # group count | |||
is_3D_conv: bool |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
possible to make another class instead for conv3d? That way every time we try to add a problem, we don't need the False, -1, -1, -1
part.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll suggest another option, which would be to give these parameters some default values and put them at the end of the __init__
function. Then we don't need an extra class and the conv2d configs don't need to add these fields.
It's probably a good idea to get rid of these classes in favor of using python bindings at some point anyway, so I see the config classes a temporary implementation. Linalg ops are relatively stable, so it has not caused us any maintenance issues yet, but the bindings are much more resilient to changes to IR assembly format. We are making the same transition on the tuner side, right now, because it is very dependent on codegen dialects (which are very in flux right now).
Eventually we can get rid of these separate config classes, and just have a single config class that is used across all kernels (gemm, attention, conv, etc.) that just has functions to build the desired kernel types, and track things like peak flops, arithmetic intensity, etc. Then we can get rid of all these classes, and move everything to a shared benchmarking implementation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you add some 3D and grouped convs to the conv problems so it can be tested on the CI?
@@ -33,30 +35,36 @@ class ConvConfig: | |||
Q: int | |||
F: int | |||
S: int | |||
is_grouped_conv: bool | |||
G: int # group count | |||
is_3D_conv: bool |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll suggest another option, which would be to give these parameters some default values and put them at the end of the __init__
function. Then we don't need an extra class and the conv2d configs don't need to add these fields.
It's probably a good idea to get rid of these classes in favor of using python bindings at some point anyway, so I see the config classes a temporary implementation. Linalg ops are relatively stable, so it has not caused us any maintenance issues yet, but the bindings are much more resilient to changes to IR assembly format. We are making the same transition on the tuner side, right now, because it is very dependent on codegen dialects (which are very in flux right now).
Eventually we can get rid of these separate config classes, and just have a single config class that is used across all kernels (gemm, attention, conv, etc.) that just has functions to build the desired kernel types, and track things like peak flops, arithmetic intensity, etc. Then we can get rid of all these classes, and move everything to a shared benchmarking implementation.
No description provided.