-
Notifications
You must be signed in to change notification settings - Fork 117
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SNOW-612394: Support for Python >= 3.9 #377
Comments
Hey @dingobar, thanks for your feedback! We use Python 3.8 to maintain consistency between the client and the server side infrastructure (mainly about Python UDF). We have plans to support newer versions of Python on the server side and the client support will be updated accordingly. |
Hi @sfc-gh-jdu, thank you for getting back to me. What's the ballpark estimate for when that could come into fruition? |
@dingobar Unfortunately, we do not have an ETA yet but it is under active discussion within the team. |
That's fair enough. I guess it also makes sense to sync the versions between the client and the servers, as the code will actually be mirrored and run server-side? In any case, I find it a bit of a headache as this dependency cannot be installed into our curated python environments where our data scientists interact with snowflake, because that environment runs python 3.10, and I am hesitant to downgrade it' Any chance we could release the 3.8.* pin, and specify the server-side supported version in docs? We could even log a warning when creating a session on a non-supported version? As I mentioned earlier, everything works just fine on python 3.10 at a glance, and I find it unlikely that we would run into issues. Edit: To be clear, I don't mind implementing any solution for this if need be. |
Also running into this issue, what's the plan here? |
@hanakawamomo there is a perfectly harmless workaround of just installing python3.8 and working in that environment. Are you having trouble with that? |
I wouldn't consider downgrading the python minor version universally harmless, though. |
@sfc-gh-jdu I'd also like to point out that the recommended solution from Snowflake for arm macOS users to this issue is "update to Homebrew Python 3.9". So the Snowflake connector doesn't work with Python < 3.9 on arm Macs, and Snowpark doesn't work on Python > 3.8. |
@badge seconded. That's kind of a showstopper for my team. We'd be fine with updating to 3.9 or fine running on 3.8, but we sure can't do both simultaneously. |
Hi @sfc-gh-jdu, is there any update on ETA? |
+1 to @badge's comment, we'd love to test snowpark locally as M1 macOS users. |
Is there any update on this issue ? Wondering when the support for |
@amirhessam88 We are looking at 3.9 support as a high priority item with a ballpark ETA of the first half of next year. We will keep you posted on a more concrete date as the work progresses. |
I'm running into this issue as well. We run a web app on anvil.works, which supports Python3.7 and Python3.10. |
I am on the same boat. I can't believe the "ballpark" ETA is the first half of next year.... |
For the libffi issues on M1, please follow this workaround. We will continue working towards Python39 in the meantime. |
Thanks @sfc-gh-achandrasekaran! Working so far Minor thing. I did need to change the order of the conda create arguments to either: |
3.9 is nice, but any reason why it cannot just be any python version? You could package the python binary as a conda package and install it serverside as needed? If you only go for python 3.9 support, this will become a forever recurring issue 😐 |
We need to work on the server side support for other Python versions mainly because of UDFs. When you register a runtime Python function as a UDF, in order to make it working in Snowflake and the same as the function in your local environment, we have to make sure the Python minor version is the same. And it needs some non-trivial engineering effort on the server side. We also briefly explain how UDF works between the client and the server here if you're interested in. |
It has been half of a year since the issue opened, now another 6 months 😥. |
@vanducng Supporting 3.9 requires a bit of work on our server side which we are actively doing with a target date in the first half of 2023.
|
@vanducng The package you created is causing some confusion among other customers. Could you please update the PyPi description on https://pypi.org/project/snowflake-snowpark-python3/ to indicate that this is not an official snowflake supported package? Please also remove the release history associated with that page. Thanks |
For safe, I remove the package from pypi. |
Wondering if "supporting 3.9 requires a bit of work" is referring to 3.9 only or including 3.10 as well? When would be a good ETA for Python 3.10 support? |
I don't really get why the workaround fork was an issue? It was allowing me to continue using your product at all since not everyone can work on 3.8 for a myriad of reasons. @sfc-gh-achandrasekaran |
If you're comfortable with it, would you post a github link so that people who need to use it can install it locally? |
We have client security requirements to be on up-to-date versions of our software. 3.8 is rapidly phasing into EOL for python, and we don't want to build against Snowpark if it is not going to support modern Python. |
I was told in a Snowflake web session/seminar thing last week that 3.9 and 3.10 support would be in some kind of public preview "in the next few months", is that still the plan? |
@etdr yep, that is still the plan. |
Please support 3.10 🙏 |
Hello, Do you have any update for Python 3.10 support? |
I've been following this issue for the best part of a year; Python 3.9 is over 2.5 years old at this point and we've had another major release since then. Insofar as I can tell, there are two use cases for the snowpark-python package:
In the case of (1), it's understandable that more time and effort is required to make sure that the Snowflake warehouse environment works with different major Python versions (Lord knows it took AWS long enough with Lambdas), but in the case of (2), all that gets sent down the line to the Snowflake warehouse is Snowflake SQL, and the version of Python used to create it is immaterial. We enthusiastically embraced Snowpark when it was available and have a fair amount of code running on our compute for pulling and pushing data from and to Snowflake (i.e. use case (2)), in addition to deploying some code to Snowflake itself. (To the extent that I wrote the Go code for the Terraform provider to support Python procedures, which was released earlier this month.) But given the status of this issue we've now implemented a blanket ban on any new Snowpark code, and we're removing it whenever we work on existing code. Even if this issue gets resolved in the next n < ~3 months, we don't have a) clarity on whether Python 3.10+ will be supported, or b) any confidence that if they are, we'll find ourselves in a similar situation in another 12–18 months time. The downsides for us are more work to remove existing code, and a less appealing developer experience; the upsides are no-longer being beholden to Snowflake's opaque development priorities for the Snowpark package. In addition, if we're working with bare SQL, it is significantly less work to move to another data warehouse in future. Of course I have no idea how many other people are using Snowpark for use case (2), but if it's significant, perhaps it makes sense to spin off a separate package (e.g. snowpark-python-core) which contains the bulk of the existing Snowpark code but none of the procedures/UDF deployment code, which would stay in the existing package. That way it'd be trivial to support new Python versions for people running Snowpark 'locally' in the core package, while ensuring that code which gets deployed to a Snowflake warehouse isn't going to cause problems. Naturally this would require some up-front work on Snowflake's part, but if it stops users giving up on Snowpark–and increases customer stickiness–it could be worth it. (Yes, I will be having this discussion with our AE soon, but thought it worthwhile posting here too.) |
@badge : Our goal is to have public preview of python 3.9 at end of May, and python 3.10 at end of June. This includes python API, UDF, UDTF and stored procedure. I agree with you that we could have made a better decision to spin off the core SQL generation as a separate package to be independent from UDF python versions. That being said, we could provide you with a wheel file that supports multiple runtime versions. Let me know if this is something that could unblock you. |
A wheel file would be great for unlocking us. |
Unfortunately I'm not able to share a wheel file using github :( You should be able to get the wheel file built by this github action (end of the page): https://github.com/snowflakedb/snowpark-python/actions/runs/4320547133. This was built on top of 1.2.0 and supports Python 3.8-3.10. Note that this is for test only so please don't use it in production, also UDF related operations don't work yet. We would love to hear your feedback :) |
Is there a schedule for the fully supported library released to support python 3.10 or newer? |
@jonnio : We are working towards providing predictable SLAs on python runtime upgrades & deprecation. Stay tuned! |
Hey team, end of May has passed! How are we looking for 3.9 & 3.10 releases? |
@sfc-gh-sfan Would it be possible to add the wheels for 3.11 to the build pipeline and regenerate assuming there are no breaking changes between 3.10 and 3.11. As a broader point, could a workaround for the UDF and SQL gen packages being intertwined be having the UDF functionality as an extra install (i.e. |
Hello! 1.5.0 was released yesterday which supports 3.9, and we plan to make a micro version release soon that will support 3.10. As for 3.11, it's on our radar but we need to have UDF support before bumping the client's python version. Re: having the UDF functionality as an extra install. Looking back, this would have been a better idea to separate UDF from snowpark core dataframes from the start. We'll evaluate this option but it might not be the top priority for the team. |
Thanks @sfc-gh-sfan for this. Looking forward to the micro version too. |
1.5.1 is released on PyPi which supports 3.10 now! |
When will 1.5.1 roll out on Snowflake Anaconda repo? We still have 1.4.0
|
@dwelden : It typically takes 3-4 weeks for the Anaconda channel to get mirrored into Snowflake. |
Thanks team, this is great news! |
Can you please also update the conda-forge recipe? https://github.com/conda-forge/snowflake-snowpark-python-feedstock |
Updated in conda-forge/snowflake-snowpark-python-feedstock#15 :) |
Any plans for supporting Python 3.11? |
|
this was on June 15th, any rough estimate of when 3.11 will be supported? I believe 3.12 will also be released soon |
We are currently targeting end of October for 3.11 support. You could already see PR in this repo heading that direction :) cc @sfc-gh-aling |
Hello, |
3.11 was released as part of 1.9.0 this week. Check https://github.com/snowflakedb/snowpark-python/blob/main/CHANGELOG.md#new-features 😃 |
I'll close this issue then :) |
What is the current behavior?
Install fails on python 3.9
What is the desired behavior?
That it works on python 3.9. Our use-case is installing snowpark in the jupyterhub/docker-stacks scipy image, which is in fact already on python 3.10.
How would this improve
snowflake-snowpark-python
?More flexibility. Is there really such a breaking change in python 3.9 that warrants this oddly specific version pin on 3.8? I suspect just releasing the pin would probably be enough, as I'm not aware of any non-compatibilities between 3.8 and >=3.9
The text was updated successfully, but these errors were encountered: