Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Build with Spark 3.2 failed #113

Closed
XorSum opened this issue Feb 26, 2024 · 6 comments · Fixed by #117
Closed

Build with Spark 3.2 failed #113

XorSum opened this issue Feb 26, 2024 · 6 comments · Fixed by #117
Labels
bug Something isn't working

Comments

@XorSum
Copy link

XorSum commented Feb 26, 2024

Describe the bug

comet fails to compile with Spark version 3.2

[ERROR] Failed to execute goal com.diffplug.spotless:spotless-maven-plugin:2.43.0:check (default) on project comet-parent-spark3.2_2.12: Execution default of goal com.diffplug.spotless:spotless-maven-plugin:2.43.0:check failed: Unable to load the mojo 'check' in the plugin 'com.diffplug.spotless:spotless-maven-plugin:2.43.0' due to an API incompatibility: org.codehaus.plexus.component.repository.exception.ComponentLookupException: com/diffplug/spotless/maven/SpotlessCheckMojo has been compiled by a more recent version of the Java Runtime (class file version 55.0), this version of the Java Runtime only recognizes class file versions up to 52.0
[ERROR] spark/src/test/scala/org/apache/comet/exec/CometExecSuite.scala:1069: value offset is not a member of org.apache.spark.sql.DataFrame

Steps to reproduce

PROFILES="-Pspark-3.2" make

Expected behavior

Compilation passes

Additional context

No response

@XorSum XorSum added the bug Something isn't working label Feb 26, 2024
@XorSum
Copy link
Author

XorSum commented Feb 26, 2024

The first error is easy to fix. Just add property <spotless.version>2.29.0</spotless.version> to profile spark-3.2.

The second error is caused by api DataSet.offset, which is introduced since Spark 3.4. Should we split the code in two parts for both pre and post 3.4 versions?

#85
https://issues.apache.org/jira/browse/SPARK-39159

@viirya
Copy link
Member

viirya commented Feb 26, 2024

We skip certain tests which are only for Spark 3.4 internally. But seems we don't do it for Comet now.

@advancedxy
Copy link
Contributor

Maybe we should also run CI with different Spark version profiles too.
However the CI run time is already quite long, we should enable that after reducing the CI run time.

@sunchao
Copy link
Member

sunchao commented Feb 26, 2024

I think the Spark 3.2 & 3.3 profiles still have quite a few issues right now. I agree that we should add CI pipelines for them at some point, and perhaps make them trigger at post-commit time.

@advancedxy
Copy link
Contributor

I agree that we should add CI pipelines for them at some point, and perhaps make them trigger at post-commit time.

After a second thought, maybe we can just compile(skipTests) against Spark 3.2 and 3.3 for now on CI? The test pipelines could be added later.

@sunchao
Copy link
Member

sunchao commented Feb 27, 2024

maybe we can just compile(skipTests) against Spark 3.2 and 3.3

Yes I think we can do that. We can also try to enable them and see if they fail.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants