-
Notifications
You must be signed in to change notification settings - Fork 453
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OPENNLP-1515: Changing onnx dependency to onnxruntime. #551
Conversation
The A motivation for this change is to offer better support for ONNX models in Apache Solr - apache/solr#1999 So, I'm proposing defaulting to |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me. Does that mean GPUs won't be supported until the OSX support is added to the removed dependency? If so, I think we should just include that info in the ANNOUNCE for the next release. Thanks!
What about using |
I think that would be for build only? Unless we produced artefacts to be uploaded to Maven per OS (I think I saw that once, some different flag in the dependency in Maven pom.xml, but never actually used it?) |
Something like: <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.apache.opennlp</groupId>
<artifactId>opennlp</artifactId>
<version>2.3.1-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<groupId>org.apache.opennlp</groupId>
<artifactId>opennlp-dl</artifactId>
<name>Apache OpenNLP DL</name>
<properties>
<onxx-runtime.artifact.name>onnxruntime_gpu</onxx-runtime.artifact.name>
</properties>
<profiles>
<profile>
<id>system-osx</id>
<activation>
<os>
<family>mac</family>
</os>
</activation>
<properties>
<onxx-runtime.artifact.name>onnxruntime</onxx-runtime.artifact.name>
</properties>
</profile>
</profiles>
<dependencies>
<dependency>
<groupId>org.apache.opennlp</groupId>
<artifactId>opennlp-tools</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>com.microsoft.onnxruntime</groupId>
<artifactId>${onxx-runtime.artifact.name}</artifactId>
<version>${onnxruntime.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-api</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-engine</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
</project> Think it would propagate for the transient dependencies as well but would need special treatment for building the zip releases (ie. include both libs) |
Yes, I think so. Someone could manually enable GPU by replacing the |
I agree with you @jzonthemtn - maybe we can attach a But maybe it isn't worth the effort. No idea how many people actually consume it via Maven ;-) |
I think that would be ideal. I will write a new JIRA for it. |
Thank you for contributing to Apache OpenNLP.
In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:
For all changes:
Is there a JIRA ticket associated with this PR? Is it referenced
in the commit message?
Does your PR title start with OPENNLP-XXXX where XXXX is the JIRA number you are trying to resolve? Pay particular attention to the hyphen "-" character.
Has your PR been rebased against the latest commit within the target branch (typically main)?
Is your initial contribution a single, squashed commit?
For code changes:
For documentation related changes:
Note:
Please ensure that once the PR is submitted, you check GitHub Actions for build issues and submit an update to your PR as soon as possible.