Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

multi-tenancy support #50

Open
zhjuncai opened this issue Dec 2, 2020 · 4 comments
Open

multi-tenancy support #50

zhjuncai opened this issue Dec 2, 2020 · 4 comments

Comments

@zhjuncai
Copy link
Contributor

zhjuncai commented Dec 2, 2020

Hello team,

I'd like to try the kafka-connect-sap to setup an ETL pipeline to extract data from SAP HANA, while the transactional data in SAP HANA is used schema-based multi-tenancy, which means each tenant has an own schema HDI container.

is there a way to use this library to connect all the HDI containers? or how does it to support multi-tenancy in SAP infrastructure?

Best regards,
Ethan

@elakito
Copy link
Collaborator

elakito commented Dec 3, 2020

HDI's DB-schema based multi-tenancy just gives the users to interact with tables residing in their corresponding DB schema and that is transparent to the jdbc driver's usage itself. The connection used by HDI is supposed to require TLS, so encrypt property must be set to true. (See here for the HANA JDBC connection properties https://help.sap.com/viewer/0eec0d68141541d1b07893a39944924e/LATEST/en-US/109397c2206a4ab2a5386d494f4cf75e.html). How the other TLS related properties must be set likely depends on how the broker side is configured.

@zhjuncai
Copy link
Contributor Author

zhjuncai commented Dec 4, 2020

Hello @elakito thanks for your prompt response. the reason I ask the question you know is because the hdi container username password is dynamically changed and managed by instance manager.
If we configure the jdbc url, username, password in the properties file, the connector won't work once the credentials changes, or we can use a technical user that can access multiple schema data...

Just would like to know if there a build-in solution to support this...

@elakito
Copy link
Collaborator

elakito commented Dec 4, 2020

I have to think about options. One option may be to use ConfigProvider to store the credential elsewhere and trigger start and stop of the connector worker using Kafka-connect API every time when the password provisioning happens.

@elakito
Copy link
Collaborator

elakito commented Jan 29, 2021

@zhjuncai I think one way to manage the automatic credential update is to simply restart the task. When you deploy a connector foo using the connect REST API POST /connectors , you can find its task ID using GET /connectors/foo/tasks and then later after at some point when the credential is updated, you can restart its task using POST /connectors/foo/tasks/<task-id>/restart which will load the updated credential to continue running the connector's task.

In other words, you can control everything using Kafka Connect REST API. Does this answer your question?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants