Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Please specify timeout for the Prometheus service HTTP POST call #550

Open
tisnik opened this issue Jul 8, 2019 · 8 comments
Open

Please specify timeout for the Prometheus service HTTP POST call #550

tisnik opened this issue Jul 8, 2019 · 8 comments

Comments

@tisnik
Copy link
Member

tisnik commented Jul 8, 2019

_session.post(url=METRICS_SERVICE_URL + "/api/v1/prometheus", json=metrics_payload)

Most requests to external servers should have a timeout attached, in case the server is not responding in a timely manner. By default, requests do not time out unless a timeout value is set explicitly. Without a timeout, your code may hang for minutes or more.

@miteshvp
Copy link
Contributor

miteshvp commented Jul 8, 2019

It's an async future requests call. API server does not wait for response. Closing this.

@miteshvp miteshvp closed this as completed Jul 8, 2019
@tisnik
Copy link
Member Author

tisnik commented Jul 8, 2019

@miteshvp is not this session used to call workers asynchronously? I'd suggest to use second FutureSession instance for calling Prometheus

@tisnik tisnik reopened this Jul 8, 2019
@miteshvp
Copy link
Contributor

miteshvp commented Jul 8, 2019

Actually, its threaded. https://github.com/fabric8-analytics/fabric8-analytics-server/blob/master/bayesian/api_v1.py#L81-L82 so we don't need any other session.

@tisnik
Copy link
Member Author

tisnik commented Jul 8, 2019

Yup, I meant that we might need to increase the number of workers for the futuresession?

@miteshvp
Copy link
Contributor

miteshvp commented Jul 8, 2019

@tisnik - I guess 100 is a good enough number for our requests. These sessions are not used to invoke workers at the moment. Workers are invoked via another async flow https://github.com/fabric8-analytics/fabric8-analytics-server/blob/master/bayesian/api_v1.py#L278 I guess we are good here.

@tisnik
Copy link
Member Author

tisnik commented Jul 8, 2019

Yup, and what timeout value to choose? In this case it might be let's say from one to five seconds at most. WDYT?

@miteshvp
Copy link
Contributor

miteshvp commented Jul 8, 2019

@tisnik - this is like a nano-sec call. Timeout doesn't make sense here

@tisnik
Copy link
Member Author

tisnik commented Jul 8, 2019

@miteshvp yes - in case everything is working as expected. But in reality we'll see network issues that would need to be handled by our service. We can discuss in in a chat rather than there ;)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants