-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Slow execution time of reading a big file #128
Comments
@tomkeee Print operation in every language is slow, if you want to print every line of big file, you have to keep in mind that will drastically slow down your operation. What is a time of execution (processing full file) without printing? |
@Tatarinho
|
Hi @tomkeee, we'll be looking into this next week. |
Hmm... so I did some initial digging and was able to run a test with local network and a similar program in node works quite fast, but not as fast as from the disk... We need to take into account the network connection, but that wouldn't explain 25 minutes. Could you follow this guide: https://docs.scramjet.org/platform/self-hosted-installation Then based on this, can you try your program with the data sent to |
I've tested the platform on how fast it would analyze a big .csv file (532MB, 3 235 282 lines). The execution time of the program (code below) is about 25 minutes.
The program should just print the current line with a very simple comment
main.py
package.json
requirements.txt
scramjet-framework-py
The text was updated successfully, but these errors were encountered: