Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Suggested Multiple Computer Configuration for Batch Rendering #91

Open
ethangale25 opened this issue Apr 14, 2022 · 10 comments
Open

Suggested Multiple Computer Configuration for Batch Rendering #91

ethangale25 opened this issue Apr 14, 2022 · 10 comments
Labels
documentation Improvements or additions to documentation

Comments

@ethangale25
Copy link

Hey Torrin & Crew,

This isn't as much of an issue as much as it is asking for a recommendation for how to get multiple computers going at a time (e.g. computer 1 rending batch 1 and computer 2 rendering batch 2, etc.).

There are options for both "Virtual Drive" and "Mirror Drive" for google drive...
image
Could you foresee potential issues with using a Virtual Drive system where the documents needing to be uploaded prior to accessing? For instance if a batch was prematurely ended and part of a file changed such that the renderer would know where to pick up the render again, this would have to be uploaded prior to reopening the .blend file?

Would you recommend a "Mirror Drive" system instead?

What have you used in the past? Box? Google Drive? DropBox? OneDrive?

Thank you,

@ethangale25
Copy link
Author

-OR-

Do you not even suggest using a cloud storage system?
Would you create the NFT data, duplicate it across several computers, render locally on each computer, and then carefully consolidating renders to one computer and rebatching after the fact?

Yikes.

@torrinworx
Copy link
Owner

I would suggest using Mirror files, BMNFTs will need to open and modify many files; I see a potential issue when you refactor your batches where BMNFTs might not be able to move around and rename all those files because they are located in Google Drive. I would run some small tests to see if it works before going all in.

The method we used is just to duplicate the NFT Data folder and .blend file across all the machines we would render on, then have each machine render a unique batch. Then we would send the finalized batches to one machine to refactor. It's very inefficient and prone to error but if you are careful it will work.

If you end up running some tests with Google Drive Mirror Drive let me know, I'm interested in setting a system like that up.

@torrinworx
Copy link
Owner

Honestly we never thought about using a cloud storage solution because we didn't know if it was possible, seems silly now lol

I'm definitely going to try it out for our next client.

@ethangale25
Copy link
Author

ethangale25 commented Apr 15, 2022

Couple questions:

  1. When you duplicate the NFT_Data folder and .blend file across other machines, what steps do you need to take to make sure that the correct files are reference when generating additional batches?
  2. Does the blend file know where to "look" for the NFT_Data folder in the file heirarchy?

@torrinworx
Copy link
Owner

I've done some testing and found that using Google Drive with Stream Files is effective. However, the folder Blend_My_NFTs Output needs to be set to Offline Access so that the files are downloaded on each computer you use. This will ensure that the Batch.json files for each batch will be accessible for each machine.

Through testing we also found that it is important that the settings for the Create Data panel must be the same on all machines. These settings include the NFT collection size, NFT Name, and NFTs per batch, and most importantly, the Save Path (This might be different on each machine, but the directory should be located in the same place so that BMNFTs is running the same files across all machines).

We recommend that you generate the NFT data on one machine, wait for all your machines to sync with Google Drive so that the Blend_My_NFTs Output folder appears on all machines. Then finally begin rendering each batch.

We used Google Drive to store the .blend file so we didn't have to worry about multiple versions/copies floating around, it worked surprisingly well.

@torrinworx
Copy link
Owner

I'm also planning on doing a future tutorial on our YouTube channel about this process, but in the meantime let me know if you have any questions about the setup. @ethangale25 Thank you for bringing this topic to my attention, this is a really useful tool, especially when dealing with client work as a contractor.

@matt-159
Copy link
Contributor

I've done batch rendering on multiple rented computers at once. I use vast.ai as my rental service and my set of scripts will rent out computers using vasta.ai's api, set up, install, and start rendering out a single batch and copy back the finished content back to my main server.

This whole process is done headlessly. I prefer to have the .blend file hosted on either google drive or dropbox. For my scripts, anything will work as long as it works with the wget command.

Once I run batch refactoring, I'll either upload everything to either an IPFS service or a google drive depending on if it's a test batch or not.

Vast.ai is a great service as it's pretty affordable, the only drawback is that it's not beginner friendly as you only can interact with your rented computers through ssh.

With the cli compat addition, you should be able to run Blend My NFTs within google Colab as long as your project doesn't require too much resources (iirc the $10/month tier of colab gets you like 24GB of RAM which wasn't sufficient for the project I was working on at the time).

@torrinworx if you'd like, we could hop on a call sometime and I could talk to you more about vast.ai

@ethangale25
Copy link
Author

Thank you @torrinworx for your testing on this subject and your thoroughness in your response! I really look forward to your YouTube tutorial about this process. I will be navigating using Google Drive for my NFT project.

I have a few notes regarding rendering on multiple machines NOT utilizing a shared cloud storage drive on each machine (i.e. like how your team originally did in this comment). These notes might not be useful to this thread, but I thought I should write them in this thread for potential documentation purposes and to be complete. These notes may not be 100% accurate, they were just my findings when trying to locally get many computers working on the same project:

  1. All computers should be running the same version of Blender (e.g. 3.0 vs 3.1 seemed to be incompatible)

  2. All computers should be running the same version of BMNFTs Add-On

  3. The NFT Collection size, NFT Name, NFTs per batch, and Save path should be the same

  4. If the Save Path (from above) cannot be the same due to the User name in the windows file heirarchy structure (e.g. C:\users\Torrin\desktop\NFT project and C:\users\JohnDoe\desktop\NFT Project), then a workaround may be this:

    4a. Generate a "dummy" set of NFT data by clicking on the "Create NFT Data" button on the BMNFTs UI, in order for the .blend file to establish a new local file location for the project.

    4b. Substitute the "real" set of NFT_Data files with the "dummy" set of NFT_data files

    4c. Save the .blend file. Close blender application. Reopen your .blend file. Begin rendering additional NFTs as typical. The .blend file should then be referencing the correct set of NFT_Data when determining how and what to render

@torrinworx,If you came across conflicting processes in creating your NFT collection, please let me know.

@torrinworx
Copy link
Owner

@TheBrochacho Wow that sounds awesome. We are currently looking into adding a render farm like service to our API so that would be amazing. I sent a friend request to you on Discord, let me know if you have time next Tuesday for a meeting. My time zone is EDT btw.

@ethangale25 Np! I like those steps, nice work!

@torrinworx torrinworx added the documentation Improvements or additions to documentation label May 31, 2022
@torrinworx
Copy link
Owner

I will add this info to the documentation, for now this issue will remain open.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation
Projects
None yet
Development

No branches or pull requests

3 participants