Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Write blogpost showing how to interface with an MPI application #1220

Closed
mrocklin opened this issue Jun 29, 2017 · 4 comments
Closed

Write blogpost showing how to interface with an MPI application #1220

mrocklin opened this issue Jun 29, 2017 · 4 comments

Comments

@mrocklin
Copy link
Member

We've often been asked how to hand data back and forth to MPI applications. At least to start this is likely to be handled on an application by application basis. It would be nice to provide an example doing this that others can copy.

As an example application it might be nice to use Elemental a well respected distributed array library.

There are probably a few ways to do this. In all cases you'll probably need to start Dask workers within an MPI Comm world, probably using mpi4py. You might then do a little pre-processing with Dask, persist a distributed dask array and then run a function on each of the workers that takes the local numpy chunks in their .data attribute and either starts some other MPI work or MPI.sends that data to the appropriate MPI rank running some other code. You'll then either need to wait for that function to return a bunch of numpy arrays or else MPI.recv them from other ranks. Then you can arrange those blocks back into a dask.array (perhaps by putting them into a Queue and letting some other client handle the coordination.

@mrocklin
Copy link
Member Author

Relevant issue around getting numpy arrays from elemental data: elemental/Elemental#223

@mrocklin
Copy link
Member Author

mrocklin commented Jul 5, 2017

I think that @jcrist is interested in this, but is busy this week and next.

@GenevieveBuckley
Copy link
Contributor

@dask/maintenance given the topic, I think this issue might be best transferred to https://github.com/dask/dask-blog/issues

Not sure if it'll get a little more love over there, but it's worth a shot.

@jsignell
Copy link
Member

Given the existence of dask-mpi. I think we can probably just go ahead and close this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants