-
Notifications
You must be signed in to change notification settings - Fork 27
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Issue]: Omnitrace not instrumenting MPI calls (Fortran) #426
Comments
Hi @xaguilar. Internal ticket has been created to investigate your issue. Thanks! |
Hi @xaguilar, can you provide a sample of the Fortran code you are trying to analyze? |
Hi, the code I was originally trying to instrument is a quite big and complex CFD code (Neko), but I've managed to create a very simple example that also fails. Omnitrace is not able to instrument the MPI calls from this simple ping-pong example:
I guess that maybe the problem is that the code uses the Fortran 2008 bindings for MPI, which translate into symbols within the binary such as:
and maybe omnitrace does not recognise them, and thus, it does not trigger the MPI instrumentation. Just my guess :) Please don't hesitate to ask if you need anything else. |
Problem Description
Hi,
I have a Fortran code that I'm trying to analyse. I instrument the code with:
omnitrace-instrument --mpi -o program.inst -- program.x
but omnitrace does not intrument the MPI calls. The binary contains MPI symbols, just checking with nm:
I compiled omnitrace from scratch with support for MPI. If I for example instrument one of the OSU MPI benchmarks, it works and intercepts the MPI calls, but not with this fortran code. Any ideas why this can be happening?
Thanks a lot in advance.
Cheers,
Xavier
Operating System
SLES 15-SP5
CPU
AMD EPYC 7A53 64-Core Processor
GPU
AMD Instinct MI250X
ROCm Version
ROCm 6.0.0
ROCm Component
No response
Steps to Reproduce
No response
(Optional for Linux users) Output of /opt/rocm/bin/rocminfo --support
No response
Additional Information
No response
The text was updated successfully, but these errors were encountered: