Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Installation issues of xformers on MI250 with rocm/pytorch:rocm6.0.2_ubuntu22.04_py3.10_pytorch_2.1.2 #8

Open
zhaohm14 opened this issue Apr 26, 2024 · 6 comments

Comments

@zhaohm14
Copy link

I am installing xformers by the following steps:

git clone https://github.com/ROCm/xformers
cd xformers
git checkout develop
git submodule update --init --recursive
python setup.py install

And after the installation, when using xformers, a message shows:

WARNING[XFORMERS]: Need to compile C++ extensions to use all xFormers features.
    Please install xformers properly (see https://github.com/facebookresearch/xformers#installing-xformers)
  Memory-efficient attention, SwiGLU, sparse and more won't be available.
  Set XFORMERS_MORE_DETAILS=1 for more details

This is python -m xformers.info:

WARNING[XFORMERS]: Need to compile C++ extensions to use all xFormers features.
    Please install xformers properly (see https://github.com/facebookresearch/xformers#installing-xformers)
  Memory-efficient attention, SwiGLU, sparse and more won't be available.
  Set XFORMERS_MORE_DETAILS=1 for more details
Unable to find python bindings at /usr/local/dcgm/bindings/python3. No data will be captured.
xFormers 0.0.0
memory_efficient_attention.ckF:                    unavailable
memory_efficient_attention.ckB:                    unavailable
memory_efficient_attention.ck_decoderF:            unavailable
memory_efficient_attention.ck_splitKF:             unavailable
memory_efficient_attention.cutlassF:               unavailable
memory_efficient_attention.cutlassB:               unavailable
memory_efficient_attention.decoderF:               unavailable
[email protected]:         unavailable
[email protected]:         unavailable
memory_efficient_attention.smallkF:                unavailable
memory_efficient_attention.smallkB:                unavailable
memory_efficient_attention.triton_splitKF:         available
indexing.scaled_index_addF:                        available
indexing.scaled_index_addB:                        available
indexing.index_select:                             available
sequence_parallel_fused.write_values:              unavailable
sequence_parallel_fused.wait_values:               unavailable
sequence_parallel_fused.cuda_memset_32b_async:     unavailable
sp24.sparse24_sparsify_both_ways:                  unavailable
sp24.sparse24_apply:                               unavailable
sp24.sparse24_apply_dense_output:                  unavailable
sp24._sparse24_gemm:                               unavailable
[email protected]:                        available
swiglu.dual_gemm_silu:                             unavailable
swiglu.gemm_fused_operand_sum:                     unavailable
swiglu.fused.p.cpp:                                not built
is_triton_available:                               True
pytorch.version:                                   2.1.2+git98a6632
pytorch.cuda:                                      available
gpu.compute_capability:                            9.0
gpu.name:                                          AMD Instinct MI250X/MI250
dcgm_profiler:                                     unavailable
build.info:                                        none
source.privacy:                                    open source

Is there anything I'm doing wrong? Thanks!

@qianfengz
Copy link
Collaborator

qianfengz commented May 5, 2024

Your setting is completely correct. But be sure you are using the correct pytorch since xformers is a pytorch application. You can check whether your pytorch can work on your MI250 by

> import torch
> query = torch.ones((16, 256),  device=torch.device("cuda"))

@xinlong-yang
Copy link

@zhaohm14 Hi I would like to ask how long does it takes to install the xformers?

@zhaohm14
Copy link
Author

@xinlong-yang 这个跟CPU有关,印象里用7763安装的过程大概一两分钟的样子

@xinlong-yang
Copy link

@xinlong-yang 这个跟CPU有关,印象里用7763安装的过程大概一两分钟的样子

感谢回复,我也是在MI250上面装的,但是一开始忘记用ninja了,导致跑了好几个小时,用ninja的话大概10分钟多

@tenpercent
Copy link
Collaborator

@zhaohm14 I noticed the installation is broken with python setup.py subcommands. Some necessary logic happens in the wrapper around it. The way suggested in the readme (pip install -v -U git+https://github.com/facebookresearch/xformers.git@main#egg=xformers ) should work

Note that you can customize the repository and branch link if you like to use the rocm fork, but the upstream version should be treated as more stable

Please let me know if that solves the build issue

@qianfengz
Copy link
Collaborator

@xinlong-yang 这个跟CPU有关,印象里用7763安装的过程大概一两分钟的样子

感谢回复,我也是在MI250上面装的,但是一开始忘记用ninja了,导致跑了好几个小时,用ninja的话大概10分钟多

Yes, #> unset MAX_JOBS to use all your CPUs for compiling

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants