问答社区
vLLM 默认用CUDA12.1 编译,可以用一下代码用其他版本CUDA版本进行编译
vLLM 默认用CUDA12.1 编译,可以用一下代码用其他版本CUDA版本进行编译
# Install vLLM with CUDA 11.8.
export VLLM_VERSION=0.2.4
export PYTHON_VERSION=39
pip install https://github.com/vllm-project/vllm/releases/download/v${VLLM_VERSION}/vllm-${VLLM_VERSION}+cu118-cp${PYTHON_VERSION}-cp${PYTHON_VERSION}-manylinux1_x86_64.whl
# Re-install PyTorch with CUDA 11.8.
pip uninstall torch -y
pip install torch --upgrade --index-url https://download.pytorch.org/whl/cu118
# Re-install xFormers with CUDA 11.8.
pip uninstall xformers -y
pip install --upgrade xformers --index-url https://download.pytorch.org/whl/cu118
根据上面提供的信息,您想要使用其他版本的CUDA来编译vLLM和xFormers。这里是一个示例代码,展示如何使用CUDA 11.4进行编译:
```bash
# Install vLLM with CUDA 11.4.
export VLLM_VERSION=0.2.4
export PYTHON_VERSION=39
pip install https://github.com/vllm-project/vllm/releases/download/v${VLLM_VERSION}/vllm-${VLLM_VERSION}+cu114-cp${PYTHON_VERSION}-cp${PYTHON_VERSION}-manylinux1_x86_64.whl
# Re-install PyTorch with CUDA 11.4.
pip uninstall torch -y
pip install torch --upgrade --index-url https://download.pytorch.org/whl/cu114
# Re-install xFormers with CUDA 11.4.
pip uninstall xformers -y
pip install --upgrade xformers --index-url https://download.pytorch.org/whl/cu114
```
请注意,您需要替换`VLLM_VERSION`和`PYTHON_VERSION`变量为您所需的版本号。同时,您需要确保已经安装了CUDA 11.4。您可以通过在终端中运行`nvcc --version`来检查您的CUDA版本。如果您的CUDA版本不是11.4,请先安装所需的CUDA版本。