Install K2-FSA/IceFall/Lhotse
K2-FSA Installation
It is recommended to install K2-FSA in separate virtual environments. We use the recommended pre-compiled wheels.
- K2-CPU Installation (Torch version 2.0.1)
pip install torch==2.0.1+cpu -f https://download.pytorch.org/whl/torch_stable.html pip install k2==1.24.3.dev20230718+cpu.torch2.0.1 -f https://k2-fsa.github.io/k2/cpu.html
- K2-GPU Installation (CUDA version 11.7)
pip install torch==2.0.1+cu117 -f https://download.pytorch.org/whl/torch_stable.html pip install k2==1.24.3.dev20230718+cuda11.7.torch2.0.1 -f https://k2-fsa.github.io/k2/cuda.html
IceFall Installation
Installation source: IceFall Documentation
git clone https://github.com/k2-fsa/icefall
cd icefall
pip install -r ./requirements.txt
export PYTHONPATH=$PWD:$PYTHONPATH
Lhotse Installation
Installation source: Lhotse Documentation
git clone https://github.com/lhotse-speech/lhotse.git
pip install -e '.[dev]'
pre-commit install
pip install lhotse[kaldi]
Installing CUDA (11.7) and cuDNN (8.9)
Installation source: Icefall Documentation
- Installing CUDA
## Choose installation dir INSTALL_PATH="/home/sagar/bin/cuda/11.7.1/" mkdir -p ${INSTALL_PATH} ## Download cuda wget https://developer.download.nvidia.com/compute/cuda/11.7.1/local_installers/cuda_11.7.1_515.65.01_linux.run chmod +x cuda_11.7.1_515.65.01_linux.run ./cuda_11.7.1_515.65.01_linux.run \ --silent \ --toolkit \ --installpath=${INSTALL_PATH} \ --no-opengl-libs \ --no-drm \ --no-man-page
-
Installing cuDNN
Check other available versions here: cuDNN Archive by Nvidia
wget https://huggingface.co/csukuangfj/cudnn/resolve/main/cudnn-linux-x86_64-8.9.1.23_cuda11-archive.tar.xz tar xvf cudnn-linux-x86_64-8.9.1.23_cuda11-archive.tar.xz --strip-components=1 -C ${INSTALL_PATH}
- Set environment variables
export CUDA_HOME=${INSTALL_PATH} export PATH=$CUDA_HOME/bin:$PATH export LD_LIBRARY_PATH=$CUDA_HOME/lib64:$LD_LIBRARY_PATH export LD_LIBRARY_PATH=$CUDA_HOME/lib:$LD_LIBRARY_PATH export CUDA_TOOLKIT_ROOT_DIR=$CUDA_HOME export CUDA_TOOLKIT_ROOT=$CUDA_HOME export CUDA_BIN_PATH=$CUDA_HOME export CUDA_PATH=$CUDA_HOME export CUDA_INC_PATH=$CUDA_HOME/targets/x86_64-linux export CFLAGS=-I$CUDA_HOME/targets/x86_64-linux/include:$CFLAGS
- Verify installation
which nvcc nvcc --version
Enjoy Reading This Article?
Here are some more articles you might like to read next: