Environment Setup¶
Prerequisites¶
Before proceeding with the Allo installation, please follow the instructions on the MLIR-AIE website to install the required Vitis and XRT environment. Stop when you reach the “Install IRON for AMD Ryzen™ AI AIE Application” section as we need a separate process to install MLIR-AIE under the Allo environment.
Install from Source¶
Please follow the general instructions in Install from Source to install the LLVM-19 project and the Allo package. In the following, we suppose you have already installed the LLVM-19 project, cloned Allo repository and created the allo
conda environment.
Below are the exact commands to set up the environment:
Step 1¶
Activate the allo
conda environment
conda activate allo
Step 2¶
We depend on the MLIR-AIE project to compile the Allo IR to AIE. Install release 1.0
# Install IRON library and mlir-aie from a wheel
python3 -m pip install mlir_aie -f https://github.com/Xilinx/mlir-aie/releases/expanded_assets/v1.0
# Install Peano from a llvm-aie wheel
python3 -m pip install https://github.com/Xilinx/llvm-aie/releases/download/nightly/llvm_aie-19.0.0.2025041501+b2a279c1-py3-none-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Warning
The mlir_aie
wheel requires manylinux_2_35
, and some systems (e.g., those with glibc 2.34, confirmed by ldd --version
) do not meet this requirement.
This results in an installation failure such as:
ERROR: mlir_aie-0.0.1.2025042204+24208c0-cp312-cp312-manylinux_2_35_x86_64.whl is not a supported wheel on this platform.
Step 3¶
Enter the Allo repository and install.
Enter the scripts
directory
cd scripts
Set up MLIR-AIE by running the setup script
source aie-setup.sh
Note
This will clone the mlir-aie
repository and checkout to the commit corresponding to release 1.0.
By default, the repository is cloned under Allo’s root directory. To customize the installation directory,
use the --clone-dir
option.
source aie-setup.sh --clone-dir /customized/path/
After running the setup script, you may see the following message in the terminal:
>>> Please note: Each time you activate your environment, you need to export the following variables:
export PATH=/path/to/your/env/lib/python3.12/site-packages/mlir_aie/bin:$PATH
export MLIR_AIE_INSTALL_DIR=/path/to/your/env/lib/python3.12/site-packages/mlir_aie
export PEANO_INSTALL_DIR=/path/to/your/env/lib/python3.12/site-packages/llvm-aie
export MLIR_AIE_EXTERNAL_KERNEL_DIR=/path/to/mlir-aie/aie_kernels/
export RUNTIME_LIB_DIR=/path/to/mlir-aie/runtime_lib/
export PYTHONPATH=/path/to/your/env/lib/python3.12/site-packages/mlir_aie/python:$PYTHONPATH
You can copy the export commands listed here into your own script (e.g., /path/to/your/env/etc/conda/activate.d/setup.sh
), so that these environment variables are automatically set whenever you activate your environment.
To build and install Allo, you may want to set up environment variables first to use a custom CMake and LLVM build. For example:
export PATH=/opt/cmake-3.31.5-linux-x86_64/bin:/opt/llvm-project-19.x/build/bin:$PATH
export LLVM_BUILD_DIR=/opt/llvm-project-19.x/build
Next, enter Allo’s root directory and install by running the following commands
python3 -m pip install -v -e .
Note
See Internal Installation (Cornell) for Zhang Group students.
Step 4¶
Setup Vitis and XRT.
Note
See Internal Installation (Cornell) for Zhang Group students.
Lastly, you can verify the AIE backend by running the following command under Allo’s root directory.
python3 tests/dataflow/aie/test_vector.py
Internal Installation (Cornell)¶
For Zhang Group students, please set up environment variables in Step 3 with the following commands.
export PATH=/opt/cmake-3.31.5-linux-x86_64/bin:/opt/llvm-project-19.x/build/bin:$PATH
export LLVM_BUILD_DIR=/opt/llvm-project-19.x/build
And set up Vitis and XRT in Step 4 by running the following commands.
source /opt/common/setupVitis.sh
source /opt/common/setupXRT.sh
Lastly, to verify the installation, you can run the following command:
python3 tests/dataflow/aie/test_vector.py
If the unit tests pass, then the installation is successful. Otherwise, please contact us for help.