XLA does not work properly
Using model.compile(..., jit_compile=True)
fails, even if using export XLA_FLAGS=--xla_gpu_cuda_data_dir=/opt/conda
before with the error message:
"Failed to load in-memory CUBIN: CUDA_ERROR_INVALID_IMAGE: device kernel image is invalid"
This might be a version conflict of the base image, which uses CUDA 11.2 and the conda package cuda-nvcc 12.2. Maybe an update of nvidia-driver is required to switch to a new base image.