thanks for all the discussions! here are the next steps I can see:
- create a
python/requirements.txtfor the core direct dependencies, specify version constraints as appropriate to be used in the CI. also create e.g.python/requirements-tflite.txtfiles (one per extra feature of TVM) in the same spirit. - modify
python/setup.pyto use thoserequirements.txtto deriveinstall_requires. - create
python/requirements-dev.txt, which will not be read bysetup.pybut which should be used to install dev dependencies (i.e. lint, pytest, sphinx). - modify the CI docker build scripts as follows:
- update to pip 20.3 to use the new dependency resolver
- delete all
pip install foocommands and replace them with a singlepip installcommand (see next bullet point) - create
python/ci-constraints.txtwhich captures even more restrictive constraints for use with the CI that we would not ordinarily populate. - modify the CI install scripts in
docker/install/*.sh, deleting allpip installcommands and replacing withpip install -e ./python -c python/ci-constraints-concatenated.txt(and potentially./python[tflite]when extras are needed).ci-constraints-concatenated.txtis synthetically built by a shell script function concatenating these files (we may need to de-dupe python packages, so this may become a python script):python/requirements.txtpython/requirements-$extra.txtpython/ci-constraints.txt
in a follow-on RFC, we’ll consider the challenge of ensuring that this constraints file is uniform across all CI containers (I.e. also the question of whether this is necessary). we’ll also consider challenges in keeping the requirements up-to-date, which may be easier with a tool such as poetry or may not. I think given the new pip dependency resolver, motivation for the use of poetry or some additional tool should come after considering that.