How to resolve the below queries: 1. When an ML Package is created in AI Fabric using the zip file having Python files, where do the required dependencies gets stored? 2. From where do those dependencies gets downloaded? 3. If developers have a lot of modules and libraries created in Anaconda which are used in every AI model, is there a way to integrate AI-Fabric with Anaconda or PyPI or Conda-forge?
- A docker container gets created in the backend ,where all the dependencies mentioned in the requirements.txt files gets installed.
2. Usually these dependencies are all downloaded from PyPI registry which is the master registry for open source python packages.The ML model further uses that container for execution by referencing the installed dependencies.
3. In Conda, if used dependencies list can be fetched by using conda list, then those can be mentioned in the requirements.txt file and all the dependencies get installed in the docker container for the model to run.
For Custom packages ,it needs to be placed in the zip folder, which are referenced in the main.py or in other scripts.Refer below screenshot or link:
Note: Any package which pip can install, can be mentioned in the requirements.txt file; pip can install python packages from github as well.
pip install <<anyURL>>
Requirement file can also consist URLs where python packages are hosted and as long as those URLs are reachable, packages will get downloaded.