gpt4 book ai didi

python - 避免对Docker产生依赖 hell

转载 作者:行者123 更新时间:2023-12-02 20:42:12 27 4
gpt4 key购买 nike

我使用Python构建了一个AI应用程序,其中涉及大量的Python库。此时,我想在Docker容器中运行我的应用程序以使AI App成为服务。

关于依赖项,我有哪些选择,以便自动下载所有必需的库?

作为一种较弱的选择,我尝试使用与Docker构建文件相同级别的“requirement.txt”文件进行此操作,但这没有用。

最佳答案

您有几种选择。这在很大程度上取决于用例,最终将要构建的容器数量,生产与开发环境等。

通常,如果您具有AI应用程序,则需要在主机系统上预先安装图形卡驱动程序以进行模型训练。这意味着最终您将不得不想出一种自动安装驱动程序或为最终用户编写说明的方法。对于应用程序,如果前端或后端数据库在容器外部,则可能还需要在Docker镜像中使用数据库驱动程序。这是我用例之一的精简示例,要求为数据管道构建docker。

#Taken from puckel/docker-airflow
#can look up this image name on google to see which OS it is based on.
FROM python:3.6-slim-buster
LABEL maintainer="batman"

# Never prompt the user for choices on installation/configuration of packages
ENV DEBIAN_FRONTEND noninteractive
ENV TERM linux

# Set some default configuration for data pipeline management tool called airflow
ARG AIRFLOW_VERSION=1.10.9
ARG AIRFLOW_USER_HOME=/usr/local/airflow
ARG AIRFLOW_DEPS=""
ENV AIRFLOW_HOME=${AIRFLOW_USER_HOME}

# here install some linux dependencies required to run the pipeline.
# use apt-get install, apt-get auto-remove etc to reduce size of image
# curl and install sql server odbc driver for my linux
RUN set -ex \
&& buildDeps=' freetds-dev libkrb5-dev libsasl2-dev libssl-dev libffi-dev libpq-dev git' \
&& apt-get update -yqq \
&& apt-get upgrade -yqq \
&& apt-get install -yqq --no-install-recommends \
$buildDeps freetds-bin build-essential default-libmysqlclient-dev \
apt-utils curl rsync netcat locales gnupg wget \
&& useradd -ms /bin/bash -d ${AIRFLOW_USER_HOME} airflow \
&& curl https://packages.microsoft.com/keys/microsoft.asc | apt-key add - \ #
&& curl https://packages.microsoft.com/config/debian/10/prod.list > /etc/apt/sources.list.d/mssql-release.list \
&& apt-get update \
&& ACCEPT_EULA=Y apt-get install -y msodbcsql17 \
&& ACCEPT_EULA=Y apt-get install -y mssql-tools \
&& pip install apache-airflow[crypto,celery,postgres,hive,jdbc,mysql,ssh${AIRFLOW_DEPS:+,}${AIRFLOW_DEPS}]==${AIRFLOW_VERSION} \
&& apt-get purge --auto-remove -yqq $buildDeps \
&& apt-get autoremove -yqq --purge \
&& apt-get clean \
&& rm -rf \
/var/lib/apt/lists/* \
/tmp/* \
/var/tmp/* \
/usr/share/man \
/usr/share/doc \
/usr/share/doc-base

# Install all required packages in python environment from requirements.txt (I generally remove version numbers if my python version are same)
ADD ./requirements.txt /config/
RUN pip install -r /config/requirements.txt


# CLEANUP
RUN apt-get autoremove -yqq --purge \
&& apt-get clean \
&& rm -rf \
/var/lib/apt/lists/* \
/tmp/* \
/var/tmp/* \
/usr/share/man \
/usr/share/doc \
/usr/share/doc-base


#CONFIGURATION
COPY script/entrypoint.sh /entrypoint.sh
COPY config/airflow.cfg ${AIRFLOW_USER_HOME}/airflow.cfg

# hand ownership of libraries to relevant user
RUN chown -R airflow: ${AIRFLOW_USER_HOME}

#expose ports to outside container for web app access
EXPOSE 8080 5555 8793

USER airflow
WORKDIR ${AIRFLOW_USER_HOME}
ENTRYPOINT ["/entrypoint.sh"]
CMD ["webserver"]

1)选择一个具有所需操作系统的适当基础镜像。
2)如果要训练模型,请安装gpu驱动程序,如果要提供模型,则不是必需的

关于python - 避免对Docker产生依赖 hell ,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/61476269/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com