gpt4 book ai didi

python - Azure DevOps Pipeline 上没有日志记录

转载 作者:行者123 更新时间:2023-12-02 05:52:26 27 4
gpt4 key购买 nike

更新:

是否可以添加或更改在 Azure DevOps 上执行管道的命令?

<小时/>

Visual Studio Code 上本地运行我的程序,我确实得到了输出。

但是,在 Azure DevOps 上运行我的 GitHub 原始分支不会产生任何输出。

我关注了 Stack Overflow answer ,该解决方案引用 GitHub Issue .

我已经实现了以下内容,但 Azure 的原始日志在我的 Python 日志记录上返回空白。

test_logging.py:

import logging

filename = "my.log"

global logger
logger = logging.getLogger()
logger.setLevel(logging.INFO)
formatter = logging.Formatter("%(asctime)s - %(name)s - %(levelname)s - %(message)s")
open(filename, "w").close() # empty logs
fileHandler = logging.FileHandler(filename)
fileHandler.setFormatter(formatter)
fileHandler.setLevel(logging.INFO)
logger.addHandler(fileHandler)

logger.error('TEST')

# fetch logs
with open(filename, "r") as fileHandler:
logs = [log.rstrip() for log in fileHandler.readlines()]
open(filename, "w").close() # empty logs
print('logs = ', logs)
>>> logs = []

host.json:

{
"version": "2.0",
"logging": {
"fileLoggingMode": "always",
"logLevel": {
"default": "Debug"
}
}
}
<小时/>

然后我尝试了 post 中的替代 host.json :

"logging": {
"fileLoggingMode": "debugOnly",
"logLevel": {
"default": "None",
"Host.Results": "Information",
"Function": "Information",
"Host.Aggregator": "Information"
},
"applicationInsights": {
"samplingSettings": {
"isEnabled": false,
"maxTelemetryItemsPerSecond": 5
}
}
}

azure-pipeline-ontology_tagger.yaml

# ##########
# A build run against multiple Python targets
# ##########

resources:
- repo: self

variables:
tag: '$(Build.SourceBranchName)-$(Build.BuildNumber)'
imageName: '$(Build.Repository.Name)-ontology_tagger'
artifactFeed: grandproject/private-sources
repositoryUrl: private-sources
packageDirectory: workers/ontology_tagger

trigger:
batch: true
branches:
include:
- master
- development
- releases/*
paths:
include:
- "workers/ontology_tagger"
exclude:
- "workers"
- "*.md"
pr:
branches:
include:
- master
- development
- releases/*
paths:
include:
- "workers/ontology_tagger"
exclude:
- "workers"
- "*.md"

stages:
- stage: BuildWP
displayName: Build Workers python package
jobs:

- job: Build
displayName: Build Worker python image

pool:
name: EKS-grandproject-dev

steps:
- bash: env

- task: PipAuthenticate@0
displayName: Authenticate with artifact feed
inputs:
artifactFeeds: $(artifactFeed)

- task: TwineAuthenticate@1
displayName: Authenticate with artifact feed
inputs:
artifactFeed: $(artifactFeed)

- bash: echo "##vso[task.setvariable variable=POETRY_HTTP_BASIC_AZURE_PASSWORD;isOutput=true]$(echo $PIP_EXTRA_INDEX_URL | sed -r 's|https://(.+):(.+)@.*|\2|')"
name: "PIPAUTH"

- task: Bash@3
displayName: Test worker
inputs:
targetType: 'inline'
workingDirectory: '$(packageDirectory)'
script: |
docker build . --progress plain --pull --target test \
--build-arg POETRY_HTTP_BASIC_AZURE_PASSWORD=${PIPAUTH_POETRY_HTTP_BASIC_AZURE_PASSWORD} \
--build-arg ATLASSIAN_TOKEN=$(ATLASSIAN_TOKEN)
- task: Bash@3
displayName: Build and publish package
inputs:
targetType: 'inline'
workingDirectory: '$(packageDirectory)'
script: |
set -e
cp $(PYPIRC_PATH) ./
docker build . --target package --progress plain --build-arg REPO=$(repositoryUrl)
- task: Bash@3
displayName: Build docker image
inputs:
targetType: 'inline'
workingDirectory: '$(packageDirectory)'
script: |
docker build . --tag '$(imageName):$(tag)' --progress plain --pull --target production \
--build-arg POETRY_HTTP_BASIC_AZURE_PASSWORD=${PIPAUTH_POETRY_HTTP_BASIC_AZURE_PASSWORD} \
--label com.azure.dev.image.build.sourceversion=$(Build.SourceVersion) \
--label com.azure.dev.image.build.sourcebranchname=$(Build.SourceBranchName) \
--label com.azure.dev.image.build.buildnumber=$(Build.BuildNumber)
- task: ECRPushImage@1
displayName: Push image with 'latest' tag
condition: and(succeeded(),eq(variables['Build.SourceBranchName'], 'master'))
inputs:
awsCredentials: 'dev-azure-devops'
regionName: 'eu-central-1'
imageSource: 'imagename'
sourceImageName: $(imageName)
sourceImageTag: $(tag)
repositoryName: $(imageName)
pushTag: 'latest'
autoCreateRepository: true

- task: ECRPushImage@1
displayName: Push image with branch name tag
condition: and(succeeded(),ne(variables['Build.SourceBranchName'], 'merge'))
inputs:
awsCredentials: 'iotahoe-dev-azure-devops'
regionName: 'eu-central-1'
imageSource: 'imagename'
sourceImageName: $(imageName)
sourceImageTag: $(tag)
repositoryName: $(imageName)
pushTag: '$(Build.SourceBranchName)'
autoCreateRepository: true

- task: ECRPushImage@1
displayName: Push image with uniq tag
condition: and(succeeded(),ne(variables['Build.SourceBranchName'], 'merge'))
inputs:
awsCredentials: 'dev-azure-devops'
regionName: 'eu-central-1'
imageSource: 'imagename'
sourceImageName: $(imageName)
sourceImageTag: $(tag)
repositoryName: $(imageName)
pushTag: $(tag)
autoCreateRepository: true
outputVariable: 'ECR_PUSHED_IMAGE_NAME'

如果还有什么需要我提供的,请告诉我。

最佳答案

我认为您从根本上混淆了这里的一些内容:您提供的链接和下面的链接提供了有关在 Azure Functions 中设置日志记录的指南。但是,您似乎正在谈论 Azure Pipelines 中的日志记录,这是完全不同的事情。所以需要明确的是:

Azure Pipelines 运行构建和部署作业,将 GitHub 存储库上可能拥有的代码部署到 Azure Functions。管道在 Azure Pipelines 代理中执行,该代理可以是 Microsoft 或自托管。如果我们假设你使用 Microsoft 托管代理执行管道,则不应假设这些代理具有 Azure Functions 可能具有的任何功能(也不应首先执行针对 Azure Functions 的代码)。如果您想在管道中执行 python 代码,您应该首先开始查看托管代理已预安装哪些与 python 相关的功能并从那里开始工作:https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/hosted?view=azure-devops&tabs=yaml

如果您想记录有关管道运行的信息,则应在手动排队管道时首先选中“启用系统诊断”选项。要自己实现更多日志记录,请检查:https://learn.microsoft.com/en-us/azure/devops/pipelines/scripts/logging-commands?view=azure-devops&tabs=bash

要登录 Azure Functions,您可能需要从此处开始:https://learn.microsoft.com/en-us/azure/azure-functions/functions-monitoring ,但这与登录 Azure Pipelines 是完全不同的主题。

关于python - Azure DevOps Pipeline 上没有日志记录,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/69552230/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com