gpt4 book ai didi

node.js - 使用带同步日志记录的 pino multistream

转载 作者:行者123 更新时间:2023-12-05 03:33:53 25 4
gpt4 key购买 nike

据我了解,Pino (v 7.5.1) 默认执行同步日志记录。来自文档

In Pino's standard mode of operation log messages are directly written to the output stream as the messages are generated with a blocking operation.

我正在像这样使用 pino.multistreams

const pino = require('pino')
const pretty = require('pino-pretty')
const logdir = '/Users/punkish/Projects/z/logs'

const streams = [
{stream: fs.createWriteStream(`${logdir}/info.log`, {flags: 'a'})},
{stream: pretty()},
{level: 'error', stream: fs.createWriteStream(`${logdir}/error.log`, {flags: 'a'})},
{level: 'debug', stream: fs.createWriteStream(`${logdir}/debug.log`, {flags: 'a'})},
{level: 'fatal', stream: fs.createWriteStream(`${logdir}/fatal.log`, {flags: 'a'})}
]

奇怪的是,Pino 的行为是异步的。我有一个 curl 操作乱序输出(在使用 log.info 的早期事件之前)

log.info('1')
.. code to do 1 something

log.info('2')
.. code to do 2 something

log.info('3')
.. code to do 3 something

const execSync = require('child_process').execSync
execSync(curl --silent --output ${local} '${remote}')

我的控制台输出是

1
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 39.5M 100 39.5M 0 0 108M 0 --:--:-- --:--:-- --:--:-- 113M
2
3

这有点烦人和令人困惑。也许这不是 Pino 的错,也许是 curl 导致了问题。但是,如果我用 console.log 替换 pino logging,那么顺序就是预期的。所以问题似乎出在 Pino 的异步行为上。我怎样才能回到同步日志记录?

最佳答案

诀窍是调用 pino.destination({...}) 来创建 SonicBoom 输出流:fs.createWriteStream 的 pino 特定替代方案。 SonicBoom 选项有一个 bool 属性 sync。您还需要 pretty({...}) 中的 sync 选项。

const pino = require('pino')
const pretty = require('pino-pretty')
const logdir = '/Users/punkish/Projects/z/logs'

const createSonicBoom = (dest) =>
pino.destination({dest: dest, append: true, sync: true})

const streams = [
{stream: createSonicBoom(`${logdir}/info.log`)},
{stream: pretty({
colorize: true,
sync: true,
})},
{level: 'error', stream: createSonicBoom(`${logdir}/error.log`)},
{level: 'debug', stream: createSonicBoom(`${logdir}/debug.log`)},
{level: 'fatal', stream: createSonicBoom(`${logdir}/fatal.log`)}
]

测试:

const log = pino({ level: 'info' }, pino.multistream(streams))

console.log('Before-Fatal')
log.fatal('Fatal')
log.error('Error')
log.warn('Warn')
console.log('After-Warn, Before-Info')
log.info('Info')
console.log('After-Info')

输出:

Before-Fatal[1234567890123] FATAL (1234567 on host): Fatal[1234567890127] ERROR (1234567 on host): Error[1234567890127] WARN (1234567 on host): WarnAfter-Warn, Before-Info[1234567890128] INFO (1234567 on host): InfoAfter-Info

关于node.js - 使用带同步日志记录的 pino multistream,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/70265186/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com