gpt4 book ai didi

c# - 有没有办法跳过 MediatR 管道?

转载 作者:行者123 更新时间:2023-12-04 16:40:44 29 4
gpt4 key购买 nike

我想缓存一些来自 CommandsHandlers 的响应。

我已经使用 IPipelineBehaviour 做到了这一点,但我的请求中只有 5% 确实必须有缓存,而其他 95% 必须跳过此管道。有没有办法做到这一点?

下面是我的代码。

谢谢!

     public class PipelineBehavior<TRequest, TResponse> : IPipelineBehavior<TRequest, TResponse> 
where TRequest : IRequest<TResponse>, IProvideCacheKey
{
private readonly IMemoryCache _cache;
public PipelineBehavior(IMemoryCache cache)
{
_cache = cache;
}
public async Task<TResponse> Handle(TRequest request, CancellationToken cancellationToken,
RequestHandlerDelegate<TResponse> next)
{
// Check in cache if we already have what we're looking for
var cacheKey = request.CacheKey;
if (_cache.TryGetValue<TResponse>(cacheKey, out var cachedResponse))
{
return cachedResponse;
}
// If we don't, execute the rest of the pipeline, and add the result to the cache
var response = await next();
_cache.Set(cacheKey, response);
return response;
}
}





public class GetUserByEmailCommand : Command, IRequest<bool>, IProvideCacheKey
{
public string Email { get; set; }

public string CacheKey => $"{GetType().Name}:{Email}";

public override bool IsValid()
{
ValidationResult = new GetUserByEmailCommandValidation<GetUserByEmailCommand>().Validate(this);

return ValidationResult.IsValid;
}
}



public interface IProvideCacheKey
{
string CacheKey { get; }
}

最佳答案

如果请求不可缓存以让管道继续,您可以将缓存行为包装在绕过的检查中。在您的情况下,您可能只需检查请求是否在 Handle 方法开始时实现您的接口(interface):

if (request is IProvideCacheKey)
{
// perform cache behavior, return if cached and terminate the pipeline
}
// else continue the pipeline

有几个很好的例子更详细地说明了这一点:

https://lurumad.github.io/cross-cutting-concerns-in-asp-net-core-with-meaditr

https://anderly.com/2019/12/12/cross-cutting-concerns-with-mediatr-pipeline-behaviors/

关于c# - 有没有办法跳过 MediatR 管道?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/61400315/

29 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com