gpt4 book ai didi

asp.net-mvc - 由于请求正文太大,大文件上传到 ASP.NET Core 3.0 Web API 失败

转载 作者:行者123 更新时间:2023-12-03 17:12:57 25 4
gpt4 key购买 nike

我有一个 ASP.NET Core 3.0 Web API 端点,我已将其设置为允许我发布大型音频文件。我已按照 MS 文档中的以下说明来设置端点。

https://learn.microsoft.com/en-us/aspnet/core/mvc/models/file-uploads?view=aspnetcore-3.0#kestrel-maximum-request-body-size

当音频文件上传到端点时,它会流式传输到 Azure Blob 存储容器。

我的代码在本地按预期工作。

当我将其推送到 Linux 上的 Azure 应用服务中的生产服务器时,代码不起作用并出现错误

请求管道中未处理的异常:System.Net.Http.HttpRequestException:发送请求时发生错误。 ---> Microsoft.AspNetCore.Server.Kestrel.Core.BadHttpRequestException:请求正文太大。

根据上述文章的建议,我已使用以下内容配置了增量更新的 Kesterl:

.ConfigureWebHostDefaults(webBuilder =>
{
webBuilder.UseKestrel((ctx, options) =>
{
var config = ctx.Configuration;

options.Limits.MaxRequestBodySize = 6000000000;
options.Limits.MinRequestBodyDataRate =
new MinDataRate(bytesPerSecond: 100,
gracePeriod: TimeSpan.FromSeconds(10));
options.Limits.MinResponseDataRate =
new MinDataRate(bytesPerSecond: 100,
gracePeriod: TimeSpan.FromSeconds(10));
options.Limits.RequestHeadersTimeout =
TimeSpan.FromMinutes(2);
}).UseStartup<Startup>();

还配置了 FormOptions 以接受最多 6000000000 个文件

services.Configure<FormOptions>(options =>
{
options.MultipartBodyLengthLimit = 6000000000;
});

并根据文章中的建议设置具有以下属性的 API Controller

[HttpPost("audio", Name="UploadAudio")]
[DisableFormValueModelBinding]
[GenerateAntiforgeryTokenCookie]
[RequestSizeLimit(6000000000)]
[RequestFormLimits(MultipartBodyLengthLimit = 6000000000)]

最后,这是操作本身。这一巨大的代码块并不表示我希望如何编写代码,但我已将其合并到一个方法中,作为调试练习的一部分。

public async Task<IActionResult> Audio()
{
if (!MultipartRequestHelper.IsMultipartContentType(Request.ContentType))
{
throw new ArgumentException("The media file could not be processed.");
}

string mediaId = string.Empty;
string instructorId = string.Empty;
try
{
// process file first
KeyValueAccumulator formAccumulator = new KeyValueAccumulator();

var streamedFileContent = new byte[0];

var boundary = MultipartRequestHelper.GetBoundary(
MediaTypeHeaderValue.Parse(Request.ContentType),
_defaultFormOptions.MultipartBoundaryLengthLimit
);
var reader = new MultipartReader(boundary, Request.Body);
var section = await reader.ReadNextSectionAsync();

while (section != null)
{
var hasContentDispositionHeader = ContentDispositionHeaderValue.TryParse(
section.ContentDisposition, out var contentDisposition);

if (hasContentDispositionHeader)
{
if (MultipartRequestHelper
.HasFileContentDisposition(contentDisposition))
{
streamedFileContent =
await FileHelpers.ProcessStreamedFile(section, contentDisposition,
_permittedExtensions, _fileSizeLimit);

}
else if (MultipartRequestHelper
.HasFormDataContentDisposition(contentDisposition))
{
var key = HeaderUtilities.RemoveQuotes(contentDisposition.Name).Value;
var encoding = FileHelpers.GetEncoding(section);

if (encoding == null)
{
return BadRequest($"The request could not be processed: Bad Encoding");
}

using (var streamReader = new StreamReader(
section.Body,
encoding,
detectEncodingFromByteOrderMarks: true,
bufferSize: 1024,
leaveOpen: true))
{
// The value length limit is enforced by
// MultipartBodyLengthLimit
var value = await streamReader.ReadToEndAsync();

if (string.Equals(value, "undefined",
StringComparison.OrdinalIgnoreCase))
{
value = string.Empty;
}

formAccumulator.Append(key, value);

if (formAccumulator.ValueCount >
_defaultFormOptions.ValueCountLimit)
{
return BadRequest($"The request could not be processed: Key Count limit exceeded.");
}
}
}
}

// Drain any remaining section body that hasn't been consumed and
// read the headers for the next section.
section = await reader.ReadNextSectionAsync();
}

var form = formAccumulator;
var file = streamedFileContent;

var results = form.GetResults();

instructorId = results["instructorId"];
string title = results["title"];
string firstName = results["firstName"];
string lastName = results["lastName"];
string durationInMinutes = results["durationInMinutes"];

//mediaId = await AddInstructorAudioMedia(instructorId, firstName, lastName, title, Convert.ToInt32(duration), DateTime.UtcNow, DateTime.UtcNow, file);

string fileExtension = "m4a";

// Generate Container Name - InstructorSpecific
string containerName = $"{firstName[0].ToString().ToLower()}{lastName.ToLower()}-{instructorId}";

string contentType = "audio/mp4";
FileType fileType = FileType.audio;

string authorName = $"{firstName} {lastName}";
string authorShortName = $"{firstName[0]}{lastName}";
string description = $"{authorShortName} - {title}";

long duration = (Convert.ToInt32(durationInMinutes) * 60000);

// Generate new filename
string fileName = $"{firstName[0].ToString().ToLower()}{lastName.ToLower()}-{Guid.NewGuid()}";

DateTime recordingDate = DateTime.UtcNow;
DateTime uploadDate = DateTime.UtcNow;
long blobSize = long.MinValue;
try
{
// Update file properties in storage
Dictionary<string, string> fileProperties = new Dictionary<string, string>();
fileProperties.Add("ContentType", contentType);

// update file metadata in storage
Dictionary<string, string> metadata = new Dictionary<string, string>();
metadata.Add("author", authorShortName);
metadata.Add("tite", title);
metadata.Add("description", description);
metadata.Add("duration", duration.ToString());
metadata.Add("recordingDate", recordingDate.ToString());
metadata.Add("uploadDate", uploadDate.ToString());

var fileNameWExt = $"{fileName}.{fileExtension}";

var blobContainer = await _cloudStorageService.CreateBlob(containerName, fileNameWExt, "audio");

try
{
MemoryStream fileContent = new MemoryStream(streamedFileContent);
fileContent.Position = 0;
using (fileContent)
{
await blobContainer.UploadFromStreamAsync(fileContent);
}

}
catch (StorageException e)
{
if (e.RequestInformation.HttpStatusCode == 403)
{
return BadRequest(e.Message);
}
else
{
return BadRequest(e.Message);
}
}

try
{
foreach (var key in metadata.Keys.ToList())
{
blobContainer.Metadata.Add(key, metadata[key]);
}

await blobContainer.SetMetadataAsync();

}
catch (StorageException e)
{
return BadRequest(e.Message);
}

blobSize = await StorageUtils.GetBlobSize(blobContainer);
}
catch (StorageException e)
{
return BadRequest(e.Message);
}

Media media = Media.Create(string.Empty, instructorId, authorName, fileName, fileType, fileExtension, recordingDate, uploadDate, ContentDetails.Create(title, description, duration, blobSize, 0, new List<string>()), StateDetails.Create(StatusType.STAGED, DateTime.MinValue, DateTime.UtcNow, DateTime.MaxValue), Manifest.Create(new Dictionary<string, string>()));

// upload to MongoDB
if (media != null)
{
var mapper = new Mapper(_mapperConfiguration);

var dao = mapper.Map<ContentDAO>(media);

try
{
await _db.Content.InsertOneAsync(dao);
}
catch (Exception)
{
mediaId = string.Empty;
}

mediaId = dao.Id.ToString();
}
else
{
// metadata wasn't stored, remove blob
await _cloudStorageService.DeleteBlob(containerName, fileName, "audio");
return BadRequest($"An issue occurred during media upload: rolling back storage change");
}

if (string.IsNullOrEmpty(mediaId))
{
return BadRequest($"Could not add instructor media");
}

}
catch (Exception ex)
{
return BadRequest(ex.Message);
}

var result = new { MediaId = mediaId, InstructorId = instructorId };

return Ok(result);
}

我重申,这一切在本地都很有效。我不在 IISExpress 中运行它,而是将其作为控制台应用程序运行。

我通过 SPA 应用程序和 Postman 提交大型音频文件,效果非常好。

我正在将此代码部署到 Linux 上的 Azure 应用服务(作为基本 B1)。

由于代码在我的本地开发环境中运行,我不知道接下来的步骤是什么。我已经重构了这段代码几次,但我怀疑它与环境有关。

我找不到任何地方提到应用服务计划的水平是罪魁祸首,所以在我出去花更多钱之前,我想看看这里是否有人遇到过这个挑战并可以提供建议。

更新:我尝试升级到生产应用程序服务计划,以查看是否存在未记录的传入流量门。升级也没用。

提前致谢。

-A

最佳答案

目前,自 2019 年 11 月起,适用于 Linux 的 Azure 应用服务存在限制。它的 CORS 功能默认启用且无法禁用,并且它具有文件大小限制,该限制似乎不会被任何已发布的 Kestrel 配置覆盖。解决方案是将 Web API 应用程序移动到适用于 Windows 的 Azure 应用服务,它会按预期工作。

如果您知道配置、服务器设置和 CLI 命令的神奇组合,我确信有一些方法可以解决这个问题,但我需要继续开发。

关于asp.net-mvc - 由于请求正文太大,大文件上传到 ASP.NET Core 3.0 Web API 失败,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/58886305/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com