gpt4 book ai didi

ExecuteQuerySegmentedAsync 与 ExecuteQuery 的 Azure 表查询限制

转载 作者:行者123 更新时间:2023-12-03 01:01:26 28 4
gpt4 key购买 nike

调用ExecuteQuery()有哪些限制?例如,实体数量限制下载大小

也就是说,下面的方法什么时候会达到极限?

 private static void ExecuteSimpleQuery(CloudTable table, string partitionKey, string startRowKey, string endRowKey)
{
try
{
// Create the range query using the fluid API
TableQuery<CustomerEntity> rangeQuery = new TableQuery<CustomerEntity>().Where(
TableQuery.CombineFilters(
TableQuery.GenerateFilterCondition("PartitionKey", QueryComparisons.Equal, partitionKey),
TableOperators.And,
TableQuery.CombineFilters(
TableQuery.GenerateFilterCondition("RowKey", QueryComparisons.GreaterThanOrEqual, startRowKey),
TableOperators.And,
TableQuery.GenerateFilterCondition("RowKey", QueryComparisons.LessThanOrEqual, endRowKey))));

foreach (CustomerEntity entity in table.ExecuteQuery(rangeQuery))
{
Console.WriteLine("Customer: {0},{1}\t{2}\t{3}", entity.PartitionKey, entity.RowKey, entity.Email, entity.PhoneNumber);
}
}
catch (StorageException e)
{
Console.WriteLine(e.Message);
Console.ReadLine();
throw;
}
}

下面的方法是使用ExecuteQuerySegmentedAsync,TakeCount为50,但是50是如何确定的,我认为这是由我上面的问题决定的。

 private static async Task PartitionRangeQueryAsync(CloudTable table, string partitionKey, string startRowKey, string endRowKey)
{
try
{
// Create the range query using the fluid API
TableQuery<CustomerEntity> rangeQuery = new TableQuery<CustomerEntity>().Where(
TableQuery.CombineFilters(
TableQuery.GenerateFilterCondition("PartitionKey", QueryComparisons.Equal, partitionKey),
TableOperators.And,
TableQuery.CombineFilters(
TableQuery.GenerateFilterCondition("RowKey", QueryComparisons.GreaterThanOrEqual, startRowKey),
TableOperators.And,
TableQuery.GenerateFilterCondition("RowKey", QueryComparisons.LessThanOrEqual, endRowKey))));

// Request 50 results at a time from the server.
TableContinuationToken token = null;
rangeQuery.TakeCount = 50;
int segmentNumber = 0;
do
{
// Execute the query, passing in the continuation token.
// The first time this method is called, the continuation token is null. If there are more results, the call
// populates the continuation token for use in the next call.
TableQuerySegment<CustomerEntity> segment = await table.ExecuteQuerySegmentedAsync(rangeQuery, token);

// Indicate which segment is being displayed
if (segment.Results.Count > 0)
{
segmentNumber++;
Console.WriteLine();
Console.WriteLine("Segment {0}", segmentNumber);
}

// Save the continuation token for the next call to ExecuteQuerySegmentedAsync
token = segment.ContinuationToken;

// Write out the properties for each entity returned.
foreach (CustomerEntity entity in segment)
{
Console.WriteLine("\t Customer: {0},{1}\t{2}\t{3}", entity.PartitionKey, entity.RowKey, entity.Email, entity.PhoneNumber);
}

Console.WriteLine();
}
while (token != null);
}
catch (StorageException e)
{
Console.WriteLine(e.Message);
Console.ReadLine();
throw;
}
}

示例来自以下链接: https://github.com/Azure-Samples/storage-table-dotnet-getting-started

最佳答案

对于ExecuteQuerySegmentedAsync,限制为1000。这是基于 REST API 提出的限制,其中对表服务的单个请求最多可以返回 1000 个实体(引用:https://learn.microsoft.com/en-us/rest/api/storageservices/query-timeout-and-pagination)。

ExecuteQuery 方法将尝试返回与查询匹配的所有实体。在内部,它尝试在单次迭代中获取最多 1000 个实体,并且如果表服务的响应包含继续 token ,则将尝试获取下一组实体。

更新

If ExecuteQuery performs pagination automatically, it seems it is easier to use than ExecuteQuerySegmentedAsync. Why must I use ExecuteQuerySegmentedAsync? What about download size? 1000 entities regardless their sizes?

使用ExecuteQuery,您无法跳出循环。当表中有很多实体时,这会成为问题。通过 ExecuteQuerySegmentedAsync,您可以实现这种灵 active 。例如,假设您想要从一个非常大的表中下载所有实体并将它们保存在本地。如果您使用 ExecuteQuerySegmentedAsync,则可以将实体保存在不同的文件中。

关于您对 1000 个实体(无论大小)的评论,答案是肯定的。请记住,每个实体的最大大小可为 1MB。

关于ExecuteQuerySegmentedAsync 与 ExecuteQuery 的 Azure 表查询限制,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/59710382/

28 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com