gpt4 book ai didi

php - PHP 的 Elasticsearch 批量上传错误 - 已超过索引中总字段 [1000] 的限制

转载 作者:塔克拉玛干 更新时间:2023-11-03 05:53:43 26 4
gpt4 key购买 nike

我们计划在我们的一个项目中使用 ElasticSearch。目前,我们正在使用我们的数据测试 ElasticSearch 5.0.1。我们面临的一个问题是当我们从 MySQL 表批量上传到 elasticsearch 时出现错误,我们得到...

java.lang.IllegalArgumentException: Limit of total fields [1000] in index [shopfront] has been exceeded
at org.elasticsearch.index.mapper.MapperService.checkTotalFieldsLimit(MapperService.java:482) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:343) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:277) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.cluster.metadata.MetaDataMappingService$PutMappingExecutor.applyRequest(MetaDataMappingService.java:323) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.cluster.metadata.MetaDataMappingService$PutMappingExecutor.execute(MetaDataMappingService.java:241) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.cluster.service.ClusterService.runTasksForExecutor(ClusterService.java:555) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.cluster.service.ClusterService$UpdateTask.run(ClusterService.java:896) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:451) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.runAndClean(PrioritizedEsThreadPoolExecutor.java:238) ~[elasticsearch-5.0.1.jar:5.0.1]
at org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.run(PrioritizedEsThreadPoolExecutor.java:201) ~[elasticsearch-5.0.1.jar:5.0.1]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_111]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_111]
at java.lang.Thread.run(Thread.java:745) [?:1.8.0_111]

我们使用 PHP 作为 elasticsearch 客户端,从 MySQL 批量上传到 Elastic。在谷歌搜索后,我得到了这条信息 - https://discuss.elastic.co/t/es-2-3-5-x-metricbeat-index-field-limit/66821

我还在某处读到使用“index.mapping.total_fields.limit”可以解决这个问题。但是,无法理解如何在我的 PHP 代码中使用它。这是我的 PHP 代码。

$params = ['body' => []];

$i = 1;
foreach ($productsList as $key => $value) {

$params['body'][] = [
'index' => [
'_index' => 'shopfront',
'_type' => 'products'
],
'settings' => ['index.mapping.total_fields.limit' => 3000]
];

$params['body'][] = [
'product_displayname' => $value['product_displayname'],
'product_price' => $value['product_price'],
'popularity' => $value['popularity'],
'lowestcomp_price' => $value['lowestcomp_price']
];

// Every 1000 documents stop and send the bulk request
if ($i % 1000 == 0) {
$responses = $client->bulk($params);

// erase the old bulk request
$params = ['body' => []];

// unset the bulk response when you are done to save memory
unset($responses);
}

$i++;
}

// Send the last batch if it exists
if (!empty($params['body'])) {
$responses = $client->bulk($params);
}

注意 - 我在 Elasticsearch 2.4.1 中使用了相同的代码,它运行良好。

最佳答案

在 ES 5 中,ES 人员决定限制映射类型可以包含的字段数,以防止映射爆炸。正如您所注意到的,该限制已设置为每个映射 1000 个字段,但您可以通过在索引创建时指定 index.mapping.total_fields.limit 设置来提升该限制以满足您的需要或通过 updating the index settings ,像这样:

curl -XPUT 'localhost:9200/shopfront/_settings' -d '
{
"index.mapping.total_fields.limit": 3000
}'

请注意,您还需要问问自己,拥有那么多字段是否是一件好事。你需要他们吗?你能结合一些吗?等等等等

关于php - PHP 的 Elasticsearch 批量上传错误 - 已超过索引中总字段 [1000] 的限制,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/40857060/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com