CURL / JSON中是否有办法为索引中存在的所有ID检索特定类型的源值?因此,例如,在下面的示例数据中,我需要获取所有类型(文档)的ID的_id_batch(源值)。 _source_include在ID级别起作用,但是我想为索引中该类型的所有ID检索_source_include的所有值。
一旦到达那里,我便想限制输出中的数据,因此只给那些可能带有诸如“位置”:“西棕榈滩”之类的记录的_id_batch。
诸如此类(*仅用于选择所有值):
curl -X GET "http://<IP Server>:9200/<index_name>/document/*/_source?_source_include=_id_batch" -d "
{
"from": 0,
"fields":[],
"size":50,
"sort":[]
}
" > "C:\Users\user\Downloads\output.json"
样本数据:
网址:http://IPServer:9200/index_name/document
{
"_index" : "<index_name>",
"_type" : "document",
"_id" : "11",
"_version" : 1,
"found" : true,
"_source":{"_id_batch":1001688,"_id_document":11,"name":"xxx","location":"west palm beach","tweet":"0.0"}
}
{
"_index" : "<index_name>",
"_type" : "document",
"_id" : "12",
"_version" : 1,
"found" : true,
"_source":{"_id_batch":1001689,"_id_document":12,"name":"yyy","location":"west palm beach","tweet":"0.0"}
}
预期产量:
{"_id_batch":1001688}
{"_id_batch":1001689}
示例查询(使用命令行,因此使用“”):
curl -X GET "http://<IP Server>:9200/<index_name>/document/11/_source?_source_include=_id_batch" -d "
{
"from": 0,
"fields":[],
"size":50,
"sort":[]
}
" > "C:\Users\user\Downloads\output.json"
JSON文件中的示例输出:
{"_id_batch":1001688}
试过电话-
curl -XPOST "http://<IP Server>:9200/<index_name>/document/_search" -d "{"query" : { "match_all" : {} },"size" : <LARGE_NUM>,"_source" : [ "_id_batch" ]}"
得到了错误-
{"error":"SearchPhaseExecutionException[Failed to execute phase [query], all shards failed; shardFailures {[dyYdbJLMQeC5XvXE0wwA9Q][<index_name>][0]: RemoteTransportException[[<server>][inet[/<IP Server>:9300]][indices:data/read/search[phase/query]]]; nested: SearchParseException[[<index_name>][0]: query[ConstantScore(*:*)],from[-1],size[50]: Parse Failure [Failed to parse source [{query : { match_all : {} },size : 50,_source : [ _id_batch ]}]]]; nested: JsonParseException[Unrecognized token '_id_batch': was expecting ('true', 'false' or 'null')\n at [Source: UNKNOWN; line: 1, column: 61]]; }{[dyYdbJLMQeC5XvXE0wwA9Q][<index_name>][1]: RemoteTransportException[[<server>][inet[/<IP Server>:9300]][indices:data/read/search[phase/query]]]; nested: SearchParseException[[<index_name>][1]: query[ConstantScore(*:*)],from[-1],size[50]: Parse Failure [Failed to parse source [{query : { match_all : {} },size : 50,_source : [ _id_batch ]}]]]; nested: JsonParseException[Unrecognized token '_id_batch': was expecting ('true', 'false' or 'null')\n at [Source: UNKNOWN; line: 1, column: 61]]; }{[31mSI_d3SPmTf7cU1dSNlQ][<index_name>][2]: RemoteTransportException[[D1-C04-WAPP02][inet[/<IP>:9300]][indices:data/read/search[phase/query]]]; nested: SearchParseException[[<index_name>][2]: query[ConstantScore(*:*)],from[-1],size[50]: Parse Failure [Failed to parse source [{query : { match_all : {} },size : 50,_source : [ _id_batch ]}]]]; nested: JsonParseException[Unrecognized token '_id_batch': was expecting ('true', 'false' or 'null')\n at [Source: UNKNOWN; line: 1, column: 61]]; }{[q2tWPLiCQR-O2zXRZBuQYg][<index_name>][3]: SearchParseException[[<index_name>][3]: query[ConstantScore(*:*)],from[-1],size[50]: Parse Failure [Failed to parse source [{query : { match_all : {} },size : 50,_source : [ _id_batch ]}]]]; nested: JsonParseException[Unrecognized token '_id_batch': was expecting ('true', 'false' or 'null')\n at [Source: [B@4fccd33b; line: 1, column: 61]]; }{[q2tWPLiCQR-O2zXRZBuQYg][<index_name>][4]: SearchParseException[[<index_name>][4]: query[ConstantScore(*:*)],from[-1],size[50]: Parse Failure [Failed to parse source [{query : { match_all : {} },size : 50,_source : [ _id_batch ]}]]]; nested: JsonParseException[Unrecognized token '_id_batch': was expecting ('true', 'false' or 'null')\n at [Source: [B@4fccd33b; line: 1, column: 61]]; }]","status":400}
最佳答案
我想你想要这样的东西:
curl -XPOST "http://<IP Server>:9200/<index_name>/document/_search" -d '{
"query" : { "match_all" : {} },
"size" : <LARGE_NUM>,
"_source" : [ "_id_batch" ]
}'
其中LARGE_NUM是大量>>所有'document'类型的记录的计数
因此,您要通过命中的端点来指定类型或映射(“文档”),并且正在查询该映射的所有Elasticsearch记录。
关于json - Elasticsearch查询检索所有ID的特定类型的特定_source值,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/33042070/