索引我的數據時出現此錯誤。經過一番研究,我發現爲什麼發生這種情況,並且增加了max_token_length
所以我這樣做,但我仍然得到的TokenStream expanded to 912 finite strings. Only <= 256 finite strings are supported
IllegalArgumentException - 僅支持<= 256個有限字符串
這裏同樣的錯誤是我的分析儀設置:
"settings": {
"index": {
"analysis": {
"analyzer": {
"shingle_analyzer": {
"tokenizer": "standard",
"max_token_length": 920,
"filter": ["lowercase", "shingle_filter", "asciifolding"],
"char_filter": ["html_strip"],
"type": "custom"
},
"html_analyzer": {
"tokenizer": "standard",
"max_token_length": 920,
"filter": ["lowercase", "asciifolding"],
"char_filter": ["html_strip"],
"type": "custom"
}
},
"tokenizer": {
"standard": {
"type": "standard"
}
},
"filter": {
"shingle_filter": {
"min_shingle_size": 2,
"max_shingle_size": 5,
"type": "shingle"
}
}
}
}
}
這裏的我想要一個例子來插入:
POST /my_index/my_type/{id}
{
"myField":{
"input":"Abcdefghij kl Mnopqrstwx yz Abcdef g Hijklmno pq Rstwxy Zabc (DEF)",
"weight":2,
"payload":{
"iD":"2786129"
}
}
}
下面是my_type
屬性映射
"Suggestion": {
"properties": {
"id": {
"index": "not_analyzed",
"type": "integer"
},
"myField": {
"type": "completion",
"analyzer": "shingle_analyzer",
"search_analyzer": "shingle_analyzer",
"max_input_length": 150,
"payloads": true
}
}
}
我錯過了什麼?
我將不勝感激任何幫助或線索來解決這個問題,謝謝!
編輯: 更正analyzer
封閉失蹤
請注意,您缺少並在您的索引設置中包含''分析器':{...}'部件來包裹您的自定義分析器。查看[自定義分析器的結構](https://www.elastic.co/guide/en/elasticsearch/reference/current/analysis-custom-analyzer.html),'analyzer','tokenizer'和'filter'進入「分析」結構。 – Val
噢,我很抱歉,我寫了一個錯誤,實際上我把它們都放在'analyzer'設置中 –