You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm using an ElasticSearch solution for search-as-you-type, that requires a custom analyzer. In order for this to work, a custom analyzer/tokenizer have to be set when creating an index. I'm assuming this connector supports that, as the config example an analyzer is made in settings too. However, when I try to do the same, no analyzer is made. So when I try to create the mapping using this custom analyzer, it obviously fails as the analyzer does not exist.
api_1 | Elasticsearch DEBUG: 2016-10-19T10:21:23Z
api_1 | starting request { method: 'HEAD',
api_1 | castExists: true,
api_1 | path: '/superbuddy',
api_1 | query: {} }
api_1 |
api_1 |
api_1 | Elasticsearch TRACE: 2016-10-19T10:21:23Z
api_1 | -> HEAD http://search:9200/superbuddy
api_1 |
api_1 | <- 404
api_1 |
api_1 |
api_1 | Elasticsearch DEBUG: 2016-10-19T10:21:23Z
api_1 | Request complete
api_1 |
api_1 | Elasticsearch DEBUG: 2016-10-19T10:21:23Z
api_1 | starting request { method: 'POST', path: '/superbuddy', query: {} }
api_1 |
api_1 |
api_1 | Elasticsearch TRACE: 2016-10-19T10:21:23Z
api_1 | -> POST http://search:9200/superbuddy
api_1 |
api_1 | <- 200
api_1 | {
api_1 | "acknowledged": true
api_1 | }
api_1 |
api_1 | Elasticsearch DEBUG: 2016-10-19T10:21:23Z
api_1 | Request complete
api_1 |
api_1 | Elasticsearch DEBUG: 2016-10-19T10:21:23Z
api_1 | starting request { method: 'PUT',
api_1 | path: '/superbuddy/_mapping/term',
api_1 | body: { properties: { term: [Object] } },
api_1 | query: {} }
api_1 |
api_1 |
api_1 | Elasticsearch TRACE: 2016-10-19T10:21:23Z
api_1 | -> PUT http://search:9200/superbuddy/_mapping/term
api_1 | {
api_1 | "properties": {
api_1 | "term": {
api_1 | "type": "string",
api_1 | "analyzer": "autocomplete",
api_1 | "search_analyzer": "standard"
api_1 | }
api_1 | }
api_1 | }
api_1 | <- 400
api_1 | {
api_1 | "error": {
api_1 | "root_cause": [
api_1 | {
api_1 | "type": "mapper_parsing_exception",
api_1 | "reason": "analyzer [autocomplete] not found for field [term]"
api_1 | }
api_1 | ],
api_1 | "type": "mapper_parsing_exception",
api_1 | "reason": "analyzer [autocomplete] not found for field [term]"
api_1 | },
api_1 | "status": 400
api_1 | }
api_1 |
api_1 | Elasticsearch DEBUG: 2016-10-19T10:21:23Z
api_1 | Request complete
api_1 |
api_1 | Connection fails: [mapper_parsing_exception] analyzer [autocomplete] not found for field [term] :: {"path":"/superbuddy/_mapping/term","query":{},"body":"{"properties":{"term":{"type":"string","analyzer":"autocomplete","search_analyzer":"standard"}}}","statusCode":400,"response":"{"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"analyzer [autocomplete] not found for field [term]"}],"type":"mapper_parsing_exception","reason":"analyzer [autocomplete] not found for field [term]"},"status":400}"}
api_1 | It will be retried for the next request.
api_1 | Unhandled rejection Error: [mapper_parsing_exception] analyzer [autocomplete] not found for field [term]
api_1 | at respond (/project/node_modules/elasticsearch/src/lib/transport.js:289:15)
api_1 | at checkRespForFailure (/project/node_modules/elasticsearch/src/lib/transport.js:248:7)
api_1 | at HttpConnector. (/project/node_modules/elasticsearch/src/lib/connectors/http.js:164:7)
api_1 | at IncomingMessage.wrapper (/project/node_modules/lodash/index.js:3095:19)
api_1 | at emitNone (events.js:91:20)
api_1 | at IncomingMessage.emit (events.js:185:7)
api_1 | at endReadableNT (_stream_readable.js:974:12)
api_1 | at _combinedTickCallback (internal/process/next_tick.js:74:11)
api_1 | at process._tickDomainCallback (internal/process/next_tick.js:122:9)
Naturally, those same settings work fine when I insert them manually. When I do this I push the settings in the body of the POST request for making the index. If I do this, this connector can also make the mapping without issue, giving me a workaround for now. But I'll be damned if I have to create the index manually when this connector can do it automatically.
The text was updated successfully, but these errors were encountered:
@Ozitiho - as you found out ... creating custom analyzers (if they do not already exist) is not handled by the code currently :(
If I, you or some other member of our community submits a PR with tests then we can get the feature merged and released.
Currently there are two other big PRs which I'm waiting to review with their respective contributors. Given that it is quite busy at my day-job ... I'm semi sure that I won't be contributing any new code (on my own) this year.
Good to know! Since I do need this feature, I'll probably make it myself eventually. If I do, I'll make sure to make a PR for it. Thanks for clarifying. :)
I'm using an ElasticSearch solution for search-as-you-type, that requires a custom analyzer. In order for this to work, a custom analyzer/tokenizer have to be set when creating an index. I'm assuming this connector supports that, as the config example an analyzer is made in settings too. However, when I try to do the same, no analyzer is made. So when I try to create the mapping using this custom analyzer, it obviously fails as the analyzer does not exist.
Here's what I got.
datasource.json:
"mappings": [ { "name": "term", "properties": { "term": { "type": "string", "analyzer": "autocomplete", "search_analyzer": "standard" } } } ], "settings": { "analysis": { "filter": { "autocomplete_filter": { "type": "edge_ngram", "min_gram": 1, "max_gram": 20 } }, "analyzer": { "autocomplete": { "type": "custom", "tokenizer": "standard", "filter": [ "lowercase", "autocomplete_filter" ] } } } },
Output:
Naturally, those same settings work fine when I insert them manually. When I do this I push the settings in the body of the POST request for making the index. If I do this, this connector can also make the mapping without issue, giving me a workaround for now. But I'll be damned if I have to create the index manually when this connector can do it automatically.
The text was updated successfully, but these errors were encountered: