elasticsearch - Mapping fields with special character # fails -


i mapped field in elastic search gets analyzed edge 2gram tokenizer:

"google.title.#t": {   "type": "string",    "index_analyzer": "edge_2gram_body_analyzer",    "search_analyzer": "standard" } 

when mapping, seems healthy. expect this:

post myindex/_analyze?field=google.title.#t {"test"} 

to return tokens:

te, tes, test 

yet, not, returns "test" instead: defaulting standard analyzer.

now, when remove # key (google.title.t), works. there way can escape # @ mapping time? other forbidden characters?

this becuase "#" in url needs url-encoded
example:

post myindex/_analyze?field=google.title.%23t&text=text 

Comments

Popular posts from this blog

toolbar - How to add link to user registration inside toobar in admin joomla 3 custom component -

linux - disk space limitation when creating war file -