You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi Qiqi, sorry for the delay. And the quick answer is that we don't have tokenizer in rucene at this time. We resort to data pipeline to tokenize the input instead.
This might be a silly question, does Rucene support Chinese character indexing and searching.
I don't see any tokenizer under the https://github.com/zhihu/rucene/tree/master/src/core/analysis
The text was updated successfully, but these errors were encountered: