-
Notifications
You must be signed in to change notification settings - Fork 66
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Could ack speed up with a per-directory cache of some kind? #333
Comments
I've played with that in my head for quite a while, but never actually tried it. The indexing tools at https://beyondgrep.com/more-tools/ may be ideas about what you could use. Also, if you're looking for functions and variables a lot, using |
Aside from the cache question: How many hundreds of MBs do you have, and how long are searches taking? One thing we've had trouble with over the years is that some folks have systems where ack takes far longer than we would expect it to, and we haven't been able to figure out why. I'm wondering if you might be in that situation as well. See #194 for example. |
Tuning the OS filecache reservation and/or switching from spinning iron-oxide to SSD can greatly improve read speed. I have doubts about one tool having both cached-index mode and grep mode, and i wouldn't want to give up extemporaneous usage of ack. I've installed I've even experimented with scanning files matched by swish-e with ack: Before i can make full use of it i need to figure out how to capture metadata about a document along with its contents, and what's my needed metadata schema ... ugh. I should remember from 20+ years ago in late Web1.0 when i was buying a bleeding edge indexing engine that Information Retrieval is NOT as easy as it looks!. |
I hope this isn't too naive but I couldn't find anything on it. I have a directory with hundreds of MBs of source code and every search takes a long time. Would it be possible to make a saved cache index for a directory and update it for updated files when you make a new search?
The text was updated successfully, but these errors were encountered: