Description
I'm mostly posting this because I had some potential problems on my work codebase whose ghcide process is sitting at 20.4GB of RES memory right now. I can't share the code, but I'm interested in helping fix this performance issue and not quite sure how.
The only thing I can currently think of is manually try and pare down a version of the codebase with the same number of modules, functions, perhaps even code body within functions, etc. Even then though, I'm not sure it will reproduce my issue.
Then I had an idea. We could have a "codebase shape" tool of some sort which tells the number of modules, constructors, and other parts of the GHC ast I have no idea about that matter to the performance of ghcide.
Then perhaps that could be extended to recreate codebases of that "shape".
That's probably a pretty complex tool though and my goal here was to answer: What is the simplest way to make a minimum reproduction of a large private codebase?
This would be simplest from the user perspective and I expect that commercial users with large codebases would have no problem submitting a file outputting the "shape" of their codebase in order to debug performance issues.
I'm curious what others with more experience with ghcide think of this idea and if there are simpler ways to accomplish this.
ghcide 2.0 has been awesome and super fast btw, outside of a few pathological (hopefully, time will tell) cases.