← Back to context

Comment by maxnk

9 hours ago

Related problem I've been exploring lately: finding which files are most worth refactoring, with complexity as one of the inputs.

I've built a small, opinionated tool for that [1]. It can rank files by a "Refactor Priority" score based on structural signals - size, callable burden, cyclomatic complexity, nesting - with churn and co-change from local git history layered on top.

It's more of an exploratory tool than a general solution, but it's been practically useful for quickly spotting painful files.

Part of why it was built: keeping coding agents in check. They tend to produce code that gets complex fast, don't feel the complexity building up, and eventually start making changes that break things. So the tool helps me catch files that are getting out of hand before that happens. It can also generate a refactoring prompt explaining why a given file is problematic - as a conversation starter for the agent.

The article gave me a few more metric ideas to try, thanks.

[1] https://github.com/etechlead/token-map