Humanwashed is a one-person project. I read open-source repositories and document when the public 'I did it alone' claim doesn't hold up under forensic review. No organization, no collective. Just open research.
My review is public and reproducible. No black-box ML, no secret formula. Three steps, fully documented.
I read READMEs, comments and commit messages. LLMs leave detectable cadences — sentence lengths, transition words, telltale phrasing.
Git history doesn't lie. I look at commit timing, file-creation bursts, diff shapes. Hundreds of lines per minute is human-atypical.
Every major model has tells — preferred error messages, variable names, boilerplate. I compare against publicly known patterns.
When I'm wrong, the correction is louder than the claim. Raw evidence is on the table before anything is published. Maintainers can always send counter-evidence — it lands in the archive alongside the original assessment.
Repo link, the claim that made you suspicious, and whatever you've seen. I'll look into it, and if it holds, it goes into the archive — with full notes.