Something is rotting inside GitHub. If you've browsed Trending recently, you've probably noticed: repositories with hundreds of stars that appeared overnight, glowing issue comments that read like they were written by the same person, and contributor profiles with suspiciously perfect green-square grids. Welcome to the Reputation-as-a-Service economy.
This isn't vanity metrics. It's a coordinated effort to trick both GitHub's ranking algorithms and human developers into trusting malicious or low-quality code. And despite years of countermeasures, it's getting worse.
The Numbers: Six Million Fake Stars and Counting
In December 2024, researchers from Carnegie Mellon University, Socket, and North Carolina State University published the most comprehensive study of GitHub star fraud to date. Using a detection tool called StarScout, they analyzed GitHub event data from July 2019 to December 2024 and identified six million suspected fake stars across 15,835 repositories.
The trajectory is alarming. Fake star campaigns grew two orders of magnitude in 2024 alone. At their peak in July 2024, 16% of all repositories with star activity were associated with fake star campaigns, with 3,216 repositories and 30,779 participating bot accounts active in a single month.
GitHub responded by purging flagged accounts: roughly 91% of the identified repositories and 62% of the suspected inauthentic accounts were deleted by October 2024. But researchers found new clusters appearing faster than old ones could be removed. The purge didn't solve the problem; it just reset the scoreboard.
The Star-Farming Marketplace
GitHub stars are now a commodity. Star-selling services operate openly on Fiverr, Telegram groups, and gray-market forums. The pricing is surprisingly cheap:
- 50-100 stars: $5-10
- 500-1,000 stars: $25-64
- Premium packages with "natural" delivery over weeks: $100-200
- "Star insurance" (replacements if GitHub purges them): extra $20-50
- Aged accounts with achievements and commit history: up to $5,000 each
Some services even sell complete engagement packages: stars, forks, watchers, and issue comments bundled together. Every vanity metric on a GitHub profile has been monetized.
Why it works: High star counts push repositories up in GitHub Trending and search results. A repository that jumps from 10 to 500 stars in a week gets algorithmic amplification, landing on the front page where real developers discover it. The fake stars create a self-reinforcing cycle: bot engagement attracts real attention, which attracts real engagement. By the time anyone investigates, the repo looks legitimate.
The Bots Have Evolved
These aren't empty accounts anymore. The first generation of star-farming bots were obvious: no avatar, no bio, no repositories, created the same week as the stars. GitHub's detection caught most of them.
The current generation is far more sophisticated. Star-farming operations now use "aged" profiles with months of fake commit history, forked repositories, bio text, profile photos (often AI-generated), and followers. Some even have README files and links to personal websites. The accounts are "seasoned" for 60-90 days before activation, building a history that bypasses automated detection.
The StarScout researchers identified the key behavioral patterns that distinguish fake accounts: they show minimal organic activity, star repositories in coordinated bursts, and cluster around the same small set of target repos. But the operators are adapting, spacing out their actions and diversifying their targets to look more organic.
LLM-Generated Feedback: The New Social Proof
The most dangerous evolution is fake issue comments and discussions generated by large language models. GPT-4, Claude, and open-source models are being used to create a veneer of legitimacy that is increasingly hard to distinguish from real developer feedback.
