apps/www/_blog/2026-04-28-introducing-osscar-index.mdx
Today we are launching the OSSCAR Index: the Open Source Supabase Commit Analytical Ranking. A quarterly ranking of the fastest-growing open source organizations, measured with a transparent, reproducible methodology. The site, the data, and the scoring code are all open source. The first edition covers Q1 2026 and is live now.
Open source has a ranking problem.
Most "top open source" lists rank by raw totals: stars, downloads, contributors. Those are real signals, and they accumulate for good reasons. They also tell you who was big yesterday, not who is growing today. The list of fastest-growing open source projects looks nothing like the list of largest ones, and right now there is no good way to find it.
We've been working with >commit on creating the definitive ranking of open-source projects.
A few observations from this quarter's data:
The index ranks GitHub organizations by the rate at which their communities are growing, across three signals:
Each signal is normalized within a division so a 200-person team and a five-person team can be compared fairly. The three normalized scores are then combined into a single composite using an L² norm (the square root of the sum of squares). The L² norm rewards standout growth on a single signal vs more balanced growth across the board, and also doesn't penalize too much projects that are missing a specific metric, such as a library without a published package, for example.
The OSSCAR Index focuses on growth, not size. A project with 800 stars that doubled in a quarter can outrank a project with 80,000 stars that added 5%.
Ranking a new AI agent framework against Kubernetes is not useful. So the index splits organizations into two independent leaderboards based on their star count at the start of the quarter:
Divisions lock at quarter-start. Cross the threshold mid-quarter and you still compete in Emerging that cycle. This keeps the peer group fair and prevents projects from gaming which division they sit in.
The score answers a simple question: how much faster is this project growing this quarter than its peers?
For each project we look at three things over the course of the quarter: how many new GitHub stars it picked up, how many new contributors showed up, and how many more package downloads it saw. Each of those gets compared against everyone else in the same division and turned into a number from 0 to 100. Combine them with an L² norm (the square root of the sum of squares) and you get a composite out of ~173.
A few things worth knowing:
The full methodology is on the site. The site itself, the data pipeline, and the scoring code are all on GitHub: commitvc/osscar. If you think a weighting is wrong, a data source is missing, or a division boundary should move, propose it. Read the code, open an issue, send a pull request. We mean it.
Supabase is an open source company. We run on the open source ecosystem: Postgres, PostgREST, pg_vector, Deno, and dozens more tools. We want that ecosystem to be healthy, visible, and legible to developers, customers, and investors who are trying to find what is working. A good index helps new projects get discovered. Discovery helps contributors show up. Contributors ship features. Features create users. That flywheel is how open source compounds.
If you appear in the Q1 2026 OSSCAR Index: congratulations. You can download a badge to promote your entry in the list from your entry on the website.
If you think you should be ranked and are not, check the methodology page first. The most common reasons:
We update quarterly. Q2 2026 data collection is already underway.
Three things on the near-term roadmap: