The problem: Stack Overflow's question volume across forty major language, framework, and database tags fell from 1,734,884 in 2020 to 88,256 in 2025 — a 94.9% collapse. Thirty-eight of the forty tags lost 95% or more of their 2020 volume. The 2026 partial-year data, captured 8 May 2026, shows the slide continuing on a 60-85% additional drop. The platform that was the canonical developer-Q&A site for fifteen years is approaching residual noise.
What is the Stack Overflow question decline? A 2026 ApifyForge audit of the public Stack Exchange API shows Stack Overflow's question volume fell from 1,734,884 in 2020 to 88,256 in 2025 across forty major tags — a 94.9% drop. JavaScript alone fell from 212,859 to 6,344 (-97%). Python from 283,259 to 13,756 (-95%). React from 84,781 to 3,326 (-96%). MySQL from 35,173 to 521 (-98.5%, the largest drop in the sample).
Why it matters: The collapse aligns with the November 2022 launch of ChatGPT. Most tags declined 20-35% from 2020 to 2022 (gradual). Then they declined another 80-95% from 2022 to 2025 (post-ChatGPT). The phase-2 multiplier on phase-1 declines is roughly 4-6× across every tag tested. Whatever was hurting Stack Overflow before ChatGPT, ChatGPT made it five times faster.
Use it when: reporting on AI's impact on developer-knowledge channels, citing tag-level evidence for "Stack Overflow is dying" coverage, briefing engineering leadership on Q&A-channel migration, or sourcing a multi-year quantitative anchor for an "AI killed [X]" feature.
Key findings
- Combined volume across 40 major tags: 1,734,884 (2020) → 88,256 (2025) — a 94.9% decline.
- JavaScript fell 97%: 212,859 (2020) → 6,344 (2025). On pace for ~2,047 in 2026 — a 100:1 collapse over six years.
- MySQL is the canary: -98.5% from 2020 to 2025, the largest drop in the sample. 521 questions in 2025 across the entire global MySQL developer population.
- 38 of 40 tags lost 95%+ of their 2020 volume by 2025. Only
openai-apiandlangchainshow net positive growth from 2020 — and both peaked in 2023 and are now declining. - The ChatGPT cliff is visible in the data: Phase 1 (2020-2022) saw a 20-35% gradual decline. Phase 2 (2022-2025) saw an 80-95% additional decline — a 4-6× acceleration.
- 2026 partial-year data shows no floor: every tracked tag is on pace for another 60-85% drop in 2026, including
openai-apiitself.
In this article: Quick answer · The 40-tag leaderboard · The ChatGPT cliff · No safe modern tag · AI tags are dying too · MySQL the canary · The 2026 trajectory · Methodology · Caveats · FAQ
Quick answer
- What it is: A six-year tag-by-tag question-volume audit of Stack Overflow (2020-2025) plus 2026 partial-year data through 8 May 2026.
- Sample: 40 major language, framework, database, web, mobile, DevOps, and AI tags.
- Source: Stack Exchange API v2.3
/search/advancedendpoint, queried with custom filter!nNPvSNVZJSfor thetotalfield. - What "decline" means: the count of new questions tagged with the specified tag, created within the calendar year, returned by the SE API. Existing-question views and answer activity are not counted.
- Main caveat: Stack Overflow's traffic has been declining since approximately 2014 per public estimates. ChatGPT did not start the decline. It accelerated an existing one by roughly 5×.
| Tag | 2020 | 2025 | Drop |
|---|---|---|---|
| python | 283,259 | 13,756 | -95.1% |
| javascript | 212,859 | 6,344 | -97.0% |
| reactjs | 84,781 | 3,326 | -96.1% |
| mysql | 35,173 | 521 | -98.5% |
| sql | 52,062 | 1,924 | -96.3% |
What is the Stack Overflow question decline?
Definition (short version): The Stack Overflow question decline is the multi-year, near-universal collapse in new-question volume on the platform between 2020 and 2025, with a sharp acceleration after ChatGPT's November 2022 launch — a pattern that, on the audited 40-tag corpus, averages a 96% drop and shows no floor in 2026 partial-year data.
The decline is uniform across language, framework, database, mobile, DevOps, and AI tags. It is not a "Java is dying" story or a "PHP is dying" story. The decline applies to JavaScript, Python, React, AWS, Kubernetes, and PyTorch alike. The platform's question intake fell ~95% across most of its surface area, regardless of which technology generation the tag represents.
Also known as: Stack Overflow collapse, the AI Q&A migration, the ChatGPT cliff, Stack Overflow's question drought, the SO long-tail collapse, the dev-Q&A channel shift.
The 40-tag leaderboard 2020-2025
Sorted by 2020 baseline volume, descending. The biggest tags first.
| Tag | 2020 | 2021 | 2022 | 2023 | 2024 | 2025 | Drop 2020→2025 |
|---|---|---|---|---|---|---|---|
| python | 283,259 | 250,202 | 223,907 | 112,281 | 51,442 | 13,756 | -95.1% |
| javascript | 212,859 | 178,342 | 149,671 | 72,347 | 29,367 | 6,344 | -97.0% |
| java | 119,637 | 87,062 | 68,087 | 39,219 | 18,296 | 5,801 | -95.2% |
| c# | 87,155 | 66,417 | 62,156 | 39,110 | 20,865 | 6,003 | -93.1% |
| reactjs | 84,781 | 81,311 | 78,635 | 41,302 | 16,535 | 3,326 | -96.1% |
| html | 83,055 | 67,966 | 55,196 | 28,617 | 12,212 | 3,044 | -96.3% |
| android | 73,720 | 53,848 | 42,147 | 25,397 | 14,146 | 4,656 | -93.7% |
| php | 65,647 | 48,067 | 36,000 | 18,718 | 9,133 | 2,179 | -96.7% |
| c++ | 57,761 | 45,526 | 37,251 | 20,386 | 12,459 | 4,735 | -91.8% |
| node.js | 57,298 | 48,849 | 40,364 | 21,205 | 9,271 | 2,106 | -96.3% |
| css | 55,078 | 47,226 | 39,930 | 22,634 | 10,031 | 2,850 | -94.8% |
| sql | 52,062 | 43,160 | 36,968 | 18,681 | 8,301 | 1,924 | -96.3% |
| angular | 42,932 | 29,157 | 22,620 | 13,652 | 8,129 | 2,147 | -95.0% |
| mysql | 35,173 | 25,994 | 21,036 | 9,370 | 3,480 | 521 | -98.5% |
| django | 34,589 | 28,434 | 22,463 | 10,975 | 4,835 | 1,006 | -97.1% |
| typescript | 33,773 | 33,046 | 34,192 | 22,432 | 11,048 | 3,048 | -91.0% |
| swift | 33,300 | 22,552 | 17,732 | 10,431 | 5,654 | 1,996 | -94.0% |
| ios | 29,986 | 20,163 | 15,507 | 9,709 | 5,595 | 2,627 | -91.2% |
| laravel | 28,838 | 22,357 | 16,808 | 9,287 | 4,755 | 1,179 | -95.9% |
| react-native | 23,558 | 19,189 | 17,126 | 9,612 | 4,910 | 1,902 | -91.9% |
| spring-boot | 23,200 | 19,210 | 17,587 | 13,588 | 7,027 | 2,271 | -90.2% |
| vue.js | 21,705 | 18,313 | 13,183 | 6,723 | 3,008 | 661 | -97.0% |
| amazon-web-services | 21,653 | 19,885 | 17,262 | 11,294 | 5,285 | 1,205 | -94.4% |
| docker | 20,368 | 18,036 | 16,715 | 11,530 | 6,137 | 1,573 | -92.3% |
| postgresql | 19,358 | 18,118 | 16,479 | 10,759 | 5,749 | 1,561 | -91.9% |
| mongodb | 18,599 | 15,614 | 13,626 | 7,034 | 2,611 | 500 | -97.3% |
| kotlin | 16,836 | 16,331 | 15,712 | 10,581 | 5,717 | 1,759 | -89.6% |
| regex | 16,795 | 11,906 | 9,487 | 4,825 | 1,843 | 501 | -97.0% |
| tensorflow | 14,127 | 10,034 | 7,184 | 3,065 | 1,475 | 355 | -97.5% |
| git | 11,857 | 9,593 | 7,866 | 5,274 | 2,913 | 918 | -92.3% |
| kubernetes | 10,821 | 9,565 | 8,309 | 5,085 | 1,913 | 435 | -96.0% |
| flask | 9,129 | 6,353 | 4,401 | 3,106 | 1,710 | 345 | -96.2% |
| go | 7,906 | 7,744 | 8,317 | 5,676 | 2,606 | 660 | -91.7% |
| machine-learning | 7,770 | 5,105 | 4,402 | 2,901 | 1,492 | 361 | -95.4% |
| ruby | 7,433 | 5,376 | 4,924 | 2,468 | 1,112 | 287 | -96.1% |
| rust | 5,054 | 5,957 | 7,626 | 6,110 | 3,498 | 1,287 | -74.5% |
| pytorch | 4,514 | 4,819 | 4,448 | 3,177 | 1,771 | 462 | -89.8% |
| next.js | 3,284 | 7,343 | 10,398 | 10,648 | 6,361 | 1,568 | -52.3% (peak 2023) |
| openai-api | 14 | 28 | 160 | 1,461 | 704 | 184 | +1214.3% (peak 2023) |
| langchain | 0 | 1 | 0 | 963 | 786 | 213 | (new since 2023, peak 2023) |
Combined volume across all 40 tags:
- 2020: 1,734,884 questions
- 2025: 88,256 questions
- Decline 2020 → 2025: -94.9%
The two tags that show net positive growth from 2020 — openai-api and langchain — both peaked in 2023, the year ChatGPT and the LangChain ecosystem hit mainstream developer awareness. By 2025 they are also in active decline. Stack Overflow did not capture the AI-tooling Q&A market either.
2026 partial-year projections
Through 128 days of 2026 (35% of the year, captured 8 May 2026). Projecting forward at the current rate gives an indicative full-year figure. Direction, not precision.
| Tag | 2020 | 2025 | 2026 actual (128 days) | 2026 projected full year | Projected vs 2025 |
|---|---|---|---|---|---|
| python | 283,259 | 13,756 | 1,638 | ~4,672 | -66% |
| javascript | 212,859 | 6,344 | 718 | ~2,047 | -68% |
| java | 119,637 | 5,801 | 721 | ~2,056 | -65% |
| c# | 87,155 | 6,003 | 673 | ~1,919 | -68% |
| reactjs | 84,781 | 3,326 | 173 | ~493 | -85% |
| typescript | 33,773 | 3,048 | 242 | ~690 | -77% |
| html | 83,055 | 3,044 | 377 | ~1,075 | -65% |
| css | 55,078 | 2,850 | 324 | ~924 | -68% |
| sql | 52,062 | 1,924 | 243 | ~693 | -64% |
| openai-api | 14 | 184 | 25 | ~71 | -61% |
The pattern: every tag tested is on pace for another 60-85% decline in 2026. Stack Overflow's question volume is not bottoming out. Even openai-api is projected to fall 61% — meaning developers asking questions about ChatGPT's own API are not asking them on Stack Overflow either.
If the projected 2026 figures hold, the cumulative 2020 → 2026 decline will be 99%+ for every major tag tested. JavaScript would go from 212,859 to roughly 2,000 — a 100:1 collapse over six years.
Story A — The ChatGPT cliff: when each tag's collapse accelerated
ChatGPT was released on 30 November 2022 (OpenAI announcement). The data shows two distinct decline phases that sit on either side of that date.
Phase 1 (gradual, 2020-2022): Most tags declined 20-35% over three years. This is consistent with public reports that Stack Overflow's site traffic peaked around 2014 and had been gradually declining for years before generative AI tooling existed.
Phase 2 (accelerated, 2022-2025): Every tag in the sample declined another 80-95% from end-of-2022 through end-of-2025. ChatGPT, GitHub Copilot (general availability June 2022), Claude, Gemini, and Cursor absorbed the question traffic that previously landed on Stack Overflow.
The phase-2 multiplier on phase-1 declines is roughly 4-6× across every tag.
| Tag | 2020 | 2022 | 2025 | Phase 1 (2020→2022) | Phase 2 (2022→2025) |
|---|---|---|---|---|---|
| javascript | 212,859 | 149,671 | 6,344 | -29.7% | -95.8% |
| python | 283,259 | 223,907 | 13,756 | -21.0% | -93.9% |
| reactjs | 84,781 | 78,635 | 3,326 | -7.2% | -95.8% |
| sql | 52,062 | 36,968 | 1,924 | -29.0% | -94.8% |
| html | 83,055 | 55,196 | 3,044 | -33.5% | -94.5% |
| typescript | 33,773 | 34,192 | 3,048 | +1.2% | -91.1% |
| docker | 20,368 | 16,715 | 1,573 | -17.9% | -90.6% |
| kubernetes | 10,821 | 8,309 | 435 | -23.2% | -94.8% |
TypeScript is the cleanest case in the table: it gained 1.2% over the pre-ChatGPT phase 1 (one of the only tags actively gaining mindshare), then collapsed 91% over phase 2. A growing technology cannot insulate its tag from the channel-level collapse. The collapse is happening to Stack Overflow, not to the technologies.
The framing matters for fair coverage. Stack Overflow was already declining. ChatGPT did not start the trend. ChatGPT made the existing trend roughly five times faster.
Story B — The "modern" tags weren't safe either
A reasonable counter-hypothesis: maybe legacy languages declined while modern tags held. The data does not support that. The two tags often cited as the modern winners — Next.js and Rust — peaked late and then collapsed.
- Next.js: 3,284 (2020) → peak 10,648 (2023) → 1,568 (2025). The 2023 peak coincides with the Next.js App Router release. Then 85% decline through 2025. The 2026 projection puts Next.js at ~493 questions for the full year — a 95% drop from its 2023 peak.
- Rust: 5,054 (2020) → peak 7,626 (2022) → 1,287 (2025). Peaked just before ChatGPT launch. Then 83% decline.
Next.js is particularly damaging to the "Stack Overflow is fine, only the dying tags are declining" counter-narrative. Next.js was actively winning the React-meta-framework competition in 2023. The technology was healthy. The Stack Overflow channel was not.
There is no safe tag on Stack Overflow. Even tags actively gaining mindshare in their respective ecosystems are losing question volume on the platform.
Story C — The AI tags peaked and are now dying too
If Stack Overflow were holding its position as the canonical Q&A site for any developer question, the AI-tool tags should be experiencing explosive growth — the kind of thing JavaScript saw in 2010-2015 as web development scaled. They are not.
- openai-api: 14 (2020) → 28 (2021) → 160 (2022) → 1,461 (2023, peak) → 704 (2024) → 184 (2025) → ~71 projected (2026). Peak 2023, down 87% from peak by 2025.
- langchain: 0 (2020) → 0 (2022) → 963 (2023, peak) → 786 (2024) → 213 (2025). Peak 2023, down 78% from peak by 2025.
Both AI tags peaked the year their underlying tools became mainstream and are now collapsing alongside everything else. The mechanism is straightforward: developers asking questions about ChatGPT, Claude, or LangChain ask the AI itself, not a Stack Overflow forum. The tools have eaten their own marketing surface.
This is the strongest single signal in the dataset. A platform whose Q&A position is structurally healthy should capture the questions about a brand-new technology category. Stack Overflow captured roughly one year of that traffic, then lost it.
Story D — MySQL is the canary
The largest single-tag drop in the sample is MySQL: -98.5% from 2020 to 2025.
- 35,173 questions in 2020.
- 521 questions in 2025.
521 questions in 2025 represents roughly 1.4 questions per day about MySQL across the entire global MySQL developer population. MySQL is one of the most widely-deployed databases globally. ChatGPT alone reportedly serves over 100 million weekly users, and even at vanishingly small per-user query rates the volume of MySQL questions in the world is far higher than 1.4 per day.
This is not a story about MySQL declining as a technology. This is a story about Stack Overflow losing the canonical-database-Q&A position by 98.5%. Every database tag in the sample tells a similar story: PostgreSQL -91.9%, MongoDB -97.3%, SQL -96.3%. Database questions specifically have nearly fully migrated off the platform.
Story E — The 2026 trajectory shows no floor
The 2026 partial-year data — captured 8 May 2026, covering 128 days of the calendar year — projects a further 60-85% decline on top of 2025's already-decimated baseline.
There is no leveling off in the data. Every tag tested is still declining at the same compounding rate. The platform's question intake is approaching residual noise.
If the projected 2026 figures hold, the cumulative 2020 → 2026 decline will be 99%+ for every major tag tested:
| Tag | 2020 | 2026 projected | Cumulative 6-year drop |
|---|---|---|---|
| python | 283,259 | ~4,672 | -98.3% |
| javascript | 212,859 | ~2,047 | -99.0% |
| java | 119,637 | ~2,056 | -98.3% |
| reactjs | 84,781 | ~493 | -99.4% |
| html | 83,055 | ~1,075 | -98.7% |
JavaScript on the projected 2026 trajectory: 212,859 (2020) → 2,047 (2026). A 100:1 collapse in six years.
Methodology
- Aggregation tool:
bash scripts/ms research so-tag-counts— an ApifyForge ms-CLI subcommand atscripts/ms_cli/commands/research_so_tag_counts.py. Hits the Stack Exchange API directly. Anonymous, public, free. - Underlying API: Stack Exchange API v2.3
/search/advancedendpoint, with custom filter!nNPvSNVZJSthat exposes thetotalfield. Thetotalis the count of matching questions for the date range and tag, returned in the response envelope. - Query shape: For each (tag, year) pair, request
pagesize=1with date range covering the calendar year. Parsetotalfrom the response. Question records themselves are not returned. One API quota unit per pair. - Coverage: 40 tags × 6 full years (2020-2025) = 240 calls, plus 10 top tags × 2026 partial = 250 total calls. Stack Exchange anonymous quota is 300/day; the sweep used 246 with 54 remaining.
- Date ranges: Each year's
fromdateandtodateare calendar-year UTC timestamps (Jan 1 00:00:00 to Dec 31 23:59:59 UTC). 2026 partial uses 2026-01-01 to 2026-12-31 — Stack Exchange returns counts only for actual question creation dates within range, so the result is automatically Jan 1 → today. - Tag scope: 40 tags chosen as representative across major language tags, frontend frameworks, backend frameworks, databases, web fundamentals, mobile, DevOps/infra, ML/AI tools, and tooling. Editorial selection — "the tags people would name in a Stack Overflow conversation" — not a frequency-ranked seed.
- Reproduction: Every figure in this post is re-fetchable from the public Stack Exchange API. No proprietary data, no rate-limited paid access, no scraping. The CSV and JSON outputs are also archived at
storage/research/so-tag-decline/in the ApifyForge repo.
Why this isn't a job for the stackexchange-search actor
ApifyForge's stackexchange-search Apify actor returns individual question records — title, body, accepted-answer text, vote counts, dates. It is the right tool for question-content extraction (top questions for a tag, accepted-answer mining, sentiment analysis, scoring developer questions for backlog automation). For tag totals, however, paginating through 200,000+ JavaScript questions per year would mean tens of thousands of API calls per tag and a full-day quota burn for a single column of this leaderboard.
The new ms-CLI command sits alongside the actor for aggregation use cases. Question-content extraction → use the actor. Tag-volume audit → use the new command. Both target the same upstream API, just with different request shapes.
What the data does NOT support
Several inferences a journalist might want to draw from this dataset are not supported by it. Honest framing requires acknowledging them.
1. "Stack Overflow is dying because of ChatGPT alone." Not isolated causation. Stack Overflow's traffic peaked around 2014 per public estimates and was already declining gradually before generative AI existed. GitHub Copilot reached general availability in June 2022, five months before ChatGPT. The 2022-2023 macroeconomic developer-hiring slowdown reduced overall developer activity. Stack Overflow's own 2023 moderator strike over AI-content policy may have suppressed posting. Reddit's /r/programming and Discord communities absorbed some traffic. Multiple causal factors overlap. ChatGPT correlates most strongly in the data, but it is not the only factor.
2. "Stack Overflow has fewer but higher-quality questions now." Possible counter-hypothesis the data does not address. This audit measures volume, not quality. Acceptance rates, accepted-answer rates, and answer-vote distributions are not in the dataset. If Stack Overflow's leadership wants to argue this counter-hypothesis, that data exists in the platform's own internal analytics and could be published.
3. "All developer Q&A migrated to ChatGPT." Partial. The data confirms Stack Overflow's loss of share. It does not confirm where the questions went. ChatGPT, Claude, Gemini, GitHub Copilot Chat, Cursor, and Perplexity are all candidates. Reddit, Discord servers, GitHub Discussions, and language-specific forums (the Rust users forum, the Python Discourse) also absorbed traffic. A complete attribution would require user-level migration data that no single source publishes.
4. "These tags' underlying technologies are dying." No. GitHub repository activity for the same tags does not show a comparable collapse. JavaScript, Python, React, MySQL, Kubernetes, and PostgreSQL all still drive enormous repository activity, package downloads, and job postings. The decline is in the Stack Overflow channel specifically, not in the technologies themselves.
5. "2026 will end at the projected figures." Linear extrapolation. The "projected full year" figures in this post assume the rate observed Jan 1 - May 8 holds for the remaining 8 months. Stack Overflow question volume historically has slight seasonality (summer / December dips), so actual full-year 2026 figures will differ. The projection is directional, not precise.
6. "The 40-tag selection proves it for every tag." Editorial selection. A different 40-tag seed would shift the average decline figure but not the shape of the story. The largest tags by 2020 baseline (javascript, python, java, c#) are mainstream choices that no editor would dispute. The full top-100 by 2020 baseline would tighten the claim further.
Common misconceptions
- "Stack Overflow's traffic was always going to decline; this is normal." Public estimates show Stack Overflow traffic peaking around 2014 and declining gradually since. The 2020-2022 phase-1 decline (20-35%) is consistent with that trajectory. The 2022-2025 phase-2 decline (80-95%) is roughly five times faster. "Normal decline" does not explain the post-November-2022 acceleration.
- "This is just JavaScript questions, not the whole platform." No. The audited 40 tags span every major category — language, framework, database, mobile, DevOps, AI. 38 of 40 lost 95%+ of their 2020 volume. The pattern is platform-wide, not language-specific.
- "Maybe new questions are being asked on tags I haven't audited." Possible but improbable to overturn the headline. The audited tags cover the largest baseline volumes on the platform. A long-tail tag uplift would not approach the 1.6 million-question gap between 2020 and 2025 totals.
- "ChatGPT also peaked and is declining, so this proves nothing." ChatGPT's user base reportedly continues to grow as of early 2026. The
openai-apitag's decline is a Stack Overflow-channel decline for OpenAI questions, not a decline in OpenAI usage itself. The two are different signals.
Glossary
- Stack Exchange API — the public REST interface to Stack Overflow and the Stack Exchange network. Free anonymous quota of 300 calls/day per IP, higher with an API key.
/search/advanced— the Stack Exchange endpoint that returns questions matching specified filters (tags, date range, accepted status, etc.) with atotalcount when an appropriate custom filter is used.- Custom filter
!nNPvSNVZJS— a Stack Exchange API filter that exposes thetotalfield in the response envelope, allowing per-(tag, year) counts without paginating through individual question records. - Phase 1 / Phase 2 split — the editorial framing in this post, dividing the decline into pre-ChatGPT (2020-2022) and post-ChatGPT (2022-2025) periods.
- Tag — Stack Overflow's per-question metadata label, e.g.,
javascript,mysql,kubernetes, used for question categorisation and search. - Cumulative decline — the percentage drop from a tag's 2020 baseline to its 2025 actual or 2026 projected figure.
Press lift-out paragraph for journalists
For trade-press, newsletter, and analyst use:
A 2026 ApifyForge audit of the public Stack Exchange API found Stack
Overflow's question volume across 40 major language and framework tags
fell from 1,734,884 in 2020 to 88,256 in 2025 — a 94.9% decline.
Tag-level drops include JavaScript -97% (212,859 → 6,344), Python -95%
(283,259 → 13,756), React -96% (84,781 → 3,326), Java -95%
(119,637 → 5,801), MySQL -98.5% (35,173 → 521), and SQL -96%
(52,062 → 1,924). The collapse aligns with ChatGPT's release in
November 2022: most tags declined 20-30% from 2020 to 2022, then 80-95%
from 2022 to 2025. The 2026 partial-year data (through 8 May 2026)
projects another 60-85% decline on top of the 2025 baseline,
indicating the trend is still accelerating, not bottoming out.
Mini case study — JavaScript, the headline tag
Before: JavaScript was the largest single-language tag on Stack Overflow by question volume in 2020, with 212,859 new questions tagged that year. It had been the most-asked tag on the platform for most of the previous decade.
Phase 1 change (2020-2022): JavaScript questions fell to 149,671 in 2022 — a 30% decline over three years. Consistent with the gradual platform-wide decline that pre-dates generative AI tooling.
Phase 2 change (2022-2025): JavaScript questions fell from 149,671 (2022) to 6,344 (2025) — a 96% decline over three years. The rate of decline more than tripled from phase 1 to phase 2.
State as of capture (May 2026): JavaScript questions in the first 128 days of 2026 totalled 718. The full-year linear projection is approximately 2,047 — a 100:1 collapse from the 2020 baseline.
These numbers reflect one capture of the public Stack Exchange API on 8 May 2026. Anyone with an internet connection can re-run the same queries and verify the figures. Reproducibility is the point.
What are the alternatives to this kind of audit?
Several methods exist for measuring developer-Q&A platform health. Each has tradeoffs.
| Approach | What it measures | Where it breaks at scale |
|---|---|---|
| Stack Exchange API + tag-total filter (used here) | Year-over-year per-tag question counts | Bound by anonymous 300/day quota; needs filter !nNPvSNVZJS for totals |
| Stack Overflow Annual Developer Survey | Self-reported tool usage, satisfaction, demographics | Survey-respondent skew; published annually, no monthly granularity; usage ≠ Q&A volume |
| Stack Exchange Data Explorer (SEDE) | Direct SQL access to a public mirror of SO data | Requires SQL knowledge; mirror lag; better for rich queries than rapid time-series |
| Public traffic estimate services (Similarweb, etc.) | Site-level traffic, not tag-level | Top-of-funnel proxy only; no tag granularity; commercial paywall |
| Google Trends for tag terms | Search interest for the tag string | Not Q&A volume; conflates different intents (learning, debugging, news) |
| Manual editorial sampling | Curated subjective read of a few flagship tags | Anecdotal; not reproducible; misses long-tail scale |
Pricing and features based on publicly available information as of May 2026 and may change.
Each approach has tradeoffs in granularity, reproducibility, and cost. For a tag-by-tag year-over-year audit with reproducible numbers and no paywall, the Stack Exchange API with the totals filter is the most direct mechanism. For richer follow-up questions — accepted-answer rates, vote distributions, user-level migration — SEDE or the stackexchange-search actor extends the audit further.
Best practices for journalists citing this dataset
- Always pair the 94.9% headline with the 40-tag scope. Frame as "across 40 major language and framework tags" not "across all of Stack Overflow."
- Use a specific named tag with its drop figure as the quote anchor. "JavaScript -97% (212,859 → 6,344)" is more citable than the aggregate.
- Note the phase-1 / phase-2 split when invoking the ChatGPT correlation. Acknowledge the pre-existing decline. The honest version of "ChatGPT killed Stack Overflow" is "ChatGPT made Stack Overflow's existing decline roughly five times faster."
- Date the snapshot. All figures are as of 8 May 2026 captures of the public Stack Exchange API.
- Link to the methodology section for reproducibility — every figure is re-fetchable from the public API.
- Disclose the 2026 projection is linear. The "projected full year" figures assume the Jan 1 - May 8 rate holds, which seasonality may slightly shift.
- Acknowledge multi-causal factors. GitHub Copilot, the developer-hiring slowdown, the 2023 moderator strike, and Reddit / Discord absorption are all overlapping influences.
- Quote the press lift-out paragraph verbatim if the framing needs to be standardised across outlets.
Common mistakes when citing this dataset
- Treating the 94.9% figure as covering all of Stack Overflow. It covers the audited 40-tag corpus. Other tags exist. Restate as "across 40 major tags."
- Attributing 100% of the collapse to ChatGPT. Pre-ChatGPT decline was already underway. Phase 1 (2020-2022) showed 20-35% drops before any AI-coding-tool release.
- Inferring the underlying technologies are dying. They are not — GitHub repo activity for the same tags is robust. The channel is dying, not the technologies.
- Quoting the 2026 projection as actual data. It is a linear extrapolation of partial-year data. Use the word "projected" when citing it.
- Generalising to Stack Exchange network sites. This audit is Stack Overflow main only. Server Fault, Database Administrators, AI Stack Exchange, and the rest of the network were not measured.
- Conflating "fewer questions" with "lower quality." Quality is not in the dataset. Volume only.
Implementation checklist for re-running the audit
- Sign up for nothing — the Stack Exchange API is anonymous-accessible at 300 calls/day.
- Pick the tags you want to audit. The 40-tag seed in this post is a starting point.
- Construct one
/search/advancedrequest per (tag, year) pair, using filter!nNPvSNVZJS. - Set
fromdateandtodateas Unix timestamps for the year boundary. - Set
taggedto the tag string andpagesize=1. - Parse
totalfrom the JSON response. - Repeat across all (tag, year) pairs. Stay under the daily quota.
- Compute per-tag drops and aggregate totals.
- For 2026 partial-year, divide actual Jan 1 → today count by days elapsed and multiply by 365 for the linear projection.
- Publish the spreadsheet alongside the analysis. Reproducibility is the entire point.
Limitations
- Stack Overflow main only. This audit excludes Server Fault, Super User, Database Administrators, Software Engineering Stack Exchange, AI Stack Exchange, and the rest of the Stack Exchange network. A small fraction of the lost Stack Overflow traffic may have migrated to specialised network sites — the AI Stack Exchange site has roughly 1/100th the traffic of Stack Overflow main and could not absorb the millions of lost questions.
- The Stack Exchange API counts new questions, not engagement. Existing-question views, comments on old questions, and answer activity on past questions are not measured here. A platform whose question intake collapses but whose archive is still heavily read is not the same story as one whose archive is dark too. This audit does not address the second question.
- Tag selection is editorial. The 40-tag seed is a representative cross-section, not a frequency-weighted census of all tags. A full top-100 by 2020 baseline volume is the obvious follow-up.
- Stack Overflow has paywalled some of its data products. As of 2026-05-08 the public Stack Exchange API still returns full historical question counts for free at the 300 calls/day quota. If the platform restricts historical access in future, this audit becomes harder to reproduce.
- 2026 projection is linear. Assumes Jan 1 - May 8 rate holds for the rest of the year. Stack Overflow has slight historical seasonality (summer / December lulls). Treat the projection as directional.
- Volume only, not quality. Acceptance rates, accepted-answer rates, vote distributions, and answer latency are not in this dataset. The "fewer but higher quality" counter-hypothesis is not addressed.
Key facts about the Stack Overflow question decline
- Combined volume across 40 major Stack Overflow tags fell from 1,734,884 in 2020 to 88,256 in 2025 — a 94.9% decline.
- 38 of 40 audited tags lost 95%+ of their 2020 question volume by 2025.
- JavaScript fell from 212,859 (2020) to 6,344 (2025) — a 97% drop.
- Python fell from 283,259 (2020) to 13,756 (2025) — a 95.1% drop.
- MySQL fell from 35,173 (2020) to 521 (2025) — a 98.5% drop, the largest in the sample.
- ChatGPT launched 30 November 2022. Phase-1 decline (2020-2022) averaged 20-35%; phase-2 decline (2022-2025) averaged 80-95% — a 4-6× acceleration.
- The
openai-apiandlangchaintags both peaked in 2023 and are now declining. Stack Overflow did not capture the AI-tooling Q&A market. - 2026 partial-year data (through 8 May 2026) projects another 60-85% decline on top of 2025 across every tag tested.
- The audit was generated from the public Stack Exchange API v2.3 with custom filter
!nNPvSNVZJSfor thetotalfield. - Every figure in this post is re-fetchable from the public API at no cost.
Broader applicability
These patterns apply beyond Stack Overflow to any Q&A or knowledge platform whose value depends on user-contributed answers being the canonical reference for a topic:
- AI assistants disintermediate canonical-answer platforms first. The platforms that aggregated answers to repeated developer questions are the ones generative AI replicates most directly. Documentation-as-code wikis, FAQ knowledge bases, and Q&A forums all face the same pressure.
- Network effects work in both directions. A Q&A platform with strong network effects on the way up has equally strong negative network effects on the way down. Fewer questions → fewer answerers → fewer questions.
- Channel health is independent of technology health. The technologies whose Stack Overflow tags collapsed (JavaScript, Python, React, MySQL) are themselves unaffected — their GitHub activity, package downloads, and job postings are robust. Channel decline is a channel signal, not a technology signal.
- The 2026 trajectory data is more important than the 2020-2025 cumulative. Cumulative declines are historical; trajectory data answers the question "is this stabilising or accelerating?"
- Reproducibility is the press currency. Audit datasets that anyone can re-run from a public API are easier for journalists to cite than proprietary internal numbers. Publishing the methodology alongside the data is the point.
When you need this analysis
You probably want to reference this dataset if:
- You're writing about generative AI's impact on developer-knowledge channels.
- You're reporting on Stack Overflow specifically — its strategy, its valuation, its acquisition by Prosus, or its 2024 OpenAI partnership.
- You need a tag-by-tag, year-over-year quantitative anchor for an "AI killed [X]" feature.
- You're briefing engineering leadership on developer-Q&A channel migration.
- You want a reproducible, named-tag, dated dataset rather than vibes.
- You're sourcing the methodology for a follow-on analysis (e.g., extending to the full Stack Exchange network, or comparing to GitHub repo activity).
You probably don't need this if:
- You want listenership / page-view / unique-visitor data — not in this dataset (volume of questions, only).
- You want acceptance rates or accepted-answer rates — not measured here.
- You want platform revenue or ad-impression figures — Stack Overflow has not published these to the public.
- You want the question content — that's the
stackexchange-searchactor, a separate tool. - You want a single-cause attribution — multiple factors overlap and the dataset deliberately surfaces that.
How to verify any single tag in this dataset
Each tag's per-year count is re-fetchable from the public Stack Exchange API. To verify any individual figure:
- Visit the Stack Exchange API documentation.
- Use the
/search/advancedendpoint with these query parameters:site=stackoverflow,tagged=<tag>,fromdate=<unix-timestamp>,todate=<unix-timestamp>,pagesize=1,filter=!nNPvSNVZJS. - Read the
totalfield in the JSON response — that's the per-(tag, year) count. - Compare to the table in this post.
- For a fast browser-only check, the documentation's interactive request builder will produce the count without writing code.
Frequently asked questions
How was the 94.9% aggregate decline calculated?
It is the sum of new questions across all 40 audited tags, comparing 2020 (1,734,884) to 2025 (88,256). Each per-tag-per-year figure comes from the Stack Exchange API's /search/advanced endpoint with custom filter !nNPvSNVZJS for the total field. The audited tags span major language tags, frontend frameworks, backend frameworks, databases, web fundamentals, mobile, DevOps/infra, and ML/AI tooling — an editorial cross-section, not a frequency-weighted census of all Stack Overflow tags. Re-running the same queries on a different date may produce slightly different figures if Stack Overflow's data layer has minor revisions in flight, but the order of magnitude is stable across multiple captures.
Did ChatGPT cause this collapse on its own?
No. Stack Overflow's traffic peaked around 2014 per public estimates and was already declining when ChatGPT launched in November 2022. The data shows this in the phase-1/phase-2 split: tags dropped 20-35% over 2020-2022 (pre-ChatGPT), then 80-95% over 2022-2025 (post-ChatGPT). ChatGPT correlates with a roughly 4-6× acceleration of the existing decline, but it shares the period with GitHub Copilot's general availability (June 2022), the 2022-2023 developer-hiring slowdown, the 2023 Stack Overflow moderator strike, and Reddit / Discord absorbing some Q&A traffic. Multiple causal factors overlap. The post is careful to acknowledge this throughout.
Why use the new ms-CLI command instead of the stackexchange-search actor?
The stackexchange-search Apify actor returns individual question records, which is the right shape for question-content extraction (top questions, accepted answers, sentiment, scoring developer questions for backlog tickets). For tag totals only — the count, not the content — the Stack Exchange API exposes a total field via filter !nNPvSNVZJS that returns the count in a single API call per (tag, year) pair. Paginating through 200,000+ JavaScript questions per year via the actor would burn an entire day's quota for one column of this leaderboard. The new command sits alongside the actor for aggregation use cases. Different tools, same upstream API.
Are the underlying technologies actually dying?
No. JavaScript, Python, React, MySQL, Kubernetes, and PostgreSQL all continue to drive enormous GitHub repository activity, package-manager downloads, and job-board postings. The Stack Overflow channel is collapsing — not the technologies the channel covered. The Next.js case is the cleanest illustration: Next.js was actively winning the React-meta-framework competition through 2023, and its Stack Overflow tag still collapsed 85% from peak. Channel health and technology health are independent variables.
Will Stack Overflow's question volume bottom out?
No floor is visible in the data through 8 May 2026. Every tag tested in the 2026 partial-year sample is on pace for another 60-85% decline this year. If the projected 2026 figures hold, the cumulative 2020-2026 decline will be 99%+ for every major tag. JavaScript would go from 212,859 (2020) to roughly 2,047 (2026) — a 100:1 collapse over six years. Whether the trajectory eventually flattens depends on factors outside this dataset: Stack Overflow's strategic response, AI assistants' continued ability to answer developer questions accurately, and whether developers eventually return to canonical-answer platforms for problems AI fails on. None of those is visible in question-volume data.
Can I republish these tables?
Yes — the dataset is published for press citation. Attribution to ApifyForge with a link back to this post is appreciated. The press lift-out paragraph above is written for direct quoting. Every figure is re-fetchable from the public Stack Exchange API, so verifying before publication is straightforward — see the methodology section for the exact filter and parameters.
What about the AI Stack Exchange site or GitHub Discussions?
The audited 40-tag corpus is Stack Overflow main only. Some questions that might have landed on Stack Overflow in 2020 may now go to AI Stack Exchange (a separate Stack Exchange network site), GitHub Discussions, language-specific Discourse forums, or Discord servers. The AI Stack Exchange site has roughly 1/100th the traffic of Stack Overflow main per public estimates, so it could not absorb the millions of lost questions. A multi-platform follow-up — the same query against Server Fault, Super User, Database Administrators, and AI Stack Exchange — would test whether the network as a whole is collapsing or whether main is uniquely affected.
A note on the underlying tools
This post is the third in an ApifyForge backlink-bait series documenting an industry channel's decline through public data, alongside The Tech Podcast Cemetery 2026 (an Apple Podcasts dormancy audit of 400 tech-keyword shows) and the SaaS Pricing Time Machine 2020-2026 (a Wayback-driven SaaS pricing-archaeology audit). All three use the same methodology pattern: pick a public dataset, capture it as of a specific date, publish the per-row figures alongside the methodology, and let the named-entity drops do the journalism.
Other posts in the ApifyForge backlink-bait series:
- Defense Contractor Lobbying ROI 2024 — what each $1 of lobbying buys in federal contract dollars.
- FDA 510(k) Shortcut vs PMA 2024 — the regulatory loophole audit.
- CFPB Credit Bureau Complaint Dominance 2024 — the three-bureau complaint share.
- Tech Podcast Cemetery 2026 — 68 dormant tech podcasts, 8,123 abandoned episodes.
- SEC Insider Sales 2024 Leaderboard — the executives selling their own stock.
- SaaS Pricing Time Machine 2020-2026 — six years of stealth SaaS price hikes.
- 2024 Academic Retractions Publisher Leaderboard — which journals retract the most.
- Medical Debt Collection 2024 CFPB Leaderboard — who collects on America's medical debt.
The dataset for this post was generated using bash scripts/ms research so-tag-counts, an ApifyForge ms-CLI subcommand that hits the Stack Exchange API directly. For the question-content side of Stack Overflow analysis — top questions, accepted answers, scored developer feedback for backlog automation — the stackexchange-search Apify actor is the right tool, documented separately in Stop Reading Stack Overflow Manually.
Ryan Clinton publishes Apify actors and MCP servers as ryanclinton and runs ApifyForge.
Last updated: May 2026
This guide focuses on Stack Overflow's English-language main site, but the same channel-decline patterns apply broadly across any user-contributed Q&A platform whose value depends on being the canonical reference for repeated questions.