The Stack Overflow Survey shows docs need to evolve
— 5 min read
The 2025 Stack Overflow Developer Survey is out, and as a technical writer, I am listening. As the definitive report on the state of software development, the survey is a gold mine of insights for what my audience needs. Let's break down what matters for us documentarians.
Developers have trust issues (with AI)
One stat in particular stuck out to me: Positive sentiment for AI has decreased from 70% to 60% this year. I get it. I've used AI tools in my own projects, and still actively shake my fist every time it suggests buggy code or refuses to follow a strict style guide that I fed it only seconds before. I've also called my AI tools out when they've confidently given me incorrect responses, leaving me to wonder if I will be spared in the human-robot takeover one day.
It's hard to keep an upward trend of positive sentiment when you have to spend time scolding your LLM for hallucinating. This same experience is reflected in developers who responded as distrusting AI tools... a whopping 46%. That's more developers who distrust AI than trust it.
This is a canary in a coal mine situation. We need to evolve now.
It's all a little ironic
It feels like yesterday when the general vibe of my fellow documentarians was "We're going to lose our audience to ChatGPT." Now ChatGPT is failing the ones using it, and making us all frustrated in the process.
Developers are still looking for content, the way they search for it has changed. For developers who are mostly using AI, they overwhelmingly use it to search for answers or learn new concepts or techniques.
A lot of us have updated our docs sites to cater to this crowd. We have added
AI search tools and llms.txt
files. Some of us have embedded context-aware
AI chatbots. When developers are on our docs sites, we can verify that
their AI experience is accurate, verified, and safe.
The problem? When they use AI off our docs sites... that's where things get messy.
The "almost right" nightmare
Right now, AI feels like it is "almost right", and somehow, this is worse than completely wrong. When code breaks spectacularly, you know to start over. When it's almost right, you spend hours debugging it, questioning your life choices, and wondering why you didn't just read the docs in the first place.
This is happening because AI tools are consuming our documentation in ways we never designed for. Our docs are written for humans, not for LLMs that need to extract meaning. We write "see the following table for authentication details", but AI doesn't understand where "following" is when someone asks it a direct question about your API.
What the survey really tells us
The survey reveals some fascinating patterns about developers' workflows:
-
Python's usage jump reflects its expansion from web development into AI and data science. Your audience isn't just growing, it's diversifying. They're training neural networks in the morning and building REST APIs in the afternoon. They're using the same language, but for different purposes. Do your examples and use case guides acknowledge this shift?
-
84% of developers are using AI tools, but they want different things at different times. Young developers (18-24) want chat and interactive formats. Static docs aren't cutting it for young developers anymore.
-
The tools are evolving. GitHub beat Jira as the most desired collaboration tool. Docker usage jumped 17 percentage points. VS Code maintains dominance despite the AI-powered IDE competition biting at its heels.
All of this tells me: developers are working in AI-native environments, but our documentation probably isn't meeting them where they are.
The solution is beyond docs
This heading might have you scratching your head, but hear me out. We've been focusing on improving the AI experience when developers visit our docs sites, but 35% of Stack Overflow visits are now people solving AI-related problems. They're not always coming to us first. They're asking ChatGPT, getting stuck, and finding authoritative sources after.
We need to optimize for AI consumption wherever it happens. Here's what I'm experimenting with based on patterns I'm seeing:
-
Structured content that AI can actually parse.
llms.txt
files are a good start, but AI needs more. It needs explicit boundaries, clear scope definitions, and structured relationships between concepts. Use consistent patterns and metadata that tells AI what applies where and when. When information has clear authority indicators and context boundaries, AI tools don't mix unrelated concepts as much. -
MCP servers and direct integrations. Model Context Protocol lets us serve documentation directly to AI tools in developers' workflows. Instead of hoping Claude scraped our docs correctly, we can provide authoritative context on demand.
-
Distribution-agnostic content architecture. Whether it's your docs site, an internal chatbot, or MCP servers, the same structured content should power multiple interfaces. This works when your content follows task-oriented patterns rather than feature-based organization. AI excels at workflow guidance when information is clustered around user journeys. Your authentication guide becomes reusable data that adapts to different contexts.
The opportunity
The survey suggests developers want trusted, human-verified information delivered in AI-native formats. My hypothesis: they're not abandoning docs, they're asking for docs to evolve.
This could be a huge opportunity. We can experiment with adapting our craft for how developers actually consume information in 2025, or keep doing what we've always done while AI tools continue to hallucinate about our products.
I'm still figuring this out, but the patterns seem worth exploring.
I'm actively researching these patterns and writing a book about optimizing documentation for AI consumption. If you're seeing similar trends at your company, I'd love to hear about it! Reach out on LinkedIn to chat anytime.