2026 Roadmap: Feedback requested #4204
Replies: 1 comment
-
|
Hey, Kyle. It's nice to see the project is still active. About the database analysis:
This is a great idea, but (how) do you intend to implement nullability analysis without parsing the queries? Prepared statements have information about the data types and table references which is useful for determining the data types, but the nullability of query results cannot be determined with that. This is a problem I'm particularly interested in as I'm attempting a tool inspired by Sqlc that leverages Postgres' database analysis, and nullability inference is the one big problem I've run into. In my case, I'm opting to go with all generated fields being nullable (perhaps except for the simplest cases), but Sqlc already has nullability inference based on query analysis (even though it isn't exhaustive and has some false postives) so it'd be very painful for existing users if that is lost. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hey there! Sorry that I've been so quiet, but it's been a very busy time for me. I'm hoping that next year will give me a bit more time to focus on sqlc (no promises!). I wanted to get some feedback on my current plans for 2026.
I want to ship database-only analysis, a first-class Go API and Clickhouse support. These three features build on top of each other.
Roadmap
Database-only Analysis
The largest source of bugs today is sqlc's homegrown query analysis engine. For those unfamiliar with the internals of sqlc, it contains an partial implementation of a relational database engine, including a catalog for tracking tables and columns, query analysis based on a SQL AST, and a custom type-inference implementation.
These things have served us well, but it turns out it's difficult to faithful replicate the behavior of PostgreSQL, MySQL and SQLite. It results in a constant stream of issues when the query analysis is incorrect due to missing features or bad assumptions.
It's not sustainable in a future where we want to add more engines. So what's the plan?
We'll keep the existing implementation around for people that don't want to have a running database server. For our existing engines, there will be an option to run the analysis without using the catalog or query analysis, instead relying on prepared statements or other database features.
Go API
Adding a new engine requires understanding the sqlc codebase. There isn't a plugin system for database engines. My plan is to ship a Go API, similar to how esbuild does it. I'm going to need a lot of input from the community on how this should look, but I want to make it easy to pass an
Enginestruct to sqlc so that people can build their own engines.I'd refactor the internals to use these APIs, giving up a clear path for adopting engines that the community has developed. It also allow people to support lesser-used engines without having to get support added to sqlc. For example, someone has been working on YDB support. I'm not going to add that to sqlc, but having a Go API would make it easy to add.
Clickhouse Support
To test out the Go API, I'll be using it to implement #1628. I'm a casual Clickhouse user, so will be asking for examples from the community to make sure things are good.
Claude Coding
With the recent release of Opus 4.5, I actually don't need much help on the coding front. Claude is extremely good at working on sqlc, mainly because of the copious amounts of end-to-end tests. The database-only analyzer may also open up the path for fixing many issues with the existing catalog and query analyzer.
I'm excited for the next year!
Beta Was this translation helpful? Give feedback.
All reactions