-
-
Notifications
You must be signed in to change notification settings - Fork 168
feat: add python type generator #1011
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
new class is needed because typeddict uses NonRequired for missing attributes
Pull Request Test Coverage Report for Build 19366795807Details
💛 - Coveralls |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. In terms of release, we could make a canary release from this PR, setup some client-side tests (even if basics just to ensure no regressions and working types e2e) on the supabase-py repo.
(A bit like it's done for the TS types here: https://github.com/supabase/supabase-js/blob/master/packages/core/postgrest-js/test/resource-embedding.test.ts#L133-L152)
Idea would be to use the canary image to generate types for a test database, and have some basics runtime + types testing to see if it work as expected before getting this merged to production. WDYT ?
| const schema = type!.schema | ||
| return `${formatForPyClassName(schema)}${formatForPyClassName(type.name)}` | ||
| } | ||
| console.log(`Unknown recognized row type ${name}`) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
suggestion
Might be worth to setup sentry logger integration to track this up ?
Something like:
Sentry.init({
dsn: "https://77126e07f4f992432a915850b656f092@o398706.ingest.us.sentry.io/4509151809765377",
integrations: [
// send console.log, console.warn, and console.error calls as logs to Sentry
Sentry.consoleLoggingIntegration({ levels: ["log", "warn", "error"] }),
],
// Enable logs to be sent to Sentry
enableLogs: true,
});
Or use a .captureException so we can proactively look up such "unknown" row types and fix them up ?
| import { extractRequestForLogging } from '../../utils.js' | ||
| import { apply as applyPyTemplate } from '../../templates/python.js' | ||
| import { getGeneratorMetadata } from '../../../lib/generators.js' | ||
|
|
||
| export default async (fastify: FastifyInstance) => { | ||
| fastify.get<{ | ||
| Headers: { pg: string } | ||
| Querystring: { | ||
| excluded_schemas?: string | ||
| included_schemas?: string | ||
| } | ||
| }>('/', async (request, reply) => { | ||
| const connectionString = request.headers.pg | ||
| const excludedSchemas = | ||
| request.query.excluded_schemas?.split(',').map((schema) => schema.trim()) ?? [] | ||
| const includedSchemas = | ||
| request.query.included_schemas?.split(',').map((schema) => schema.trim()) ?? [] | ||
|
|
||
| const pgMeta: PostgresMeta = new PostgresMeta({ ...DEFAULT_POOL_CONFIG, connectionString }) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nitpick
Would need to use x-application-name headers and utils like the other generation routes to track where the query originate from.
| import { extractRequestForLogging } from '../../utils.js' | |
| import { apply as applyPyTemplate } from '../../templates/python.js' | |
| import { getGeneratorMetadata } from '../../../lib/generators.js' | |
| export default async (fastify: FastifyInstance) => { | |
| fastify.get<{ | |
| Headers: { pg: string } | |
| Querystring: { | |
| excluded_schemas?: string | |
| included_schemas?: string | |
| } | |
| }>('/', async (request, reply) => { | |
| const connectionString = request.headers.pg | |
| const excludedSchemas = | |
| request.query.excluded_schemas?.split(',').map((schema) => schema.trim()) ?? [] | |
| const includedSchemas = | |
| request.query.included_schemas?.split(',').map((schema) => schema.trim()) ?? [] | |
| const pgMeta: PostgresMeta = new PostgresMeta({ ...DEFAULT_POOL_CONFIG, connectionString }) | |
| import { createConnectionConfig, extractRequestForLogging } from '../../utils.js' | |
| import { apply as applyPyTemplate } from '../../templates/python.js' | |
| import { getGeneratorMetadata } from '../../../lib/generators.js' | |
| export default async (fastify: FastifyInstance) => { | |
| fastify.get<{ | |
| Headers: { pg: string; 'x-pg-application-name'?: string } | |
| Querystring: { | |
| excluded_schemas?: string | |
| included_schemas?: string | |
| } | |
| }>('/', async (request, reply) => { | |
| const config = createConnectionConfig(request) | |
| const excludedSchemas = | |
| request.query.excluded_schemas?.split(',').map((schema) => schema.trim()) ?? [] | |
| const includedSchemas = | |
| request.query.included_schemas?.split(',').map((schema) => schema.trim()) ?? [] | |
| const pgMeta: PostgresMeta = new PostgresMeta(config) |
| const py_tables = tables | ||
| .filter((table) => schemas.some((schema) => schema.name === table.schema)) | ||
| .flatMap((table) => { | ||
| const py_class_and_methods = ctx.tableToClass(table) | ||
| return py_class_and_methods | ||
| }) | ||
| const composite_types = types | ||
| .filter((type) => type.attributes.length > 0) | ||
| .map((type) => ctx.typeToClass(type)) | ||
| const py_views = views.map((view) => ctx.viewToClass(view)) | ||
| const py_matviews = materializedViews.map((matview) => ctx.matViewToClass(matview)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nitpick
Tables on schemas that are excluded should also be excluded by introspection (not sure the filter is in need there).
If it isn't then in such case I guess you also want to apply the same filter for views and matviews ? (if they're declared on a schema not in the list then don't generate types for them).
What is the current behavior?
Adds python type generation support, piggy backing off of @ryanpeach's #808 pull request. The main difference between his approach is that I added a lot more structure by first creating the "AST", then serializing it through the
serialize()method, instead of in place constructing strings everywhere.Fixes #795.
Additional context
The idea is that the normal
Public{table}class should be used to parse the result of select queries, while theInsertandUpdateobjects should be passed directly into the.insertand.updatepostgrestmethods. As such, they needed to beTypedDicts instead ofBaseModels, for that's what thepostgrestlibrary expects in those functions.This may change in the future, specially in a v3. Ideally, I'd love to be able to infer the type of the arguments for the function from the query builder itself, like TS already does, but I'm afraid that python's type checker is not strong enough to do that. I think a frankenstein can be stitched together from
Literals and@overloads but not sure how scalable and user friendly that would be. I will certainly research more about it before doing a v3 rewrite. For now, this approach suffices.