最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

python - sqlalchemy : This result object does not return rows. but console does - Stack Overflow

programmeradmin2浏览0评论

i'm fairly new to python with sql-alchemy and just want to insert a bunch of data and receive some values back.

my code looks something like this:

statement = text("""
        with _user as (
            insert into user (first_name, last_name)
                   values (:first, :last)
                   returning id, first_name, last_name
        ),
        _student as (
            insert into student (user_id, name)
                select id, concat(last_name,' ', first_name) from _user
                   returning id, name
                   )
        select * from _student;
    """)
    rows = conn.execute(statement, [{'bar': 1, 'foo': 'foo1'},{'bar': 2, 'foo': 'foo2'}]).mappings()
    for row in rows:
        print(f'${row}')

now i got the following error: This result object does not return rows. It has been closed automatically.

when running the same query in a console, there are some results. why sql-alchemy closes the cursor?

i'm fairly new to python with sql-alchemy and just want to insert a bunch of data and receive some values back.

my code looks something like this:

statement = text("""
        with _user as (
            insert into user (first_name, last_name)
                   values (:first, :last)
                   returning id, first_name, last_name
        ),
        _student as (
            insert into student (user_id, name)
                select id, concat(last_name,' ', first_name) from _user
                   returning id, name
                   )
        select * from _student;
    """)
    rows = conn.execute(statement, [{'bar': 1, 'foo': 'foo1'},{'bar': 2, 'foo': 'foo2'}]).mappings()
    for row in rows:
        print(f'${row}')

now i got the following error: This result object does not return rows. It has been closed automatically.

when running the same query in a console, there are some results. why sql-alchemy closes the cursor?

Share Improve this question asked Jan 17 at 16:47 effeffeffeff 132 bronze badges 3
  • Of curse the data i have passed fitting to the named parameters in the query (sorry to messed it up here). As an addition, if i do not pass any data to execute (instate using fixed values in the query) i will get results back. – effeff Commented Jan 17 at 16:53
  • 1 i'm not too familiar with that syntax to specify variable substitution, but shouldn't :first and :last be :bar and :foo? – mechanical_meat Commented Jan 17 at 19:05
  • @mechanical_meat: you totally right it should be rows = conn.execute(statement, [{'first': 'some firstName', 'last': 'someLastName'},..,{'first': 'otherFirstName', 'last': 'otherLastName'}]).mappings() but the problem keeps than the same ... the cursor was closed even if the query return som rows. – effeff Commented Jan 17 at 23:42
Add a comment  | 

1 Answer 1

Reset to default 0

Summary

I don't think that this is possible for multiple parameters using SQLAlchemy at present. This limitation may be by design or (less likely I suspect), a bug.

Analysis

When the statement is executed with a list of more than one parameter dictionary, SQLAlchemy uses psycopg2's executemany method to execute the statement (if you look in the postgres log, you will see that the statements are executed twice sequentially, if you are logging statements). This method discards its results and so SQLAlchemy raises the ResourceClosedError observed by the OP.

SQLAlchemy does support psycopg2's fast execution helpers, but not for textual SQL statements. Converting the raw SQL to SQLAlchemy core statements results in the same error however.

I suspect this is because the insert-within-a-cte pattern is not something to which SQLAlchemy can apply its internal logic for using psycopg2's helpers or its native insertmanyvalues feature.

What is to be done?

psycopg2's execute_values statement handles this case correctly, so you could just use psycopg2:

import psycopg2 
from psycopg2.extras import execute_values

...

with psycopg2.connect(dbname='so') as conn, conn.cursor() as cur:
    execute_values(
        cur,
        sql, 
        [dict1, dict2, ...], 
        template='(%(first)s, %(last)s)'
        )
    row = cur.fetchall()
    for row in rows:
        print(row)

Alternatively you could perform the inserts and selects over multiple operations, at the cost of increased network traffic and time:

  1. insert users, returning ids
  2. insert students
  3. query students table

Limitation Or Bug?

I'm inclined to think that this is a limitation in the implementation of SQLAlchemy's bulk insert support rather than a bug. If you need clarification (for the core objects insert case, not for text) you could open a discussion on GitHub.

FWIW the core version would look like this:

import sqlalchemy as sa

engine = sa.create_engine(url)

# Reflect the tables.
metadata = sa.MetaData()
metadata.reflect(engine, only=['user', 'student'])
user = metadata.tables['user']
student = metadata.tables['student']

with engine.connect() as conn:
    _user = (
        user.insert().returning(user.c.id, user.c.first_name, user.c.last_name)
    ).cte('user')
    _student = (
        student.insert()
        .from_select(
            [student.c.user_id, student.c.name],
            sa.select(_user.c.id, _user.c.first_name + ' ' + _user.c.last_name),
        ).returning(
            student.c.id, student.c.name
        )
    ).cte('student')
    rows = conn.execute(
        sa.select(_student),
        [dict1, dict2, ...]
    )
    for row in rows.mappings():
        print(row)
发布评论

评论列表(0)

  1. 暂无评论