mirror of
https://github.com/jeffknupp/sandman2.git
synced 2026-04-25 08:35:49 +03:00
[GH-ISSUE #188] Seems like exclude_tables not working in call to get_app #71
Labels
No labels
bug
duplicate
enhancement
help wanted
invalid
pull-request
question
refactoring
research
wontfix
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/sandman2-jeffknupp#71
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @thomascapote on GitHub (Nov 24, 2020).
Original GitHub issue: https://github.com/jeffknupp/sandman2/issues/188
My SQL Server database has a table name with special characters, including single-quotes and a space: ['Downstream Accounts$']
Note that the following query works in Microsoft SQL Server Management Studio.
Without any qualification, a call to
sandman2ctlfails with the message "Incorrect syntax near 'Downstream'."The next-to-last line of the above stack trace shows the table name SQLAlchemy is choking on:
@table_name = ''Downstream Accounts$''. (The link mentioned in the last line is useless and unrelated to escaping table names.)My next step was to try to create a custom user model for this table, as described in #186. I have not been able to get that approach to work.
The next thing to try is using the
exclude_tablesparameter ofget_appto have SQLAlchemy skip the attempted introspection of this table, but this fails with the identical error!My script (
mrm_live_sandman.py) attempts to quote the table name every way I can think of, but it seems SQLAlchemy still attempts to use the same failing form of the table name.I am at a loss for how to get
sandman2to ignore this table or, better yet (of course), process it without complaining about the name. Thank you very much for taking the time to read this far and trying to help!