WIP - Pydantic v2 support (#1238)
* WIP * WIP - make test_model_definition tests pass * WIP - make test_model_methods pass * WIP - make whole test suit at least run - failing 49/443 tests * WIP fix part of the getting pydantic tests as types of fields are now kept in core schema and not on fieldsinfo * WIP fix validation in update by creating individual fields validators, failing 36/443 * WIP fix __pydantic_extra__ in intializing model, fix test related to pydantic config checks, failing 32/442 * WIP - fix enum schema in model_json_schema, failing 31/442 * WIP - fix copying through model, fix setting pydantic fields on through, fix default config and inheriting from it, failing 26/442 * WIP fix tests checking pydantic schema, fix excluding parent fields, failing 21/442 * WIP some missed files * WIP - fix validators inheritance and fix validators in generated pydantic, failing 17/442 * WIP - fix through models setting - only on reverse side of relation, but always on reverse side, failing 15/442 * WIP - fix through models setting - only on reverse side of relation, but always on reverse side, failing 15/442 * WIP - working on proper populating __dict__ for relations for new schema dumping, some work on openapi docs, failing 13/442 * WIP - remove property fields as pydantic has now computed_field on its own, failing 9/442 * WIP - fixes in docs, failing 8/442 * WIP - fix tests for largebinary schema, wrapped bytes fields fail in pydantic, will be fixed in pydantic-core, remaining is circural schema for related models, failing 6/442 * WIP - fix to pk only models in schemas * Getting test suites to pass (#1249) * wip, fixing tests * iteration, fixing some more tests * iteration, fixing some more tests * adhere to comments * adhere to comments * remove unnecessary dict call, re-add getattribute for testing * todo for reverse relationship * adhere to comments, remove prints * solve circular refs * all tests pass 🎉 * remove 3.7 from tests * add lint and type check jobs * reforat with ruff, fix jobs * rename jobs * fix imports * fix evaluate in py3.8 * partially fix coverage * fix coverage, add more tests * fix test ids * fix test ids * fix lint, fix docs, make docs fully working scripts, add test docs job * fix pyproject * pin py ver in test docs * change dir in test docs * fix pydantic warning hack * rm poetry call in test_docs * switch to pathlib in test docs * remove coverage req test docs * fix type check tests, fix part of types * fix/skip next part of types * fix next part of types * fix next part of types * fix coverage * fix coverage * fix type (bit dirty 🤷) * fix some code smells * change pre-commit * tweak workflows * remove no root from tests * switch to full python path by passing sys.executable * some small refactor in new base model, one sample test, change makefile * small refactors to reduce complexity of methods * temp add tests for prs against pydantic_v2 * remove all references to __fields__ * remove all references to construct, deprecate the method and update model_construct to be in line with pydantic * deprecate dict and add model_dump, todo switch to model_dict in calls * fix tests * change to union * change to union * change to model_dump and model_dump_json from dict and json deprecated methods, deprecate them in ormar too * finish switching dict() -> model_dump() * finish switching json() -> model_dump_json() * remove fully pydantic_only * switch to extra for payment card, change missed json calls * fix coverage - no more warnings internal * fix coverage - no more warnings internal - part 2 * split model_construct into own and pydantic parts * split determine pydantic field type * change to new field validators * fix benchmarks, add codspeed instead of pytest-benchmark, add action and gh workflow * restore pytest-benchmark * remove codspeed * pin pydantic version, restore codspeed * change on push to pydantic_v2 to trigger first one * Use lifespan function instead of event (#1259) * check return types * fix imports order, set warnings=False on json that passes the dict, fix unnecessary loop in one of the test * remove references to model's meta as it's now ormar config, rename related methods too * filter out pydantic serializer warnings * remove choices leftovers * remove leftovers after property_fields, keep only enough to exclude them in initialization * add migration guide * fix meta references * downgrade databases for now * Change line numbers in documentation (#1265) * proofread and fix the docs, part 1 * proofread and fix the docs for models * proofread and fix the docs for fields * proofread and fix the docs for relations * proofread and fix rest of the docs, add release notes for 0.20 * create tables in new docs src * cleanup old deps, uncomment docs publish on tag * fix import reorder --------- Co-authored-by: TouwaStar <30479449+TouwaStar@users.noreply.github.com> Co-authored-by: Goran Mekić <meka@tilda.center>
This commit is contained in:
@ -14,15 +14,15 @@ import sqlalchemy
|
||||
|
||||
import ormar
|
||||
|
||||
database = databases.Database("sqlite:///db.sqlite")
|
||||
metadata = sqlalchemy.MetaData()
|
||||
|
||||
base_ormar_config = ormar.OrmarConfig(
|
||||
database=databases.Database("sqlite:///db.sqlite"),
|
||||
metadata=sqlalchemy.MetaData(),
|
||||
)
|
||||
|
||||
|
||||
class Album(ormar.Model):
|
||||
class Meta:
|
||||
tablename = "albums"
|
||||
metadata = metadata
|
||||
database = database
|
||||
ormar_config = base_ormar_config.copy()
|
||||
|
||||
id: int = ormar.Integer(primary_key=True)
|
||||
name: str = ormar.String(max_length=100)
|
||||
@ -34,7 +34,7 @@ You can for example define a trigger that will set `album.is_best_seller` status
|
||||
|
||||
Import `pre_update` decorator, for list of currently available decorators/ signals check below.
|
||||
|
||||
```Python hl_lines="1"
|
||||
```Python hl_lines="7"
|
||||
--8<-- "../docs_src/signals/docs002.py"
|
||||
```
|
||||
|
||||
@ -54,7 +54,7 @@ for which you want to run the signal receiver.
|
||||
|
||||
Currently there is no way to set signal for all models at once without explicitly passing them all into registration of receiver.
|
||||
|
||||
```Python hl_lines="4-7"
|
||||
```Python hl_lines="28-31"
|
||||
--8<-- "../docs_src/signals/docs002.py"
|
||||
```
|
||||
|
||||
@ -65,7 +65,7 @@ Currently there is no way to set signal for all models at once without explicitl
|
||||
Note that our newly created function has instance and class of the instance so you can easily run database
|
||||
queries inside your receivers if you want to.
|
||||
|
||||
```Python hl_lines="15-22"
|
||||
```Python hl_lines="41-48"
|
||||
--8<-- "../docs_src/signals/docs002.py"
|
||||
```
|
||||
|
||||
@ -75,15 +75,15 @@ You can define same receiver for multiple models at once by passing a list of mo
|
||||
# define a dummy debug function
|
||||
@pre_update([Album, Track])
|
||||
async def before_update(sender, instance, **kwargs):
|
||||
print(f"{sender.get_name()}: {instance.json()}: {kwargs}")
|
||||
print(f"{sender.get_name()}: {instance.model_dump_json()}: {kwargs}")
|
||||
```
|
||||
|
||||
Of course you can also create multiple functions for the same signal and model. Each of them will run at each signal.
|
||||
Of course, you can also create multiple functions for the same signal and model. Each of them will run at each signal.
|
||||
|
||||
```python
|
||||
@pre_update(Album)
|
||||
async def before_update(sender, instance, **kwargs):
|
||||
print(f"{sender.get_name()}: {instance.json()}: {kwargs}")
|
||||
print(f"{sender.get_name()}: {instance.model_dump_json()}: {kwargs}")
|
||||
|
||||
@pre_update(Album)
|
||||
async def before_update2(sender, instance, **kwargs):
|
||||
@ -100,13 +100,13 @@ class AlbumAuditor:
|
||||
|
||||
async def before_save(self, sender, instance, **kwargs):
|
||||
await AuditLog(
|
||||
event_type=f"{self.event_type}_SAVE", event_log=instance.json()
|
||||
event_type=f"{self.event_type}_SAVE", event_log=instance.model_dump_json()
|
||||
).save()
|
||||
|
||||
auditor = AlbumAuditor()
|
||||
pre_save(Album)(auditor.before_save)
|
||||
# call above has same result like the one below
|
||||
Album.Meta.signals.pre_save.connect(auditor.before_save)
|
||||
Album.ormar_config.signals.pre_save.connect(auditor.before_save)
|
||||
# signals are also exposed on instance
|
||||
album = Album(name='Miami')
|
||||
album.signals.pre_save.connect(auditor.before_save)
|
||||
@ -127,7 +127,7 @@ async def before_update(sender, instance, **kwargs):
|
||||
instance.is_best_seller = True
|
||||
|
||||
# disconnect given function from signal for given Model
|
||||
Album.Meta.signals.pre_save.disconnect(before_save)
|
||||
Album.ormar_config.signals.pre_save.disconnect(before_save)
|
||||
# signals are also exposed on instance
|
||||
album = Album(name='Miami')
|
||||
album.signals.pre_save.disconnect(before_save)
|
||||
@ -142,7 +142,7 @@ album.signals.pre_save.disconnect(before_save)
|
||||
* bulk operations (`QuerySet.bulk_create` and `QuerySet.bulk_update`) as they are designed for speed.
|
||||
|
||||
* queryset table level operations (`QuerySet.update` and `QuerySet.delete`) as they run on the underlying tables
|
||||
(more lak raw sql update/delete operations) and do not have specific instance.
|
||||
(more like raw sql update/delete operations) and do not have specific instance.
|
||||
|
||||
### pre_save
|
||||
|
||||
@ -251,23 +251,23 @@ import sqlalchemy
|
||||
|
||||
import ormar
|
||||
|
||||
database = databases.Database("sqlite:///db.sqlite")
|
||||
metadata = sqlalchemy.MetaData()
|
||||
|
||||
base_ormar_config = ormar.OrmarConfig(
|
||||
database=databases.Database("sqlite:///db.sqlite"),
|
||||
metadata=sqlalchemy.MetaData(),
|
||||
)
|
||||
|
||||
|
||||
class Album(ormar.Model):
|
||||
class Meta:
|
||||
tablename = "albums"
|
||||
metadata = metadata
|
||||
database = database
|
||||
ormar_config = base_ormar_config.copy()
|
||||
|
||||
id: int = ormar.Integer(primary_key=True)
|
||||
name: str = ormar.String(max_length=100)
|
||||
is_best_seller: bool = ormar.Boolean(default=False)
|
||||
play_count: int = ormar.Integer(default=0)
|
||||
|
||||
Album.Meta.signals.your_custom_signal = ormar.Signal()
|
||||
Album.Meta.signals.your_custom_signal.connect(your_receiver_name)
|
||||
Album.ormar_config.signals.your_custom_signal = ormar.Signal()
|
||||
Album.ormar_config.signals.your_custom_signal.connect(your_receiver_name)
|
||||
```
|
||||
|
||||
Actually under the hood signal is a `SignalEmitter` instance that keeps a dictionary of know signals, and allows you
|
||||
@ -276,13 +276,13 @@ to access them as attributes. When you try to access a signal that does not exis
|
||||
So example above can be simplified to. The `Signal` will be created for you.
|
||||
|
||||
```
|
||||
Album.Meta.signals.your_custom_signal.connect(your_receiver_name)
|
||||
Album.ormar_config.signals.your_custom_signal.connect(your_receiver_name)
|
||||
```
|
||||
|
||||
Now to trigger this signal you need to call send method of the Signal.
|
||||
|
||||
```python
|
||||
await Album.Meta.signals.your_custom_signal.send(sender=Album)
|
||||
await Album.ormar_config.signals.your_custom_signal.send(sender=Album)
|
||||
```
|
||||
|
||||
Note that sender is the only required parameter and it should be ormar Model class.
|
||||
@ -290,6 +290,6 @@ Note that sender is the only required parameter and it should be ormar Model cla
|
||||
Additional parameters have to be passed as keyword arguments.
|
||||
|
||||
```python
|
||||
await Album.Meta.signals.your_custom_signal.send(sender=Album, my_param=True)
|
||||
await Album.ormar_config.signals.your_custom_signal.send(sender=Album, my_param=True)
|
||||
```
|
||||
|
||||
|
||||
Reference in New Issue
Block a user