* WIP * WIP - make test_model_definition tests pass * WIP - make test_model_methods pass * WIP - make whole test suit at least run - failing 49/443 tests * WIP fix part of the getting pydantic tests as types of fields are now kept in core schema and not on fieldsinfo * WIP fix validation in update by creating individual fields validators, failing 36/443 * WIP fix __pydantic_extra__ in intializing model, fix test related to pydantic config checks, failing 32/442 * WIP - fix enum schema in model_json_schema, failing 31/442 * WIP - fix copying through model, fix setting pydantic fields on through, fix default config and inheriting from it, failing 26/442 * WIP fix tests checking pydantic schema, fix excluding parent fields, failing 21/442 * WIP some missed files * WIP - fix validators inheritance and fix validators in generated pydantic, failing 17/442 * WIP - fix through models setting - only on reverse side of relation, but always on reverse side, failing 15/442 * WIP - fix through models setting - only on reverse side of relation, but always on reverse side, failing 15/442 * WIP - working on proper populating __dict__ for relations for new schema dumping, some work on openapi docs, failing 13/442 * WIP - remove property fields as pydantic has now computed_field on its own, failing 9/442 * WIP - fixes in docs, failing 8/442 * WIP - fix tests for largebinary schema, wrapped bytes fields fail in pydantic, will be fixed in pydantic-core, remaining is circural schema for related models, failing 6/442 * WIP - fix to pk only models in schemas * Getting test suites to pass (#1249) * wip, fixing tests * iteration, fixing some more tests * iteration, fixing some more tests * adhere to comments * adhere to comments * remove unnecessary dict call, re-add getattribute for testing * todo for reverse relationship * adhere to comments, remove prints * solve circular refs * all tests pass 🎉 * remove 3.7 from tests * add lint and type check jobs * reforat with ruff, fix jobs * rename jobs * fix imports * fix evaluate in py3.8 * partially fix coverage * fix coverage, add more tests * fix test ids * fix test ids * fix lint, fix docs, make docs fully working scripts, add test docs job * fix pyproject * pin py ver in test docs * change dir in test docs * fix pydantic warning hack * rm poetry call in test_docs * switch to pathlib in test docs * remove coverage req test docs * fix type check tests, fix part of types * fix/skip next part of types * fix next part of types * fix next part of types * fix coverage * fix coverage * fix type (bit dirty 🤷) * fix some code smells * change pre-commit * tweak workflows * remove no root from tests * switch to full python path by passing sys.executable * some small refactor in new base model, one sample test, change makefile * small refactors to reduce complexity of methods * temp add tests for prs against pydantic_v2 * remove all references to __fields__ * remove all references to construct, deprecate the method and update model_construct to be in line with pydantic * deprecate dict and add model_dump, todo switch to model_dict in calls * fix tests * change to union * change to union * change to model_dump and model_dump_json from dict and json deprecated methods, deprecate them in ormar too * finish switching dict() -> model_dump() * finish switching json() -> model_dump_json() * remove fully pydantic_only * switch to extra for payment card, change missed json calls * fix coverage - no more warnings internal * fix coverage - no more warnings internal - part 2 * split model_construct into own and pydantic parts * split determine pydantic field type * change to new field validators * fix benchmarks, add codspeed instead of pytest-benchmark, add action and gh workflow * restore pytest-benchmark * remove codspeed * pin pydantic version, restore codspeed * change on push to pydantic_v2 to trigger first one * Use lifespan function instead of event (#1259) * check return types * fix imports order, set warnings=False on json that passes the dict, fix unnecessary loop in one of the test * remove references to model's meta as it's now ormar config, rename related methods too * filter out pydantic serializer warnings * remove choices leftovers * remove leftovers after property_fields, keep only enough to exclude them in initialization * add migration guide * fix meta references * downgrade databases for now * Change line numbers in documentation (#1265) * proofread and fix the docs, part 1 * proofread and fix the docs for models * proofread and fix the docs for fields * proofread and fix the docs for relations * proofread and fix rest of the docs, add release notes for 0.20 * create tables in new docs src * cleanup old deps, uncomment docs publish on tag * fix import reorder --------- Co-authored-by: TouwaStar <30479449+TouwaStar@users.noreply.github.com> Co-authored-by: Goran Mekić <meka@tilda.center>
9.7 KiB
Signals
Signals are a mechanism to fire your piece of code (function / method) whenever given type of event happens in ormar.
To achieve this you need to register your receiver for a given type of signal for selected model(s).
Defining receivers
Given a sample model like following:
import databases
import sqlalchemy
import ormar
base_ormar_config = ormar.OrmarConfig(
database=databases.Database("sqlite:///db.sqlite"),
metadata=sqlalchemy.MetaData(),
)
class Album(ormar.Model):
ormar_config = base_ormar_config.copy()
id: int = ormar.Integer(primary_key=True)
name: str = ormar.String(max_length=100)
is_best_seller: bool = ormar.Boolean(default=False)
play_count: int = ormar.Integer(default=0)
You can for example define a trigger that will set album.is_best_seller status if it will be played more than 50 times.
Import pre_update decorator, for list of currently available decorators/ signals check below.
--8<-- "../docs_src/signals/docs002.py"
Define your function.
Note that each receiver function:
- has to be callable
- has to accept first
senderargument that receives the class of sending object - has to accept
**kwargsargument as the parameters send in eachormar.Signalcan change at any time so your function has to serve them. - has to be
asynccause callbacks are gathered and awaited.
pre_update currently sends only one argument apart from sender and it's instance one.
Note how pre_update decorator accepts a senders argument that can be a single model or a list of models,
for which you want to run the signal receiver.
Currently there is no way to set signal for all models at once without explicitly passing them all into registration of receiver.
--8<-- "../docs_src/signals/docs002.py"
!!!note
Note that receivers are defined on a class level -> so even if you connect/disconnect function through instance
it will run/ stop running for all operations on that ormar.Model class.
Note that our newly created function has instance and class of the instance so you can easily run database queries inside your receivers if you want to.
--8<-- "../docs_src/signals/docs002.py"
You can define same receiver for multiple models at once by passing a list of models to signal decorator.
# define a dummy debug function
@pre_update([Album, Track])
async def before_update(sender, instance, **kwargs):
print(f"{sender.get_name()}: {instance.model_dump_json()}: {kwargs}")
Of course, you can also create multiple functions for the same signal and model. Each of them will run at each signal.
@pre_update(Album)
async def before_update(sender, instance, **kwargs):
print(f"{sender.get_name()}: {instance.model_dump_json()}: {kwargs}")
@pre_update(Album)
async def before_update2(sender, instance, **kwargs):
print(f'About to update {sender.get_name()} with pk: {instance.pk}')
Note that ormar decorators are the syntactic sugar, you can directly connect your function or method for given signal for
given model. Connect accept only one parameter - your receiver function / method.
class AlbumAuditor:
def __init__(self):
self.event_type = "ALBUM_INSTANCE"
async def before_save(self, sender, instance, **kwargs):
await AuditLog(
event_type=f"{self.event_type}_SAVE", event_log=instance.model_dump_json()
).save()
auditor = AlbumAuditor()
pre_save(Album)(auditor.before_save)
# call above has same result like the one below
Album.ormar_config.signals.pre_save.connect(auditor.before_save)
# signals are also exposed on instance
album = Album(name='Miami')
album.signals.pre_save.connect(auditor.before_save)
!!!warning
Note that signals keep the reference to your receiver (not a weakref) so keep that in mind to avoid circular references.
Disconnecting the receivers
To disconnect the receiver and stop it for running for given model you need to disconnect it.
@pre_update(Album)
async def before_update(sender, instance, **kwargs):
if instance.play_count > 50 and not instance.is_best_seller:
instance.is_best_seller = True
# disconnect given function from signal for given Model
Album.ormar_config.signals.pre_save.disconnect(before_save)
# signals are also exposed on instance
album = Album(name='Miami')
album.signals.pre_save.disconnect(before_save)
Available signals
!!!warning Note that signals are not send for:
* bulk operations (`QuerySet.bulk_create` and `QuerySet.bulk_update`) as they are designed for speed.
* queryset table level operations (`QuerySet.update` and `QuerySet.delete`) as they run on the underlying tables
(more like raw sql update/delete operations) and do not have specific instance.
pre_save
pre_save(sender: Type["Model"], instance: "Model")
Send for Model.save() and Model.objects.create() methods.
sender is a ormar.Model class and instance is the model to be saved.
post_save
post_save(sender: Type["Model"], instance: "Model")
Send for Model.save() and Model.objects.create() methods.
sender is a ormar.Model class and instance is the model that was saved.
pre_update
pre_update(sender: Type["Model"], instance: "Model")
Send for Model.update() method.
sender is a ormar.Model class and instance is the model to be updated.
post_update
post_update(sender: Type["Model"], instance: "Model")
Send for Model.update() method.
sender is a ormar.Model class and instance is the model that was updated.
pre_delete
pre_delete(sender: Type["Model"], instance: "Model")
Send for Model.save() and Model.objects.create() methods.
sender is a ormar.Model class and instance is the model to be deleted.
post_delete
post_delete(sender: Type["Model"], instance: "Model")
Send for Model.update() method.
sender is a ormar.Model class and instance is the model that was deleted.
pre_relation_add
pre_relation_add(sender: Type["Model"], instance: "Model", child: "Model", relation_name: str, passed_args: Dict)
Send for Model.relation_name.add() method for ManyToMany relations and reverse side of ForeignKey relation.
sender - sender class, instance - instance to which related model is added, child - model being added,
relation_name - name of the relation to which child is added, for add signals also passed_kwargs - dict of kwargs passed to add()
post_relation_add
post_relation_add(sender: Type["Model"], instance: "Model", child: "Model", relation_name: str, passed_args: Dict)
Send for Model.relation_name.add() method for ManyToMany relations and reverse side of ForeignKey relation.
sender - sender class, instance - instance to which related model is added, child - model being added,
relation_name - name of the relation to which child is added, for add signals also passed_kwargs - dict of kwargs passed to add()
pre_relation_remove
pre_relation_remove(sender: Type["Model"], instance: "Model", child: "Model", relation_name: str)
Send for Model.relation_name.remove() method for ManyToMany relations and reverse side of ForeignKey relation.
sender - sender class, instance - instance to which related model is added, child - model being added,
relation_name - name of the relation to which child is added.
post_relation_remove
post_relation_remove(sender: Type["Model"], instance: "Model", child: "Model", relation_name: str, passed_args: Dict)
Send for Model.relation_name.remove() method for ManyToMany relations and reverse side of ForeignKey relation.
sender - sender class, instance - instance to which related model is added, child - model being added,
relation_name - name of the relation to which child is added.
post_bulk_update
post_bulk_update(sender: Type["Model"], instances: List["Model"], **kwargs),
Send for Model.objects.bulk_update(List[objects]) method.
Defining your own signals
Note that you can create your own signals although you will have to send them manually in your code or subclass ormar.Model
and trigger your signals there.
Creating new signal is super easy. Following example will set a new signal with name your_custom_signal.
import databases
import sqlalchemy
import ormar
base_ormar_config = ormar.OrmarConfig(
database=databases.Database("sqlite:///db.sqlite"),
metadata=sqlalchemy.MetaData(),
)
class Album(ormar.Model):
ormar_config = base_ormar_config.copy()
id: int = ormar.Integer(primary_key=True)
name: str = ormar.String(max_length=100)
is_best_seller: bool = ormar.Boolean(default=False)
play_count: int = ormar.Integer(default=0)
Album.ormar_config.signals.your_custom_signal = ormar.Signal()
Album.ormar_config.signals.your_custom_signal.connect(your_receiver_name)
Actually under the hood signal is a SignalEmitter instance that keeps a dictionary of know signals, and allows you
to access them as attributes. When you try to access a signal that does not exist SignalEmitter will create one for you.
So example above can be simplified to. The Signal will be created for you.
Album.ormar_config.signals.your_custom_signal.connect(your_receiver_name)
Now to trigger this signal you need to call send method of the Signal.
await Album.ormar_config.signals.your_custom_signal.send(sender=Album)
Note that sender is the only required parameter and it should be ormar Model class.
Additional parameters have to be passed as keyword arguments.
await Album.ormar_config.signals.your_custom_signal.send(sender=Album, my_param=True)