Installation • Features • Usage • Roadmap • Contributing
The modern Python3.8+ out-of-the-box reactive cache solution
Installation is as simple as:
pip install dory-cache
View the current documentation here
Dory's configuration is quite simple. On your project initialization just call the setup function as follows
from dory.setup import setup
setup(
host=REDIS_HOST,
port=REDIS_PORT,
user=REDIS_USER,
password=REDIS_PASSWORD
)
Bloats
responds to the need of having a simpler approach on designing reactive cache on your system.
They make cache configuration and management easy.
For example, let's pretend that we have a model called Product
which can be either serialized or edited. So, to improve the Product
serialization performance we cache the Product
serialization view (GET /product/).
from dory.cache import cache
from dory.utils import F
@api.get('/product/<product_id>')
@cache(key=F('product_id'), timeout=timedelta(hours=1))
def get_product(request, product_id):
"""
Serialize a Product
"""
...
Now everything works faster and as expected, but we did not contemplate that since the Product
can be edited (PUT /product/), we could have cached an outdated version of the Product
, so we need a way to force the cache to refresh itself. This is where Bloats come in handy!
Instead of caching the view with a generic cache decorator, decouple the cache configuration on a Bloat
from dory.bloats import Bloat, Field
class Product(Bloat):
"""
Product's bloat
"""
product_id: int = Field(...)
class Options:
timeout: timedelta = timedelta(hours=1)
enabled: bool = True
from dory.bloats.utils import F, cache
@api.get('/product/<product_id>')
@cache(Product(product_id=F('product_id')))
def get_product(request, product_id):
"""
Serialize a Product
"""
...
And now, when a Product
is edited, you can force the view to refresh the cache using the Bloat
as a middle-man.
from dory.bloats.utils import F, destroy
@api.put('/product/<product_id>')
@destroy(Product(product_id=F('product_id')))
def edit_product(request, product_id):
"""
Edit a Product
"""
...
Now your cache will always be in sync and it'll be configured in a cleaner way! 🔥
Dory simplifies several cache utilities with an out-of-the-box interface. For example, a decorator to cache views comfortably:
from dory.cache import cache
from dory.utils import F
@api.get('/foo')
@cache(key=F('foo_id'), timeout=timedelta(hours=1))
def foo(request, foo_id):
"""
Render a Foo
"""
...
The Django signals will permit the Bloats to expire themselves when a post_save
signal from the Bloat's Django designated model signal is sent.
from dory import bloats
from .api import models
class Product(bloats.Bloat):
"""
Product's bloat
"""
...
class Meta:
django_model = models.Product
from dory.utils import F
@api.get('/product/<product_id>')
@cache(Product(product_id=F('product_id')))
def get_product(request, product_id):
"""
Serialize a Product
"""
...
- Bloats 🐡 (See Bloats)
- Cache utils (See Cache utils)
- Cache decorator
- Ratelimit
- Django signals (See Django signals)
- Bloats v2
- The v2 of the
Bloats
will implement the method.set()
, capable not only to deprecate the currentBloat
version, but to fill it it again. The design is still aWIP
.
- The v2 of the
- Support more cache engines
Suggestions and contributions are extremely welcome! ❤️
Just open an issue or a PR, and I'll respond as soon as I can.