Quarkus: API for application data caching

Created on 25 Mar 2020  Â·  13Comments  Â·  Source: quarkusio/quarkus

Offer a programmatic API for application data cache offering features above and beyond the annotation based ones.
Follow up of https://github.com/quarkusio/quarkus/issues/3306

CC @karesti and @gwenneg

arecache kinenhancement

All 13 comments

One question:
Will quarkus use an already established caching provider (e.g. Infinispan) for its API? I think quarkus could indeed use, for example "Infinispan-embeeded" and provide its caching services (e.g. annotations and beyond) around this core.

Right now we only plan to use the existing cache implementation which uses caffeine.

If we go and support alternative providers like Infinispan or Hazelcast, we need to be very careful as these involve remote calls and the API must not assume locality cc @Sanne

In my opinion, _Quarkus Caching Layer_ should only use one cache provider internally. There is no need to support different caching providers (the same as there is no need to support different jpa providers).

Quarkus should stick with only one good caching provider and build on that. I think caching providers like infinispan are better suited for the _Quarkus Caching Layer_ than to only use caffeine because I would like to see the following features for _Quarkus Caching Layer_:

  • off-heap-cache
  • hot and cold cache
  • caching data should not be limited to RAM but to storage (for inspiration see https://github.com/OpenHFT/Chronicle-Map or https://github.com/jankotek/mapdb or https://microstream.one/)
  • annotations for domain layer to enable and control the caching (maybe the JPA-annotations can be used and interpreted)
  • provide different transaction levels with support for strong cache consistency (SERIALIZABLE > REPEATABLE_READ > READ_COMMITTED > READ_UNCOMMITTED). Those transaction levels can be realized by using (programmatic JTA or @Transactional)

as these involve remote calls and the API must not assume locality

Using caching providers in _embeeded mode_ (e.g. infinispan-embeeded) can be run locally (and does not involve remote calls).

Sure: Infinispan is built on Caffeine providing such additional capabilities. We'll likely base such capabilities on the Infinispan code base, either directly or reimplementing bits and pieces with the help of the Infinispan team as best suited.

as these involve remote calls and the API must not assume locality

Using caching providers in _embeeded mode_ (e.g. infinispan-embeeded) can be run locally (and does not involve remote calls).

I know but there is likely little value in swapping implementation in that case compared to the extra complexity of explaining how to configure stuff depending on the provider (more complex doc for example)

One thing I miss is an enabled flag on prpperties to enable or disable the extension. I have quite a long list of classes methods annotated and would like to easily enable/disable the extension, for instance, to compare performance. Unless I missed something in the docs, today this is not possible and would require the removal of all annotations from my code.

You are absolutely right @seseso, there is currently no way to disable the cache extension using a property, but I could easily take care of that missing flag. It looks like a build time property (meaning the flag could not be changed at runtime) would be enough, what do you think about that?

It would solve my use case for sure. And it would bring it closer to other
extensions that provide a similar flag!

On Fri, 31 Jul 2020, 21:56 Gwenneg Lepage, notifications@github.com wrote:

You are absolutely right @seseso https://github.com/seseso, there is
currently no way to disable the cache extension using a property, but I
could easily take care of that missing flag. It looks like a build time
property (meaning the flag could not be changed at runtime) would be
enough, what do you think about that?

—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/quarkusio/quarkus/issues/8140#issuecomment-667352996,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/ABTEWJUWGY6ARDGJNME3LE3R6MVZFANCNFSM4LTKU5LA
.

I created #11138 and #11139 to address your request @seseso.

Should this be closed as done? https://quarkus.io/guides/cache

Should this be closed as done?

@agentgonzo I don't think so, this issue is about adding a "programmatic API", i.e. to be able to inject a Cache object representation and obtain a cached value for a given key.

@mkouba is right, this is still an ongoing subject.

Was this page helpful?
0 / 5 - 0 ratings