Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Client Side Caching with Otter #490

Open
rueian opened this issue Mar 6, 2024 · 9 comments
Open

Client Side Caching with Otter #490

rueian opened this issue Mar 6, 2024 · 9 comments
Labels
enhancement New feature or request

Comments

@rueian
Copy link
Collaborator

rueian commented Mar 6, 2024

Otter is a high performance lockless cache which uses proactive ttl and s3-fifo eviction policies.

Lockless is an important property that redis server-assisted client-side caching needs because the invalidation messages from redis should be applied as fast as possible. Applying invalidation messages should be the first priority if possible.

But in the current implementation, the pipelining goroutine handing invalidation messages competes with other reader goroutines to acquire the LRU lock. This will delay the invalidations and further block the pipeline. Using otter can solve this.

@rueian rueian added the enhancement New feature or request label Mar 6, 2024
@rueian rueian changed the title Client Side Caching with otter Client Side Caching with Otter Mar 6, 2024
@ouvaa
Copy link

ouvaa commented Mar 21, 2024

@rueian provide an option to use https://github.com/phuslu/lru?
otter is slow on set but fast on get, phuslu lru is more balanced and uses lesser mem

meaning for cache write intensive workload, better to use phuslu, for read intensive, better to use otter

@1a1a11a
Copy link

1a1a11a commented Mar 21, 2024

@ouvaa Do you have insights on why phuslu is better at write?

@rueian
Copy link
Collaborator Author

rueian commented Mar 22, 2024

Hi @ouvaa, rueidis is read intensive but writes should be prioritized.

@sshankar
Copy link

sshankar commented Oct 9, 2024

@rueian Do you have a work in progress branch with otter we can test with?

@rueian
Copy link
Collaborator Author

rueian commented Oct 22, 2024

Hi @sshankar, sorry for my late reply.

Unfortunately, my progress on this is currently paused. I haven't finished adding singleflight load
mechanism to otter. That is a key feature that otter currently doesn't have. If we build the mechanism outside of otter, we will need to pay high overhead for it.

What makes you interested in otter?

@04116
Copy link

04116 commented Jan 20, 2025

hi @rueian, have you made any more progress on this one?
I just faced with the problem that when I heavy write/update cache, profiling show more contention in LRU (the default cache)

@rueian
Copy link
Collaborator Author

rueian commented Jan 20, 2025

Hi @04116, as just mentioned previously, the progress on this is currently paused. On the other hand, I am working on a flattened cache implementation here #712 to have more accurate cache size estimation and lower GC overhead.

Would you mind providing your profiling result and how do you use the cache for us to better understand your situation?

@ahapeter
Copy link

Hi @rueian

not sure it actually the problem, just saw them on top of profile
Image
Image

@rueian
Copy link
Collaborator Author

rueian commented Jan 21, 2025

Hi @ahapeter, thanks for your profile. I can't really tell the problem but it seems that there was something related to GC overhead in it. I think it may work well with the new ongoing flattened cache implementation #712.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

6 participants