Skip to content

ConcurrentTLru leaks memory in removed element's values #663

@snaumenko-st

Description

@snaumenko-st

In our application we use Async, Atomic ConcurrentTLru implementation for caching very large 128MB string values.
And we found that these values remain referenced from inside the cache even after they are removed from it.
Investigation showed that after item is removed from dictionary it still remains in Lru queues until full cycle is performed and it's evicted out from cold queue.
The cycling logic also doesn't pay attention to the WasRemoved field of the item and continues to move it between queues.

My suggestion is:

  • implement IDisposable on AsyncAtomicFactory and similar classes to dispose internal values and set them to default on remove from dictionary.
  • change cycling logic so that it will respect WasRemoved field and evict item from Lru ASAP.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions