-
Notifications
You must be signed in to change notification settings - Fork 39
Performance considerations
The type of the cache key, and its associated Equals and GetHashCode methods can influence lookup cache speed. For example, if the cache key is a string, prefer ordinal to culture sensitive string comparison. The HashCode.Combine method used when autogenerating equality methods can be slower than handwriting the equivalent logic.
All cache features cost performance. This library has been designed such that disabling features eliminates that cost. Therefore, try to use only the required cache features via the cache builder methods. Time-based expiry, atomic and scoped values incur a slight penalty for lookup latency. LRU hit counting logic slightly reduces concurrent throughput.
Events incur an event args heap allocation when an event handler is registered.
When implementing a custom time-based expiry policy, IExpiryCalculator methods can be called at very high frequency. To get the lowest latency and highest throughput, avoid heap allocations and minimize computation.