-
Notifications
You must be signed in to change notification settings - Fork 38
Performance considerations
Always measure performance in the context of your application, the tips here are intended to cover the basics.
The type of the cache key, and its associated Equals and GetHashCode methods can influence lookup cache speed. For example, if the cache key is a string, prefer ordinal to culture sensitive string comparison. The HashCode.Combine method used when autogenerating equality methods can be slower than handwriting the equivalent logic.
All cache features cost performance. This library has been designed such that disabling features eliminates that cost. Therefore, try to use only the required cache features via the cache builder methods. Time-based expiry, atomic and scoped values incur a slight penalty for lookup latency. LRU metrics slightly reduce concurrent throughput.
Events incur an event args heap allocation when an event handler is registered.
IExpiryCalculator methods can be called at very high frequency. To get the lowest latency and highest throughput, avoid heap allocations and minimize computation within the GetExpireAfter* methods.
An efficient expire after write calculator is show below. Since expiry time is fixed it can be calculated once up front at initialization instead of per create/read/update call, avoiding floating point multiplication on the hot path.
public class ExpireAfterWrite : IExpiryCalculator<string, int>
{
private readonly Duration timeToExpire = Duration.FromMinutes(5);
public Duration GetExpireAfterCreate(string key, int value) => timeToExpire;
public Duration GetExpireAfterRead(string key, int value, Duration current) => current;
public Duration GetExpireAfterUpdate(string key, int value, Duration current) => timeToExpire;
}