We are experiencing an issue where our API becomes unresponsive, and upon investigating the logs, we found that the application is encountering an OutOfMemoryException. This issue appears to be related to Redis caching, specifically during the execution of KeyExpireAsync and other cache-related operations.
Below are two samples from our logs that happened on different days showing the OutOfMemoryException:
[21:16:29 WRN] Exception of type 'System.OutOfMemoryException' was thrown.
System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown.
at StackExchange.Redis.PhysicalBridge.HandleWriteException(Message message, Exception ex)
at StackExchange.Redis.PhysicalBridge.WriteMessageTakingWriteLockAsync(PhysicalConnection physical, Message message, Boolean bypassBacklog) in /_/src/StackExchange.Redis/PhysicalBridge.cs:line 1164
at StackExchange.Redis.PhysicalBridge.TryWriteAsync(Message message, Boolean isReplica, Boolean bypassBacklog) in /_/src/StackExchange.Redis/PhysicalBridge.cs:line 225
at StackExchange.Redis.ConnectionMultiplexer.ExecuteAsyncImpl[T](Message message, ResultProcessor`1 processor, Object state, ServerEndPoint server) in /_/src/StackExchange.Redis/ConnectionMultiplexer.cs:line 2167
at StackExchange.Redis.RedisBase.ExecuteAsync[T](Message message, ResultProcessor`1 processor, ServerEndPoint server) in /_/src/StackExchange.Redis/RedisBase.cs:line 54
at StackExchange.Redis.RedisDatabase.KeyExpireAsync(RedisKey key, Nullable`1 expiry, ExpireWhen when, CommandFlags flags) in /_/src/StackExchange.Redis/RedisDatabase.cs:line 859
at StackExchange.Redis.RedisDatabase.KeyExpireAsync(RedisKey key, Nullable`1 expiry, CommandFlags flags) in /_/src/StackExchange.Redis/RedisDatabase.cs:line 851
at Microsoft.Extensions.Caching.StackExchangeRedis.RedisCache.RefreshAsync(IDatabase cache, String key, Nullable`1 absExpr, Nullable`1 sldExpr, CancellationToken token)
at Microsoft.Extensions.Caching.StackExchangeRedis.RedisCache.GetAndRefreshAsync(String key, Boolean getData, CancellationToken token)
at Microsoft.Extensions.Caching.StackExchangeRedis.RedisCache.GetAsync(String key, CancellationToken token)
at Volo.Abp.Caching.DistributedCache`2.GetAsync(TCacheKey key, Nullable`1 hideErrors, Boolean considerUow, CancellationToken token)
[20:24:23 WRN] Exception of type 'System.OutOfMemoryException' was thrown.
System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown.
at StackExchange.Redis.PhysicalBridge.HandleWriteException(Message message, Exception ex)
at StackExchange.Redis.PhysicalBridge.WriteMessageTakingWriteLockAsync(PhysicalConnection physical, Message message, Boolean bypassBacklog)
at StackExchange.Redis.PhysicalBridge.TryWriteAsync(Message message, Boolean isReplica, Boolean bypassBacklog)
at StackExchange.Redis.ConnectionMultiplexer.TryPushMessageToBridgeAsync[T](Message message, ResultProcessor`1 processor, IResultBox`1 resultBox, ServerEndPoint& server)
at StackExchange.Redis.ConnectionMultiplexer.ExecuteAsyncImpl[T](Message message, ResultProcessor`1 processor, Object state, ServerEndPoint server, T defaultValue)
at StackExchange.Redis.RedisBase.ExecuteAsync[T](Message message, ResultProcessor`1 processor, T defaultValue, ServerEndPoint server)
at StackExchange.Redis.RedisDatabase.HashGetAsync(RedisKey key, RedisValue[] hashFields, CommandFlags flags)
at Microsoft.Extensions.Caching.StackExchangeRedis.RedisExtensions.HashMemberGetAsync(IDatabase cache, String key, String[] members)
at Microsoft.Extensions.Caching.StackExchangeRedis.RedisCache.GetAndRefreshAsync(String key, Boolean getData, CancellationToken token)
at Microsoft.Extensions.Caching.StackExchangeRedis.RedisCache.GetAsync(String key, CancellationToken token)
at Volo.Abp.Caching.DistributedCache`2.GetAsync(TCacheKey key, Nullable`1 hideErrors, Boolean considerUow, CancellationToken token)
Additional Observations from Redis Server Info:
We captured Redis server statistics during an API outage, and we noticed the following:
-
Memory usage remains low (~100MB) despite having 2.4GB available.
-
Total connections received is extremely high (1,842,338 connections).
-
Expired keys count is significant (75,042 keys).
-
No out-of-memory messages detected from Redis itself (total_oom_messages: 0).
-
The maxmemory policy is set to allkeys-lru, but there are no evicted keys.
This suggests that memory is not the bottleneck but possibly an excessive number of key expiration events or a high rate of new connections.
Issue Details:
-
The API stops responding when this error occurs.
-
The error originates from Redis cache operations, likely causing memory exhaustion at the application level.
-
We are using the ABP framework’s caching system with Redis (Volo.Abp.Caching.DistributedCache).
-
The issue is intermittent.
-
The Redis server itself does not appear to be running out of memory or rejecting connections.
Steps Taken:
-
Reviewed our Redis cache configuration and did not identify obvious misconfigurations.
-
Monitored memory usage and noticed it remains low (~100MB).
-
Restarting the API temporarily resolves the issue, but it reoccurs after some time.
Additional Info
-
ABP Framework version: v7.4.2
-
UI Type: Angular
-
Database System: SQL Server
-
Tiered (for MVC) or Auth Server Separated (for Angular): Auth Server Separated
-
We are using the Azure cloud to host our services (App Services, Azure Cache for redis, etc.)
4 Answer(s)
-
0
Hi,
The logs show that the problem started with write operations at the physical link layer (PhysicalBridge).
ABP is the
Microsoft.Extensions.Caching.StackExchangeRedis
package and we don't have any custom code there. Honestly, I think the problem may be for a different reason. For example, a user reported the following problem on stackoverflow. In your case, where are you deploying the application and is this error only happening in production?Reference: https://stackoverflow.com/a/72729503/9922629
-
0
Hi, thanks for the response. We have our platform hosted on Azure App Services. I just reviewed the Configuration on my App Service blade and interestingly the Stack configuration was empty and the Platform was set to 32 Bit. I just modified these values on my development environment and everything is up and running.
I will do a maintenance tonight of my app and will configure the .NET Stack and 64 Bit on the Platform and will monitor how this behaves tomorrow with these changes.
-
0
Hi,
Great, keep us updated.
-
0
Hi, In fact the issues seemed to be related to the Platform setting we were using on thr Azure App Service. We had this on 32 Bit and updated it to 64 Bits and now the issue is resolved. Thanks for your assistance with this issue