Asp.Net Core: The connection pool has been exhausted

Dung Do Tien Apr 16 2021 380

I have a project using ASp.Net Core 3.1, Dapper and PostgreSQL 9.xx 

I publish it in Docker, Ubuntu 18.0.4 and I have some code to get list data from PostgreSQL as below:

public async Task<int> GetTotalProductByStatusSalonIdAsync(int salonId, int status)
{
    StringBuilder query = new StringBuilder();
    query.Append("SELECT count (productid) ");
    query.Append("FROM product ");
    query.AppendFormat("WHERE salonid > 0 AND salonid = {0} ", salonId);
    query.AppendFormat(" AND status & {0} = {0} ", status);
    query.AppendFormat("AND {0} > startdate ", DateTime.Now.Ticks);
    query.AppendFormat("AND {0} < enddate ", DateTime.Now.Ticks - StaticVariable.DateExtendToExpires);
    query.Append("AND ispublish = TRUE ");
    var result = await QuerySingleOrDefaultAsync<int>(query.ToString());
    return result;
}

 I test it  from local my computer it works very well but when published it and have many visitors go to my site (~100)  it throw many errors: The connection pool has been exhausted, either raise MaxPoolSize (currently 100) or Timeout (currently 15 seconds)

Npgsql.NpgsqlException (0x80004005): The connection pool has been exhausted, either raise MaxPoolSize (currently 100) or Timeout (currently 15 seconds)
   at Npgsql.ConnectorPool.<>c__DisplayClass38_0.<<Rent>g__RentAsync|0>d.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at Npgsql.NpgsqlConnection.<>c__DisplayClass41_0.<<Open>g__OpenAsync|0>d.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at Npgsql.NpgsqlConnection.Open()
   at CMS.QuizDev.Infrastructure.UnitOfWork.UnitOfWorkBase..ctor(IDbConnection dbConnection) in /src/Cores/DVG.Autoportal.Core/Infrastructure/UnitOfWork/UnitOfWorkBase.cs:line 18
   at ResolveService(ILEmitResolverBuilderRuntimeContext , ServiceProviderEngineScope )
   at ResolveService(ILEmitResolverBuilderRuntimeContext , ServiceProviderEngineScope )
   at ResolveService(ILEmitResolverBuilderRuntimeContext , ServiceProviderEngineScope )
   at Microsoft.Extensions.DependencyInjection.ActivatorUtilities.GetService(IServiceProvider sp, Type type, Type requiredBy, Boolean isDefaultParameterRequired)

I check the max pool size of my database is 100.

Please tell me how to fix it?

Have 2 answer(s) found.
  • N

    Nguyen Truong Giang Apr 16 2021

    Maybe you have to change the max size pool of your PostgreSQL to large such as 1000.

    To change, you have to change it in postgresql.conf file.

    1. Increase max_connection and shared_buffers

    in /var/lib/pgsql/{version_number}/data/postgresql.conf

    change

    max_connections = 100
    shared_buffers = 24MB

    to

    max_connections = 1000
    shared_buffers = 240MB

    The shared_buffers configuration parameter determines how much memory is dedicated to PostgreSQL to use for caching data.

    • If you have a system with 1GB or more of RAM, a reasonable starting value for shared_buffers is 1/4 of the memory in your system.
    • it's unlikely you'll find using more than 40% of RAM to work better than a smaller amount (like 25%)
    • Be aware that if your system or PostgreSQL build is 32-bit, it might not be practical to set shared_buffers above 2 ~ 2.5GB.
    • Note that on Windows, large values for shared_buffers aren't as effective, and you may find better results keeping it relatively low and using the OS cache more instead. On Windows, the useful range is 64MB to 512MB.

    2. Change kernel.shmmax

    You would need to increase kernel max segment size to be slightly larger than the shared_buffers.

    In file /etc/sysctl.conf set the parameter as shown below. It will take effect when PostgreSQL reboots (The following line makes the kernel max to 96Mb)

    kernel.shmmax=100663296

    But I want you to know that increase it may not resolve issues for you if your traffic is large. You need to check all your code and sure that you no connect to DB in a LOOP.

  • R

    Raja T Apr 16 2021

    I not sure but I got the same exception. But this exception throws because I have much code connect to the Database inside a Loop.

    With small traffic, you will never see it and it also not impact your site.

    Please review all your source code. Maybe you have to use some cache to reduce connect to DB such as Memory cache, Redis cache or Memcache.

Leave An Answer
* NOTE: You need Login before leave an answer

* Type maximum 2000 characters.

* All comments have to wait approved before display.

* Please polite comment and respect questions and answers of others.

Popular Tips

X Close