Hi, I’m running 38 projects in ETL on an 11 node spark cluster, and am failing in FATHMM step with “PSQLException: FATAL: sorry, too many clients already”. I’ve raised the max number of connections to 200, but still get the error.
What max connection value do you use for Postgres?
Here’s what I think the relevant ETL settings are:
spark.default.parallelism: 33 # number of partitions
SPARK_WORKER_CORES=3
SPARK_WORKER_INSTANCES=1
postgres setting:
max_connections = 200
Thanks