ECONNREFUSED for Postgres on nodeJS with dockers

node.jsPostgresqlDockersequelize.js

node.js Problem Overview


I'm building an app running on NodeJS using postgresql. I'm using SequelizeJS as ORM. To avoid using real postgres daemon and having nodejs on my own device, i'm using containers with docker-compose.

when I run docker-compose up it starts the pg database

database system is ready to accept connections

and the nodejs server. but the server can't connect to database.

Error: connect ECONNREFUSED 127.0.01:5432

If I try to run the server without using containers (with real nodejs and postgresd on my machine) it works.

But I want it to work correctly with containers. I don't understand what i'm doing wrong.

here is the docker-compose.yml file

web:
  image: node
  command: npm start
  ports:
    - "8000:4242"
  links:
    - db
  working_dir: /src
  environment:
    SEQ_DB: mydatabase
    SEQ_USER: username
    SEQ_PW: pgpassword
    PORT: 4242
    DATABASE_URL: postgres://username:[email protected]:5432/mydatabase
  volumes:
    - ./:/src
db:
  image: postgres
  ports:
  - "5432:5432"
  environment:
    POSTGRES_USER: username
    POSTGRES_PASSWORD: pgpassword

Could someone help me please?

(someone who likes docker :) )

node.js Solutions


Solution 1 - node.js

Your DATABASE_URL refers to 127.0.0.1, which is the loopback adapter (more here). This means "connect to myself".

When running both applications (without using Docker) on the same host, they are both addressable on the same adapter (also known as localhost).

When running both applications in containers they are not both on localhost as before. Instead you need to point the web container to the db container's IP address on the docker0 adapter - which docker-compose sets for you.

Change:

127.0.0.1 to CONTAINER_NAME (e.g. db)

Example:

DATABASE_URL: postgres://username:[email protected]:5432/mydatabase

to

DATABASE_URL: postgres://username:pgpassword@db:5432/mydatabase

This works thanks to Docker links: the web container has a file (/etc/hosts) with a db entry pointing to the IP that the db container is on. This is the first place a system (in this case, the container) will look when trying to resolve hostnames.

Solution 2 - node.js

For further readers, if you're using Docker desktop for Mac use host.docker.internal instead of localhost or 127.0.0.1 as it's suggested in the doc. I came across same connection refused... problem. Backend api-service couldn't connect to postgres using localhost/127.0.0.1. Below is my docker-compose.yml and environment variables as a reference:

version: "2"

services:
  api:
    container_name: "be"
    image: <image_name>:latest
    ports:
      - "8000:8000"
    environment:
      DB_HOST: host.docker.internal
      DB_USER: <your_user>
      DB_PASS: <your_pass>
    networks: 
      - mynw
    
  db:
    container_name: "psql"
    image: postgres
    ports:
      - "5432:5432"
    environment:
      POSTGRES_DB: <your_postgres_db_name>
      POSTGRES_USER: <your_postgres_user>
      POSTGRES_PASS: <your_postgres_pass>
    volumes:
      - ~/dbdata:/var/lib/postgresql/data
    networks:
      - mynw

Solution 3 - node.js

If you send database vars separately. You can assign a database host.

DB_HOST=<POSTGRES_SERVICE_NAME> #in your case "db" from docker-compose file.

Solution 4 - node.js

I had two containers one called postgresdb, and another call node

I changed my node queries.js from:

const pool = new Pool({
    user: 'postgres',
    host: 'localhost',
    database: 'users',
    password: 'password',
    port: 5432,
})

To

const pool = new Pool({
    user: 'postgres',
    host: 'postgresdb',
    database: 'users',
    password: 'password',
    port: 5432,
})

All I had to do was change the host to my container name ["postgresdb"] and that fixed this for me. I'm sure this can be done better but I just learned docker compose / node.js stuff in the last 2 days.

Solution 5 - node.js

If none of the other solutions worked for you, consider manual wrapping of PgPool.connect() with retry upon having ECONNREFUSED:

const pgPool = new Pool(pgConfig);
const pgPoolWrapper = {
    async connect() {
        for (let nRetry = 1; ; nRetry++) {
            try {
                const client = await pgPool.connect();
                if (nRetry > 1) {
                    console.info('Now successfully connected to Postgres');
                }
                return client;
            } catch (e) {
                if (e.toString().includes('ECONNREFUSED') && nRetry < 5) {
                    console.info('ECONNREFUSED connecting to Postgres, ' +
                        'maybe container is not ready yet, will retry ' + nRetry);
                    // Wait 1 second
                    await new Promise(resolve => setTimeout(resolve, 1000));
                } else {
                    throw e;
                }
            }
        }
    }
};

(See this issue in node-postgres for tracking.)

Solution 6 - node.js

As mentioned here.

> Each container can now look up the hostname web or db and get back the appropriate container’s IP address. For example, web’s application code could connect to the URL postgres://db:5432 and start using the Postgres database.

> It is important to note the distinction between HOST_PORT and CONTAINER_PORT. In the above example, for db, the HOST_PORT is 8001 and the container port is 5432 (postgres default). Networked service-to-service communication uses the CONTAINER_PORT. When HOST_PORT is defined, the service is accessible outside the swarm as well.

> Within the web container, your connection string to db would look like postgres://db:5432, and from the host machine, the connection string would look like postgres://{DOCKER_IP}:8001.

So DATABASE_URL should be postgres://username:pgpassword@db:5432/mydatabase

Solution 7 - node.js

I am here with a tiny modification about handle this.

As Andy say in him response.

  • "you need to point the web container to the db container's"

And taking in consideration the official documentation about docker-compose link's

  • "Links are not required to enable services to communicate - by default, any service can reach any other service at that service’s name."

Because of that, you can keep your docker_compose.yml in this way:

docker_compose.yml

version: "3"
services:
    web:
      image: node
      command: npm start
      ports:
         - "8000:4242"
      # links:
      #   - db
      working_dir: /src
      environment:
        SEQ_DB: mydatabase
        SEQ_USER: username
        SEQ_PW: pgpassword
        PORT: 4242
        # DATABASE_URL: postgres://username:[email protected]:5432/mydatabase
        DATABASE_URL: "postgres://username:pgpassword@db:5432/mydatabase"
      volumes:
          - ./:/src
    db:
      image: postgres
      ports:
          - "5432:5432"
      environment:
        POSTGRES_USER: username
        POSTGRES_PASSWORD: pgpassword

But it is a kinda cool way to be verbose while we are coding. So, your approach it is nice.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionStainz42View Question on Stackoverflow
Solution 1 - node.jsAndyView Answer on Stackoverflow
Solution 2 - node.jsAbu ShumonView Answer on Stackoverflow
Solution 3 - node.jsMEDZView Answer on Stackoverflow
Solution 4 - node.jsPhilip Jay FryView Answer on Stackoverflow
Solution 5 - node.jsleventovView Answer on Stackoverflow
Solution 6 - node.jsmdmundoView Answer on Stackoverflow
Solution 7 - node.jsFranco GilView Answer on Stackoverflow