QFieldCloud Self-Hosted – Issue with Offline Editing and PostGIS Access via host.docker.internal

Hi everyone,

I’m currently setting up a self-hosted QFieldCloud instance on a Linux VM using Docker Compose.

My QGIS project uses PostGIS layers stored in a PostgreSQL database running on the same VM as the Docker containers. Everything works fine because I use secrets with my database url.

But soon I’ll have to close public access to my database. I would like to know if Qfield could work with postgis data whose host is host.docker.internal? I have configured the docker yml with:
extra_hosts:
- “host.docker.internal:host-gateway”
and configure secrets with host.docker.internal as host. This doesn’t seem to work. But I would like to have your opinion. I’m thinking that maybe the update of the postgis data is done from the server?

Thanks

1 Like

Hey,

I don’t have the answer to this but let me try to find someone that can help with this question and we will get back to you.

1 Like

Hi,
I have news for my problem. The connection works when I put the local ip of the server in the pg_service.conf file.
Not localhost or 127.0.0.1, that of my vm’s local network. This is already a good start, but this ip is likely to change from time to time and being able to use host.docker.internal or another alias would be great.

Hi @JLHI_JLHI

If you provide QFieldCloud qgis containers to have acess to the PostGIS database, and your PostGIS layers are configured as offline editing, there is no need to expose your PostGIS to the outside world.

If you need any further support, please send us an email at sales@qfield.cloud

1 Like