Download speed from self hosted cloud instance is very low

Hello,
I’m trying to sync a project to Qfield via a self hosted QfieldCloud instance - this takes more then 1h to download the project (~1GB). Do you have any idea, what the problem could be?
When i download the file via the webinterface of Qfieldcloud/admin i get the full downloadspeed >50mb/s


Imported from GitHub discussion by @meyerlor on 2024-07-15T13:26:39Z

Hey meyerlor . I am not sure why you experince this, especially when compared to the admin. Sounds more like a networking or device issue. I can assure you there is no speed throttling depending on the server. :slight_smile:

Also, note there is a different in how files are delivered depending if you are in debug mode or not:

Hope these help to locate the source of your issue.

You can also check the performance of GitHub - opengisch/qfieldcloud-sdk-python: The official QFieldCloud SDK and CLI, it’s using the same API endpoint as QField for downloading files.


Imported from GitHub comment by @suricactus on 2024-07-15T15:17:20Z

thanks for the pointers - i just used the sdk the first time, thanks for that nice tool to play around, it certainly helps to debug/compare performances!
So i made several tests:

  • uploaded a large (1gb) .png to the project, downloaded it with
qfieldcloud-cli download-files --filter *.png 4d1f44b9-4cd6-4dbf-83ad-6139195e266f C:/temp/qfieldcloud-cli/
Downloading project "4d1f44b9-4cd6-4dbf-83ad-6139195e266f" files to C:/temp/qfieldcloud-cli/…
Dokumente_Schacht/Orthofotos_2024_10cm.png: 100%|████████████████████████████████| 1.30G/1.30G [00:16<00:00, 79.1Mit/s]
Downloaded 1 file(s).
  • tried to download all files (more then 2000 attachment files, each ~100kb): taking ages (although downloadspeed per file is ~2mb/s it needed ~13minutes to download 260mb of files, then i aborted)

→ so the file amount is the culprit, not the networking i guess

I tried to deactivate DEBUG in the .env file and use the docker-compose.override.prod.yml instead of the local.yml - then i get internatl API_ERRORS when packaging a project, so i reverted to DEBUG=1 and the local.yml..

Would love to try this “Monster-Project” in the official cloud with our organization account, but unfortunately packaging never succeeds as i’m always running into the 10min timout :frowning:


Imported from GitHub comment by @meyerlor on 2024-07-17T09:18:03Z

Note QField is limited to 8 paralllel file downloads at any point of time. Actually it’s pretty much the same for all clients (including web browsers) that you use.

I am pretty sure if you make the DEBUG=0, you will gain some speed. I guess the API errors are because of QFieldCloud/.env.example at master · opengisch/QFieldCloud · GitHub . If you search in the QField(Cloud) related issues/discussions, I have already explained what it is and what it should be configured to.


Imported from GitHub comment by @suricactus on 2024-07-17T11:24:22Z

Would love to try this “Monster-Project” in the official cloud with our organization account, but unfortunately packaging never succeeds as i’m always running into the 10min timout :frowning:

Are those file needed for the project itself? If not, you can mark them as “attachments” and they will not be downloaded in your app.qfield.cloud job, therefore you should be able to fit within the 10 mins limit.


Imported from GitHub comment by @suricactus on 2024-07-17T11:25:26Z

suricactus many thanks again - this answer set me on the right path. All i had to do was to add the minio/geo-db services to the docker-compose.override.prod.yaml (so far i have no external S3 service running), now i can set DEBUG=0 and use the docker-compose.yaml and the prod.yaml.

Speed is now much better (260mb took 8min, still not superfast but at least you see the bar moving :slight_smile: )

The files are (Water) Hydrant/Manhole pictures/drawings which my field workers need to be able to check anytime. I have marked the folders as attachments in QfieldSync plugin, but they are still downloaded/processed at count to the timelimit if i understand it right?!


Imported from GitHub comment by @meyerlor on 2024-07-17T14:20:24Z

You can check the job logs if these files are downloaded. If they are, you have config problem with attachment dirs. If you are sure it is not a config problem, please open a new issue or contact the support at support.qfield.cloud.

For more detailed support, you can drop us an email at salesqfield.cloud.


Imported from GitHub comment by @suricactus on 2024-07-17T16:20:51Z

sub-sub folders need to be listed seperately as attachment dir i learned :D. Got it solved now, it is running on the official cloud perfectly well! Thanks a bunch!


Imported from GitHub comment by @meyerlor on 2024-07-21T11:29:33Z