I’m having issues with getting QGIS to connect to farmOS with farmOS_wfs module.
Background. I’m running 2.0.0-beta2, farmos_wfs 1.2.1, and QGIS 3.22.3. I believe I have configured QGIS correctly ar at least as per instillation instructions.
When I try to add the WFS layer it just keeps asking for my username and password. I’m not super familiar with QGIS and I’m not seeing any errors it may be providing, so not really sure where to look next.
Besides installing the WFS module in farmOS is there anything I need to configure with Oauth2? Any other ideas on where to look for troubleshooting?
Thanks @Symbioquine!. I thought there had to be some sort of log in QGIS, just didn’t know where to look.
When I try to add the layer and push connect the username/password box appears. I can try username and password over and over again with nothing until I push cancel. Then it pops up with this error and log messages.
curl -i -X POST -d "client_id=farm&client_secret=&username=Nick&password=my_secret_password_here&grant_type=password&scope=openid" https://mydomain.com/oauth/token
Replacing:
my_secret_password_here with your password
mydomain.com with your domain
Don’t post the actual token if it works, but it would be useful to see if it succeeds or fails and what headers/response-code/error-messages come back…
This was a fresh install of beta2. A week ago or so I accidentally moved to the 2.x-dev version before moving back to beta2.
farm_ui was installed. I uninstalled it and reinstalled it and have the same clients listed.
I’m still using 1.x for my main production and just playing with 2.x in the mean time. I’m waiting for beta3 before I move over completely on a fresh install. If this seems to be a weird instillation error I can just hold off until the fresh new install.
That code tries to create keys for OAuth2, but if it fails then the rest of that function doesn’t run. Maybe that’s happening to you? Do you see a farm_api error in Administration > Reports > Recent Log Messages?
We should probably remove that return; do you agree @paul121? It only prevents the farm consumer from being created, which isn’t helpful.
@BOTLFarm if you comment out that return; statement, and reinstall farm_api, I bet you’ll see the farm client show up.
That doesn’t resolve the keys issue though… that code is essentially trying to create a keys directory one level up from the webroot. If you create that yourself, and make it writable by Apache (before you reinstall the module) then maybe it will work!
@mstenta Looks like you may had a mistake in your past post when mentioned farm_ui instead of farm_api?
I did uninstall farm_api and reinstall it. This did create the farm client for me but I was still having key issues not having a directory assigned. I was able to provide a temporary directory for testing and this all works now. WFS connection to QGIS is working.
Im looking to a clean install now that I have played around with 2.x for a while and ready to switch over my production site soon.
Thanks again everyone for your help. This community is amazing!
Looks like I’m still having an issue with this on a clean install of beta-3. The farm client was made but there is still the issue of it not creating the directory opt/drupal/keys/. Should this be done automatically with a docker-compose install or is it something that would need to be manually made and permissions set? Also this folder is outside the sites directory so wouldn’t it get deleted every time the container was closed?
Should this be done automatically with a docker-compose install or is it something that would need to be manually made and permissions set?
No, unfortunately this is not something that the Docker Composer installation currently handles automatically - so we should open an issue for this. In the meantime you need to create the directory manually and set the permissions so they are readable only by Apache (www-data user).
Also this folder is outside the sites directory so wouldn’t it get deleted every time the container was closed?
Yes this is true! Sounds like we need some additions to the documentation to describe best practice around this.
The best thing to do is probably mount that directory as another Docker volume.