Hello, I have farmOS working on an Ubuntu system. I used the guide to install it with docker compose which worked very well! I also used the guide to try to get the HTTPS working, but unfortunately I am stuck
Here are some screenshots of all the steps I have taken to get the HTTPS working:
Looks like it is working only the certs are not from a trusted CA (Which is true as they are self signed)
If you click on “Not secure” beside the url box you should be able to see more detail on the issue with the certs.
You may have to export the cert and import it to your browsers trusted certificate authorities section.
or you could use brave://flags/#allow-insecure-localhost, this will allow untrusted certs for localhost only.
@keylum88 Yea that’s expected behavior with a self-signed certificate. Self-signed certificates allow you to have https, but browsers will still complain that the certificate is untrusted.
I’m not sure about Brave browser, but in Firefox you can click the Advanced button on the screen you see and there’s an option to “proceed” anyway. It will still show a red strike through the https in the addressbar, but you’ll be able to use the site.
Thanks for the reply. This is my first time setting up an HTTPS so I don’t completely understand it. My goal is to have HTTPS on my localhost so that any device that accesses it will have the HTTPS lock icon in their address bar. I don’t want them to have to configure anything on their browser. Is there a way to achieve this? Do I have to use a “self signed” cert?
If you need other devices to connect to it, then you either need to configure each device to trust your certificate, or you need to get a real SSL certificate. Let’s Encrypt is a great free option, but your site needs to be publicly accessible from the internet in order for that to work. Otherwise you can pay for a certificate and install that locally.
(These are all general considerations for HTTPS websites in general - not specific to farmOS, FYI.)
OK, thanks for the advice. I’m trying to set up a self contained system that isn’t internet dependent. I didn’t realize that SSL was something you bought even if it’s use is local. It’s always something that my host took care of
Buying a certificate is the “old fashioned” way, and is still a simple way to get one.
It is possible to use Let’s Encrypt with DNS-based challenge, but that’s a lot more complicated to automate, and because Let’s Encrypt certificates expire every 3 months you really want to automate it.
The website needs an internet connection at least periodically, but it doesn’t necessarily need to be publicly accessible from the internet if you use the dns-01 challenge instead of the http-01 challenge.
In my opinion, the easiest way to do that is via Dehydrated which has a plugin architecture allowing many DNS providers to work automatically just by installing the right plugin. There’s also matrix-org/docker-dehydrated which makes it a bit easier to get set up in Docker.
I personally use AWS Route53 because IAM users can be configured with very fine grained permissions so the user I’m exposing on the host to Dehydrated/Docker/etc only has permission to update the DNS records - e.g. not modify/transfer my domain(s) . (Compared with something like NameCheap - my registrar - which only has account-wide API access. So if the credentials became compromised, much more damage could occur.)
I’ve been using cloudflare tunnels to serve up docker instances to their own cnames on my domain. No need to register certs, cloudflare takes care of them, even resolves your cname to the ip/port of your container internally and serves it via https even if the docker container is running http. There is a cloudflare docker container to run to do the tunnel through.
I might add, there are several authentication methods to secure access too, he covers that in the video.