Having trouble getting HTTPS to work for localhost environment

Hello, I have farmOS working on an Ubuntu system. I used the guide to install it with docker compose which worked very well! I also used the guide to try to get the HTTPS working, but unfortunately I am stuck :frowning:

Here are some screenshots of all the steps I have taken to get the HTTPS working:

First I created the certs:

Then I created the NGINX config file:

Then I edited the settings file:

Then I edited the docker-compose YML file:

But when I go to https://localhost it gives me this page:

I am running three containers right now:

farmos_proxy_1
farmos_www_1
farmos_db_1

Not sure where to go from here. Thank you so much for your help!

-Alex

1 Like

Looks like it is working only the certs are not from a trusted CA (Which is true as they are self signed)
If you click on ā€œNot secureā€ beside the url box you should be able to see more detail on the issue with the certs.

You may have to export the cert and import it to your browsers trusted certificate authorities section.

or you could use brave://flags/#allow-insecure-localhost, this will allow untrusted certs for localhost only.

2 Likes

@keylum88 Yea thatā€™s expected behavior with a self-signed certificate. Self-signed certificates allow you to have https, but browsers will still complain that the certificate is untrusted.

Iā€™m not sure about Brave browser, but in Firefox you can click the Advanced button on the screen you see and thereā€™s an option to ā€œproceedā€ anyway. It will still show a red strike through the https in the addressbar, but youā€™ll be able to use the site.

2 Likes

Thanks for the reply. This is my first time setting up an HTTPS so I donā€™t completely understand it. My goal is to have HTTPS on my localhost so that any device that accesses it will have the HTTPS lock icon in their address bar. I donā€™t want them to have to configure anything on their browser. Is there a way to achieve this? Do I have to use a ā€œself signedā€ cert?

1 Like

If you need other devices to connect to it, then you either need to configure each device to trust your certificate, or you need to get a real SSL certificate. Letā€™s Encrypt is a great free option, but your site needs to be publicly accessible from the internet in order for that to work. Otherwise you can pay for a certificate and install that locally.

(These are all general considerations for HTTPS websites in general - not specific to farmOS, FYI.)

2 Likes

OK, thanks for the advice. Iā€™m trying to set up a self contained system that isnā€™t internet dependent. I didnā€™t realize that SSL was something you bought even if itā€™s use is local. Itā€™s always something that my host took care of :slight_smile:

1 Like

Buying a certificate is the ā€œold fashionedā€ way, and is still a simple way to get one. :slight_smile:

It is possible to use Letā€™s Encrypt with DNS-based challenge, but thatā€™s a lot more complicated to automate, and because Letā€™s Encrypt certificates expire every 3 months you really want to automate it.

1 Like

This is partially trueā€¦

The website needs an internet connection at least periodically, but it doesnā€™t necessarily need to be publicly accessible from the internet if you use the dns-01 challenge instead of the http-01 challenge.

In my opinion, the easiest way to do that is via Dehydrated which has a plugin architecture allowing many DNS providers to work automatically just by installing the right plugin. Thereā€™s also matrix-org/docker-dehydrated which makes it a bit easier to get set up in Docker.

3 Likes

Check out docker-dehydrated/dns-01.md at master Ā· matrix-org/docker-dehydrated Ā· GitHub if you are considering going down that path.

I personally use AWS Route53 because IAM users can be configured with very fine grained permissions so the user Iā€™m exposing on the host to Dehydrated/Docker/etc only has permission to update the DNS records - e.g. not modify/transfer my domain(s) :sweat_smile:. (Compared with something like NameCheap - my registrar - which only has account-wide API access. So if the credentials became compromised, much more damage could occur.)

3 Likes

Iā€™ve been using cloudflare tunnels to serve up docker instances to their own cnames on my domain. No need to register certs, cloudflare takes care of them, even resolves your cname to the ip/port of your container internally and serves it via https even if the docker container is running http. There is a cloudflare docker container to run to do the tunnel through.

I might add, there are several authentication methods to secure access too, he covers that in the video.

1 Like