Building a new website? A short security check-list for you

Abdulla Abdurakhmanov
3 min readMar 8, 2021

Just to remind ourselves of basic and common security recommendations for most of the cases.

1. Use HTTPS (and HTTP/2) whenever possible.

With services like Let’s Encrypt you don’t have an excuse anymore. It won’t affect much your performance/resources/budgets either. Don’t believe me? Have a look at a (really old now) study from Google.

2. Disable obsolete TLS v1.0/1.1 and insecure algorithms.

This might be challenging, depending on what framework/software you’re using, but you need to disable insecure protocols and use TLS 1.2/TLS 1.3 only. Some of the environment already disabled them by default (like rustls for example), but for others, you need to do it yourself.

This is one of my configs for JDK for example:

jdk.tls.disabledAlgorithms=SSLv2Hello, SSLv3, TLSv1, TLSv1.1, DES, DESede, RC4, MD5withRSA, DH keySize < 1024, \
EC keySize < 224, DES40_CBC, RC4_40, \
TLS_RSA_WITH_AES_256_CBC_SHA256, TLS_RSA_WITH_AES_256_CBC_SHA, \
TLS_RSA_WITH_AES_128_CBC_SHA256, TLS_RSA_WITH_AES_128_CBC_SHA, \
TLS_RSA_WITH_AES_256_GCM_SHA384, TLS_RSA_WITH_AES_128_GCM_SHA256, \
TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA,
LS_DHE_RSA_WITH_AES_256_CBC_SHA, \
TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA, TLS_DHE_RSA_WITH_AES_128_CBC_SHA

You should be careful with this if you are still working with old/legacy clients that don’t support new TLS protocols.

3. Enable Strict-Transport-Security everywhere.

This is really as simple as that to protect your users from MitM attacks. Just add this to all of the responses from your website.
https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Strict-Transport-Security

4. Remove everything that identifies your software, environment and their versions.

  • Check HTTP response headers from your website. Remove/replace all of the X-Powered-By etc that provides hints about your software and versions of the software.
  • Check HTTP responses when errors happen (404, 403, 500 pages, REST API responses, etc). Disable all visible for end-users call stacks and error details about your software. Introduce custom HTTP pages.
  • Using console tools like cURL, netcat, etc, try to send malformed URLs and check all responses that don’t have hints of what software are you using.

5. Prevent loading your websites in frames/iframes.

If you don’t need it for your particular case (which is uncommon, I believe), then by default you should disable it using Content-Security-Policy (CSP), for example:

Content-Security-Policy: frame-src 'none'

There are also other useful policies in CSP as well, yet this is something more common to disable so was worth to mention separately.

6. Don’t trust any user input.

Even if you have the majority of users without any mal intention, there is also an evil one and also someone whose devices were compromised and contains malicious software.

So, escape all of the user inputs by default (or better use web-frameworks and the majority of them mostly do that by default). Use appropriate data types checks, lengths/size limitations.

This is a good start link for details.

7. Secure your cookies.

Default config for your session cookies must be:

  • HttpOnly
  • Secure
  • SameSite=Strict (or at least Lax, this is default now in modern browsers, but it is better to be strict about it for all users).
  • Max-Age=<appropriate-duration-in-days-not-years>

8. Use CORS in a secure way.

If you’re using CORS, then don’t put a * to access-control-allow-origin header, use a limited list of domains that really supposed to be the source of your browser requests.

9. Don’t use sensitive or personal information as URL parameter values.

Even if they still encrypted in HTTPS, they might easily accidentally leak - logs, browser developer tools, etc.

So, avoid this if possible:
https://example.net?my-access-code=<access-code>
(sometimes it isn’t possible, for example for sharing links in emails without identification — in those cases, I recommend to use time-limited tokens, etc)

10. Don’t forget to protect WebSockets, SSE/EventSource endpoints.

Check access to the endpoints. They also borrow resources like sockets, so it might be also causing issues with DDoS attacks.

11. Don’t rely on robots.txt.

Use it of course to manage indexing your website, but it isn’t a security tool.

— —

Surely if you deal with important data (like PCI DSS, or working on something in Healthcare, etc), this is not good enough indeed, just a good start.

--

--