Caching in NGINX
How to cache pages responses to speed up your website
Under high traffic, server resources can be quickly exhausted. Caching helps optimize request processing, reduce response times, and lower the load on your database and PHP. When caching is enabled, the application generates a page once and stores the result in memory for a set period (TTL). Until the TTL expires, users will receive the pre-generated version of the page. Let’s see how to configure caching in Nginx.
Enabling Caching in Nginx
To enable caching in Nginx, you first need to define the maximum cache size (the total size of all cached pages cannot exceed this limit). This can be done in the http section of /etc/nginx/nginx.conf:
proxy_cache_path /var/cache/nginx levels=1:2 keys_zone=all:32m;
Don’t forget to create the directory for cache storage:
# mkdir /var/cache/nginx
Configuring the Cache and Main Servers
To use caching, move your main server to a different port (for example, 81), and run a caching server on the standard port 80. The caching server will either serve cached content or proxy requests to the main server:
server {
listen 80;
location / {
proxy_pass http://127.0.0.1:81/;
proxy_cache all;
proxy_cache_valid any 1h; # Cache for 1 hour
}
}
Main server configuration:
server {
listen 81;
location / {
try_files $uri $uri/ /index.php?$query_string;
}
location = /favicon.ico { access_log off; log_not_found off; }
location = /robots.txt { access_log off; log_not_found off; }
access_log /var/log/nginx/letsclearitup.access.log;
error_log /var/log/nginx/letsclearitup.error.log error;
location ~ \.php$ {
fastcgi_split_path_info ^(.+\.php)(/.+)$;
include fastcgi_params;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
fastcgi_pass unix:/run/php/letsclearitup.sock;
fastcgi_hide_header X-Powered-By;
}
location /status {
fastcgi_pass unix:/run/php/letsclearitup.sock;
include fastcgi.conf;
allow 127.0.0.1;
deny all;
}
}
Disabling Cache for Users with Cookies
If the user has cookies set, caching can be disabled:
server {
listen 80;
location / {
if ($http_cookie ~* ".+" ) {
set $do_not_cache 1;
}
proxy_pass http://127.0.0.1:81/;
proxy_cache all;
proxy_cache_valid any 1h;
}
}
Caching Error Responses
You can also cache error responses (404, 502, 503) to reduce load from requests to non-working parts of the site:
server {
listen 80;
location / {
if ($http_cookie ~* ".+" ) {
set $do_not_cache 1;
}
proxy_pass http://127.0.0.1:81/;
proxy_cache all;
proxy_cache_valid 404 502 503 1m;
proxy_cache_valid any 1h;
}
}
FastCGI (PHP-FPM) Caching
Nginx supports caching FastCGI responses. To enable this, add the following to the http section of /etc/nginx/nginx.conf:
fastcgi_cache_path /var/cache/fpm levels=1:2 keys_zone=fcgi:100m;
fastcgi_cache_key "$scheme$request_method$host$request_uri";
Create the cache directory:
# mkdir /var/cache/fpm
Then in the main server configuration, add:
server {
listen 81;
...
location ~ \.php$ {
fastcgi_split_path_info ^(.+\.php)(/.+)$;
include fastcgi_params;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
fastcgi_pass unix:/run/php/letsclearitup.sock;
fastcgi_hide_header X-Powered-By;
fastcgi_cache fcgi;
fastcgi_cache_valid 200 60m; # Cache 200 responses for 1 hour
}
...
}
Note
On VPS and dedicated servers, you have full control over NGINX configuration, including flexible caching settings. This gives you complete control over site performance and server load.