Varnish Cache

For dynamic web projects, complex layouts and rising user rates significantly drag down performance. Reserve proxies make it possible to curtail these losses and relieve web servers by answering server requests. Reserve proxies work by storing requested material—static content, like pictures, as well as frequently requested dynamic pages—in their cache. One very popular caching software is Varnish. Unlike many of its competitors, Varnish was planned from the ground up to be a web accelerator. Installing and configuring a varnish cache requires web servers to preside over all root rights and run on Unix operating systems.

How Varnish Cache works

In the chain of processes that occur during data requests, Varnish is positioned directly upstream from the web server where the desired content is found. While page requests are still initially processed by the original server, it’s the Varnish proxy that saves the request and required content. Further requests of this kind are dealt with by loading the desired data directly from the Varnish Cache. This means that the software is able to cache all data located in the working memory and allows the operating system to determine what’s to be saved to the server’s hard drive. Doing this helps users avoid simultaneously saving data on both their hard drives and in the cache.

Varnish also functions as a load balancer . With the help of the Round Robin procedure, incoming client requests are evaluated as separate worker threads that are dealt with in sequential fashion by the Varnish Cache. A fixed limit determines how many simultaneously active threads are able to be processed. Once this threshold is reached, all other requests end up in a queue where they wait to be processed. Incoming connections are only blocked once the queue’s limit is reached.

Configuring Varnish reserve proxies is mostly controlled via the Varnish Configuration Language (VCL). This makes it possible for hooks (here: a technique which allows users to integrate foreign code into the application) to be written. Once this VCL script is loaded, it’s then translated into the programming language C and compiled into a program library; the VCL instructions are linked to the Varnish cache. If the applied CMS, e-commerce software, or web application supports the markup language ESI (Edge Side Includes), Varnish is also able to transmit entirely cached pages. The tagging language generates ESI tags in the HTML files, which are used for labeling dynamic content. During client requests, Varnish Cache is able to recognize these tags and reload their corresponding content.

The pros and cons of Varnish hosting

In many cases, optimizing customized hosting solutions with a Varnish cache can be the answer to challenges brought about by the growing complexity and rising user rates of your web project. This doesn’t mean that the software is best suited to all web presences, though. Take a look at the pros and cons of Varnish hosting in the overview we’ve created below:

Advantages:Disadvantages:
✔ Faster loading times thanks to caching in the RAM ✘ No substantial optimization for systems that don’t support ESI
✔ Web server relief ✘ Increased complexity and error rate 
✔ Supports ESI ✘ Doesn’t support TLS/SSL (HTTPS)
✔ Operating system exports content to server hard drive  ✘ Demanding set-up and configuration
✔ Load distribution based on Round Robin procedure✘ Only for Unix systems
✔ Flexible configuration possibilites with VCL

The comparison above illustrates once more that Varnish hosting is only a viable alternative to caching functions for clients and web servers when working with web applications that support ESI. Additionally, setting up and configuring the Varnish Cache with ESI tags can prove to be taxing. And given that Varnish doesn’t support any TLS/SSL connections, you’ll need an additional proxy server for secure transfers.

Off putting as some of these points may seem, a properly configured Varnish Cache with ESI tags can speed up your web projects in a way that conventional caching methods can’t. This greatly decreases loading times for visitors, helping you achieve a greater overall conversion rate in the long run. These efforts are also rewarded with a better search engine ranking and significantly relieve the webserver, which is no longer responsible for processing all incoming connections. Varnish hosting is especially popular with those operating online stores and websites with a variety of content.

Installing Varnish Cache

Administrative rights of the Unix system in use is required in order to install the Varnish Cache. Additionally, the web server located downstream from the Varnish Cache needs to be installed before you can begin. The following instructions lay out the necessary steps on how to install and configure varnish. The example that follows has been done on an Ubuntu operating system and an Apache web server:

1. First step:

Per default, Varnish is included in Ubuntu’s software package management, but it’s not always the latest version. For this reason, Varnish gives users the opportunity to access its online directory when installing the software. Here’s how to open the directory and add it as a source:

sudo apt-get install apt-transport-https
sudo curl https://repo.varnish-cache.org/GPG-key.txt | apt-key add -
sudo echo "deb https://repo.varnish-cache.org/ubuntu/ trusty varnish-4.1" >> /etc/apt/sources.list.d/varnish-cache.list 

2. Second step:

For the next step, reread the package list and install Varnish:

sudo apt-get update
sudo apt-get install varnish

3. Third step:

At this point, the Varnish file should be configured so that the software ‘knows’ where it can find web content:

sudo nano /etc/default/varnish

Change entries under ‘DAEMON_OPTS’ as follows:

NameVirtualHost 127.0.0.1:8080
Listen 127.0.0.1:8080
DAEMON_OPTS="-a :80 \
-T localhost:6082 \
-f /etc/varnish/default.vcl \
-S /etc/varnish/secret \
-s malloc,256m"

4. Fourth step:

Save the changes and open the default.vlc file:

sudo nano /etc/varnish/default.vlc

Enter port 8080 as the source for content created with Varnish:

backend default {
.host = "127.0.0.1";
.port = "8080";
}

5. Fifth step:

Finally, also set up port 8080 (default is 80) for Apache. Open the corresponding Apache port configuration file:

sudo nano /etc/apache2/ports.conf

Change the port number for the entry ‘NameVirtualHost’ and ‘Listen’ as follows:

NameVirtualHost 127.0.0.1:8080
Listen 127.0.0.1:8080

6. Sixth step:

To finish the installation, configure the default file (etc/apache2/sites-available/default) to the VirtualHost entry using the same method as from step 5.

7. Seventh step:

Restart the server and Varnish to finish the installation:

sudo service apache2 restart
sudo service varnish restart

To find additional instructions on how to install Varnish on other Unix-based operating systems as well as the software’s program code, head to the download section on Varnish’s official website.

We use cookies on our website to provide you with the best possible user experience. By continuing to use our website or services, you agree to their use. More Information.