{"id":1399,"date":"2013-04-18T16:24:10","date_gmt":"2013-04-18T14:24:10","guid":{"rendered":"http:\/\/www.alkannoide.com\/?p=1399"},"modified":"2013-06-14T17:06:38","modified_gmt":"2013-06-14T15:06:38","slug":"mistserver-optimize-the-http-delivery-via-caching","status":"publish","type":"post","link":"https:\/\/www.alkannoide.com\/2013\/04\/18\/mistserver-optimize-the-http-delivery-via-caching\/","title":{"rendered":"Mistserver – Optimize the HTTP delivery via caching"},"content":{"rendered":"
In the previous post, we looked the new features of MistServer (version 1.1).<\/a> To make this test, I use on the same server MistServer and Varnish. Here is the architecture.
\n<\/a>Today, I expose an idea exposed by a friend Nicolas Weil<\/a>. We talked about the content caching for an architecture based on Mistserver especially for HTTP based format. We thought about Varnish<\/a>, an HTTP accelerator<\/a>. The idea is to keep in cache the different fragments which are generated by MistServer. This article will not talk about the RTMP or TS part, theses protocols are not HTTP based.<\/p>\n
\n<\/a>
\n<\/p>\n