To speed up the creation of a MediaWiki site that has content that uses many templates but otherwise has pretty much static content when the templates have completed their tasks, I would like to set up a squid server.
https://www.mediawiki.org/wiki/Manual:PurgeList.php
and
https://www.mediawiki.org/wiki/Manual:Squid_caching
and then populate the squid cache server โautomaticallyโ using the script to make wget / curl calls that go to all pages of Mediawiki. My guess would be that after this procedure, each individual page is in the squid cache (if I make it big enough), and then every access will be done using squid.
How can I make this work? For instance:.
- How to check the configuration?
- How to find out how much memory is needed?
- How can I check that the pages are in squid3 cache?
What have i tried so far
I started by figuring out how to install squid using:
and
I found out my IP address xx.xxx.xxx.xxx (not disclosed here) via ifconfig eth0
in / etc / squid3 / squid.conf I put
http port xx.xxx.xxx.xxx:80 transparent vhost defaultsite=XXXXXX cache_peer 127.0.0.1 parent 80 3130 originserver acl manager proto cache_object acl localhost src 127.0.0.1/32
Then I set up apache2 server
I added
$wgUseSquid = true; $wgSquidServers = array('xx.xxx.xxx.xxx'); $wgSquidServersNoPurge = array('127.0.0.1');
for my LocalSettings.php
Then I restarted apache2 and started squid3 with
service squid3 restart
and made the first attempt to access using
wget --cache=off -r http://XXXXXX/mediawiki
result:
Resolving XXXXXXX (XXXXXXX)... xx.xxx.xxx.xxx Connecting to XXXXXXX (XXXXXXX|xx.xxx.xx.xxx|:80... failed: Connection refused.
source share