How to keep a public html page with all media and keep structure

Looking for a Linux application (or Firefox extension) that will allow me to clear the HTML layout and maintain page integrity. Firefox does an almost perfect job, but doesn’t capture images mentioned in CSS.

The Scrabbook extension for Firefox gets everything, but aligns the directory structure.

I would not be scared if all the folders became child pages of the index page.

+4
source share
4 answers

See Mirroring a site using wget

wget --mirror –w 2 –p --HTML-extension –-convert-links http://www.yourdomain.com 
+5
source

Have you tried wget?

+2
source

wget -r does what you want, and if not, there are many flags to configure it. See man wget .

Another option is curl , which is even more powerful. See http://curl.haxx.se/ .

+1
source

Teleport Pro is great for this kind of thing. You can specify it on complete websites, and it will download a copy locally maintaining the directory structure and, if necessary, replace absolute links with relative ones. You can also specify whether you want to receive content from other third-party sites associated with the source site.

0
source

Source: https://habr.com/ru/post/1276788/


All Articles