I encountered a problem while working on a project. I want to “crawl” certain sites of interest and save them as a “full web page”, including styles and images, to create a mirror for them. I had to add bookmarks to the site several times to read it later, and after a few days the site was unavailable because it was hacked and the owner did not have a backup copy of the database.
Of course, I can read files with php very easily with fopen("http://website.com", "r")or fsockopen(), but the main goal is to save full web pages, so if it goes down, it will still be available to others, for example, a “time programming machine :)
Is there a way to do this without reading and saving each link on the page?
Objective-C solutions are also welcome, as I am also trying to understand it.
Thanks!
source
share