A user pointed out a suspicious website to me, and I found these suspicions to be valid (at least for the time being). We plan to do a post, but of course I want to archive first. Two problems:
When I submit the URLs I want to archive, archive.is doesn't archive all items on any given page. Pics for example are missing. I am not very experienced with archive.is, but submitting a single page URL at a time should do the trick? Well, it doesn't for me - the archived page is nowhere like the original page. See here screenshot cuts of before and after archiving: http://imgur.com/a/qtEe4
Second problem: Even if I decided to give the URL here for someone else to archive, we could not archive everything as there are also pages which can only be accessed if you are registered. I won't give away my login data, and I would not want more people to register there as the site seems a small organization, and more registrants in a short timeframe would be suspicious.
So I hope to get help with archiving! Any ideas what I am doing wrong?
view the rest of the comments →
IShallNotFear ago
just read an article about a tool called "wget" Maybe that could help you. Apparently you can download files from websites quickly. The article said it was used by a leaker in order to give thousands of documents to Wikileaks. The article is here: https://www.labnol.org/software/wget-command-examples/28750/ The tool is here: https://www.gnu.org/software/wget/
I apologize if I am posting too many suggestions. I'm just trying to find a solution that works.
sensitive ago
On the contrary! I just don't want to waste your time for something that most likely is not possible here. Your suggestions are highly appreciated.