I should preface this by mentioning that I’m not really a data hoarder, but I’ve been tasked with downloading and preserving the content from some MediaWiki based wikis, and a couple of Fandom ones too (which to my knowledge) in order to preserve their contents (which I feel is in the spirit of this sub.)
So far I’ve managed to figure out that I need to use WikiTaxi, but the trouble I’m running into is that, while Wikipedia XML dumps are readily available for download, other Mediawiki sites are not so forthcoming from what I’ve seen so far.
Is there any way that I can get my hands on those? Preferably through a self-contained program or similar that doesn’t necessitate running Python or similar things (I’m by no means nearly tech-savyy enough to know how to use Python and would like to avoid such things since there are a few Wikis and would like to simplify the process as much as humanly possible).
I would genuinely appreciate any kind of advice or information any of you might be able to provide me with.
Use the wikiteam’s dumpgenerator.