Anyone know something a bit more user-friendly than wget that I can use to download a wikipedia page PLUS the largest version available of every thumbnail image on the Aforementioned wikipedia page? In other words, I want to be able to easily download Wikipedia pages for offline use and have the full-sized images downloaded at the same time.
I would love something that I could just feed a txt file of urls to.
I would love something that I could just feed a txt file of urls to.