The winter semester is coming to an end. Please help us to preserve as many documents of this semester as possible. This is a team effort. If you have something that we are still missing, please feel free to upload it right here. (We've already added 3 files in the last 30 days). You are also always welcome to send us an e-mail at info@pasta.place
You are welcome to automatically download/scrape all files from this page. (I can't prohibit it anyway.) If you do, please direct your bot to pasta.place/scraping/. You'll find a light-weight version of this page which isn't as computationally heavy for the server. (And you won't destroy file-acces-statistics this way. So please use it.)
Run you script preferrably at night so it doesn't disrupt the service. The repository corrently contains around 25GB of documents.
Linux:
wget -r --no-parent -N -l inf -t 5 --limit-rate=2m -w 1 -nH -U "Wget-Scraper" --https-only https://pasta.place/scraping/On subsequent executions this will only re-download files that changed since the last download. This reduces server load and increase download speed.
Windows:
You can download the latest wget.exe and put it in the directory where you want to store the downloaded files.
Then open a cmd
and change to that directory.
cd /path/to/download/folder; wget.exe -r --no-parent -N -l inf -t 5 --limit-rate=2m -w 1 -nH -U "Wget-Scraper" --https-only https://pasta.place/scraping/On subsequent executions this will only re-download files that changed since the last download. This reduces server load and increase download speed.