Parcourir la source

Update 'dev/tools/nextcloud-S3-migration/README.md'

Pierre-Yves Barriat il y a 1 an
Parent
commit
b804f82e50
1 fichiers modifiés avec 17 ajouts et 14 suppressions
  1. 17 14
      dev/tools/nextcloud-S3-migration/README.md

+ 17 - 14
dev/tools/nextcloud-S3-migration/README.md

@@ -13,9 +13,8 @@ It will transfer files from **local** based primary storage to a **S3** primary
 Before you start, it is probably wise to set $DO_FILES_CLEAN (occ files:cleanup)
 and $DO_FILES_SCAN (occ files:scan --all) to '1' once, let the 'Nextcloud' do some checking.. then you'll start out as clean as possible
 
-0. make sure that the Nextcloud cron job is disabled and make sure that you local data is sufficiently large
-
-1. go to the nextcloud root folder and install composer
+- make sure that the Nextcloud cron job is disabled and make sure that you local data is sufficiently large
+- go to the nextcloud root folder and install composer
 
 ```bash
 wget https://getcomposer.org/installer
@@ -27,37 +26,41 @@ rm -f composer-setup.php
 chmod 755 bin/*
 ```
 
-2. install 'aws/aws-sdk-php'
+- install 'aws/aws-sdk-php'
 
 ```bash
 runuser -u apache -- ./bin/composer.phar require aws/aws-sdk-php
 ```
 
-3. place 'storage.config.php', 'localtos3.php' and 'tests3.php' in the nextcloud root folder (and set your S3 credentials!)
-4. set & check all the config variables in the beginning of the scripts!
-5. check your S3 config 
+- place 'storage.config.php', 'localtos3.php' and 'tests3.php' in the nextcloud root folder (and set your S3 credentials!)
+
+- set & check all the config variables in the beginning of the scripts!
+
+- check your S3 config 
 
 ```bash
 sudo -u apache php81 -d memory_limit=1024M tests3.php
 ```
 
-6. start with the highest $TEST => 2 (complete dry run, just checks en dummy uploads etc. NO database changes what so ever!)
+- start with the highest $TEST => 2 (complete dry run, just checks en dummy uploads etc. NO database changes what so ever!)
 
 ```bash
 sudo -u apache php81 -d memory_limit=1024M localtos3.php
 ```
 
-7. set $TEST to a 'small test user", upload the data to S3 for only that user (NO database changes what so ever!)
-8. set $TEST to 1 and run the script yet again, upload (**and check**) all the data to S3 (NO database changes what so ever!)
-9. set $TEST to 0 and run the script again (this is LIVE, nextcloud will be set into maintenance:mode --on while working ! **database changes!**)
+- start the migration :
 
-> **DO NOT** skip ahead and go live ($TEST=0) as the first step.. then your downtime will be very long!
+  > **DO NOT** skip ahead and go live ($TEST=0) as the first step.. then your downtime will be very long!
+ 
+  1. set $TEST to a 'small test user", upload the data to S3 for only that user (NO database changes what so ever!)
+  2. set $TEST to 1 and run the script yet again, upload (**and check**) all the data to S3 (NO database changes what so ever!)
+  3. set $TEST to 0 and run the script again (this is LIVE, nextcloud will be set into maintenance:mode --on while working ! **database changes!**)
 
-With performing 'the move' at step 8 you will decrease the downtime (with maintenance mode:on) immensely!
+With performing 'the move' at second step you will decrease the downtime (with maintenance mode:on) immensely!
 This because the script will first check if it already has uploaded the latest file, then it can skip to the next and does not need to (slowly) upload it to your S3 bucket!
 With a litte luck the final run (with $TEST=0) can be done within a minute!
 
-> **NOTE** step 8 will take a very long time when you have a lot of data to upload!
+> **NOTE** second step will take a very long time when you have a lot of data to upload!
 
 If everything worked you might want to delete the data in data folder.
 Also you probably want to delete this script (and the 'storage.config.php') after running it.