@krishnathapa,
As the import is made in a synchronous way you should prepare your servere to this task that will be too long.
Thank you for your quick response. Could you please provide a bit more insight on this?
1. How long it might take for an AWS server?
2. What changes do I need to made in our server?
3. I can try in batches as well. Any idea, how much it will be better to do in one batch?
Respect to 1) and 2) I cannot say you. This is something you should test and check with your sysadmin.
You cannot do 3) because if you set “delete users that are not in CSV” you will have many deletions in every batch.
Thank you!
It seems like it will be complex and risky. One more question, What will happen if server terminates the deletion process in between?
Can you suggest any other way to achieve this?
Best
Krishna
What will happen if server terminates the deletion process in between?
Not all users would be deleted.
Can you suggest any other way to achieve this?
No, we are working in creating an async process to avoid this so long run task (as we have already done in export process).
Anyway with a such high number of users… it would can create problems in any situations.
Okay. Is there any chance to handle such big data when you update to the async process?
Better but the delete process would be also a sync process made in the last step.
This can be dangerous.
If you have so many users, I hope you have a good IT team that can handle all it in a safe way.
Thank you for your help @carazo. I will see what I can do from here.