How to fix the drop in the number of pages in the search?

Hello!
Recently decided to clean up duplicate pages on the website, added robots.txt the condition block. Takes a lot of.
In Yandex Webmasters began to fall loaded paper (30%), the pages (50%), excluded pages (50%).
Traffic also dropped by 50%.
Questions. Why such a sharp drop pages in the search. I do NOT put the correct page.
How to stop the drop pages in the search, and the drop in traffic?
Wait for the full re-indexing of the site? How long will it take?
Going to remove the duplicates, it is better to do this now or later?
July 2nd 19 at 14:24
3 answers
July 2nd 19 at 14:26
In such cases it is better to start with an analysis of the login pages of the search, for example, through "Yandex.The metric". And just making sure that duplicates are not considered by search engines more relevant to exclude them. Perhaps you've ruled out any traficului page.
July 2nd 19 at 14:28
Close page from index better with
<meta name="robots" content="noindex, nofollow">
robots can be blunt.

Secondly. Are you sure you have the right of the page out of index? If not, see the same webmaster and CECI. If flew only takes and needs remained, so your doubles was relevant right. You need to watch and learn why.
If the index is busted need it, so you cant robots.
and do not quite understand you. you've ruled out a lot (like you say) more and surprise why fall uploaded/search/excluded in webmaster. Everything is as it should be.
And with the traffic you need to watch. - cesar_Ritch commented on July 2nd 19 at 14:31
July 2nd 19 at 14:30
Yandex.Webmaster on December 9, Vebmasterskoy gave a new tool to work with new, and removed pages: https://yandex.ru/blog/webmaster/novosti-vebmaster...
You can try to perform.

Find more questions by tags robots.txtSearch engine optimization