How to fix the drop in the number of pages in the search?
Recently decided to clean up duplicate pages on the website, added robots.txt the condition block. Takes a lot of.
In Yandex Webmasters began to fall loaded paper (30%), the pages (50%), excluded pages (50%).
Traffic also dropped by 50%.
Questions. Why such a sharp drop pages in the search. I do NOT put the correct page.
How to stop the drop pages in the search, and the drop in traffic?
Wait for the full re-indexing of the site? How long will it take?
Going to remove the duplicates, it is better to do this now or later?
In such cases it is better to start with an analysis of the login pages of the search, for example, through "Yandex.The metric". And just making sure that duplicates are not considered by search engines more relevant to exclude them. Perhaps you've ruled out any traficului page.
cesar_Ritch answered on July 2nd 19 at 14:28
Close page from index better with <meta name="robots" content="noindex, nofollow">
robots can be blunt.
Secondly. Are you sure you have the right of the page out of index? If not, see the same webmaster and CECI. If flew only takes and needs remained, so your doubles was relevant right. You need to watch and learn why.
If the index is busted need it, so you cant robots.