How to razocharovat access to the URL in the file robots.txt?


What am I doing wrong?
June 3rd 19 at 19:16
1 answer
June 3rd 19 at 19:18
This is very similar to a glitch since robots.txt there are no rules prohibiting these URLS. As workaround, I propose to add in robots.txt allow rows for each blocked URL, and sent to check Google.

Also very doubtful, but it is theoretically possible that Disallow: */page/ does not index the URLs seitan:
Tried, but not helping...
For the second day, the site does not appear in the search. - Katherine_Hickle commented on June 3rd 19 at 19:21
Remove the disallow of all robots and send for review.

Then enter one. I don't see robots specific problems watched very closely. - jovanny_Turcotte commented on June 3rd 19 at 19:24
everywhere put allow, all the same... - Katherine_Hickle commented on June 3rd 19 at 19:27

Find more questions by tags Creating a site mapSearch engine optimizationrobots.txtWordPress