There is a online store, created sitemap.xml counted 4 000 pages, how them correctly to feed the search engines? Close all to the robots and remove from the sitemap, leaving pages 100 and gradually open during the indexing or you can just pour and let him decide?
dashawn.Mayert15 answered on June 14th 19 at 18:05
Not in any sense a gradual "feeding" the search engines, just open up search engines by and large do not care, it indexes analyzes the text displays in the results if everything is OK - looking at the behavioral factor (if it is bad then lowers).
Eulalia22 answered on June 14th 19 at 18:07
and what is the meaning of 4 thousands is to search about anything , not much different from zero pages.
like never been a factor influencing many pages on the website or a little
now run the sites with hundreds of millions of pages open all at once