How to prevent robots.txt to view except search engines?

It is necessary to prohibit viewing robots.txt in .htaccess to register for a deny from all. However, for certain search engines access to allow. To determine bots at the HTTP_USER_AGENT, for example %{HTTP_USER_AGENT} !^yandex.*

How to create such a rule .htaccess ? - for all forbidden (403), except for specific bots.
July 2nd 19 at 13:31
1 answer
July 2nd 19 at 13:33
SetEnvIfNoCase User-Agent .*google.* search_robot
SetEnvIfNoCase User-Agent .*yandex.* search_robot
SetEnvIfNoCase User-Agent .*bot.* search_robot

Order Deny,Allow
Deny from All
Allow from env=search_robot
Bypass such closing
curl_setopt($ch, CURLOPT_REFERER, 'google.com');
curl_setopt($ch, CURLOPT_USERAGENT, 'search_robot'); - Bernie commented on July 2nd 19 at 13:36
: Of course, I know that. My answer is a direct answer to the question, nothing more. As the author of the question aware of the General principles of configuration of the web server, I assume that he is also aware that filtering by User-Agent is "from honest people." - Mark.Kub commented on July 2nd 19 at 13:39
: : Sorry, I am not competent in these matters, therefore I asked the opinion of experts. Necessary just code a direct quote - what to insert in htaccess to 403 for all but robots. Quite deep digging is not necessary, the type curl_setopt etc.. Maximum "protection against the fool" enough, and the rest - full server administration. And this is just hosting. Thank you. - delbert commented on July 2nd 19 at 13:42
: The code I quoted above, allows access to three robots and prohibits others. But because the validation occurs on the User-Agent, anyone can send a request to receive your robots.txt via browser or cURL, just "pretending" robot. - Mark.Kub commented on July 2nd 19 at 13:45
Code worked or you need additional explanations? - Mark.Kub commented on July 2nd 19 at 13:48
: Good afternoon. Thank you very much for having the time for. SetEnvIfNoCase User-Agent .*google.* search_robot
SetEnvIfNoCase User-Agent .*yandex.* search_robot

Order Deny,Allow
Deny from All
Allow from env=search_robot

Not working, as well as . This code prescribed to the very beginning of htaccess, of course all in the root of the website. :( - delbert commented on July 2nd 19 at 13:51
Not displayed somehow correct...
now show the code, adding spaces:

SetEnvIfNoCase User-Agent .*google.* search_robot
SetEnvIfNoCase User-Agent .*yandex.* search_robot

< FilesMatch robots.txt >
Order Deny,Allow
Deny from All
Allow from env=search_robot
< /FilesMatch >

And if you just prescribe < files robots.txt > also, unfortunately, does not work (I have no problem to open мойсайт.ру/robots.txt) - delbert commented on July 2nd 19 at 13:54
: You have mod_setenvif installed/connected? - Mark.Kub commented on July 2nd 19 at 13:57
: I asked for help from the hosting provider, and that's what the answer was. It is not so simple. "We wrote a rule that allows you to block access to the file robots.txt for users, however, permits access for the specified user-agents:

SetEnvIfNoCase User-Agent "^User-agent" search_bot

order deny,allow
deny from all
allow from env=search_bot

However, the file robots.txt is a static file, it is processed by the web server NGINX-file rule .htaccess it is not working.

We can switch the mode returns to the statics of Your website, then the rule will work correctly.

But this may increase the amount of website resources, because all static files are handled by the web server Apache."

Here and think - how much to increase the load... whether static include?.. - delbert commented on July 2nd 19 at 14:00
Toaster clip code. Like this (only without spaces):
SetEnvIfNoCase User-Agent "^User-agent" search_bot
< Files robots.txt >
order deny,allow
deny from all
allow from env=search_bot
< /Files > - delbert commented on July 2nd 19 at 14:03
A toaster does not cut off the code if you wrap it in code tag.
If you have this robots.txt do not request a thousand times per second, no load you will not increase. Well, Yes, who could guess that robots.txt is given otherwise.
The code you gave in the last comment, use is not necessary, it's just a template code and not working code. - Mark.Kub commented on July 2nd 19 at 14:06
: Yes, I understand prosalon code, as in "^User-agent" it is necessary to substitute a specific something (the search engine or fragment). And the fact that clips code - unfortunately, in the comments it is impossible to apply (e.g. I write to You the answer) is if only the total response to write - where, Yes, you can wrap in a code or a specific language. I pilosus to You... do, a static file robots not requested often enough to significantly increase the load on the server in General... - delbert commented on July 2nd 19 at 14:09
: If Toster does not use code tag in comments, how is it that I write here is: What service/program to take a screenshot to look like on MAC? (see comments to my answer)? Just tags then you need to type manually, but not a beautiful panel to use.
You lot argue about what they don't know. - Mark.Kub commented on July 2nd 19 at 14:12
: I'm sorry, just a question in the Toaster for the first time asked... didn't know what tags you can manually type in the answers. In the future I will keep in mind. - delbert commented on July 2nd 19 at 14:15

Find more questions by tags robots.txthtaccess