Skip to main content
Disallow indexing of subdomains in Robots.txt

Unfortunately there is no directive to do this in a text file, so I had to resort to PHP.

So, create a robots.php file with the following code:

<?php 
header('Content-type: text/plain');
$subdomain = str_replace('yourdomain, '', $_SERVER['HTTP_HOST']);
if ($subdomain !== '') {
echo '
User-agent: googlebot
Disallow: /
';
}
?>
User-agent: *
Disallow: /includes/
Disallow: /misc/
Disallow: /modules/
#other rules

Sitemap: https://<?php echo $_SERVER['SERVER_NAME']; ?>/sitemap.xml

In this example we've set that if the robot is shown on a subdomain, the directive User-agent: googlebot Disallow: / is added, which prevents the site from being indexed by Google. You can specify your own values for individual User-agent

Once the robots.php file has been placed in the root folder of the site, we need to fix the .htaccess file, where we write the following line:

RewriteRule ^robots.txt$ /robots.php [L,QSA]

Delete the robots.txt file and check that the new robots are working. (By the way, on some hosting sites redirect from robots.txt to robots.php works vice versa, only if both are in the root of the site, so remove the text robots or not, you need to check individually)

Subscribe

About The Author

Author of this blog. In SEO for over 10 years. In addition to SEO, I am interested in everything related to technology and earnings on the Internet, which I try to share with readers.