How to Disable Webflow Subdomain Indexing?

Disabling a page on your search engine can easily be done by attaching a robots.txt file. As the admin of a website or blog – You can easily stop an entire webflow.io, a particular page, folder, image, or your webflow.io subdomain from indexing on the webpage. Besides, the disabling of a subdomain has its advantages like hiding your website 404 pages from being listed or indexed in the search result.

What’s a subdomain?

A subdomain is mostly considered a standalone site that is branched off from the main domain.

Let’s cite an example: Suppose the admin of a blog named theshift.com wanted to create a subdomain. Then the subdomain name would be shop.theshift.com. Now, that you have understood the meaning of a subdomain. Let’s move to the next and most important aspects.

How to disable indexing of the Webflow subdomain?

Google and other search engines can be stopped from indexing your site webflow.io subdomain by deactivating the indexing on your sites settings tab. Here’s how to get it all done without having to stress yourself out:

Step #1. Check for site settings under the SEO tab on your website and then narrow it down to the indexing section.

Step #2. Switch the Disable Webflow subdomain indexing button to “Yes”.

Step #3. Click on the Save changes button and publish these sites.

Once the own steps have been carried out and saved. A unique robot. txt gets published on the subdomain that stops all search engines from indexing the domain.

How to generate a robotx.txt file? 

The robots. txt is enveloped with a list of URLs on a website that the search engine is meant to avoid.

To create a robots.txt file:

Step #1. Go to the site settling and check for the URLs indexing section in the SEO tab.

Step #2. Attach the robots. txt rules.

Step #3. Click Save changes and publish your site.

Looking to disable a particular Webflow page or URL?

Are you trying to prevent the discovery of a particular page or URL on your blog? If Yes. Try not to use the robots.txt file to stop the URLs from being crawled. Instead, apply either of the options cited below:

  1. Don’t publish pages. Instead, save or add the pages to your draft box.
  2. Apply a non-index meta code to stop all search engines from executing the content on their web pages.

Related: 8 Best Webflow SEO Tips for Better Rankings

Frequently Asked Questions

Can a robots.txt file prevent a website webflow from being indexed?

The file must stay on the same domain with the assets it serves and its content.  So, a robots.txt file can not stop the website web-flow assets from getting indexed. However, the webflow does not serve the custom domain where the robots.txt file lives.  Instead, the assets serve and work with the global CDN.

How can I remove the robots.txt file from my site settings?

Once the robots.txt file has been made and attached to a website setting. It’s can’t be removed or stopped from executing its initial task. Regardless, it can be replaced with new rules to allow the site to be crawled.

Conclusion

Always save all the changes made on your site and ensure to republish them all. If any additional issues come up or persist after applying the old robots.txt rules on your published site, please contact customer support or reach out to an expert for help.

Leave a Reply

Your email address will not be published.