How to fix disallow issue from robots.txt

I have seen some of my pages are disallowed to crawl in my robots.txt file. But I have selected these pages that are allowed to show in search engine from the SEO setting. How can I fix this problem? 

image

673 Views
Message 1 of 4
Report
3 REPLIES 3

@freelancermurad 

Submit the sitemap.xml    more often...  and hope that google indexes ur pages...

Though the basketball rooms page    didnt have any content...   almost empty pages are not a priority..

662 Views
Message 2 of 4
Report

Solution:

Try to use Rank Math plugin or Yoast SEO plugin and enable option of sitemap auto generated.

make sure you have submitted in search console as well. And keep working and trying.

Regards,

Jobs

643 Views
Message 2 of 4
Report

@ghotojab @freelancermurad   this is weebly    ...   no need to add WP plug ins on a weebly site   (also at @Bernadette )

If google wants to index ur site it will....  u cant hurry them.   Equally    , if google doesnt want to....   well, improve ur seo  and maybe it will.

620 Views
Message 2 of 4
Report
This thread has been archived and locked. Please start a new thread if you would like to continue the conversation.