- Subscribe to RSS Feed
- Mark Thread as New
- Mark Thread as Read
- Float this Thread for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
How to fix disallow issue from robots.txt
I have seen some of my pages are disallowed to crawl in my robots.txt file. But I have selected these pages that are allowed to show in search engine from the SEO setting. How can I fix this problem?
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
- Subscribe to RSS Feed
- Mark Thread as New
- Mark Thread as Read
- Float this Thread for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
Submit the sitemap.xml more often... and hope that google indexes ur pages...
Though the basketball rooms page didnt have any content... almost empty pages are not a priority..
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
- Subscribe to RSS Feed
- Mark Thread as New
- Mark Thread as Read
- Float this Thread for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
Solution:
Try to use Rank Math plugin or Yoast SEO plugin and enable option of sitemap auto generated.
make sure you have submitted in search console as well. And keep working and trying.
Regards,
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
- Subscribe to RSS Feed
- Mark Thread as New
- Mark Thread as Read
- Float this Thread for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
@ghotojab @freelancermurad this is weebly ... no need to add WP plug ins on a weebly site (also at @Bernadette )
If google wants to index ur site it will.... u cant hurry them. Equally , if google doesnt want to.... well, improve ur seo and maybe it will.
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report