Nerdy Bot etc

I have received the following when trying to get my site indexed and crawled .

User-agent: NerdyBot 

Disallow: /      

User-agent: dotbot

Crawl-delay: 10

User-agent: *

Disallow: /ajax/

Disallow: /apps/

 

I have read elsewhere on the site that it is not necessary to be concerned.  However this does not tell me how to overcome that my site is not being fetched.  As the User I have been verified but after 2 weeks of trying I just cannot get Google to co-operate.  Any help very welcome please

759 Views
Message 1 of 2
Report
1 REPLY 1
Square Champion

Hi @theoldandgrey6.  Question.  Is your site a Square/Weebly hosted site?  If so, then this whole disallowing NerdyBoy (which is an AI bot crawler) is new to me.  I didn’t know Square was doing that yet, though I’m happy that they are, if so.

 

A quick perusal of StackOverflow.com tells me that sometimes Google’s bots can decide to ignore a site completely when they see a disallow directive in the robots.txt file. So, if you are able to edit that txt file (and Square isn’t doing this for you), edit it and add the following line at the end:

 

User-agent: Googlebot Allow: /

 

No one seems to know what is up with Google here, but this seems to work for most folks.

 

Also, keep in mind that it can take Google many days, and even weeks, for its bots to find new sites.  So patience is about all you have on your side while you wait for this to happen.

 

Let me know if you have any other questions.

Chip A.
Square Expert & Innovator and member of the Square Champions group. (But NOT a Square employee, just a seller like you)

Was my post helpful? Take a moment to mark it as a solution. Marked solutions help other sellers find possible resolutions to similar problems. Also, if you find your solution elsewhere (say, through Support), it is helpful to come back to your post and tell us about it, then mark that as a Solution. Solutions are what this Community is all about!
739 Views
Message 2 of 2
Report