x

robots.txt not updated after change to Search engine visibility

when my site was being constructed. I had the option for Search engine visibility under SEO settings set to "Hidden from search engine results". The site has since gone live, and I switched the option to "visible to search engines". I imagine that this setting should correctly update the robots.txt file with the disallow or allow attribute to block or allow the search engine from crawling the site. 

 

I have also selected to use the "Online listing on Google; as a sales channel through square. after the sync process all products fail with the following error messages from the square dashboard: 

 

  • Update your website SEO settings to be visible to search engines.
  • Mobile page not crawlable due to robots.txt. Update your robots.txt file to allow user-agents "Googlebot" and "Googlebot-Image" to crawl your site

you can easily confirm that the robots.txt file is not getting updated by browsing to https://www.yourdomain.com/robots.txt the contents of my robots.txt file is as follows: 

 

User-agent: *
Disallow: /s/search
Disallow: /s/cart/
Disallow: /store/checkout
Disallow: /store/status
Disallow: /product/*/*/leave-review

User-agent: Googlebot
Disallow:
User-agent: Googlebot-Image
Disallow:

It appears to me that that there is a service/job/task that is not running or failing when this setting is changed and not properly updating the robots.txt file. 

 

Has anyone seen this same issue, if so, have you been successful working with support to have the robots.txt file updated?

275 Views
Message 1 of 4
Report
3 REPLIES 3
Square Community Moderator

Hello @ZaraJo 👋

Happy to take a look! 

Is this for the site https://www.zarajobeautysupply.com/?

 

How long ago did you enable search engine visibility? 

 

I'll keep an eye out for your reply. 

Thank you. 

Frances
Community Moderator, Square
Sign in and click Mark as Best Answer if my reply answers your question.
232 Views
Message 2 of 4
Report

Hello @frances_a,

 

The site you have listed is the correct site. And I enabled search engine visibility around the 20th. I since noticed a number of square online sites with the same disallow rules and this may be a larger square issue, not certain.

 

Any help you can provide is greatly appreciated.

205 Views
Message 3 of 4
Report
Square Community Moderator

Hi! 

@ZaraJo

 

Thank you for confirming. 

I went to take a look and your site is appearing as a result on Google!

 

The pages listed after "Disallow:" on the top potion of the file are the ones that cannot be crawled. Pages like cart and checkout should not be indexed. A * in the User-agent line means those rules apply to all crawlers. 

 

According to this robots.txt, your are allowing Googlebot and Googlebot-Image to crawl your site.

"Disallow:" only affects a User-agent if followed by /. 

So, in your case, for Google crawlers, Disallow is "turned off".

 

Let me know if you have further questions. 

 

Thank you

Frances
Community Moderator, Square
Sign in and click Mark as Best Answer if my reply answers your question.
170 Views
Message 4 of 4
Report