I think the x-robots-tag header added to the sitemap.xml response header is preventing google from crawling pages with as sitemap generated with the sitemap framework .
This is because it detects the X-Robot-Tag : noindex when the bot checks for it . I think this is a serious issue for SEO . Advise need to change this if I possibly infringed on a Django best practice
Are all of your pages rendering with the <meta name="robots" content="noindex" />
element? The noindex
value will exclude a page, but the django sitemap framework should only be applying that to the sitemap.xml and sitemap listing pages.
It might be helpful for you to include what your sitemap configuration looks like as well.
Sorry for the late reply. It isn’t on all pages just the sitemap.xml. The problem is this is having an unexpected side effect . Which is when the googlebot crawls my page and finds the non-index tag . It gives up indexing the entire website . Simply because it checks for the sitemap first before any other webpage on the site. I’m going to show my configuration soon if it helps. I commented it out though and my indexing request was accepted just fine. Otherwise google responds with a no-idex found in X-Robot-Tag . Which is really weird because I don’t even set the x-robot-tag configuration.