0
suresh
Recently I had a problem with the robots.txt file, all of my website URLs moved to blocked URLs, Please help me as soon as possible.Thanks!
Reply on This
2 Subscribers

Submit Answer
1 Answers
Best Answer
0
Please go to google blocked URLs and click the link, you will get a popup and display 3 options like fetch as Google, owner validation and test run in robts.txt.
Now Try to test run your robots.txt file, it will show errors in your robots.txt file. then try fix error giving permissions to Googlebot to crawl.
You can also try using ‘Fetch as Google’ on the sitemap URL itself, so you can see the headers and content that the server sends to Google
If you don’t found any issue, Perhaps it was just a transient issue. Try re-submitting the sitemap in the Sitemaps section.