SEO
-
Failed: Robots.txt Unreachable Error in Google Search Console – Solved
by
in TipsHere’s how I solved the Failed: Robots.txt Unreachable and Robots.txt Not Fetched errors in Google Search Console. The issue itself is serious, since it prevents Google bots from crawling and indexing your website and webpages. The solution I found is to unblock Google bot IPs in your web hosting server’s firewall. Yes, although Google bot…
-
Force HTTPS to Avoid Duplicate Page Indexing by Google
by
in TipsHaving enabled access to both HTTPS and HTTP versions of your website may lead to duplicate content. For instance, the Google Search may index both HTTPS and HTTP versions of your site’s webpages, which may create unnecessary duplicates in the search results like this: Having duplicate pages indexed isn’t good from the SEO perspective. Furthermore,…