14 reasons why your site isn’t showing up in Google

Blog tutorial 14 reasons why your site isn’t showing up in Google
Man thinking intensively looking at the computer screen

Why isn’t my website showing up on Google? When will it be indexed? Why is my website missing from search results? These are all very popular questions. No surprise there - more visibility in Google search results means more traffic to your site, the ability to get inbound leads, bigger sales, etc.

The reasons why you’re not seeing your website in Google search results may be varied. Perhaps your site is not SEO friendly? Read on to find out the most common ones.

Table of contents

  1. Time
  2. Links
  3. Robots.txt
  4. HTTP header tags
  5. Canonical
  6. Robots meta tag
  7. No unique content
  8. Redirect chain
  9. Blocked resources
  10. Server errors
  11. Manual and algorithmic Google penalties
  12. Password
  13. Competition
  14. Long page load times

Five reasons why your site isn't showing in Google infographics by Linkedin Ad Creator/Builder Alan O'Rourke

infographics with five reason your site isn't visible in Google

1. Time

First of all - time. The most common reason a page isn’t appearing in Google is that not enough time has passed since its release. Google needs time to find a page, index it and only then will it be able to show it in the search results.

How to find out if a page is indexed? Type site:page URL in Google.

checking page indexing

Or check it directly in Google Search Console

GSC dashboard

Type in your page URL in the search bar at the top of Search Console’s dashboard and check the index status.

Less competitive content may appear at the top of search results shortly after being indexed. Competitive content will be most probably already available, which means you will need time and backlinks to your page for it to be visible in Google.

2. Links

Links are invariably one of the most important parameters in page quality evaluation. Search engines will be able to find a page if it has incoming links. The higher the domain score (Pulno Domain Power), the faster Google finds the page and the chances are great that articles posted on this website will rank higher. Obviously, it is a great generalization, however, sites without backlinks are less probable to appear on top positions.

Pulno Domain Power

In order to be indexed, a page needs:

  • internal links - a proper site structure and incoming internal links
  • external links - the more high-quality backlinks to the website, the greater are the chances a page will rank higher. In the case of competitive keywords, it is better when backlinks point directly to a given page, not the homepage.

Links to individual pages may be placed in the sitemap file, provided you regularly update the file. However, it is advisable to have the links directly on the website (homepage and other pages).

3. Robots.txt

The robots.txt file may be used to block search engines from crawling certain pages on your website. The file is located at the root of the website host, e.g.:


If the robots.txt file looks like this

User-agent: *

Disallow: /example/

and the page address is:


then this page won’t be indexed by the search engines. Of course, there are exceptions to the rule (e.g. when there are external links pointing directly to this page), but in general, pages blocked in the robots.txt file are not indexed by search engines.

4. HTTP header tags

HTTP header is an element of an HTML page code sent from the server to the search engine bots. You can specify the pages you don’t want indexed by setting the X-Robots-Tag value to noindex in the HTTP header.

X-Robots-Tag: noindex

This way you let the search engines know that the page should not be indexed. It is a good way to block bots from crawling PDF files, for example.

When communicating with the search engine, the server sends over an HTTP response status code. The 200 code means the request has been processed successfully on the server. If your page returns a code starting with:

  • 4 - e.g. 404 - it indicates the page is not available
  • 3 - e.g. 301 - it indicates the page is redirected
  • 5 - e.g. 500 - it indicates the server has encountered an error

If the status code is other than 200, the page will most likely not be indexed properly.

5. Canonical

The canonical parameter is used to point search engines to the preferred page that should be indexed.

If you add a rel=canonical to the page below


that looks like this:

<link rel="canonical" href="https://www.example.com/red-boots.html" />

the search engines will stop indexing the page


and will start indexing the canonical page:


You can point rel=canonical to the very same page, however, if you specify a different page in rel=canonical, the search engines will index the canonical page. Remember that the rel=canonical parameter is only a suggestion for the search engines and is not always recognized.

6. Robots meta tag

To have the page blocked by search engines and stop appearing in Google, you can also add the robots meta tag to a page. Adding the below element to the page

<meta name="robots" content="noindex,follow" />

will certainly block it from showing up in Google. It is worth noting that long term usage of noindex, follow will lead to nofollow on links.

7. No unique content (duplicate content)

Google relies heavily on your website’s texts.

  • If the page content is of low quality


  • the content is copied from other websites,

the chances are that the page won’t be indexed by Google and rank high. It is true that there are many counterexamples and instances of duplicate content showing up in Google. These results, however, don’t hold top positions for very long and are most likely to experience great drops in positions with the nearest algorithm update. Making sure your content is high quality, increases your chances for better rankings.

Pages with little or no unique content are crawled by search engine bots less frequently and therefore, reach much lower positions in search results.

8. Redirect chain

Another reason your site is missing from search results may be multiple redirects. If a redirect is set properly, search engines will index the destination URL. However, if there are many hops in a redirect chain (more than 5) or if there’s a loop, the page might be not indexed. Such a page won’t show up in Google at all.

9. Blocked resources

Apart from text, search engines crawl other resources as well, i.e.:

  • CSS files
  • JavaScript files
  • images, video, audio, etc.

In case search engine bots are blocked from crawling these files, they might have a problem with fully understanding the site content. Similarly to the instances mentioned earlier, it is better to let the bots access the resources too. Don’t block them in the robots.txt file or HTTP header.

10. Server errors

Frequent server errors have a negative effect on the crawl rate as well. If search engine bots often receive a 5xx response status, the chances of indexing that page get thinner.

11. Manual and algorithmic Google penalties

Google penalties may be one more reason your site isn’t showing up in the search results. The status of manual penalties set by Google can be verified in Google Search Console.

Google penalities

Recognizing an algorithmic penalty is a subject for a whole new post. You can always check if your domain name appears in the top positions in Google. Another helpful tool is the site: search operator. If the search for site:yourdomainname.com brings fewer results than the number of pages you have actually published, you can assume an algorithmic penalty. Unnatural backlink profile is a very common reason for this type of penalty. 

12. Password

STOP sign

Search engines can’t access web pages secured with a password. Setting a password is a good way to block access to a page on a production or test server. However, if you want your pages indexed, don’t block them with a password.

13. Competition

5.7 million articles are being published every day. The chances that there already is an article on the same topic as yours are incredibly high. It is good to know your competition - search for the keywords related to the content you aim to create and see why the articles on top positions actually rank so high. Most commonly, these are long and attractive posts, full of relevant images and with a clear structure. If you want to rank high, you need to create quality content and gain excellent backlinks

boys running

The situation is similar to the one in sports - only the most hardworking can achieve the best results.

14. Long page load times

Slow loading pages have much lower conversion rates, and are less likely to get indexed properly. Slow server, especially if generating a lot of errors (see point 10), will make search engines ignore such pages.


There may be many reasons why Google hasn’t indexed your website. Checking all of them manually is not only difficult (e.g. HTTP headers), but also time-consuming. If some of your pages are not indexed, I suggest using Pulno to check the issues instantly and automatically.

Jacek Wieczorek is the co-founder of Pulno. Since 2006, he has been optimizing and managing websites that generate traffic counted in hundreds of thousands of daily visits. 

Get in touch:   


SEO Audit and Website Analysis

Enter website URL to start free audit

Enter valid URL
Enter valid e-mail
You have to accept the terms.