This article was originally published in December 2018 but was last updated August 10th, 2022.

Ranking on search engines isn’t an exact science. The algorithm that determines website ranking relies on hundreds of elements. Google Search uses bots to seek out information on websites. Professionals refer to this as “crawling.” Crawling is how the search engines find new and updated pages to add to the index. Therefore, understanding the basics of crawling can help you understand if Google even knows if your site is there.

What are Google Spiders?

It is a widely accepted concept that Google and other search engines have little bots that are crawling through the whole Internet, also called spiders. Google spiders crawl websites by looking at each and every site on the web to decide its value and gather information to help create the complex algorithms that eventually determine the site’s rank on a search engine results page (SERP). These little bots have a huge job!

These bots are not actual robots. They are little bits of code that go from site to site, reading more code. Google has dubbed these floating bits of code as bots or spiders. These Google spiders are crawling all over the web, including your site, combing through the code and relaying what they find back to the search engines.

How Often Does Google Crawl a Site?

Naturally, there are factors that cause some sites to be crawled more frequently than others. Google is fairly open about their spiders and how often and why they crawl sites. They have a whole page, called the Webmaster Guidelines, dedicated to helping people learn about how the spiders crawl sites.

The question of “how long does Google take to crawl a website?” depends on so many factors that are at play when it comes to how, when, and why Google crawls your site. There is no hard-and-fast strategy that applies to every site and the time frame can range from a couple of days to a couple of weeks. 

A good tip to remember is that the frequency of the Google spiders visiting your site depends on how often you update it. For sites that are constantly adding and updating content, the Google spiders will crawl more often—sometimes multiple times a minute! However, for a small site that is rarely updated, the Google bots will only crawl every few days.

Knowing your statistics about how often Google crawls your site will help you know if your efforts are working or not, and what areas need to be optimized further. In order to see how long it takes Google to crawl your site, you will need to access your Crawl Stats report in Search Console. Follow these simple steps below:

Steps to Determine How Long it Takes Google to Crawl Your Site

  1. Log into Search Console.
  2. Click on Settings located at the bottom of the left-hand navigation.
  3. There will be a ‘Crawl Stats’ section. Click the ‘Open Report’ link and view the past 90 days worth of data.

If you have an older website that is experiencing issues with being crawled and indexed, the design of the site may be a problem. Sometimes, sites are temporarily unavailable when Google attempts to crawl, so checking the crawl stats and looking at errors can help you make changes to get your site crawled. 

Google also has two different spiders: a desktop crawler to simulate a user on a desktop, and a mobile crawler to simulate a device search. Within the Settings section of Search Console, it will tell you which device was used to crawl and index your website.

Do I Want Google to Crawl and Index My Website?

YES! If you want your website to show up in Google search engine results pages (SERPs), you need Google spiders to be crawling your site and indexing pages starting with the most valuable ones (this is why sitemaps are important for SEO). You need Google to start crawling your site immediately and with as much frequency as possible. Without Google’s bots regularly crawling your site, you have no chance of being ranked high at all.

The more Google’s spiders crawl your website, the more they will trust your site, register the information and updates you have made, and relay the information back to Google SERPs. And the better the SEO tactics you use on your site and the more the Google bots are crawling, the higher your site will continue to rank.

Note: There may be pages you do not want to be crawled or indexed. In that case, learn more about what a robots.txt file is and why it’s important to have one!

How to Get Google to Crawl Your Site

Google indexes billions of web pages. As a result, it would be impossible to crawl every page, every day. If you’re wondering “how to crawl my website in Google”, the following information should help. As with any business, Google has to use its resources wisely. If your website has errors that don’t permit access, then Google won’t keep sending bots to your site. More importantly, Google won’t send users to your site, which will decrease traffic and tank your online marketing efforts. Although Google doesn’t guarantee website indexing, you can get Google to crawl your website more frequently by implementing these strategies.

1. Check server connectivity errors

First and foremost, use Search Console to check for errors and usability issues. Then, fix those issues to let the bots crawl your site.

2. Review the robots.txt file

Another good spot to review first before getting too deep in the weeds is the robots.txt file. While it can be necessary to exclude certain pages from being crawled and indexed, there are some valuable ones that are also listed by mistake (it happens)! If that’s the case, clean up your robots.txt file by removing those pages you do want Google spiders to crawl.

3. Obtain high authority backlinks

A backlink, also called an inbound link, is a link from another website that tells search engines your site has some authority. Others are sharing your content, which makes the search engine rankings take notice and can get Google to crawl your website. The more valuable the link, the better. Diversify your strategy and pursue low hanging fruit from relevant local sources. While it’s nice to get a backlink from a really well-known website like Forbes, smaller sites can be just as valuable and sometimes even more relevant.

4. Make frequent site updates

E-A-T is growing in importance within the world of SEO and one of the best ways to convey this on your website is by creating new content or repurposing existing content that no longer performs well. In addition to this, fresh content encourages Google spiders to search your site. Write about your employees, your industry and your products. In addition, include other types of content, video, pictures, graphs and interactive content.

5. Upload a sitemap

A sitemap outlines the pages on your site and helps search engines read your website more effectively. Bots read the sitemap, which tells Google how often you update your content. Google itself says that a sitemap doesn’t guarantee indexing, but it can help the bots learn about your site.

6. Manually request indexing

In Search Console, you can either submit your sitemap to Google requesting that bots crawl and index your website, or you can manually request indexing for individual pages. While this isn’t a guarantee that your page will be crawled and indexed, it does help alert Google of the new changes you have made to your website.

7. Internally link between pages on your website

Internal linking is another strategy that can help search engines better understand how pages are connected and get Google to crawl your site. If you have pages that rank well, you can internally link to other pages that may not be performing as well (assuming that they are relevant enough to be linked to) to transfer some of the value.

8. Check the technical SEO aspects of your content

Make sure you have great titles that showcase the content of each page. Write great meta descriptions that identify the content on the page. Keep URLs concise, ideally under 50 characters. Furthermore, make sure that the page loads quickly.

9. Share content

Sharing is a great way to encourage bots to search your own site. First, share your content on social media. Second, distribute content within your industry communities. Third, find influential sites where you can offer guest blogs. And finally, ask guest bloggers to write for your website that are credible and E-A-T worthy. Get influencers in your industry to link to your page.

How to Get Google to Crawl My Site Immediately

If you are wondering: “how to get google to crawl my site immediately”, you’re in the right place. Remember, it takes time for Google’s spiders to crawl billions of pages on the web. However, you can easily request Google to crawl your site manually through Google Search Console’s URL inspection tool. This tool submits a request to Google about your URL – placing it on a priority queue. Although there isn’t a way for Google to index pages instantly, the “manual request” method is a great option. Follow the steps below:

  1. Sign into Google Search Console.
  2. Copy and paste the URL you want indexed into the “inspection” search bar.
  3. Click “Test Live URL” to make sure the page loads correctly.
  4. Click “Request Indexing” to place the URL into a priority queue.

Keep in mind that requesting indexing for a URL multiple times won’t change its place in the priority queue. Once requested, this triggers Google’s spiders to index the inserted URL as soon as possible.

Get Google to Crawl Your Website More Often!

SEO isn’t an exact science. It’s a secret algorithm that defines website ranking. Even so, there are tactics that you can do to help get Google to crawl your site more often and to rank higher in search to drive traffic to your site. At Boostability, we have decades of experience and millions of bits of data that we use to help small businesses boost their organic presence and revenue. Check out our white label seo services to learn more about our team, our technology, and see how we have helped hundreds small businesses succeed online!



Kristine is the Director of Marketing at Boostability. She brings a decade's worth of communications strategy work to the company. Kristine has a Masters Degree in Leadership and Communications from Gonzaga University and graduated from BYU with her undergrad in Broadcast Journalism. She's worked in television news, public relations, communications strategy, and marketing for over 10 years. In addition to being a part of the marketing team, Kristine enjoys traveling, sports, and all things nerdy.


  • Josh, April 9, 2015 @ 7:43 am

    Awesome article! The two things that I want to work on are adding fresh content more often and earning more inbound dofollow links. One question I have had is if Google regards comments as fresh content. If so, in theory, the more interaction an article gets by people leaving comments, the more often Google will crawl the site. Anyone know for sure if comments are perceived by Google as fresh content worthy of more frequent crawling?

  • Josh, April 9, 2015 @ 8:24 am

    Awesome article! The two things that I want to work on are adding fresh content more often and earning more inbound dofollow links. One question I have had is if Google regards comments as fresh content. If so, in theory, the more interaction an article gets by people leaving comments, the more often Google will crawl the site. Anyone know for sure if comments are perceived by Google as fresh content worthy of more frequent crawling?

  • Jamison Michael Furr, April 9, 2015 @ 11:34 am

    @disqus_BkikKRpjfe:disqus I don’t actually have an answer, but that is a really good question that I wouldn’t have even thought of – I want to know too now!

  • Caz*, April 11, 2015 @ 8:59 pm

    That is a great question. It depends how the site is built out and how the commenting system works. For example, if your comments are built out as a plugin, that means they’re also most likely built out as an iFrame within the page and not just a of the page. Therefore, your iframe is technically getting credit for new, engaging content every time a comment is left. However, Google gets smarter and smarter in how they crawl a page. At this point, I don’t know that they are doing a perfect job of linking iFrame content in as credit toward the page the iFrame is in because they’d have to program it extensively in order to give credit only if the iFrame site is owned by the same site where the iFrame exists. Otherwise, anyone could add an iFrame of these comments, for example, to their site and receive credit. However, they do easily track and understand visits, repeating visits, unique visits, bounce rate, and all those tasty Google Analytics numbers which translate to quite the SEO boosting treat! So much of how relevancy is defined comes down to how the site is surfed. Google has shifted to trusting the user’s desire and intent rather than trying to derive relevancy from the code on the page. Obviously, if people are coming back to the site often, that is more important than the number of unique words presented on a page and the percentage of words that may be considered “keywords,” for example. Does that make sense @disqus_BkikKRpjfe:disqus ? And answer your curiosity too @jamisonmichaelfurr:disqus ?

  • Longboards USA, April 13, 2015 @ 10:54 am

    hmmm about: “Google visits them an average of 17 times a day.” are you sure about that?

    Thought that it more like an average of 17 pages per day (over last 90day) – metric is “Pages crawled per day”.
    Looking at the crawl it shows that google comes in almost every day (for a short time) grabs a number of pages and comes back the next day to grab some more.
    Does not mean all the other metrics are updated daily as they only seem to change 1 or 2 times a week. (eg. index status, links to your website..)


  • SS, April 13, 2015 @ 11:19 am

    As for fresh content, does changing the home page with new copy or layout count? Does Google Blogger count as fresh content, if that’s not part of your website pages? Is changing copy on inner pages a good idea or waste of time. Seems that fresh content would only totally new pages or am I wrong?

  • Jessi Losee, April 13, 2015 @ 12:51 pm

    Great article Jake! This is quite a common question among clients and you hit the nail on the head with your answer.

  • Caz*, April 13, 2015 @ 8:11 pm

    These are such great questions!

    Updating content across your site does count as new, updated, relevant content. However, you also don’t want to be updating all the time or changing what you have too often. Regular updates are expected – something like a one time per year overhaul of your website to make sure all the of the content remains relevant and no links are broken. Updating new content to the site can be done by adding new, relevant questions, and updating pages that are expected to be updated more often like testimonials, case studies, FAQs pages, and blogs.

    As for a blogger blog, although this is owned by Google unless you are hosting the blog directly on the same server that your website exists, it doesn’t really count as updates to your site so much as Google seeing it as another site that is linking back to your site. Does that make sense? To get the best results from blog updates, you’ll want to use a service that will take each new blog post and publish it to a given folder within the same root folder where your site exists with your web host.

    Does that answer all your questions?

  • Caz*, April 13, 2015 @ 8:11 pm

    What advice do you generally give when being asked this question?

  • Jamison Michael Furr, April 17, 2015 @ 3:09 pm

    Yes! thanks so much @cazbevan:disqus 🙂

  • Maria Williams, April 23, 2015 @ 10:46 am

    Good job Jake! this is a helpful article and like you mentioned that we should have fresh content if we want to Google to crawl the website more frequently. You mention that adding a blog will be beneficial to always have fresh content on the website but there are some cases that client’s wont do it, so my guess my questions is Do you recommend to change the content on the pages on the website? if so, How often would you change the content?

  • Yucel Yalim, August 30, 2015 @ 5:02 pm

    I thought my site was being crawled pretty regular… and perhaps it is… tho things seem to take a while to trickle thru google… that is… things don’t always happen right away… even after a definite google event…

  • Jason Newsted, October 13, 2015 @ 4:46 pm

    What about using the Fetch As Google feature? Isn’t that the simplest way to make Google crawl your webpages?

  • Caz*, October 21, 2015 @ 11:32 pm

    Fetch as Google does allow your website to be shown as Google sees it. You can ask Google to crawl your website in Google Search Console. However, these features do not guarantee listing your entire website as potential search items. The only way to ensure your site is mapped correctly in Google’s eyes is to give them a specific XML file to crawl. This tells Google more than it can see just from following links on your site.

  • Caz*, October 21, 2015 @ 11:33 pm

    You can see how many pages Google, Bing, and Yahoo are aware of with this free app

  • decker strength., December 27, 2015 @ 3:01 pm

    Good to know. Thanks for the article!

  • Caz*, December 28, 2015 @ 8:55 pm

    Glad this article could help you!

  • seonewtool, January 20, 2016 @ 11:13 pm

    Google index checker analyses on how easily and quickly google is able to crawl or index on a website. This tool is also useful in checking the google index stats of multiple websites at a time.

  • Andrew Williams, February 17, 2016 @ 2:04 pm

    Jake is on point with this article. So much good information in Google Search Console to see your crawl errors and crawl history of when Google is crawling your site.

  • Find DankWeed, February 28, 2016 @ 11:32 am

    very informative article!!

Comments are closed.