Last Update: September 16, 2019

Ranking on search engines isn’t an exact science. The algorithm that determines website ranking relies on hundreds of elements. Google Search uses bots to seek out information on websites. Professionals refer to this as “crawling.” Crawling is how the search engines find new and updated pages to add to the index. Therefore, understanding the basics of crawling can help you understand if Google even knows if your site is there.

 

Know the Crawl Stats for Your Website

Google offers a free service, called Search Console, to help websites understand how their site performs. Search Console provides analytical data that can help you boost your understanding of SEO ranking to help you increase your visibility on the web. This tool shows when the Googlebots last visited your site. Knowing your statistics about how often Google crawls your site will help you know if your efforts are working or not.

According to Google, “crawling and indexing are processes which can take some time and which rely on many factors.” It’s estimated that it takes anywhere between a few days to four weeks before Googlebots index a new site. If you have an older website and doesn’t experience crawling, the design of the site may be a problem. Sometimes, sites are temporarily unavailable when Google attempted to crawl, so checking the crawl stats and looking at errors can help you make changes to get your site crawled. Google also has two different crawlbots: a desktop crawler to simulate a user on a desktop, and a mobile crawler to simulate a device search.

 

How Can You Boost Your Crawl Rate?

Google indexes billions of webpages. As a result, it would be impossible to crawl every page, every day. As with any business, Google has to use its resources wisely. If your website has errors that don’t permit access, then Google won’t keep sending bots to your site. More importantly, Google won’t send users to your site, which will decrease traffic and tank your online marketing efforts. Google wants users to have a good experience, but when bots can’t access your site, or find the relevant information it’s looking for, Google decreases your ranking in search engines.

Google’s algorithm cares about quality content. You can have wonderful blogs, landing pages and other data on your site, but when Google bots can’t search your site to find and index that content, it doesn’t do you any good. Although Google doesn’t guarantee website indexing, you can encourage Google to crawl your site more frequently.

Check server connectivity errors.

Use the Search Console to check for errors and usability issues. Then, fix those issues to let the bots crawl your site.

Use inbound links.

An inbound link is a link from another website that tells Google your site has some authority. Others are sharing your content, which makes the search engine rankings take notice. The higher authority of the links, the better. Having local agencies link to your site is good; however, if you can get a site like Forbes or Huffington Post to link to your site it is even better.

Make frequent site updates.

Add new material to your website. Fresh content encourages bots to search your site. Add a blog as a platform to generate new content. Write about your employees, your industry and your products. In addition, include other types of content, video, pictures, graphs and interactive content.

A sitemap outlines the pages on your site. Bots read the sitemap, which tells Google how often you update your content. Google itself says that a sitemap doesn’t guarantee indexing, but it can help the bots learn about your site.

Submit your link to search engines.

In Search Console, you can submit your sitemap to Google to ask the bots to crawl your page. Similarly, you can submit your link directly to search engines. If you have other pages that rank well, you can link your new page off those existing pages.

Check the technical SEO aspects of your content.

Make sure you have great titles that showcase the content of each page. Write great meta descriptions that identify the content on the page. Keep URLs short, under 50 characters. Furthermore, make sure that the page loads quickly.

Share content.

Sharing is a great way to encourage bots to search your own site. First, share your content on social media. Second, distribute content within your industry communities. Third, find influential sites where you can offer guest blogs. And finally, ask guest bloggers to write for your page. Get influencers in your industry to link to your page.

 

Is Your Site Reaching Its Full Potential?

SEO isn’t an exact science. It’s a secret algorithm that defines website ranking. Even so, there are tactics that you can do to help your site rank higher in search to drive traffic to your site. At Boostability, we have decades of experience and millions of bits of data. Ready to learn more and start getting your website ranking? Contact us today to set your website up for success?

 

This post was originally published December 2018 and has been updated with current information. 

Share:

administrator

Kristine is the Director of Content with Boostability. She brings a decade's worth of communications strategy work to the company. In addition to being a part of the marketing team, Kristine enjoys traveling, sports, and all things nerdy.

20 Comments

  • Josh, April 9, 2015 @ 7:43 am

    Awesome article! The two things that I want to work on are adding fresh content more often and earning more inbound dofollow links. One question I have had is if Google regards comments as fresh content. If so, in theory, the more interaction an article gets by people leaving comments, the more often Google will crawl the site. Anyone know for sure if comments are perceived by Google as fresh content worthy of more frequent crawling?

  • Josh, April 9, 2015 @ 8:24 am

    Awesome article! The two things that I want to work on are adding fresh content more often and earning more inbound dofollow links. One question I have had is if Google regards comments as fresh content. If so, in theory, the more interaction an article gets by people leaving comments, the more often Google will crawl the site. Anyone know for sure if comments are perceived by Google as fresh content worthy of more frequent crawling?

  • Jamison Michael Furr, April 9, 2015 @ 11:34 am

    @disqus_BkikKRpjfe:disqus I don’t actually have an answer, but that is a really good question that I wouldn’t have even thought of – I want to know too now!

  • Caz*, April 11, 2015 @ 8:59 pm

    That is a great question. It depends how the site is built out and how the commenting system works. For example, if your comments are built out as a plugin, that means they’re also most likely built out as an iFrame within the page and not just a of the page. Therefore, your iframe is technically getting credit for new, engaging content every time a comment is left. However, Google gets smarter and smarter in how they crawl a page. At this point, I don’t know that they are doing a perfect job of linking iFrame content in as credit toward the page the iFrame is in because they’d have to program it extensively in order to give credit only if the iFrame site is owned by the same site where the iFrame exists. Otherwise, anyone could add an iFrame of these comments, for example, to their site and receive credit. However, they do easily track and understand visits, repeating visits, unique visits, bounce rate, and all those tasty Google Analytics numbers which translate to quite the SEO boosting treat! So much of how relevancy is defined comes down to how the site is surfed. Google has shifted to trusting the user’s desire and intent rather than trying to derive relevancy from the code on the page. Obviously, if people are coming back to the site often, that is more important than the number of unique words presented on a page and the percentage of words that may be considered “keywords,” for example. Does that make sense @disqus_BkikKRpjfe:disqus ? And answer your curiosity too @jamisonmichaelfurr:disqus ?

  • Longboards USA, April 13, 2015 @ 10:54 am

    hmmm about: “Google visits them an average of 17 times a day.” are you sure about that?

    Thought that it more like an average of 17 pages per day (over last 90day) – metric is “Pages crawled per day”.
    Looking at the crawl it shows that google comes in almost every day (for a short time) grabs a number of pages and comes back the next day to grab some more.
    Does not mean all the other metrics are updated daily as they only seem to change 1 or 2 times a week. (eg. index status, links to your website..)

    go4it,
    Rene

  • SS, April 13, 2015 @ 11:19 am

    As for fresh content, does changing the home page with new copy or layout count? Does Google Blogger count as fresh content, if that’s not part of your website pages? Is changing copy on inner pages a good idea or waste of time. Seems that fresh content would only totally new pages or am I wrong?

  • Jessi Losee, April 13, 2015 @ 12:51 pm

    Great article Jake! This is quite a common question among clients and you hit the nail on the head with your answer.

  • Caz*, April 13, 2015 @ 8:11 pm

    These are such great questions!

    Updating content across your site does count as new, updated, relevant content. However, you also don’t want to be updating all the time or changing what you have too often. Regular updates are expected – something like a one time per year overhaul of your website to make sure all the of the content remains relevant and no links are broken. Updating new content to the site can be done by adding new, relevant questions, and updating pages that are expected to be updated more often like testimonials, case studies, FAQs pages, and blogs.

    As for a blogger blog, although this is owned by Google unless you are hosting the blog directly on the same server that your website exists, it doesn’t really count as updates to your site so much as Google seeing it as another site that is linking back to your site. Does that make sense? To get the best results from blog updates, you’ll want to use a service that will take each new blog post and publish it to a given folder within the same root folder where your site exists with your web host.

    Does that answer all your questions?

  • Caz*, April 13, 2015 @ 8:11 pm

    What advice do you generally give when being asked this question?

  • Jamison Michael Furr, April 17, 2015 @ 3:09 pm

    Yes! thanks so much @cazbevan:disqus 🙂

  • Maria Williams, April 23, 2015 @ 10:46 am

    Good job Jake! this is a helpful article and like you mentioned that we should have fresh content if we want to Google to crawl the website more frequently. You mention that adding a blog will be beneficial to always have fresh content on the website but there are some cases that client’s wont do it, so my guess my questions is Do you recommend to change the content on the pages on the website? if so, How often would you change the content?

  • Yucel Yalim, August 30, 2015 @ 5:02 pm

    I thought my site was being crawled pretty regular… and perhaps it is… tho things seem to take a while to trickle thru google… that is… things don’t always happen right away… even after a definite google event…

  • Jason Newsted, October 13, 2015 @ 4:46 pm

    What about using the Fetch As Google feature? Isn’t that the simplest way to make Google crawl your webpages?

  • Caz*, October 21, 2015 @ 11:32 pm

    Fetch as Google does allow your website to be shown as Google sees it. You can ask Google to crawl your website in Google Search Console. However, these features do not guarantee listing your entire website as potential search items. The only way to ensure your site is mapped correctly in Google’s eyes is to give them a specific XML file to crawl. This tells Google more than it can see just from following links on your site.

  • Caz*, October 21, 2015 @ 11:33 pm

    You can see how many pages Google, Bing, and Yahoo are aware of with this free app http://www.FreeWebsiteScore.com

  • decker strength., December 27, 2015 @ 3:01 pm

    Good to know. Thanks for the article!

  • Caz*, December 28, 2015 @ 8:55 pm

    Glad this article could help you!

  • seonewtool, January 20, 2016 @ 11:13 pm

    Google index checker analyses on how easily and quickly google is able to crawl or index on a website. This tool is also useful in checking the google index stats of multiple websites at a time.

  • Andrew Williams, February 17, 2016 @ 2:04 pm

    Jake is on point with this article. So much good information in Google Search Console to see your crawl errors and crawl history of when Google is crawling your site.

  • Find DankWeed, February 28, 2016 @ 11:32 am

    very informative article!!

Comments are closed.