Every SEO term you’ll ever need to know, ever wanted to learn about, or just where to go for better understanding of your SEO campaign.
A forward to a new URL from an old URL with a transfer of all search authority permanently. Current inbound links and traffic will be redirected over to the new URL. 301’s are used for permanent URL redirection when the intent of the webmaster to replace the page with a different one for good.
A forward to a new URL from an old URL for temporary purposes. They differ from a 301 as they are used as temporary redirects. They are typically used for A/B testing and maintaining good user experience during downtime of webpages.
Also called a 404 Not Found, Page Not Found, or Server Not Found error. It means a browser is not able to communicate with a server, or the server cannot deliver a request. 404 Errors will show if a server cannot disclose information, or when a user clicks on a broken or dead link.
Referring to newspaper publishing, above the fold refers to content that shows at the top half or front page of a website. Like headlines in newspapers, the top section of a website should grab attention. And appearance on a desktop, tablet, or mobile device should be taken into account.
Currently known as Google Ads, it’s the system where advertisers bid on keywords for their ads to show up in Google Search results above organic results. These ads can be service offerings, products, videos, or brief copy.
Every year Google and other search engines make hundreds of updates to their algorithms for serving up search results. These updates can range from small tweaks to Natural Language Processing to large hits to current SEO strategies. When these algorithm updates are big enough they are named (i.e. Penguin, Panda, or Hummingbird).
Descriptions of photos that are rendered when the element cannot be rendered. The purpose of alt text is not necessarily to describe the contents of the image literally but to give context and guide purpose to the overall page. The alt text is used as an alternative for the image that wasn’t rendered.
An abbreviation for Accelerated Mobile Pages, it’s an online publishing format created by Google. This language is similar to HTML but optimizing a site for mobile web browsing.
Text that is used as an “anchor” for an HTML hyperlink. Search engines view anchor text with more regard as typically they are used to describe what is contained within the hyperlink but that does not have to be the case. Strategies regarding anchor text have changed in the past as search engines evolve and learn how to weigh the importance of content.
Backlinks are hyperlinks that direct a user from one website to another. Search engines view these connections as integral to their ranking algorithms. Depending on the frequency, quality and timeliness of links they can have a wide range of effects on the authority of your site. If your site has backlinks on another site then you would consider those “inbound links” as they are driving traffic back to your site from another.
A valuable term in website analytics. A “bounce” is when someone navigates away from your site without interacting with it further. Bounce rate refers to the percentage of visitors who leave your site after viewing only one page.
Keywords that include a company’s name or are related to a brand’s identity. In our case, any keyword that includes “Boostability” is a branded keyword regardless of the other words around it. These are keywords customers use when looking for specific information about a business or brand.
A software application that allows users to access the internet by interpreting HTML files and display a graphical user interface in the format intended by the website owner. Popular browsers include Google Chrome, Safari, Firefox, Microsoft Edge, and Internet Explorer.
A listing of local businesses in different categories. Directories can often be found online but they may be printed as well. Consumers use business directories to find businesses to meet their needs. Many of these directories include paid services or premium options to help businesses be seen on their site more prominently.
HTML link tags that demonstrate to search engines that different URLs host the same information and that only one should be considered for indexation. Webmasters use canonical URLs to avoid issues with duplicate information. This tactic is similar to the 301 Redirect without actually redirecting traffic. Instead it identifies the canonical link as the one with the most authority so the duplicate pages are viewed as part of the link.
Opinion content generated by users to comment on the subject matter of a certain page. Comments can be used on blogs or other sites to allow readers to interact with the content by posting a comment.
There are two types of comment sections, gated and non-gated. Non-gated comments are open to everyone on the internet. Gated sections require users to identify themselves in some way to the site as a way of monitoring what is being commented.
Information developed by a company to inform their consumers of products, services or answers. Content can include videos, blog posts, articles, social media posts, etc. Content is how you communicate with your consumer.
An important indicator in website and business performance. A conversion occurs when a visitor to a website performs a desired action like completing a purchase or filling out a form. What a conversion looks like will differ from website to website depending on what the site’s goals are.
The percentage of unique visitors who complete the desired action on a website and turn into customers. In most cases the higher the conversion rate, the more successful your website is.
The act of optimizing lead funnels and interactions to achieve a better conversion rate. This can include changing assets, media or altering content on your website. Many marketers will use A/B testing to see if A method converts better than B method for CRO.
A small packet of data that is sent from a website and stored browserside. They are typically used to remember information regarding e-commerce or record browsing activity. There are many different types of cookies that serve different purposes but the modern web uses authentication cookies to know whether or not to serve up sensitive information.
The most important pages and pieces of content on a website. These are pieces of content that other pages on your site link to in order to direct users to your best content. This internal linking structure can also guide web crawlers to know what you consider your best content. Cornerstone pieces are detailed, typically longer, and teach what a user needs to know about a subject. Normally, cornerstone pieces aren’t meant to sell content or push products, but to inform. Cornerstone content also includes the most competitive keywords that a site wants to rank for.
The act of scraping and indexing content from the internet for the purpose of serving search results to a search engine and browser. There are many types of crawls but one of the most common is a website crawl. This type will start at the homepage and grab links from there to search through a whole site. Each link followed from that point is another level in depth.
Each website has what is called a “crawl budget” that is determined by the bot. A crawl budget is the maximum number of pages that the crawler will visit on a website. This typically is only an issue for larger sites but can affect the indexation of pages further down the crawl.
A software application that fetches, analyzes and files information from web servers. A web crawler systematically browses the internet for the purpose of indexation. Because of the enormous size of the internet even the biggest and most complex crawler falls short of a fully complete index. A robots.txt file can request that crawlers only visit specific parts of a website or nothing at all.
The act of a website being removed from the search results page. Typically being de-indexed comes from a violation or manipulation to the rules setup by search engines like Google. These infractions are in the form of spammy content, toxic linking or keyword stuffing as well as others.
De-indexed is used synonymously with a Google Manual Penalty but is not always the case. It could be a human error that you had made on the back-end of your site that is causing the de-indexation. Depending on what was the cause of deindexation you will need to take appropriate action to have it reversed.
Direct Traffic is attributed to the portion of traffic on your site when Google Analytics cannot determine the source or channel. Typically this happens when users come to your site through entering your URL into their browser directly or using a bookmark. Although it should be said that this is not the full answer. Traffic can also be Direct if incomplete UTMs have been placed on links or shared social media links from apps that don’t transfer referral information.
The default value that is assigned to the rel attribute of an HTML anchor element to instruct search engines that this link should pass along authority. As opposed to a nofollow attribute that tells search engines not to pass along authority.
The amount of time a user spends on a website. The duration starts when the user lands on the site and ends when they leave. Google has used dwell time as an indicator for page rank as this metric can be an indicator of satisfied user intent.
See White Hat SEO
E-A-T is a term that Google uses for the Quality Rater’s Guidelines for judging how much clout the website and author has. The correlation between website and author and how much expertise, authority and trust they have is used to determine if they provide quality content or not. There are many different strategies for improving your website’s E-A-T score that you can check out here.
Links that point towards any domain other than the one they are on. They lead to an external source domain
Special boxes found in the SERPs that provide results in a format that is dissimilar to the normal lists of links. If your domain ranks for a Featured Snippet for a search term then it will NOT HAVE AN ORGANIC LISTING on the same results page. Because of this there are many different strategies that go into whether trying to land a snippet is ideal or not.
Not every search query has a featured snippet in the SERP but they are typically included on question queries. They are different from Knowledge Cards from Google’s Knowledge Graph that are used to enhance the results page.
Typically understood as “timely” content regarding breaking news stories but can also mean the frequency at which a site posts content to add new pages or even how often they update content. The concept of fresh content can be extremely powerful for SEO rankings if implemented correctly.
The term “evergreen” content is sometimes mixed in with fresh content but is different in ideation and execution. Evergreen content is content that is meant to stand the test of time while fresh content is the idea that you hit the topic while the iron is hot and move on.
A technology company that was started by Larry Page and Sergey Brin that has a wide variety of products and services. The initial product was the Google Search Engine that was created with the idea of PageRank as opposed to frequency of search terms and has evolved into what we now know today.
Google Search is always evolving and has many algorithm updates that change what results are being shown and how to rank for those results. The industry that seeks to optimize for and understand those changes and how they relate to businesses is called Search Engine Optimization or SEO.
Paid advertisements which appear in SERPs with the Search Network or appear on websites through the Display Network and Google’s AdSense program.
AdWords are focused around keywords where advertisers will make a list of target keywords that are pertinent to their business’ offerings. After the list is determined they bid on these keywords and depending on how much they are willing to pay as well as their Quality Score determines which ads appear.
Launched in 2005, Google Analytics is a service offered by Google that tracks and reports website traffic. This website activity is displayed as different metrics like session duration, bounce rate, events, ect. You can modify Analytics to track conversions as well as assess traffic trends to help optimize your site for better user experience.
A free tool paired with Google Ads to help select which keywords to target and how to be competitive in bids for those keywords in the Google Search Network.
This tool is also used for more robust keyword research for campaigns and also for historical statistics. You can see the competitiveness of specific keywords and terms as well as how many times per month any term is searched in Google’s search engine.
A free tool for businesses and organizations to manage their online presence on Google’s Search and Maps. You can verify your business location, post updates, respond to reviews and tell your story from this platform. Google My Business is intrinsically tied to local SEO in that it helps Google recognize your business as an entity located in a specific location.
When a Google reviewer has looked at your site and deemed that it is violation of the Quality Rater Guidelines you can receive a Manual Penalty. These penalties are not to be confused with algorithmic devaluation that can come from changes to Google’s core algorithm.
You can check to see if your site has a Manual Penalty in the Google Search Console under the Security & Manual Actions tab. Reversing a penalty can be a complicated fix but if you have completed all the allotted changes Google requests of you then you can submit your site for a re-evaluation with a reconsideration request.
The Quality Rater Guidelines are a set of guidelines given to a group of people that Google contracts to check sites manually to better train their algorithm’s results. The Rater Guidelines are located right here and can be viewed publicly. You can read through them and see exactly what Google is looking for and extrapolate what they view as quality content.
A web service provided by Google that allows webmasters to check indexing status as well as test for mobile effectiveness. This product is constantly being updated with new features and reports to help webmasters understand how Google visits their site. Some of the most prominent features are the implementation of sitemaps and robots.txt files on GSC.
A free tool that allows you to manage and deploy marketing tags, triggers and variables on your website without modifying the underlying code. This is especially valuable to marketers as they can implement pixels and code without fear of harming their site. Third-party tags can all be managed with Google Tag Manager and campaign tracking can give amazing insights on performance.
Guidelines that have been published for the benefit of webmasters so they know what Google considers “good” and “bad” practices. For example, they have a few basic principles like:
Make pages primarily for users, not for search engines.
Don’t deceive your users.
Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you’d feel comfortable explaining what you’ve done to a website that competes with you, or to a Google employee. Another useful test is to ask, “Does this help my users? Would I do this if search engines didn’t exist?”
Think about what makes your website unique, valuable, or engaging. Make your website stand out from others in your field.
Google also lets us know what we should be avoiding. Things like link schemes or little to no original content on your site. They have also denounced hidden text and abusing structured data markup. The Google Webmaster Guidelines are very similar to the Quality Rater Guidelines but they are written for webmasters as opposed to raters.
The web crawler software that Google uses to collect documents and build a searchable index. GoogleBot is characterized by two different types of web crawlers: a desktop crawler and a mobile crawler. You can use your robots.txt file to give specific instructions to the GoogleBot or other web crawlers on how to crawl your site.
As of July 1, 2019 Google announced that all new websites crawled by GoogleBot will be on a mobile-first index standard. This means that when a site is crawled that the mobile version of the site will be served to their index. Making your site “mobile friendly” is now more important than ever to avoid indexation issues.
The act of writing content for another business or website where you are featured as a guest author in order to attract traffic and boost authority with inbound links. The nature of guest posting has changed many times in the past. Prior to Matt Cutts 2014 denunciation of the practice it was commonplace to share content and links without regard to quality but now only the best content and an organic relationship should be considered.
As with many SEO tasks, guest posting can be done in a “Black Hat” or unethical manner but if done correctly can be a great tool for digital marketing. Avoid spam bloggers and do not allow low-quality content to be shared on your site. Remember Google’s Quality Rater Guidelines when considering content from an unknown guest blogger and keep your site’s users in mind.
An introductory page of a website that typically serves as a “table of contents” for the overall site. They often are used to help a user navigate throughout the website or present headlines to entice potential customers.
A basic markup language for tagging text documents designed to display in a web browser. HTML uses tags that are character-based data types, character references and entity references. Tagging is done to modify font, color, graphic and hyperlink effects. Tags normally come in pairs like <h1> and </h1>.
HTML can help describe the structure of a web page and help interpret the content by emphasizing as well as improving accessibility of web documents
The most common HTML headings or “tags” come in pairs like <h1> and </h1>, <h2> and </h2>, <h3> and </h3>. They help delineate importance from 1-6 with 1 being the most important tag on the page.
A protocol application for distributed, collaborative, hypermedia information systems. It functions as a request–response protocol in the client–server computing model and is the foundation of data communication for the Internet.
An extension of HTTP, HTTPS is used for secure communication over a computer network. With this type of connection the protocol is encrypted. The request URL, query parameters, headers, and cookies are all encrypted to provide a more private and secure connection.
A clickable reference to another source of data. The data source is typically another webpage. Hyperlinks are most commonly set to text called Hypertext but can also be an icon, graphic or a picture. Hyperlinks were integral to the formation of the World Wide Web in that they connect websites together.
A measurement of how many users see an ad or other type of digital media. This measurement is not action-based and simply refers to how many times the media was seen. Impressions are generally used for campaigns aimed at increasing brand awareness.
A link coming from another website to your site. Inbound links serve to direct users from one site to your site to learn more and serve user intent based on the anchor text they incorporate. Authoritative inbound links are an important aspect of good SEO.
Where search engines store the information they have gathered and processed from across the internet. It serves as a database of all the content that meets their quality standards. Search engines pull results from their index whenever a user performs a search so websites must be indexable (visible to search engines) to be found by users.
Global search engines, like Google, maintain indexes for each separate market. Different countries can access results from national indexes that specialize results for the search behavior of that country. Likewise, local indexes focus on results catered to a specific region or city. Common queries like “near me” and those that contain geographical names will return results from local indexes.
How accessible and transparent a web page is for search engine web crawlers. The easier a site is for crawlers to download and catalog, the more indexable the site is. Indexability is a key factor in determining web positioning, or how visible a site is on the web.
A link on a site that links to another page on the same website. Internal linking allows your web pages to share their link authority and is the first signal to search engines that your pages are important. Your most important pages (i.e. product pages, your homepage, etc.) should be linked to the most.
The idea of geotargeting your SEO to be either friendly with many countries or targeting a specific country with your SEO strategy. This can be done with the use of language tags and performing SEO with language based keywords or location based keywords. International SEO is similar to local SEO in strategy and practice in that you tailor your content for a specific location. You may also need to consider different search engines with your international strategies like Yandex in Russia or Baidu for Chinese search results.
A word or concept of great significance. In SEO, keywords are used to help describe the content of specific pages best. Search engines use keywords to help with data retrieval as they help you find sources that are associated with your keyword or keyword phrase. These search engines are always evolving to better serve user intent and are constantly being tweaked
How difficult it is to rank for a specific keyword or keyword phrase. The higher the competition, the higher the difficulty of ranking. Search frequency, domain age, site structure and overall quality of content can help mitigate some of the issues that arise with high keyword difficulty. There are also many tools that help with keyword research and help determine the keyword difficulty.
Common phrases can be extremely competitive and not worth small businesses time to try to rank for. That’s why doing proper keyword research and competitive analysis is integral to a quality SEO campaign.
The practice that SEO professionals use to determine the right keyword to target for a particular business so as to appear higher in the search results. These practices can vary widely depending on the website’s niche and campaign intent but are all similar in nature. Typically a researcher will use tools like Google Ads Keyword Planner, SEMrush’s Keyword Magic Tool, Ahref’s or Moz’s Keyword Explorer to select keywords that are attainable for the website.
A knowledge base or index that Google uses to enhance SERPs with information gathered from various sources. Typically seen as an Infobox on the side of a SERP, a Knowledge Card was incorporated into Google’s search structure in 2012. The implementation of this information was done to provide a more useful and relevant experience to searchers.
A web page that appears in response to clicking on a link in the SERPs or marketing advertisement.
As part of SEO, link building is the act of increasing the number and quality of inbound links to a webpage. This is done to increase search engine rankings and traffic. There are many types of links a website can acquire of various levels of authority and trust as well as methods for gaining these links. Link building can build brand awareness but, if done poorly, can result in a penalty from Google. When link building, always remember to add value to end users and abide by Google’s terms of service for best results.
A term used to describe how much authority a link is able to pass to another site or webpage. Link equity is closely tied to Google’s old PageRank in that it views a backlink profile as a way to determine search engine rankings. Some of the key indicators of good link equity are relevancy, indexation and overall quality of the sites involved in the link exchange.
The entirety of inbound links pointing towards your site constitutes it’s link profile. Having a good backlink profile means having multiple relevant and authoritative links linking to your site. These links can come from a wide variety of locations like other domains, directories, comments or social media shares. They can also be either no- or dofollow for a natural, healthy mix that search engines like to see.
A company that provides goods or services to a local population is considered a local business. Often identified as a “brick and mortar” location. Local businesses have different online strategies for getting foot traffic in their area than you would see for a national business without a location near a consumer. A local business can be a locally owned business or a corporate business with multiple locations operating in a specific area.
A group of three to five local business listings that appear in a map box relative to the searcher’s location or search query that includes a specific location. These results are pulled directly from Google My Business listings as opposed to the organic search index.
Optimizing a business’ online presence to rank for local searches. The strategies for local SEO are very similar to traditional but involve business listing sites like Yelp, Angie’s List and Manta as well as social campaigns to help rankings/visibility. Reputation management and review transparency also play a large role in Local SEO success. These strategies also include ranking for locational queries in order to pull local traffic to a physical location.
Part of a website’s head section typically under 160 characters and used to describe the page’s content. These descriptions are served up on SERPs and should be optimized to get engagement and click-through-rate. Meta descriptions help users see what is on the page as they appear in SERPs. A Meta Description is a type of Meta Tag but should not be used interchangeably as Tags can include many different elements that are only for crawlers.
Meta Tags are similar to descriptions but are used to more distinctly tell search engines what is on a page. They are used for the search engine’s benefit as they are found in the source code only. Sometimes the term Meta Tags is used in place of Meta Descriptions but they are, in fact, not to be used interchangeably.
Modifying a website’s content to ensure that mobile visitors have a good experience.
This can be achieved by optimizing videos, content, images, buttons, using responsive templates and considering site load speed. Changes like these to a site will help overall user experience. Google offers their Accelerated Mobile Pages as a solution to many of these mobile optimization challenges.
The mobile version of your website is what Googlebot crawls and archives in their search index. This is different from what it was in the past as it would pull the desktop version of a site. Mobile-first does not mean mobile-only and if a website doesn’t have a mobile version served up then the desktop version will be included in the index. Not having a mobile version of your site COULD harm your ranking potential.
The capability of a website to be rendered on a mobile device or the ability for a web page to render appropriately on different size windows or screens.
There are two types of mobile-friendly sites—an entirely separate site designed to serve up on mobile and a responsive design that is a site that auto-adjusts itself based on screen size. If a site is truly “mobile-friendly” then it will be easy to navigate and load on mobile devices. If a user is able to easily navigate your website no matter the screen size it can help with Google ranking factors in an SEO campaign.
The process of navigating SERPs on a web browser. Think of web navigating as a road map that enables website visitors to explore and discover different areas.
An HTML attribute that instructs search engines that the hyperlink should not influence the ranking of the link’s target relative to the source. A nofollow attribute will not stop a search engine’s crawl but will diminish or lessen the authority typically passed. The reason it only diminishes the effectiveness is because a healthy backlink profile has a good mix of both followed and nofollowed links.
Optimizations done on sites other than your own for the purpose of increasing your site’s search engine ranking. This is done by sharing content, review cultivation, influencer outreach, guest blogging and social media. Good SEO strategies take into account both on and offsite SEO strategies for overall success.
Optimizations done to your own website for the purpose of increasing your search engine rankings. This can be done by optimizing keywords, content, titles, meta descriptions, headlines, microdata, site speed, internal links, mobile friendliness and alt tags. Because these things are all done on your own website they are considered onsite optimizations and are a good starting point for any SEO campaign.
Links that are gained without explicit agreement to exchange links.They are placed in a piece of content for the purpose of enhancing the content or validate the stance of the author. They can also be called authoritative links. The idea of organic link building is that you produce content on such a high level that you will organically get people and sites linking to your own because of quality.
On a SERP there are two types of results: organic and paid. Organic rankings are everything that is not a paid result. On the front page there can be anywhere from 7 to 10 organic results with the most desirable being the first result. With the advent of featured snippets and what is considered “position zero” by some there is debate as to whether being in top position is better or ranking for the featured snippet is better.
Traffic that comes to your site from a search engine results page (SERP) for free. On a SERP there are two types of results: organic and paid. If you have a page on your site that ranks highly for a search query you have more potential for acquiring more organic traffic.
Google’s PageRank was named after one of their founders Larry Page. It was a link analysis algorithm that assigned a numerical value of 0-10 to denote importance or authority of a webpage. On April 15, 2016 Google turned off the display that showed PageRank data of their toolbar. While it still uses PageRank to determine rankings in search results it is no longer visible to site owners as is not considered an important metric to track.
Page Authority (PA) is often confused with PageRank but should not be used synonymously. Page Authority is a number between 1-100 that Moz uses to denote the quality of a page based on their own proprietary algorithm and is still visible with their MozBar tool.
The amount of time it takes for a web page to load. There are different types of ways to measure page speed by some of the most common are: time to first byte, first meaning paint and fully loaded page. There are some things that each webmaster can do to help their loading time like upgrading your hosting, cleaning up your code and optimizing your images. There are many different tools you can use to check your page load speed like Google’s own tool.
In April 2010 Google made page speed a ranking factor and in 2018 they ramped up the importance of this metric in their algorithm.
A short description of a page on a website. This is typically under 70 characters and appears on the top of the browser window and in SERPs. A page title often has the focus keyword for the page.
Any traffic to your site that the webmaster has paid for. This includes advertising promotions on PPC campaigns and social media advertisements. The strategies involved for obtaining the right kind of paid traffic can be very complex and can be extremely targeted towards specific audiences.
See Manual Penalty
A type of Internet marketing where advertisers pay a fee each time their ads are clicked. The most popular platform for PPC advertising is Google Ads but other platforms like Bing and Facebook have adopted this type of paid advertisements.
Bid-based PPC is when advertisers compete against other advertisers in a private auction for a specific ad spot that is typically tied to a specific keyword. This auction plays out automatically every time a visitor triggers an ad spot. These ad spots can either be in the SERPs or hosted on a third-party partner website that then shares the ad revenue pulled.
Content is the method by which a website presents their audience with information. This content can be in the form of text, graphical, video or audio. Quality content is content that provides value for users, generates more sales, evergreen, shareable, SEO optimized and/or enhances usability of the site.
The actual quality level of content can be interpreted from a few different angles. Quality content for users might be different than quality content for web crawlers. Google is always tweaking their algorithm to bring those two inline but sometimes they can differ. Following Google’s Quality Rater Guidelines can be a great touchstone for what qualifies as quality content.
A backlink to your site that fits into your content and backlink profile in a manner that enhances visitors experience, increases traffic and/or verifies authority. Quality links can be different depending on what your campaign goals are and the type of website you manage.
A query, in web search terms, is a question that is entered on a search engine to satisfy the informational needs of a user.
Wikipedia illustrates 4 different types of web queries:
Informational queries – Queries that cover a broad topic (e.g., colorado or trucks) for which there may be thousands of relevant results.
Navigational queries – Queries that seek a single website or web page of a single entity (e.g., youtube or delta air lines).
Transactional queries – Queries that reflect the intent of the user to perform a particular action, like purchasing a car or downloading a screen saver.
Connectivity queries – Queries that report on the connectivity of the indexed web graph (e.g., Which links point to this URL?, and How many pages are indexed from this domain name?).
Web search queries can be structured to be more specific than just a textual question with search operators.
A machine learning search engine algorithm update that Google released on 26 October 2015. This update was a great step forward towards Google’s search engine understanding user intent and serving up better, more relevant results. SEO tactics surrounding this update emphasised quality content.
A search result position that is displayed on a results page. There are different factors that play into positions on a results page with some of the most prominent being backlinks, website optimization and content structure.
See 301 Redirect or 302 Redirect
A traffic type that Google identifies as a visit from outside their search engine. This can include people clicking on links from another website or social media.
The idea of relevance in relation to SEO and search engines is the concept of how the content corresponds to a search term or keyword. The higher the ability to relate to the purpose of a keyword the higher you should rank for said term.
Influencing the way people think about you or your brand online. A good reputation online will help your sales and is essential in a modern competitive landscape for small to large businesses. You see small businesses live and die by their online reviews and campaigns directed towards gathering good reviews is a legitimate strategy that should not be ignored.
Companies, brands and people will seek reputation management services when they see that negative news is being shared. Because this type of PR is done online companies will use social media and SEO tactics to help lead their reputation management campaigns.
Often used interchangeably with Featured Snippet but a rich result is more encompassing than just a featured snippet. Rich results can include People Also Ask boxes, video results, knowledge cards or carousels. The strategy for ranking for these results involves microdata markup (like Schema.org) loading on the page.
A standardized way websites communicate with web crawlers about which areas of the website should be crawled and which shouldn’t. Not all crawlers follow the Robots.txt directives as there are some that scan for security vulnerabilities. Not to be confused with a sitemap, robots.txt file is often used in conjunction with traditional XML sitemaps.
A set of vocabulary that was created collaboratively by Google, Yahoo, Microsoft and Yandex to structure metadata (microdata) on web pages. This markup can help search engines understand content of a page and website for enhanced results on the SERP.
An example of microdata would be a website’s homepage having logo, organization, contact and social media markup for knowledge card results. There are three ways to implement Schema.org microdata on your site: HTML, RDFa and JSON-LD with JSON-LD being the most popular and recommended method.
A GUI element used to accept user input for queries of a database. Typically seen in search engines but a search box can be used for any database that has a search function.
A software system that is designed to carry out website crawling, web searches and archiving data to be queried by a user at a later time. Each search engine has their own algorithm that is continually updated with better, more intelligent functions to crawl, search and index more effectively.
When a search engine decides on ranking there are specific factors that are weighted differently that each engine looks at to attribute a location for displaying on the SERP. Frequently when you hear about algorithm updates the community is referring to the Google search engine and their algorithm.
A form of Digital Marketing that involves the promotion of websites by increasing their visibility in SERPs. SEM encompases PPC, SEO, Social Media, website and content optimizations directed at results page rankings.
The process of increasing website traffic through improvements in visibility on a search engine results page. You can improve your visibility through various methods that many SEO companies offer. SEO is solely focussed on natural or organic results and omits paid search methods like PPC or adspace.
An SEO campaign is fulfilled differently depending on website, audience, competition and niche. Because of this variety an SEO professional needs to be competent in how search engines work, algorithm updates, search volume, search terms, website structure and audience tendencies to have success. Typically optimizations are characterized under two classes: on-site and off-site optimizations. This means that you need to take into account both your website and influences not on your website.
The pages displayed by a search engine in response to a query.
These results used to be a simple 1-10 position ranking that started with the best, most linked-to result being in the top spot on the page. But due to many changes in the online landscape there are now snippets, cards, carousels, ads and other elements that are competing directly with the traditional 1-10 spots.
The list of web pages a user has visited recently. Typically this data is recorded by the web browser along with page title and time of visit. Sometimes third-party services record web browsing history for various reasons.
A question that is posed to a search engine in order to search onlines databases for a response of web pages. Typically search queries were associated with a search term or keyword but as algorithms get smarter and understand human speech we get further and further away from short keywords and go towards actual conversational dialog.
Traffic that visits your site from a search engine.
The number of searches that are anticipated for a specific keyword in a 30 day time period.
A form of online writing that contains keywords, helps rankings, and increases traffic. When writing you have to take into account both the user and a search engine crawler. One may understand the context of the content but the other may not so using keywords to help illustrate this intent is essential.
A URL that illustrates context to users and crawlers. Understanding how search engine crawlers use URLs is essential to selecting a good URL slug that is SEO-friendly.
Best practices for this type of URL selection vary but the general consensus in the past has been that the slug should contain the keyword that it looks to rank for
An SEO reseller is an individual or a business that outsources an SEO strategy and deliverables to a third-party and then sells the SEO service under their own branding to their list of clients. The third-party can be either a freelancer, digital marketing agency, or a dedicated SEO agency like Boostability.
A process by which a webmaster or SEO agency audits a website to see how optimized it is for search engines. This includes technical aspects like pagespeed, site structure and mobile optimization as well as content and keyword placement. Tools that are typically used are ScreamingFrog or other common tools like SEMrush or aHrefs.
A list of results pulled for a query or search. These results are pulled from an index that is archived by a crawler. A SERP can change many times a day as modification to search engine algorithms happen regularly as well as ranking changes due to new content and ranking signals.
Features that modify the typical 1-10 spot display of a search engine results page.
These features constantly change but are traditionally seen as rich results like featured snippets or carousels but can also be as simple as paid ads. Recently, due to the COVID-19 pandemic Google has included pagination for terms that are more ambiguous. For example if you search “COVID-19” you will get 5 different pages you can flip through like Overview, Statistics, Health INfo, Testing, Coping and News. These types of changes could see widespread implementation for more ambiguous, informational terms.
Site Age refers to the amount of time during which a domain has existed. Site age is a factor in SEO ranking only as far as content is concerned. If a site has been around for a long time it is assumed that it is constantly being worked on and producing new content. If this is not the case then site age would mean nothing as no credibility was being built. Just because your site is older does not mean it will rank better.
A sitemap is a list of pages on a website.
There are two main types of sitemaps that SEO deals with. One is a structured listing intended for web crawlers and search engines that tells them what to do and where to go. The other is a user-visible listing that helps with navigation of your site by visitors. This is typically done in a hierarchical fashion for ease of access.
When a sitemap is intended for a crawler it is typically called sitemap.xml and helps the bot find pages that aren’t accessible through links.
Social media marketing is the use of social media platforms to connect with your audience to build your brand, increase sales, and drive website traffic.
The major social media platforms that marketers use (at the moment) are Facebook, Instagram, Twitter, LinkedIn, Pinterest, YouTube, TikTok, WhatsApp and Snapchat.
Social media marketing can indirectly help your SEO in terms of visibility for your content. If someone shares and likes your content then more and more people will be visiting your site and search engines view these visits favorably for ranking purposes over time.
Spam, in the SEO sense of the word, is the deliberate manipulation of search engine indexes and algorithms to better your position on SERPs. This is done in many ways such as keyword stuffing, poor link building practices and duplicate content.
The history of quality SEO has evolved significantly over the years. Many tactics that were once accepted as common, good practice have fallen out of favor with search engines and, in turn, the SEO community.
Structured data is another term for microdata and describes entities and their relationships through a standardized list of attributes (aka Schema.org) . This data is organized or structured to show a format or rule set that is readable by crawlers. You often see this in SERPs as rich results, star reviews(in the past) or breadcrumbs.
A subdomain is another part of a main domain that typically illustrates different parts of an organization. For example, blog.boostability.com would indicate that blog content would be available underneath the domain boostability with com being the top level domain or TLD.
Not all sites have subdomains and their use is a hotly debated topic in the SEO realm. Most SEO experts have the opinion that subfolders serve the same purpose and require less overhead to manage and, therefore, more preferable to subdomains.
A metric used to determine how long a user stays on a page of a website. Longer time on page indicates that the content being viewed is satisfying the visitor’s intent.. Short times on page can indicate the content is too thin or no longer relevant for users. This metric is also used by crawlers when determining SERP rankings.
Content that inadequately explains a concept or a proposed idea. Thin content may not be long enough, or lack important details. With the general trend towards actionable content, your site may not be indexed if your content does not provide answers to users or encourage them to move forward in their customer journey. Thin content can sometimes be identified by higher than normal bounce rates.
White label marketing is the act of providing a marketing service or product to a third-party business to brand under their own logo and branding, not yours (despite the fact that you’re delivering on the actual work). The third-party business will sell the marketing product or service as their own.
This is a common practice in various industries and can be a very beneficial partnership! Learn more about white label marketing here.
Similar to white label marketing mentioned above, white label SEO services is the act of providing service to a third-party business to then market under their own branding. It’s typically a partnership between the white label SEO provider and another agency that will then sell the service under their own branding to their clients.
It’s a common and extremely beneficial partnership because it allows the third-party agency to scale their business with minimal resources and bandwidth, reduces risk in offering another service that they may not be experts in, and provides a new revenue stream. And for the white label SEO provider, it’s recurring revenue!