SEO is the process of tuning our website content so that it is optimized for both visitors as well as search engine bots. This is to increase the chances of our website appearing in search engine results pages in response to user queries. With millions of websites and trillions of webpages, the process of ranking our content at the topmost position on search engines is not an easy task. Every blogger or website owner must have an understanding of the basic SEO concepts before diving into the deep pool of SEO. This blog post aims at explaining the most common SEO terms in simple language so that even newbie bloggers could understand the concepts.
When we provide a link to another page on our website we could provide the complete URL of our target webpage. For example, let us see the below hyperlink tag.
<a href="https://www.blogtriangle.com/about-us″>About BlogTriangle </a>
This is an absolute URL as it contains the entire address of the webpage.
Algorithms are programs created by Google and other Search Engines that decide how a page is ranked. Google, for example, considers more than 200 factors for ranking content. They have several algorithms such as Penguin, Panda, and Hummingbird, etc that focus on specialized tasks. For example, the Panda algorithm is designed to identify duplicate content, plagiarism and keyword stuffing.
ALT description is an HTML attribute that we add to images to provide a text alternative for search engines. The bots cannot identify or interpret the images as such.
They depend on this alt text information to identify the content of the image. An apt ALT text helps the search engines to better understand and correlate our content and image. This might also help in improving our Search Engine Ranking.
This is the clickable text in a hyperlink. For example, if I create a link to a blog post mentioning top SEO mistakes then my hyperlink HTML would be as follows.
<a href="https://www.blogtriangle.com/top-7-seo-mistakes-bloggers-should-avoid/">Common SEO Mistakes</a>
When we see this link on the browser the text would be Common SEO Mistakes. From an SEO perspective, it is advisable to write an anchor text that is relevant to the page that we are linking.
Backlinks are created when we link to a webpage on another domain. For example, if I link to a Wikipedia page on this blog post then I am providing a backlink for Wikipedia. Backlinks from reputed sites provide us with an SEO advantage because Search Engines recognize this as a sign of our authority or reputation. On the other hand, low-quality backlinks are adversely going to affect as SEO as search engines consider this as a signal for spammy or irrelevant content on our site.
Baidu is the most popular search engine in China. This search engine dominates the market of the world’s most populated country with more than 76 percent of the online search queries.
So if we focus on the Chinese audience it is essential to sign up for Baidu Webmasters.
Bing owned by Microsoft is another popular search engine that claims about 33% of the search engine market share in the USA.
Site owners focusing on American geography must definitely sign up for the Bing Webmasters portal and optimize the site as per their guidelines.
8.BLACK HAT SEO
This methodology follows practices that are unethical and does not adhere to search guidelines. This might give a temporary boost in search engine rankings but always carries the risk that our website or content may be banned from search results. The AI-powered algorithms of the search engines have the potential to detect such fraudulent techniques and will destroy any short-term advantage we may have gained by black hat techniques. These techniques are mostly employed by spam sites that are active for shorter tenures.
Google and other major Search Engines are continuously crawling the internet for updated information. This process of traversing the web page is done by automated programs known as bots. They are also known as web crawlers or spiders. These bots crawl our content based on the rules defined in the Robot.txt file.
When a user queries for certain information on Google the search engine will display a list of results based on the keywords and other numerous factors. Suppose the user clicks on one of the websites in the results page, and immediately clicks the back button on the browser to get back to the listings page it is known as a bounce.
Google measures this bounce rate and uses it as a parameter for further ranking. This is because it is logical to assume that the user has not found the required information on the site he just visited if he has clicked the back button to get back to the results page. In other words, if more people leave a website immediately without taking any action then that Google assumes that the specific content may not be satisfactory for the users. An increase in bounce will surely affect our rankings.
Breadcrumb is a visual navigation system that shows the users their current location within a website. This is especially useful for large websites such where the user might find it difficult to go back to the particular page from where he has started his navigation.
The above image depicts a sample breadcrumb on the Amazon website.
A broken link refers to a URL that is no longer active. We might be linking to several external web pages within our website while creating our content. We have no guarantee that these external links will be available forever. When Google bots try to index our content they also check the links within our webpage. Since these links are not available, the bots will conclude these as dead links. Having more broken links on our site is actually an indicator of low-quality content. It is a good practice to audit our external links once in a while to remove all broken links.
There may be different pages with different URLs on our website that are identical or nearly identical in terms of content. As Google is not aware of the relationship between these identical pages this may be considered as duplicate content which causes issues in the ranking. The canonical tag is a way of informing the search engines that a specific URL is a copy of another page and we would like google to display the original page in SERPS.
The above line of HTML code snippet tells search engines that the URL “https://example.com” is the original version of the page that this tag occurs on.
Content management system (CMS) is an application that helps to manage the creation, modification, and distribution of digital content. The most popular CMS platforms available in the market are WordPress, Joomla, Drupal, etc. For example, this site blogtriangle is managed using WordPress CMS.
Content cloaking is another malpractice to trick search engines by hiding content using HTML or CSS. In this process, many div sections are added containing a variety of keywords related to the topic. These sections are then hidden from the user. But search engines read this content and assume that the content is related to so many keywords in the hidden section. These techniques might have worked long back but now search engines can easily identify such hidden content and may result in a penalty.
The main revenue of Google and other similar search engines is from the ads displayed on the search results page. For example, if I search for “Dog food in Seattle” I get the following page.
As we could see, the top few results are marked as advertisements. We could also see a few sponsored links on the right side of the page. When we click on any of these ads or sponsored links the advertiser has to pay Google a certain amount of money. Cost Per Click (CPC) refers to the actual amount the advertiser pays for each click in these marketing campaigns.
CPC mentioned above is just one model used in advertising. Another popular methodology to determine the amount the advertiser has to pay is based on the number of impressions. CPM means Cost-Per-Thousand. The “M” represents the Roman numeral for 1,000. For example, if the CPM is fixed at $2.00 then the advertiser has to pay $ 2.00 for every 1000 views of the advertisement.
Search engines are continuously scanning the internet to identify new URLs and content to add to their database. This process of reading our content is known as crawling. This is done by automated programs known as bots. The relevant information from our website is then added to the search engine database. This information is later displayed to the users when a query is made on the search engines.
Cascading Style Sheets refers to the code snippet that provides the look and feel of our website. CSS determines how the different HTML components like buttons, texts, links drop-down boxes, etc appear on our website. Bloggers need to have a basic understanding of CSS syntax so that we could easily customize the appearance of our blog.
Click-through Rate is another parameter used by Google to rank a website. CTR is basically the percentage of users who click our link against the total number of people who had the opportunity to do so. If we position higher on the SERP’s then naturally our CTR will also be high. The first result of SERPs has a CTR of 31.7% whereas the 10th result has a CTR of just 3%.
When bots encounter links on a web page they have two options either to follow the link or to ignore the link. This is determined by checking the HTML rel attribute.
<a href="http://www.example.com">Do Follow Link Sample</a>
If the link does not have any attribute specified then it is the do-follow link by default. This means that the search engine will follow this link.
Domain authority is a score developed by Moz that predicts how well a website will rank on SERPs. This based on a scale from 1-100 where a higher score indicates a better ability to rank. Moz itself acknowledges that DA is not used by Google directly in determining search engine rankings. But this score can be used as a reference parameter to understand the ranking potential of our site as compared to our competitors.
The domain name is basically our website’s address that the users type in the URL bar of the browser to access our site. Before hosting a site we need to first purchase a domain name from providers like GoDaddy. A domain name is unique meaning that no two websites can have the same domain name. Let for example consider this website. The browser URL of this page reads https://www.blogtriangle.com/seo-glossary-common-seo-terms-explained. Here blogtriangle.com is the domain name. Domain names come with different extensions such as .com, .org etc. Each of these is considered a different address.
Google is the world’s most popular search engine that handles billions of searches per day. The term search engine is SEO has almost become synonymous with Google.
Every site owner or blogger must optimize their content as per Google guidelines. This is quite essential for our site to appear on the SERPs. If Google doesn’t find our site then it literally means our site does not exist.
25.GREY HAT SEO
This refers to the use of the technically legal but ethically arguable approach for improving our site rankings. The borderline between Black Hat and Grey hat is really thin and so one should be very careful while implementing such techniques as these may one day become a black hat. One example of this strategy is submitting our website to web directories to gain backlinks.
HTML allows us to define various tags from H1 to H6.
Basically, these tags are ordered in a sense of importance. For example, an H1 tag is the most important tag on a page and is most probably the main heading of the page. Similarly, tags from H2 to H6 are used to organize the content in a structured manner.
These tags are also significant from an SEO perspective as search engines find relevant information about the content from these tags.
HyperText Markup Language describes the structure of a webpage. The different components on a page like text, paragraph, button, radio button, drop-down, etc are HTML components. This is basically a markup that browsers can understand to deliver the content on the webpage.
HTTP stands for Hypertext Transfer Protocol. This is basically a system that defines the communication process between our computer and the server. When we provide a URL on the browser and hit the Enter button, the browser makes a request to the server for downloading the contents. This communication is governed by HTTP. This protocol is less secure and hence now we have HTTPS in place.
HTTPS stands for Hypertext Transfer Protocol Secured. This is exactly similar to HTTP except that it is more secure. Google and other search engines prefer our website to have HTTPS communication in place so that the user information is better protected. We can buy an SSL certificate for our website to enable HTTPS communication.
A clickable link on a webpage is known as a hyperlink. When we click this link it will take us to another webpage on the same or a different domain. Hyperlinks mostly appear blue in color unless custom CSS styles are specified.
The above image shows how a hyperlink is created using HTML syntax. The above code will display the hyperlink as shown below.
Images play a crucial role in improving our search engine rankings and hence deserve a special mention in SEO. The main aspects of image SEO deal with image optimization techniques, ALT text creation, image resolution, and format, etc.
This is one of the most common SEO terminologies often used by experts. As explained before, during the process of crawling the search engines fetch relevant information from our website and add to their database. This process of adding our site content and URL to the search engine database is known as indexing. The size of Google’s index is so huge that it contains trillions of web pages, billions of images and millions of books from all over the internet.
This is the most popular term in the SEO industry. Keywords are terms within our content that help search engines identify the content matter. Search engines get input from users in the form of words or sentences. These input phrases are matched against our content while delivering a result. Therefore keywords are of ultimate importance in SEO
Keyword density is defined as the percentage of times a particular keyword appears on a given page when compared to the total number of words on the page. In the earlier days of search engines, keyword density is considered as the most important factor for finding content relevance. But now search engines have advanced far beyond this stage and consider many external as well as internal factors for ranking a page. There are no ideal numbers for keyword density. The keywords should appear naturally in the content flow and must not be according to some predefined rules.
Keyword stuffing is malpractice to trick search engines by repeatedly adding our keywords to the content to increase keyword density. Unfortunately, many people follow this practice under the assumption that search engines will rank their content high to the due high percentage of keywords in the content. But search engines have now become smarter and understand such techniques easily. If search engines suspect our content to have stuffed keywords then we are definitely going to be penalized.
The brief description that follows the website link and header text on SERPS is known as meta description. For example, if I search for Madison Square Garden the following snippet from Britannica shows a brief description of this place.
This snippet gives a general idea about the webpage content an helps the users to decide whether to navigate to the search result or not. The meta description is a key factor that improves CTR.
If the encountered link has a rel=”nofollow” attribute defined as shown below then the bots will generally ignore this link and would not go to this page.
<a href="http://www.example.com" rel="nofollow">No Follow Link Sample</a>
This simply means that if we get nofollow links from other sites it is not going to give us any SEO advantage.
This refers to our site’s SEO related stuff that happens outside our websites such as Link Building, Domain Authority, Social Media and Local SEO, etc. This is the most difficult and time-consuming part of SEO that requires a well-defined strategy.
This refers to the set of measures or actions that we can implement directly on our website to improve our search engine rankings. Some examples of On-Page SEO elements are Title tag optimization, Meta description optimization, changing ALT tags, etc. The on-page SEO mostly deals with our site as well as content optimization in terms of speed and performance.
Page Authority is another metric developed by Moz that indicates how well a page will rank on SERPs. Just like DA, Page authority score is also has a range from 0-100. The difference between DA and PA is that DA is for the entire website whereas PA is for a specific page.
The amount of time required for our website to load on the browser once we enter the URL is basically our Page Speed. Google considers this as one of the ranking factors. Google’s study indicates that as the page load time goes from 1 second to 10seconds the probability of a mobile site visitor discarding our site increases 123%.
Page Speed depends on a variety of factors such as our hosting plan, page size, number of images, and number of HTTP requests, etc. Google Page Speed Insights is a free tool by Google that analyses our website and provides a detailed report of our page speed.
A page view is a process viewing the website page in the browser. If we reload the page, it is counted as a separate view. As per Google, the page view is
Page view gives us a basic idea of the popularity of a web page.
We do not have to provide the entire address while providing relative URLs. For example, let us see the below hyperlink tag
<a href="/about-us″> About BlogTriangle </a>
This is a relative URL. It assumes that the page we link to in is on the same site and so gets the initial part of the address from the current browser URL. SEO experts have varying opinion about the choice of absolute and relative URL during interlinking
Every single result that Google displays on the results page is known as a snippet.
For example, if I search for “Steak Recipes” I would get the following result.
This is an ordinary snippet. There is another category of the snippet which is more informative and good looking.
This snippet has the rating, time for preparation, calorie information, etc. As this snippet is richer in information as compared to the previous one this is called a rich snippet. A rich snippet obviously improves CTR.
Robots.txt is a text file within our site that tells the search engines which pages to crawl and exclude. There are definite guidelines from Google to create Robots.txt file. Any misconfiguration of this file would block search engines from indexing our content and hence we should deal with this page carefully.
This is another common term that is popular in the SEO world. Search Engine Results Pages are pages displayed by search engines in response to a user query. The results are arranged in descending order of relevance. This page may also contain sponsored results and advertisements.
The above image is an example of SERPS in response to the query “Best Restaurants in Dublin”. The ultimate goal of any SEO strategy is to obtain the top position on SERPs.
47.SITE MAP XML
An XML sitemap is a page within our website that contains all the URL’s on our website. This page acts as a roadmap of our site for search engines. XML sitemaps do not help us directly to improve our search rankings but indirectly helps google and other Search Engines to crawl our content efficiently.
48.WHITE HAT SEO
This refers to the use of strategies, techniques, and tactics to optimize our site to improve our search engine ranking. This methodology follows ethical and fair practices that are completely approved by search engine giants like Google. These techniques are time-consuming but the results are also long-lasting.
This is a popular search engine that has more than 55% of the market share in Russia.
It is therefore quite obvious that if our site focuses on the Russian audience we must be optimized as per Yandex guidelines.
When we enter some URLs in the browser we might get another page that has a completely different URL. This process of taking the users or search engines to a different URL is known as redirects. We could specify the redirect with a code as 301. A 301 redirect is considered as a permanent redirect. So the search engines might assume that the original URL no longer exists.
A 302 redirect is considered a temporary redirect. The search engines would not ignore the original URL and would retain it in their index. So redirects should only be defined based on purpose. From a user perspective, both 301 and 302 redirects are the same.
Note: I have just added 51 common SEO terms to this glossary that I thought are most important. The dictionary of SEO technical terms are so broad that it is not possible to cover the entire list on a single blog. I will keep on adding more terms and definitions to this list.