We come across some terms in SEO work and process. SEO terms have many different words and each word corresponds to a different SEO process. In the list I prepared below, you can read the SEO terms we come across during SEO service and their meanings. Here is a 2020 SEO glossary with a list of SEO terms and their meanings:

Glossary of SEO terms available

SEO – Search Engine Optimization

SEO, which has the English name "Search Engine Optimization" and translates into our language as "Search Engine Optimization", is the optimization and consists of improvement works.

On-page SEO

Considered as the first stage in search engine optimization and all dependent on Google ranking factors, it is the main optimization work carried out entirely within the site.

Off-page SEO

Considered as the second phase of search engine optimization, Offshore SEO works to increase site authority and brand awareness. Within these studies, the first rule is to get backlinks from quality, active, up-to-date sites that do not violate Google's policy.

Backlink


One of the most interesting in SEO terms is backlink. A backlink, used in external SEO optimization, is a link from one website to another website through text, images or other means. A backlink is an important step in promoting target keywords (anchor text) or queries in search results.

Hacklink – Black hat SEO

It is a type of backlink that is obtained illegally and can lead to serious penalties when used and flagged by Google. A hacklink is a backlink that takes advantage of a site's vulnerability to access the system and then provide a backlink to a site you want without the site owner's knowledge or permission.

Sitemap.xml (Sitemap)

It is a document that contains all the links on the site (example: www.domen.com/sitemap.xml). Pages, videos, images, and similar links are all included in the sitemap, and search engine spiders like Googlebot help index all the pages and posts on the site. You can use a sitemap to provide information about certain types of content on your sites, including videos and images. If you're running WordPress to create a sitemap, you don't need to install any SEO optimization plugins (Yoast SEO, RankMath, All in One SEO, etc.). To create a sitemap, go to the site and add your domain. ==> https://www.xml-sitemaps.com/

Robots.txt

It is a special document that shows which folders on your site can be accessed by bots that analyze the content on your site. Bots understand the instructions in this file and parse the folder accordingly. For example, the “Disallow:” command in this file specifies a directory that bots are not allowed to access.

Title

Headings are the first part that tells what the page is about. You can help both Googlebot and users by explaining what the content on your page is. Keyword usage is very important in this area. Title adjustments are an important factor in internal SEO optimization to increase your target keyword or query.

Meta-Description

Disclosure sections provide information about what the content on the page is about. The disclosure section that appears in search results should provide information about what the user can find on the page, and should be written in a way that encourages visitors to click through to your site. Using your keywords in this area is important.

URL

URL is a small but important factor that has been recognized in SEO for a long time. Including your target keyword in your URL is a huge advantage for you. For example, let's say the keyword for the page you're optimizing for is "Web Developer". Then it is required to use "domain.com/web-developer" in the URL of the page as well.

Canonical URL (Canonical URL)

Another interesting term in SEO terms is called canonical URL. A canonical URL is a URL structure used when two pages on a website have very similar / identical content. A canonical URL prevents pages with the same or similar content from being indexed again, and the base URL used in the URL is indexed.

Nofollow link

The concept of nofollow is a command that indicates that a link is not followed by Googlebot. When providing a backlink to a site, the "nofollow" tag used in the HTML code tells bots not to follow the link. So, Googlebot is told, "I'm linking to this URL, but don't follow, take advantage of the value of my site."

Dofollow link

The dofollow tag means that the backlinks from our site are trusted, that it is good to be followed by Googlebot, and that this site can benefit from our site's values. The "dofollow" tag added to the HTML code tells Googlebot, "I'm backlinking this URL, and I trust it can benefit the value of my site."

Redirect

There are many applications used in SEO work, but two of them are the most commonly used applications. 301 and 302 redirects are among the most common redirects we see today. A 301 redirect means a permanent redirect from one URL to another URL. As an example, it's a negative for us if a URL we won't use on our site ends up with a 404. For this reason, after removing a page that we no longer use, we should redirect its link to an address that is related to a 301 redirect or has content close to that page.

A 302 redirect, on the other hand, allows one URL to be temporarily redirected to another URL. This is usually the case in temporary switching exchanges, etc. It is used.

Broken link (Broken link)

A broken link is an error that appears when a link exists but does not work. It is called a broken link in cases where another page to which we have given a backlink within our site is not working, has been deleted or cannot be accessed for any other reason. This backlink should be changed or corrected. You can find broken links on your site using the Screaming Frog SEO tool. Besides, you can also find your broken links with this site. ==> https://www.brokenlinkcheck.com/

Keyword

A keyword, a concept that forms the basis of SEO research, tells the search engine what query a user enters into your site. So, if a user asks what came to your site from Google or another search engine and clicks through to your site, it's a keyword. SEO experts and site owners do keyword research to find the keywords that will get them the necessary and quality traffic.

Keyword Density

Keyword density shows how much the keyword we are targeting and working with is used in the content on the page. The number of times a keyword is used in the content, title, or text anywhere on the page is important. As you can see in some websites, keyword usage rate is not taken into account and as a result Google sees these pages as spam and results in this penalty.

Googlebot

The search engine giant is Google's bot that crawls the entire web. If access is not restricted by site owners, bots analyze all the pages they see and add them to the database. In addition, Googlebot not only analyzes a page, but also analyzes other pages linked from the page. Thus, its scope is greatly expanded, and the data it stores in the database increases accordingly.

Crawl Budget

How many times a day pages on a site are viewed by Googlebot is called analysis budget. This includes how many times bots visit a page, how often, and when. Site speed, number and quality of links to your site, site size and some signals are important for Googlebot to visit often.

Blocking unnecessary and weak content pages on the site to Googlebot is important for optimizing the analysis budget. Googlebot visits a certain number of pages per day, and this number is uncertain. For this reason, in order to use Googlebot's analysis budget, we need to submit pages that we value to prevent Googlebot from using unnecessary pages. In this case, we should mark pages that have no SEO value with "noindex" and block Googlebot access from the Robots.txt file.

Index

After the search engines analyze the site, the indexing of the analyzed pages is called indexing. Search engine bots add the pages they view to their database and list them in a directory after browsing. The order in which the search results will be depends on certain criteria.

NoIndex

The "noindex" tag, used to instruct search engine bots, tells bots not to index the page with this tag. Search engine bots that see the noindex command while analyzing the page will analyze the page, but will not show it in the directory and search results.

Spam

Spam is a concept that emerges after research and is carried out separately according to search engine criteria. Spam is improper actions taken to gain rankings by manipulating search engine bots instead of informing visitors. For example, using a keyword repeatedly on a page to gain an extra advantage in ranking is an example of spam.

Anchor Text

Each link from one site to another contains text. Clickable texts are called "anchor text" in backlinks.

.htaccess file

A .htaccess file is a document that allows you to change settings on a web site. Various settings can be made at the folder level with the commands used on the server. It is used by many web servers, especially Hypertext Access and Apache.

SERP (Search Results Page)

This is a results page with a list of sites that appear after entering a query in search engines. The listing here depends on SEO optimization and SEO work must be done properly to get top rankings. Bots analyze your site and rate you if they find it relevant enough to be listed at the top of the search engine result page.

Domain Authority

Domain Authority – Domain authority is a value given to how powerful your website's domain name is on the internet. Domain Authority, a value given by SEO tools as a result of hundreds of search engine criteria, can also be considered brand awareness. Sites with high domain authority are favorable compared to their competitors in SEO analysis. To increase the reputation of a domain name, you need to focus on factors such as backlinks, original and quality content, social media signals and the best response to user queries.

Page Authority

Page authority is a different metric than domain authority and varies on every page on your site. Each page has its own value. Backlinks, user visit time, on-page content quality, originality and links from social media are variable on your site. That's why page authority has a different value on all your pages.

Google Sandbox

Google's Sandbox, a filtering system, is a filter applied to sites that violate Google policies and conduct SEO. Google, which aims to provide users with the best and quality content through filtering, sandbox sites that do not comply with these factors and generate spam. A website that gets sandboxed and gets a filter penalty no longer appears for targeted keywords or queries.

Page speed

Page speed is an important value that has been officially recognized by Google as an SEO factor since July 2018. For Google, which aims to provide the fastest, highest quality and best response to site users, page speed has gained great importance, especially in recent times. Now all the people, especially the use of mobile phone, has brought up the need for speed. This is why page speed has become so important for SEO. You can use this link to measure page speed. ==> https://developers.google.com/speed/pagespeed/insights/

Algorithm

Algorithms are a ranking model that helps search engines index according to certain criteria. Algorithms make a number of changes to deliver the best content to the users and aim to provide real quality content to the users through improvements. There have been many algorithm updates such as Panda, Penguin, Hummingbird, Pigeon, Mobile, RankBrain, Possum, Fred and Medic.

Breadcrumb

It is the circulation menu on the site. It shows the paths we take until you get to any page. This rotation menu is also provided to search engine bots like Googlebot. For example, in a hierarchical order like "Home>SEO>SEO service". Also known as breadcrumbs, these appear hierarchically on search engine results pages.

Content (Content, Blog, Article)

Article is a key factor in SEO. As a manifestation of its purpose and service, Google wants to list the best, most accurate and quality content for users, and creates a ranking for this purpose. Therefore, it is very important to have a strong, comprehensive and rich content that best answers a query made by users.

Duplicate Content

It is one of the things that Google is hostile to. Using duplicate content can have a negative impact on SEO and can lead to penalties. Copy content should be avoided as much as possible and it is important that you always aim to provide original content to your visitors. You can use sites like Grammarly or Copyscape to find duplicate content.

Internal link (Internal links)

Links from one page to another page or post within the site are called internal links. An internal linking strategy should be developed accordingly to emphasize the importance and authority of the pages on the site.

External link (External links)

Outbound links from your site to other sites. It is important to add a nofollow tag to the links that leave your site unless there is an additional situation.

You can check out other posts about SEO. https://kananmirza.com/kateqoriya/seo

 
Kanan Mirzayev
Full Stack Web Developer

The best way to get to know a programmer is to look at the code they write. You can provide a link to my Github account.

Write a comment

Your email address will not be published. Required fields are marked with *.

0 Comment