Technical optimization is a professional task, from layout to full website programming.

Metrics and analytics

Be sure to install tools from Google and Yandex!

It is imperative to connect them together:

  • Google: Webmasters 1️⃣ + Analytics 2️⃣
  • Yandex: Metrica 3️⃣ + Webmasters 4️⃣

Technical website optimization is the software configuration of a resource for complete and accurate scanning by search engine robots for the purpose of improved indexing and search ranking.

Sitemap.xml

No manual additions (automatically), it's better to structure it: the XML Schema file format.
A sitemap is needed for crawlers to index pages that don't have links from the homepage or are poorly interlinked.
Define the structure virtually – in a sitemap, you can clarify sections for search engines – what belongs to what. Indispensable for a single catalog.
Typically found at domain/sitemap.xml https://topseoboss.com/sitemap.xml

Eliminating duplicate pages

Make a clean URL.
Remove the / slash at the end of the URL, set an automatic redirect to without /.
Add an SSL certificate, the https protocol, and an http redirect.
Do without www, /index.html, /index.php, /index, /main - there should be 301 from all of them.
And, without fail, Specify the canonical on each page.

Robots.txt

User-agent: * - these are all robots, i.e. general directives.
Disallow: /folder - we block access to these directories, everything is logical. Technical folders of the engine. The admin panel /admin and search /search also go here.
We prohibit technical strings like?
from=webmaster - they can be added during transitions from services.
For User-Agent: Yandex, the directive for clearing parameters from Yandex.Direct is Clean-Param: yandex-source&utm_source&utm_medium&utm_campaign.
For User-Agent: Googlebot, the directives are the same as the general ones - only the Disallow section is explicitly closed:
https://topseoboss.com/robots.txt

301, 302, 404

301 redirects are permanent and 302 redirects are temporary.
Error 404 means a non-existent page.
All known deleted pages must be redirected to the homepage or the previous level using a 301 redirect. If these pages reappear, we'll use a temporary 302 redirect. This ensures that Googlebot constantly crawls the redirect and waits for it to reappear—useful when your geolocation rankings change, for example, if you're an artist performing the same show in different countries.

Get parameters

These are configured in Robots. For Google, simply specifying rel=canonical is sufficient. For Yandex, there are two other options: adding the Clean-param or Disallow directives. For more details, see the official source: How to find and disable unnecessary get parameters.

Micro-markups

It would be desirable to have the ability to edit page by page.

Check for errors and fix them