European methods of top up: PayPal and bank transfer with VAT Read more
Sign up

Basic SEO Checklist for Self-Promotion

Site audit. Do it yourself.
Completed tasks: 0 out of
Great!
You have completed the checklist
Here you can conduct a free SEO site audit. Go through the list of the most important points for self-promotion in 2024.
Make sure to apply them on your site. Add your own items as needed.

Login or Register to save progress and create SEO checklists for many projects"

1. Keyword Research Checklist

remove
Completed tasks: 0 out of 0
0%

1.1. Determine the target audience

remove
The target audience
The target audience is the group of Internet users who may be interested in visiting the site, obtaining information, buying goods, or ordering services presented on the site.
When determining the target audience, its portrait is compiled:
  • gender, age, marital status
  • the place of residence
  • education, employment
  • financial and social status
  • other data
Understanding the portrait of the target visitor allows you to better understand exactly how he is looking for information, which words he uses most often, and which phrases he practically does not use.

1.2. Find keywords

remove
People use keywords to find websites. They do not always search the way the site owner would like. Therefore, it is important to understand which keywords can be used to find your product, service, or content. By optimizing landing pages for these phrases, the page is more likely to be found by the target audience in search results.

1.2.1. Brainstorm

remove
Think about what words or phrases are best for the site or business it represents.
After generating ideas, group them and rate each one.

1.2.2. Google Search Suggestions

remove
Search engines provide a good opportunity to see the most popular queries for a given key.
Use Google Suggestions to expand brainstormed phrases.
To automate the collection of search suggestions, you can use tools such as Kparser and Keyword Tool.

1.2.3. Statistics and keyword selection services

remove
Major search engines provide their statistics services, where you can not only expand your keywords but also select new options, as well as evaluate the popularity of each query.

1.2.4. Site statistics

remove
Analyze which key phrases are used to find your site or certain pages using installed on the site web analytics services (Google Analytics or others).
Site statistics can show many more options than available third-party services.

1.2.5. Webmaster Dashboards

remove
Query statistics by impressions and clicks are also provided in the Google Search Console. Here you will find for which phrases your site already has visibility.

1.2.6. Words used by competitors

remove
Find sites that are similar to yours or reflect a similar business. You can find out what keywords these sites use for promotion by:
  • By analyzing html code (phrases in titles, meta tags, content) and text links;
  • Browsing open statistics of visits on the sites;
  • Using third-party services like Serpstat and Ahrefs.
Through the same services, it is also useful to pay attention to the keywords for which competitors order PPC-advertising.

1.3. Expand the query core

remove
Got the keywords? Great!
Now is the time to expand the resulting list using word combinations, synonyms, abbreviations and typos. After all, people search not only for one-word queries, but use different options!

1.3.1. Single-word and multi-word queries

remove
What additional single-word queries can add to the list you received earlier?
Can you expand this list with multi-word queries?

1.3.2. Synonyms and abbreviations

remove
Add synonyms and abbreviations to the list of keywords you have chosen.
For example, people can search for a laptop in different ways:
  • "notebook"
  • "netbook"
  • "laptop" or simply "lappy"
  • macbook.
You can also include slang words here.

1.3.3. Word combinations

remove
Use several key phrases in different combinations to get new options.
For example, you can try this scheme:
  • Cheap - … - in Kiev;
  • New - Laptops - Asus;
  • Used - netbooks - Acer;
  • The best - … - from the warehouse;
You can also swap words.

1.4. Check the key phrases selected through brainstorming and combination. Are users looking for them?

remove
Keywords and search terms are different concepts. People enter search queries in the search form and they are interesting for the site, as they can attract traffic.
Check which of the key phrases you have selected (by means of brainstorming or combination) no one is looking for and remove them from the general list.
The easiest way to check the list is with the Keyword Tool program.

1.5. View the number of search results for selected phrases

remove
Too many found documents indirectly indicate the competitiveness of the search query and, accordingly, the complexity of its promotion.

1.6. Select promising key phrases

remove
Depending on the goal, select the most promising key phrases from the resulting query core that will bring maximum traffic and, at the same time, require not too much effort to promote.
Over time, you can expand the list, but it is better to start to achieve the effect with minimal cost.

2. Plan or Optimize Your Website Structure

remove
Completed tasks: 0 out of 0
0%

2.1. Compose key phrases

remove
Working with a large list of keywords is inconvenient. Usually, they are grouped according to a number of characteristics and then work with each individual group. A list of keyword groups can help you understand what sections are important on the site, and create an optimal structure.

2.1.1. Group keywords into categories and subcategories

remove
Select the main categories from the resulting query core, including the relevant keywords in them. Distribute the remaining words into subcategories. Usually, the categories are high- and medium-frequency phrases and the subcategories are medium- and lower-frequency phrases.
Also, for grouping queries, you can use the so-called clusterizers.

2.1.2. Create a schematic tree of the site

remove
Furniture:
  • Beds
  • Kitchen furniture
  • Children's furniture
  • Furniture for the living room:
    • Coffee tables
    • Wardrobes
    • Closets
    • Chests
  • Dining room
  • Cushioned furniture

2.2. Optimize the structure of the site

remove
The structure of the site implies the presence of important pages and links between them. The search engine system must have access to important pages, understand the priority of each document, and quickly crawl and index the necessary content.
You can visualize the site structure using Screaming Frog or Sitebulb tools.

2.2.1. All navigation links are available in HTML

remove
Search robots do not recognize well the structure of sites made mainly in JavaScript or Flash. It is important to display the structure with text links. Disable JavaScript and Flash in your browser, you can also disable styles, and check if the browser displays the necessary navigation.

2.2.2. Any page is available from the main page with a maximum of two clicks

remove
Important pages and sections are made quickly accessible (measured in the number of clicks from the main page).
The more clicks it takes to get to the page, the less significant and visited it is. The "distant" pages are indexed by search engines with difficulty and re-indexed with a long delay
Nesting level of pages 2-3 guarantee more frequent visits to the site by search robots and have the highest priority for them.

2.2.3. Important sections of the site are linked from the main page

remove
The link from the main page is the most significant. Priority sections are best placed at the second nesting level.

2.2.4. Use bread crumbs

remove
Breadcrumbs are navigational links and help search engines to better define the structure of the site, as well as improve its usability.

2.2.5. Create sitemap.xml

remove
A sitemap is a file where you collect information about all the pages of your site, and indicate the relationships between them.

2.3. Optimize page URLs

remove

2.3.1. Reflect site structure in the page address

remove
Displaying the site structure in page URLs is useful not only for ease of use, but also for correct clustering of the site into sections by search algorithms. This will also affect snippets in search results.

2.3.2. Friendly URLs for internal page addresses

remove
Friendly URLs allow you to use keywords in page addresses, as well as increase the clickability of links.

2.3.3. Small internal page URLs

remove
Page URLs should be short and concise, reflecting the content the same as headings. This is useful in many cases, including when users send each other a link to a page or post it on social media.
Also, long links are often cut off when placing them on a number of sites. Use short options!

3. Technical SEO Checklist

remove
Completed tasks: 0 out of 0
0%

3.1. Configure HTTPS

remove
The secure protocol allows the site to be safer, both for administrators and users, and increases the site's trust. It is also one of the ranking signals.
.

3.2. Speed up the site

remove
You can check the current site speed with the tools PageSpeed Insights, GTmetrix, WebPageTest

3.2.1. The size of the html code does not exceed 100-200 kilobytes

remove
The larger the size of the HTML code, the longer the page loads and the more browser resources are needed to render it. Small pages will allow the site to load quickly for both users and search robots.
You can check the size of the html-code using the Developer Console built into the browser, or through external services, for example, through Sitechecker. You can check the size of pages on the site in bulk using the Screaming Frog crawler.

3.2.2. Page loading speed does not exceed 3-5 seconds

remove
Check how long it takes to load your site on the desktop and mobile. It is optimal if the full download occurs in 3-5 seconds. Otherwise, you need to look for the bottlenecks that slow down the site and work on them.

3.2.3. There is no unnecessary garbage in the html-code of the page

remove
Search engine parsers, when analyzing the content of the site, remove unnecessary information from the code, such as comments or scripts with styles. To make their task easier, reduce the size of the pages and speed up the parsing of the site, you can immediately remove everything superfluous in the HTML code, and move large pieces of scripts into separate files, including caching for them.

3.2.4. Optimize images

remove
Images are one of the bottlenecks of all sites, which do not allow you to achieve maximum loading speed. Use all image optimization techniques to reduce their size, format them optimally, or delay loading them until the image appears on the user's screen.

3.2.5. Configure caching

remove
Cache is a temporary storage, the contents of which are provided at the user's request instead of the original file, which significantly speeds up the loading of page content. Depending on the storage location, the cache is distinguished by client caching, server caching and caching on the side of search platforms.
Server caching is configured with special caching plugins and depends on the CMS. Client-side caching can often be enabled independently using the .htaccess file or the same caching plugins.

3.2.6. Enable compression

remove
Compressed or archived data is smaller in size and therefore transfers faster over the network. You need to check whether text documents are compressed on your site and, if necessary, enable this feature.

3.2.7. Set up AMP pages

remove
Accelerated Mobile Pages is a technology used by Google to deliver content instantly to mobile users. To implement it, you need to create separate versions of pages (amp-pages) on the site and link them to the main ones. For details read the documentation.

3.3. Optimize site indexing

remove

3.3.1. Good server uptime

remove
Uptime is the time of continuous server operation. Uptime 98% says that for all but 7.5 days per year your site will remain active, 99% - 3.5 days. The simpler the type of hosting, the more often, the worse the uptime. This issue should be studied even before purchasing a hosting and upload the site. Poor uptime can negatively affect the indexing and ranking of the site.
Set up monitoring of your server's availability through the Uptime Robot, which sends reports in case of problems.

3.3.2. Sitemap.xml added to the webmaster panel

remove
Created sitemap.xml must be added to the Google Webmaster Dashboard. So the search engine will quickly learn about its existence and be able to quickly crawl the pages hosted in it.

3.3.3. Javascript does not contain content that is important for indexing

remove
Search engines can execute Javascript, but they do it not immediately when scanning a site, but at the second stage. If important content is loaded on the site with Javascript, the search engine will not see it immediately, and in some cases, it may not see it at all. To prevent this, it is necessary to give all important content to search engines immediately in the HTML code. Disable Javascript in your browser and after reloading the page, see if it contains content that is important for indexing.

3.3.4. Missing frames

remove
Frames are old technologies. But if suddenly you still use them, then it is better to redo them.

3.3.5. Server logs, the admin panel and subdomains with the test version of the site are closed from indexing

remove
All sections of the site that should not be included in the index must be closed from indexing. It is better to close the admin panel using the meta-robots noindex tag and test subdomains through an entry in robots.txt.

3.3.6. The pages contain the corresponding encoding

remove
The encoding of the file, the encoding specified in the HTTP headers of the document, as well as in the HTML code, must match. It is recommended for modern sites to use UTF-8 encoding, which supports writing text in different languages, as well as supporting special characters and emojis.

3.4. Get rid of duplicates

remove
Duplicate content prevents the site from being properly indexed and ranked in the search. In large quantities, it can reduce the authority of the site.

3.4.1. Primary mirror selected (with or without www)

remove
A site mirror is a single domain or subdomain that duplicates content and is an alias. A site must have only one primary mirror, and all other mirrors must redirect with a 301 status code to the primary. Otherwise, several versions of the site can get into the index.
If you have a main version of the site without www, then check what happens when the user enters your site with the www prefix. And you need to check not only the presence of the redirect but also its code. Only 301 is used for merging mirrors redirect.

3.4.2. Customized robots.txt file

remove
The robots.txt file contains directives for the search crawler, which can be used to deny access to certain sections of the site. You need to make sure that robots.txt is configured in such a way that everything important to the robot was available for crawling, and everything unimportant was closed.
Useful links:

3.4.3. Home page not available at /index.php or /index.html

remove
When requesting pages with /index.php or /index.html elements, they should return a 404 status code or redirect to the canonical page if the code does not use the canonical tag.

3.4.4. Old URLs redirect visitors to new pages

remove
If the site used to have a different address structure, then search engines need to be informed about the structure change. This is done using 301 redirects from old addresses to new ones. If there are no new analogs of pages in the structure, then you can redirect to very similar pages, parent sections, or, in extreme cases, give a 404 status code, preventing duplicates from appearing.

3.4.5. Rel=canonical is used

remove
If there is the same content on several pages, you can specify the canonical one using rel=canonical, then it will be used for all available versions and shown in search results.
An example of specifying a canonical page in the document:

3.4.6. Non-existent pages give 404 error

remove
To prevent unplanned duplicates on the site, all non-existent pages should give 404 status codes. This can be checked by observing the server response code for a URL that is obviously non-existent on the site, for example, via https://httpstatus.io.

3.4.7. Check the site using Netpeak Spider

remove
You can find additional duplicates on the site using a parser program that will crawl all pages and generate reports. Duplicate pages often have duplicate titles. Use Netpeak Spider to check.

3.5. Users and robots see the same content

remove
If users and robots see different content on the same page, this can be regarded as cloaking, which may cause the site to suffer. Also, often unforeseen problems arise due to an incorrectly configured geoip redirect. You need to make sure that robots and users see the same content, regardless of the region or other client data.

3.6. Reliable hosting is used!

remove

3.6.1. Virus protection is used

remove
Websites are often broken to put a virus or spam content. The authority of sites suffers from this, and search engines try not to give traffic to infected resources. To prevent this, you need to have a secure site, conduct a security audit and use reliable hosting.

3.6.2. Protection against DDoS attacks is used

remove
DDoS is an attack on a website with the aim of bringing it down. Usually, this is an imitation of high traffic from different sources, which the hosting cannot withstand and the site becomes unavailable for a long time (for the period of the attack), if it does not take additional protection measures. The more popular the site, the more often it is attacked by competitors. Make sure you have DDoS protection tools.

3.6.3. Backups are configured

remove
It happens that sites go down and it is impossible to restore them. Unless from backups, if they were ever saved. Set up regular automatic backups so that in case of emergency you can get by with the least losses.

3.6.4. Server availability monitoring is configured

remove
Your server may be unavailable at the moment when search robots are actively scanning it. And it is better to find out about the server crash as soon as possible in order to quickly correct the situation. To do this, you need to configure accessibility monitoring. The easiest way to do this is through special services.

3.7. The site is registered in the webmaster's panel

remove
Search engines have developed a special panel where you can monitor the status of your site and respond to problems in time. Use this opportunity to monitor your resources!

4. On-page and Content SEO Checklist

remove
Completed tasks: 0 out of 0
0%

4.1. Optimize headers <title>

remove
If you are creating a quality resource, then it is necessary to make page titles usable (in various aspects) and optimized for search engines.
.

4.1.1. Use a short and descriptive title

remove
Use short and descriptive titles, otherwise, their text will not fit in the titles displayed by search engines in search results.

4.1.2. Display the content of the pages in the title

remove
The title should not just be but display the content of the page.

4.1.3. Make headers attractive to click

remove
Which headers would you click on? Use keywords that are important and expected by users, as if you were optimizing an ad.

4.1.4. Use keywords in title

remove
TITLE is the most important text section for SEO. Use keywords here that you want users to find you for.

4.1.5. Insert important words at the beginning of the heading

remove
Sentences and phrases are read first, so this is the best place for important words - keywords for this page and/or motivating a person to click on the header.
When adding a site to bookmarks, important words at the beginning of the header will help find the right site faster.

4.1.6. Title is unique within the network

remove
Titles should be unique not only within one site but within the entire network. This will make it easier for the user to choose.

4.1.7. Emojis are used

remove
Emoji in titles and descriptions will help you stand out from the rest of the text and attract additional user attention. You can find suitable emoji by keywords on the website https://collaborator.pro/emoji/

4.2. Optimize snippets

remove
A snippet is a description of a site in search results. It includes both titles and description text itself, as well as additional visual elements, such as the company's office address or sitelinks.
A good site snippet can significantly increase the number of clicks to it.

4.2.1. Text in meta description no more than 250 characters

remove
Make meta-description from 100 to 250 characters. Short ones will look bad, and long ones will be cut.

4.2.2. The description is written in such a way that attracts attention and encourages the user to act

remove
Like any other text, a good description that is aimed at the user and encourages action will work more efficiently and increase the CTR of the site in search.

4.2.3. The description contains the keyword

remove
The text from meta-description is just a suggestion of a snippet to search engines, it displays it only if the text is relevant to the keyword (the keyword is found in the description).
In most cases, the snippet is taken from the page content (the most relevant part of the query).

4.2.4. Use microdata markup

remove
Microformats allow you to structure content and explicitly indicate the semantic meanings of individual blocks of text, which can be reflected in snippets in the search.
More about microdata:

4.3. Optimize content

remove

4.3.1. Use unique content

remove
Unique content is loved not only by readers. Pages with such content gets into the index faster and is easier to promote.
Even if the site has thousands of pages with product descriptions, compose each description manually, making it unique.

4.3.2. Format content

remove
Formatting includes numbered and bulleted lists, tables, and other items. Structured and formatted content can better address the information needs of users, often improving behavioral factors.

4.3.3. Insert keyword phrases in H1-H6

remove
Don't forget to use <h1> – <h6> tags with keywords. These tags not only help structure the content, but also briefly describe it. The weight of words located in h-headers is much higher than the weight of words in regular p-paragraphs.

4.3.4. Key phrase occurs in the text

remove
The text should contain key phrases that you want users to use to find the current page. If the key phrase is unnatural and never occurs in the language, then the page should not use a direct occurrence, but its natural variant.

4.3.5. There is no invisible text

remove
For whom did you add this invisible text? :)
Sooner or later, someone will see it and all efforts to optimize the site will go nowhere. Approach site optimization responsibly, be as white as possible.

4.3.6. There are no duplicate content

remove
Duplicate content often causes the search engine to incorrectly determine the relevant page.
If this is not an intra-site duplicate, but copy-paste from another resource, then such pages are hardly indexed and even more difficult to rank in the top for the necessary phrases. Get rid of duplicates!

4.3.7. Keywords are used in the alt attribute (if there are images)

remove
In the alt attribute of the tag, you can specify alternative text that is displayed when images are disabled in the browser. This text is also used by search engines, associating it with an image.
Using alt keywords, you can increase the relevance of the page for these phrases, as well as get them in the image search, which can attract additional traffic.

4.3.8. There is no pop-up ads that cover the main content

remove
Pop-up ads have a good property to annoy the visitor, distracting him from his own thoughts. Therefore, it spoils the karma of the entire site as a whole.
Search engines try not to show sites with poor usability in the top, regardless of the degree of quality of the content.

4.3.9. The text on the page consists of at least 250 words

remove
250 words per page are guaranteed to give it the ability to quickly get into the index, as well as use several occurrences of the desired keywords.
When writing a description for the goods of an online store, pay special attention to this point.

5. Off-page SEO Checklist

remove
Completed tasks: 0 out of 0
0%

5.1. Optimize internal links

remove

5.1.1. Pages have at least one text link

remove
All pages of the site must be relinked. If you have pages on your site that do not link to other pages, then:
  • it will be more difficult for them to be indexed in search engines, since robot may not find them;
  • they will have a small internal weight, which will prevent their promotion in search engines.
You can check the presence of such pages using any available crawler: Screaming Frog, Netpeak Spider and other programs and services.

5.1.2. The number of internal links on the page is not more than 200

remove
The number of 200 is very conditional, 300 internal links can also be acceptable.
Do not use internal links unnecessarily - this will inefficiently spend the crawling budget.
We are talking about sites with hundreds of thousands of pages, for sites with thousands of pages this recommendation is irrelevant.

5.1.3. Keys are used in internal links

remove
This does not mean that you need to make all internal links with commercial anchors like:
  • laptop price discount;
  • laptop Kyiv cheap.
For excessive use of such anchors, you can be sanctioned by search engines for using spam technologies.
Let's say you put 100 internal links to the page about the sale of gaming laptops.The ideal anchor list for internal linking is 100 different key phrases with mid- and low-frequency queries, such as:
  • buy a gaming laptop;
  • a gaming laptop for a child;
  • how much does a gaming laptop cost, etc.
This mechanic allows you to enrich the semantics of the landing page and rank it for more queries.

5.1.4. The Wikipedia principle is applied

remove
The main principle of linking is to think about the user. The landing page should meet expectations. You should not try to put internal links in inappropriate places.
The best example of internal linking is Wikipedia. You always understand what will happen if you click on an internal link. Do the same.

5.1.5. Primary navigation is accessible to non-javascript crawlers

remove
Do not close menus or other internal navigation elements in javascript. This will prevent search robots from clicking through, as well as transferring internal weight.

5.1.6. All links work (no broken links)!

remove
Make sure all internal links are valid and don't lead to 404 pages. This can be done using any available crawler: Screaming Frog, Netpeak Spider and other programs and services.

5.2. Moderate outgoing links

remove

5.2.1. Control the quantity and quality of external outgoing links

remove
It's perfectly fine to link to other sites. This is how you participate in the Internet ecosystem.
However, you need to be careful who you link to and not too many.
Link to sites and pages that are relevant to the content of your site. Use common sense when it comes to quantity.

5.2.2. Irrelevant and unmoderated outgoing links are closed in rel=nofollow

remove
If for some reason you link to sites that you do not trust, close them with the nofollow tag.
If you have comments open without moderation and are not protected from spam, close external links in them with the tag nofollow.

5.2.3. Links posted by visitors are moderated

remove
Moderate user-generated content to avoid spam.

5.2.4. There is no link farm

remove
Do not create special pages from which you will put 100-200 external links to unverified or low-quality sites.
If you want to list your partners and you have more than 10-20 of them, then it is better to make a separate page with a description of each of them, or use pagination.

5.3. Post backlinks to the site

remove

5.3.1. Positive link profile dynamics

remove
Keep a constant positive link profile dynamic. If your site gets links, then users are talking about it. It’s a positive sign for search engines.
Link building is a process. Set it up so that the site receives new links daily/weekly.
One of the easiest ways to solve this problem is crowd marketing.

5.3.2. Work is underway to increase the authority of the site

remove
Authority (trust) of a site is a very important criterion for evaluating a link profile. The higher the trust, the easier it is for new and old pages of your site to occupy leading pages in search results.
You can evaluate the trust of a site using services such as:
To increase the authority of the site, you need to use backlinks from donor sites with high trust.
To solve this problem perfectly fit:
  • links from main pages;
  • links from guest articles on trust sites;
  • through links.

5.3.3. External link anchors do not use keywords

remove
When getting links to your site, follow the anchor list. The ideal ratio of anchor and non-anchor links is 20% to 80%.
Important! Excessive use of commercial phrases in the anchor list is very risky. This can lead to sanctions from search engines.

5.3.4. Unique domains are used

remove
To increase the trust of the site use backlinks from unique domains.
Reusing domains is great for targeted promotion of landing pages.

5.3.5. Links are placed from thematic documents

remove
Strive to ensure that the subject of the page from which the backlink is obtained and the target page are the same.

5.3.6. The scope of donors is expanding

remove
Use platforms with a wide range of topics, as well as media platforms.
Let's assume that you are promoting a site for installing air conditioners. Expand the topic as follows:
  • personal blogs of air conditioner installation masters;
  • repair services sites;
  • sites of construction and design services;
  • sites about technology;
  • sites that have a section about technology;
  • business sites;
  • media platforms.
Your task is to constantly increase backlinks. To do this, it is necessary to find opportunities, to show creativity and to expand the subject, so as not to reach the limit.

5.3.7. Tracking of received links is set up

remove
Control your attachments. Track the placed links. If you use exchanges, then they have functionality for monitoring.
To track links obtained by other methods, you can use:
If the owner of the site has deleted the link — try to negotiate for its restoration.

5.3.8. Development of the link profile is tracked

remove
At least once a week, check the link profile of your site.
  • pay attention to new links and their source;
  • make sure there are no spammy links from competitors;
  • control the anchor list.
2
3
8
1
8
8
people have already used the service
According to our cookie policy, we process cookies to provide you with the best user experience.