An absolutely white way of link building, invented by Brian Dean. Many people do not know about Skyscraper Linkbuilding, because it is often not accepted to work with such methods. What is the essence?
Find content relevant to your niche.
Find those who link to this content
Make the content better.
Write to those who are referring, with a suggestion to put a link to us instead of those who are being referred to now.
Brian Dean points out that this method worked. With 226 emails, he received 15 links, i.e. the exhaust turned out to be good. But it is important to take into account the nuances:
the authority of Brian Dean could affect the number of responses;
linkbuilders work in conditions of fairly strict KPIs, so there may not be time and resources for such interaction schemes (roughly speaking, you need to write a material – offer it – wait for a response and consent from the webmaster – to be placed).
List of donors for Skyscraper Linkbuilding
404: find broken links and get profit
Through Ahrefs we find broken links on your subject (sometimes there are even opportunities to be posted on Wikipedia).
Then we write to the owner of the resource with an indication of the possibility to place our content suitable for the subject in place of the broken link.
Cooperation can be for a fee or free of charge.
Practical guide: doing this. the core is for one page and we get the TOP in the search
In this article, I will share a fast and efficient algorithm for collecting the semantic core for a single page of the site. It would be most logical to apply this method to service sites and content projects.
Semantics is the foundation of effective SEO.
If the foundation is so-so, the result will be xs.
Why for one page?
Everything is simple. Many experts are trying to immediately assemble a semantic core for all the promoted pages of the site, all services, products, and so on, immediately cluster it all, make a structure, meta tags, and everything.
In fact, with this approach, the quality can suffer quite a lot. As a result, some clusters will move worse, somewhere there will be no top at all, and even links with a cheat PF will not help.
Detailed study of this. kernels for a single page allows:
- discard keywords that rank sites not in our weight category;
- identify a clear intent;
- get a list of similar phrases;
- make the norms right away and do not redo.
It will take up to 30 minutes to work out the semantics for one page.
Algorithm of operation
1. We use the service to work with semantics at our discretion. These can be: Ahrefs, Semrush, Keys.so , Mangools and so on. I use Ahrefs on a regular basis and now I will use it.
2. For example, let’s take the topic “typography” and imagine that we need to promote a website on this topic. We take the keyword “typography” and begin the analysis: select the region and the search engine.
Let’s start analyzing the data. First of all, I open the “main topics” tab, where Ahrefs has already grouped keywords.
We begin to look at the phrases that interest us. At this stage, we immediately analyze which sites are ranked in SERP and do not take the keyword if there are no direct competitors, sites of a different type or subject.
We add the phrases we are interested in to the list inside the service (in my case, this is Ahrefs) or to an Excel file, as convenient as possible.
3. Immediately look at the search suggestions. Google has 3 types of search suggestions: from the search bar, from the bottom block under the search results, and hints that appear after clicking on a competitor in the top 1 and returning to SERP.
4. We reduce all keywords to a table with parameters: frequency, complexity of promotion (KD, is in Ahrefs), traffic potential (TP, is in Ahrefs). That’s all, the list is ready, you can continue to make structures, TK for texts and so on.
Do I need to parse keywords from competitors?
It is possible and so, but in practice this. the core for the page will not be completely complete. It’s not a fact that competitors keep the semantics of the pages up to date, some interesting phrases may be missed.
At the same time, competitors can be ranked by phrases that they do not have either in texts or in meta tags, but if you embed the missed phrases of competitors into your site, then the positions will be higher.
It is much more efficient to select keywords manually and quickly make a list of 20-30-50-100 phrases. The volume of the list depends on the subject. Then you can compare this list with what the service has parsed from a certain competitor’s page and, so to speak, feel the difference.
How to update this. the core for an existing page?
You need to update the semantics on the site pages at least once a year, and this is usually required once every six months. At Google, the most complete semantics is contained in GSC. This data is completely not seen by any service.
I will show the algorithm of updating on the small website of the regional cosmetology center.
1. Let’s say we need to update this. the core for the page on the service “Trichology”. We go to GSC, then Efficiency. Setting up filters by URL and country.
That’s it, we got a list of phrases for which our page was shown in the search. Pay attention to the number of keywords in the screenshot. Our task is to find out if we have used all the target keywords on the page or if it can be improved.
2. Export the list of phrases and upload them to the service for analysis. In my case, this is keywords explorer in Ahrefs. We begin the selection of keywords with SERP analysis. We save the phrases to the final list.
3. Now we need to match our text on the page, taking into account all the h1-h6 headers and meta tags with our list from point 2. This can be done through an online service, for example overlead.me , or write your own tool.
We enter the URL, parse meta tags and content. In the window on the right, copy the filtered keywords from point 2, click the “check” button.
Quite simply and clearly we get the result for each phrase and look at what can be improved, prepare the TOR for content revision and occupy the TOP in the search.