As such, after the emergence of many SEO optimization tools, the SEO agency Uplix (in partnership with Oncrawl) seems to have found the formula to go beyond its infancy.
The object of this progress?
A machine dedicated to Predictive SEO (Predictive SEO), able to tell if a web page model will rank well or not, and why!
How is it possible ?
The following will show you that it is easier than it seems!
Do you know the failures?
Today, powerful AIs such as Stockfish and AlphaZero help in the preparation of the International Grand Masters, evaluating positions and recommending the best shots.
Well it turns out thatULI (Uplix Lab IA) does exactly the same: taking into account the google game rules, the strengths and weaknesses of competitors and the target query keywords, the algorithm estimates your position in the SERP with a accuracy up to 92%.
Then, he suggests areas for improvement to rank better.
Of course, the first task was to gather all known ranking factors in order to teach them to AI.
The experts ofUplix took care of it with their fingers in their noses. Then, thanks to the know-how ofOncrawl, the algorithm can benefit from a large database via crawling on the sites concerned.
In fact, to know which pages rank well, you have to know the qualities and weaknesses of the main competitors ! Then the predictions are checked, and should appear as SEO recommendations for the owner of the server.
This is where the tool d‘Uplix works wonders: he is not only able to say why a web page is better referenced than another, but also to tell you in what proportions the modification suggestions will impact the ranking.
Indeed, most identifiable and measurable referral factors generally concern:
- the contents (ex: number of words);
- the performance (ex: loading time);
- the popularity (ex: backlinks) of a site.
There are dozens of them, with a weight that differs according to query and competition.
The work ofULI so is to tell you how prioritize featuresWhere “Referral criteria”. The role of the server owner will then be simply to follow the recommendations of the algorithm.
Accordingly, if ULI believes that keywords in your H2s are more important than indexing speed, it will be web editor to take action rather than developer.
Artificial intelligences exist to make good human intuitions gain precision. Yet even with machine learning, ULI announces a minimum of 8% errors.
A flaw in the algorithm?
Not quite. Indeed, there are some unpredictable ranking factors, among which :
- the’Internet user’s history (past research is guiding future results in SERPs);
- the internet user behavior (although this is not entirely established, some experts suspect browsers like Chrome to observe user behavior while browsing so that Google can further refine its results pages);
- the’arrival of a competitor ;
- A manual penalty from Google …
So this is how we get 8% margin of error.
It is incredibly simple!
When the tool is given a certain number of features to check, this one performs the following manipulation: it removes a ranking factor from its estimates and compares the level of error before and after.
The diagram below illustrates the example of interface response time when the user interacts with.
When ULI no longer takes this criterion into account, it records a error rate of about 20% instead of 8%. This gap of 12 points is therefore, which classifies response time as the first criterion of importance for natural referencing in this precise case. The guarantee of results is therefore devilishly mathematical!
Given the countless number of sites that exist and need adjustments (or even a redesign), a Predictive Ranking tool would allow:
- giving recommendations easy to understand by means of a dashboard turned user ;
- to prioritize modifications with a action planning and therefore adjust the budget by focusing on the essentials;
- allow one total optimization on a web page template even before it is published and indexed ;
- quickly boost the positioning of sites at strategic times for specific requests.
Uplix and Oncrawl are developing the beta version of their machine learning dedicated to Predictive Ranking.
Before long, any website owner, regardless of budget and field of activity, will be able to benefit from a fast and ultra-precise audit.
Even taking into account the margin of error, it will be possible toidentify the main problems of a site, even of more easily assume external factors that prevent a page from taking the lead in the SERPs on a specific query.
In short, it’s a bit like going to the doctor, and he has a scanner capable of checking your entire body, in order to diagnose in one go which may prevent it from functioning normally.
We’ll see when will this innovation be released still under construction!
Read the full article on: https://www.uplix.fr/predictive-rankings-seo/
You want to receive our best articles ?