In mid-August, a report by Semrush – the most popular tool for analysing one’s own website and those of competitors – was released, analysing the main errors affecting site optimisation and positioning. Through Site Audit, one of the tools of the SEO analysis platform, data from 100,000 websites and 450 million pages were collected and analysed. A sort of ranking of the most frequent SEO errors and the most common actions taken for on-site SEO actions was then elaborated.
No site can give up SEO. From Menlo Park they inform that the popular search engine processes results in SERP on the basis of content quality: Content is king is the claim always present in official Google communications to webmasters. However, it is the experience of anyone involved in search engine positioning that SEO techniques that act on-page strongly influence the position within the results proposed by the search engine. The improvement of the contents through a wise web-oriented writing, the optimisation of images and tags, the use of long tail keywords and internal links are only some of the actions that concern on-page SEO.
The Semrush 2017 Report has been realised this year on the basis of three fundamental factors:
Crawlability. The ease and completeness of scanning a site by search engine crawlers is an element that strongly affects their indexing, favouring the positioning of those more easily analysed and navigable.
On-page optimisation. It involves techniques that affect the quality of content through optimisation of text, images and tags, the use of long tail keywords, internal links and backlinks, to name but a few of the actions that affect SEO on-page. The choice of domain, robots.txt files, and many other aspects also affect optimisation.
Seo Technician. The Technical Seo acts on the technical aspects of website and web page optimisation. The improvement of the architecture, the distribution of static elements, a responsive CSS, the URL structure and the configuration of the server act strongly on the organic positioning.
What are the most frequent errors?
The infographic spread by Semrush offers interesting, in some ways unexpected indications:
More than 80 percent of the sites analysed have 4XX errors. These are error messages regarding the interaction between client and server, related to resources that the server cannot find or is not enabled to provide. The most common is error 404, related to resources not found. Another common error is 400, related to syntax problems. Only 10 percent of http errors found are 5XX, which are problems presented by the server. They then reach about 30 percent of errors that concern internal or external links broken or with the attribute no follow. There are also numerous problems with redirects. In the data commentary, Semrush points out that one in 2 of the sites being searched has problems with links. The report also highlights problems with the sitemap: 35 percent of the analysed sites did not have it or did not indicate it correctly in the robots.txt file.
More than 93 percent of the sites examined have a low Text/HTML ratio. Text/HTML is the ratio of the portions of text visible to the user to the amount of text in the HTML code. You can often act on the code, for example, by moving strings of CSS or JS inline common to multiple pages from the page. 70 percent of sites have duplicate content, and more than half of sites have missing or duplicate meta description.
One of the main services offered by ByTek Marketing is that of SEO consulting, which is also carried out for nationally known brands. Contact us if you are looking for SEO consulting or Digital PR activities.