Say hello

Mathsight would love to hear from you. Give us a call, email or fill out the form and we'll be in touch.

0844 264 2960

hello@mathsight.org

Close

*Required field

 

Request Demo

MathSight discovers the secret of Penguin 2.1 – Flesch Kincaid and Dale Chall readability tests

Share Button

MathSight, the revolutionary machine learning SEO platform, has made a breakthrough discovery about the algorithm employed by Google’s Penguin 2.1.

Following weeks of analysis, MathSight has confirmed – to a 99.9% confidence level – that Penguin uses the Flesch Kincaid and Dale Chall readability tests as part of a combination of metrics to evaluate linking content.

MathSight Systems Architect, Frank Kelly, said: “We began to suspect the influence of readability tests shortly after the launch of Penguin 2.1 in October, but have been waiting until we had enough data to rigorously test the hypothesis. We found that both the Flesch Kincaid Grade Level and Dale Chall readability tests are having an impact, with the number of words per sentence and the ratio of rare and common words all playing a part in site traffic following a Google Penguin update.

“With Penguin 2.1, Google is favouring web pages with well written, well researched and sophisticated content. So, for example, the lower the ratio of rare words to the total on offsite linking pages, the more detrimental those backlinks are to your SEO.”

In addition to readability related metrics, MathSight has also identified that 2.1 is impacted by classified ‘page type’ (i.e. blog posts vs. commercial landing pages).

“We are still mathematically confirming the types of content to which we can link a reward from the Penguin change, in terms of inbound organic search traffic,” said Kelly.

According to MathSight Managing Director, Andreas Voniatis, Penguin 2.1’s ‘probabilistic’ algorithm is an effective method of evaluating the presence of so-called ‘dirty’ links.

He said: “The last decade has witnessed online businesses buying links on the basis of PageRank or a website’s inherent authority. The search engines, in particular Google, have had to respond by formulating new algorithms that look beyond the volume and quality of links.

“Targeting web pages with poor readability is algorithmically the most efficient way of discounting poor links from web pages that have content that has not been written professionally and lacks relevant research.”

So what are the implications for the SEO industry? Voniatis believes SEOs simply need additional data that helps them to evaluate links the same way Penguin would.

“Instead of only looking at a site’s authority, the SEO should use additional data to evaluate their readability scores before deciding on whether or not they want nurture an editorial relationship with a website for the purposes of link outreach.”

Share Button
comments powered by Disqus