SEO News Round Up: September 2020

– 12th October 2020 –

 

What happened in the world of SEO in September?

There were some insightful SEO updates in September! For regular updates on the world of digital marketing and our company you can sign up to our Monthly Newsletter.

 

Best Content for Backlinks?

A study was published at the start of the month by Fractl examining how well different content pieces acquire backlinks and social shares. The study examined 5,000 articles and the results they achieved. Articles included had at least 25 backlinks, with new sites excluded.

The findings

It was the shorter pieces, with an average length of 700 words, that performed best and gained the most backlinks. The content types that performed best for social shares, were:

  • How-to articles
  • List/ video
  • How-to article with video
  • List/ infographic
  • Newsletter

It’s definitely worth ensuring you’re including these content types in your marketing calendar, if you don’t already.

Overall, this is pleasant news to read because it shows you don’t need big campaign pieces to build up your website’s backlink profile. Smaller pieces can be just as effective, if not better. I think it’s also worth remembering that whilst this study shows that short content works, different campaigns have different content needs. The needs of your campaign always need to be taken into consideration, so short content shouldn’t become your automatic go to for all outreach efforts.

 

 

View your robots.txt as Bingbot

This month Bing announced the release of their enhanced robots.txt tester. In the words of Bing, the tool is designed so that “Webmasters can submit a URL to the robots.txt Tester tool and it operates as Bingbot and BingAdsBot would, to check the robots.txt file and verifies if the URL has been allowed or blocked accordingly”. So, you can use this tester tool to highlight any errors in your file and identify what changes are needed.

You can also use the tool to make edits to the file and see how they impact crawls. As robots.txt are one of the few elements Google follow, being able to intrinsically view and understand it makes this tool an asset many people will be grateful for.

 

 

Core Update Recovery Times

During a webmaster hangout John Mueller was asked if websites need to wait for the next core update to recover. The question arose from the situations where a site is negatively impacted by one update but recovers when the next update is released.

What are Core Algorithm Updates?

If you’re not in the know, a Core Algorithm Update is when Google makes notable changes to their Search algorithm. This algorithm is amended daily, but periodically bigger changes are made that have more noticeable impacts on the SERPs. These changes are considered Core Algorithm Updates and usually are announced by Google on Twitter when they’re rolled out.

Why ask about recovery times?

Some websites see greater impact from update to update than others. Where some websites will go through an update with minimal change, other websites can see big rewards, or losses. When negatively impacted results can often stay low, despite SEO and Web Developers working to improve the site, until the next update is released and the website finally recovers.

As updates can be months apart, you can understand the concern of webmasters.

However, Mueller countered the need to wait for the next update, saying that Google is continuously refreshing and updating the index.

What is likely the case, is that where mass ranking fluctuations have happened, working to counter that isn’t a quick win and does take time. Mueller also adds that the next update could be enforcing the direction your changes have taken, allowing you to see a bigger change to your performance. Not helpful advice in the short term necessarily, but useful knowledge for the long-term.

 

 

Googlebot to soon crawl HTTP2 sites

As of next month, Googlebot will be able to crawl over HTTP/2 versions of a website.

HTTP/2 (h2) is an updated version of the HTTP protocol. It’s considered to be fast and more efficient than HTTP. The semantics don’t change, but the formatting and how this data is transferred between client and server does. H2 allows a single TCP connection through which all the data is passed. This is more efficient as the less connections needed, the less crawl time Googlebot uses.

To start with, Googlebot crawls a selection of sites over h2. These sites are selected based on whether they support h2 and whether crawling the site over h2 would be beneficial; to both the website and Googlebot.

It’s been a few years since the upgraded protocol was introduced, so it’s exciting to see this progress coming into play.

 

 

Did we miss anything?

If there was anything else that happened in September that caught your eye, feel free to tweet us @upriseUPSEM, email us at hello@upriseup.co.uk, or simply send us a message through our contact page. We’d love to hear from you.

 

No Comments

Post A Comment

Contact Us