NL
Internationally acclaimed speakersPrestigious award showDrinks & Bites12th edition
News

Barry Adams (Polemic Digital): News publishers really need to double down on what makes them unique in terms of their journalistic output

30 January 2023

As a news publisher it can be hard to protect your website from the effects of Google algorithm updates. In this interview, Barry Adams, Independent SEO Consultant (Polemic Digital), Co-founder of the News & Editorial SEO Summit and a four-time speaker at Friends of Search, shares his expert insights on how news publishers can double down on what makes them unique in terms of their journalistic output and build topical authority to face the challenges of algorithm updates.

Barry Adams will be speaking at the anniversary edition of Friends of Search on March 23 in de Kromhouthal in Amsterdam. Tickets available at: friendsofsearch.com/nl

In the SEO world, you are known for your extensive experience with SEO for publishers. How did you come to focus on this area?

That was a typical case of ‘right place, right time’. In 2009 I moved to Northern Ireland from my native Eindhoven, and one of the first jobs there was as the in-house SEO specialist at the local newspaper the Belfast Telegraph.

I had decent SEO knowledge then, but knew very little about the peculiarities of SEO for news websites. Together with the Belfast Telegraph’s online editor, we performed many experiments to see what worked and how success in Google News and the Top Stories news boxes could be achieved.

That was very educational and boosted the Belfast Telegraph to levels of visibility in Google that it shouldn’t realistically be able to achieve, as it was just a small local paper.

The successes we achieved there helped open many doors for me. Later, when I’d started working as a freelance SEO consultant, certain opportunities arose from the work I did for the Belfast Telegraph, and I seized those chances to develop my expertise in SEO for news websites – a severely underserved niche within SEO. The ball got rolling from there, and over the years I’ve been fortunate enough to work with some of the biggest names in news publishing.

Due to your experience, you likely encounter similar problems on websites frequently. What are some key issues that publishers really need to have in order but often fail to achieve?

Typical issues I encounter with many of my news clients are incorrect image sizes, which can hinder articles’ performance in the Discover feed especially. I also see problems around tag pages and topic hubs, and inconsistent internal linking strategies.

A technical issue I often see problems with is pagination; it’s frequently implemented in such a way that Google is unable to crawl past the first page of articles.

The place that SEO takes within a news site’s editorial workflow is also an important area that’s often neglected. If you want to succeed in Google, you need to take SEO seriously and embed good habits in your editorial teams.

The use of structured data (Article, NewsArticle, BlogPosting, LiveBlogPosting, etc.) is a commonly applied optimization for pages. Are there certain technical things that a publisher should always optimize?

For news publishers, proper implementation of structured data is not optional. Google relies heavily on the attributes provided in (News)Article structured data to index the article and accord it a proper place in its news ecosystem. Accurately defining sufficient attributes is key.

Lately some publishers are redeveloping their aging technology stacks, and it’s easy to go down a wrong route there with regards to JavaScript. Google says they can index JavaScript, but in the context of news it’s catastrophic to rely on client-side JavaScript to load content and links. News websites need to serve all links and content in the raw HTML served to clients and Googlebot if they want to have any chance of ranking in Google’s news elements.

Lastly, crawl speed is a critical area for news websites, as they often tend to be very large sites with millions of pages. The faster Googlebot can crawl your site, the easier it can find newly published and updated articles.

In various blogs, you write about the importance of topical authority for websites. Why is this so important and how can you determine a website’s topical authority?

I believe that from 2018 onwards Google has really focused on topical authority, which is reflected in many core algorithm updates the search engine has rolled out since then. News websites are often impacted by those core updates, as Google re-evaluates their editorial content to see what topics a publisher has true authority in.

When you want to determine a site’s topical authority, you just need to look at their top ranked pages that aren’t article pages. Section pages and tag pages also rank in Google search, often for broad keywords related to their topic. For example, a search like ‘Donald Trump news’ will show – in addition to the usual news stories – a set of topic pages from news publishers in the top 10 organic results. Those publishers have very strong topical authority on the Donald Trump topic.

Improving topical authority is a matter of regularly publishing unique, quality content on that topic, and categorising that content appropriately within site sections and tag pages.

Recently, we have been experiencing that Google is taking a long time to index new content on websites. Do you also see this problem with the websites you work for, or do large news sites receive “special treatment”?

Whenever I see indexing issues on news websites, it’s either because Google is having a hiccup or the news site in question has a technical issue. Usually news articles get indexed very quickly – in a matter of minutes.

I believe Google treats news a bit differently. Because of the speed with which news moves, Google takes a few shortcuts when indexing news articles. For example, the rendering phase of indexing – where client-side JavaScript is loaded – is skipped for news articles.

This helps speed up the process of indexing news stories, so that Google can almost always show the latest articles on any given newsworthy topic. Users expect Google to show the latest news, so these shortcuts help Google fulfill that expectation.

There are several ways to check if a website has been discovered by Google (crawling) and included in the search results (indexing). How do you determine this on a large scale? Are there specific analyses or tools you can use to reach certain conclusions?

Getting content indexed isn’t really a problem for most news websites, due to the rapid indexing processes that Google uses for news articles. Rarely do I see issues with indexing on publishing sites. Whenever I do see problems there, Google Search Console is the best source of data to analyse and troubleshoot the issue.

Between the various Page Indexing reports and the awesome Crawl Stats report, there is usually an indication of what the root problem is. This can be uncrawlable pagination, slow server response times, bad coding (usually reflected in high percentages of Page Resources crawl effort in the Crawl Stats report), or issues with canonicalisation.

Do you use log file analysis for your clients? If so, how do you do that for extremely large websites? Are there any limitations?

Funny anecdote: I once asked a large British news publishing client for access to their server log files. Their devs responded with a question of their own: “How big is your hard drive?”

News sites are crawled so intensely by Google – especially popular news publishers – that log file analysis is not really a practical endeavour. Besides, the Crawl Stats report in Search Console is usually sufficient to find potential crawl issues.

Since 2018, Google Discover has become an increasingly important traffic source for publishers. Do you have tips to ensure that websites are prominently visible in Google Discover? And does this work differently in Google News?

Discover has a broader scope than Google News; it’s normal to see articles from websites in your Discover feed that you’ll never be able to find in Google News.

News is still keyword/topic based; people perform a search, and Google matches the query’s topic to relevant news stories. Discover, on the other hand, is all about interests. Google will show you articles in your Discover feed that it believes may interest you, based on what it knows about you.

Strong headlines and introduction paragraphs are key tactics for enhanced visibility in Discover, as well as defining a hi-res image (1200 pixels wide at minimum) in the NewsArticle structured data. Ensure your articles have clear topical focus and tap into known interests that Discover uses to categorise articles. Experimentation also helps to find your optimal headline structure.

Are there ways to track traffic from Discover and News outside of Search Console? Do you see differences in achieving certain goals or engagement per traffic source?

Outside of Search Console, there is very limited scope to accurately track traffic from Discover and Top Stories carousels on Google. We can track visits from the news.google.com vertical, but for most publishers this would be low single-digit percentages of total traffic. Discover and the Top Stories news boxes are where most Google traffic to a news site will come from, but that traffic lacks proper referral strings. Discover often shows as Direct traffic in your web analytics, and Top Stories traffic is heaped together with regular Google Organic traffic.

Some 3rd party tools have found ways to attribute traffic fairly accurately. One of my favourite tools for this is Marfeel Compass, which has the added advantage of being fully compliant with EU data protection legislation (contrary to Google Analytics, which in addition to being quite shit is also technically illegal in the EU).

In the past year, there have been a significant number of Google updates that have brought about major shifts in search results. Have you noticed any of these changes?

Oh yes, definitely. Whenever a major Google algorithm update rolls out, news websites are often at the frontline of the effects. It’s rare to see an algo update roll out that doesn’t impact news publishers.

Often the news sites that are impacted are not doing anything wrong as such. They publish quality journalism and have strong topical authority, but Google still ‘punishes’ those sites – usually because it decides to give less space to news in general.

This makes SEO for news sites quite frustrating at times. It’s hard to predict where your SEO efforts will take you, as Google can decide to downgrade news in the next algo update and all your gains to date will be undone.

Google’s decision to pay news websites for reusing their content in search results is a response to pressure from the EU. What is your opinion on the Extended News Previews agreement from Google?

It’s a propaganda exercise for Google. The financial cost to Google is so small compared to their revenue, it’s barely a rounding error on their balance sheet. But for news publishers, who have struggled to find appropriate business models in this era of free online news, that money can mean the difference between profit and bankruptcy.

I do think huge tech platforms like Google, Facebook, and Apple, which serve as gatekeepers to the internet, have a responsibility to share their success with the creators they built their empires on. A company like Google does not become worth 2+ trillion dollars based on its own content alone – it became so huge by enabling people to find relevant content created by others. There is a responsibility there for that success to be shared with those creators, which is something that YouTube does, for example. Google, as YouTube’s owner, doesn’t seem to have an inherent issue with distributing their wealth to content creators, so I don’t understand their vocal reluctance to do so with creators they show in Google’s search results.

I’m not sure the Extended News Preview legislation is the right approach in the long term, but at least it’s a start.

Are there any new developments for publisher SEO or do you have any expectations of what will be important in the coming time?

I suspect 2023 will be another tough year for news sites, with pain from algorithm updates and general declines in traffic from the major tech platforms.

I think news publishers really need to double down on what makes them unique in terms of their journalistic output. You can’t be a jack of all trades anymore, a news site needs to have specific strengths that it caters to.

That results in stronger topical authority in Google, and more importantly it builds a loyal audience that is genuinely interested in your output. A loyal audience means less reliance on the tech platforms, and more monetisation opportunities.

On March 23rd, you will be back on stage at Friends of Search. What will you be speaking about and why should we not miss it?

As this is the 10th edition of Friends of Search, and the 5th edition I’m speaking at, I want to do an overview of what has changed in SEO in the past 10 years. Not only will this show how far Google and SEO have come in the past decade, it also can inform us of where Google may be going in the next decade – and where SEO needs to follow.