BE
International speakersPrestigious award showThe latest insightsAnniversary edition in the Netherlands
News

Bartosz Góralewicz (Onely) “The internet is only going to get bigger and more complex and difficult to render”

15 November 2021

With Google increasingly revealing the workings behind their platforms and tools, SEOs have to evolve and become even more technical, according to Bartosz Góralewicz, CEO at Onely. Still, there’s a reluctance among SEO’s to evolve: “SEO’s simply don’t care.”

In this interview Gerk Mulder (Briljante Geesten) speaks with Bartosz Góralewicz (Onely), one of the star speakers in the line-up of the upcoming edition of Friends of Search, on November 16 (Amsterdam) and 17 (Brussels).

Bartosz Góralewicz, finally! You are a kind of mythical figure to me. In 2016, when I was still working at my former SEO agency, we were still speculating about the future of link building. In those days, we were also very angry at JavaScript. Those were also the days we could still extract organic keywords from Google Analytics. And there it was, an experiment from Poland; What we only dared to dream about (but had no idea how to set up and implement) had suddenly become reality. In 2016 you gave us the Javascript SEO Experiment! Which was jokingly referred to as “eksperyment” by me and my colleagues. Your experiment was a revelation in the SEO agency world. Everybody was talking about it. What did the experiment bring you?

It means a lot to hear you were so positive about the experiment. So, throughout all these years, I keep on finding out more and more about what happened in 2016. We actually gathered a lot of understanding of how Google renders now. And to be honest, the research way back then seems a bit naive. As of 2021, we’ve actually done a few other experiments and pieces of research I expected to  go even more viral than the one in 2016.

What I got from our 2016 research, long story short, is that Google doesn’t know a lot about mechanisms. They don’t know how they work. Obviously, they design it with the best intentions. But then a lot of those elements don’t work as planned. This led us to the indexing and rendering research we started 3 years ago, which resulted in finding literally hundreds of rendering issues. The problem is that this research is way more significant than the first JavaScript experiment in 2016, but nobody gives a peep. Like partial indexing, it affects pretty much every single e-commerce store. Did you know that only 45% of big webshops get indexed? This should be the single most shocking element in the industry, but SEOs don’t really seem to care.

That’s shocking indeed! I have heard from a SEO at the biggest e-commerce store in the Netherlands how hard it is to get their content and products indexed. That’s is why crawler CPU and crawl budgets are getting stricter, which results in us keeping our sites really light. Is that the best way to handle it?

This is not as complex as creating a viral link campaign. I would assume that creating a 100% organic backlink campaign that’s really good is way more difficult. It has a lot more metrics, so you simply cannot control this by just optimizing your website and have it rendered and indexed properly. But Google is decreasing crawling indeed.

I had a discussion with Martin Splitt after a webinar on rendering we hosted. Martin kept saying – and this is a common problem with Googlers – that rendering is not the problem and that Google does not have a problem with rendering and indexing. But at the webinar, he said that only 25% of a website is indexed, while websites should be fully indexable. The problem is the heavy rendering. Google cannot admit openly and say “Folks, we’re struggling with rendering” because it reflects bad on them. And other search engines are working on a different solution.

We have some interesting data, something that will be of interest to you. We didn’t publish this yet. Something I want to talk about at Friends of Search: Google actually decreased crawling rates threefold. Right now they’re only crawling a third of what they crawled in June. And it’s becoming less and less.

So is that why you are calling it Rendering SEO and not Technical SEO anymore? 

Technical SEO got so much more technical throughout the last two years that I don’t think technical SEO is a proper name for it anymore. I remember technical SEO being just about optimizing titles five years ago. Now, it’s all about core web vitals, the critical rendering path, how Google renders, and if content is visible for different user agents. I could go on about this for hours. So, I think this is one of the most exciting times for technical SEO because we never had this much data about how Google works. Google’s not really a black box anymore. Maybe it’s more of a gray box now. But despite this knowledge, it’s very difficult to get the community to shift being this technical, towards being this deep in the code, working this close with dev teams. Still, I do believe that a lot of technical SEO’s eventually want to evolve. They have a massive opportunity.

It’s very interesting that we learn more about how Google works, but that they are hiding their rendering issues at the same time. What are your thoughts on that?

I don’t think Google has enough quality assurance around to double check that what they think is happening. This is mostly because of their reliance on buggy tools like Google Search Console. There’s enough examples of URLs that have been in the index for two years, but that are reported as such in Search Console, or the other way around. This is something we checked on large scale.

What Google should say is “we have a problem with heavy JavaScript websites”, which is hard to admit for a company like Google, as it would be for any large company to admit their faults. Google actually did start talking about the indexing queue for the first time. So now we know that there is a queue to get indexed, and that you can get kicked out as well.

Technical SEO is now about an in-depth understanding of the critical rendering path, of how the web browser is working for the core web vitals, and how web rendering service on Google’s side is working for the SEO reasons. And those two are completely different ecosystems. But they’re both very complex. The internet is only going to get bigger and more complex and expensive to render.

In the whole field of SEO, how would you divide your attention between rendering in comparison to content and links?

I think this is not one of those black and white answers. I think if you have a small website, and not pushing 3 megabytes of JavaScript, I would focus on nice architecture and good content marketing. But if you have a large, long-existing website then I would tackle the technical side first.

For instance, let’s say you have a small WordPress website, ensure your theme is lightweight with no heavy stuff like JavaScript. If you’ve done that, I would only focus on building some links through the most white-hat method you’re able to. But at the same time, if you’re a software house, getting organic links is fairly easy. So, depending on that, I would find a way that’s the least aggressive, let’s call it that. On the other hand, if you are a company like Walmart, Amazon, or the Guardian, I wouldn’t worry about links at all and focus heavily on handling the technical .

So did you start your SEO career with purely link building? How did your career progress over the years?

In the beginning of my twelve-year career it was purely black-hat, not something I’m very proud of. But I think this allowed me to learn quite a bit of both ends, so I can see a full spectrum. Just to give you an idea, we started with affiliate websites. We would rank high for some of the most competitive keywords, but all short-term and very black-hat and nothing to be extremely proud of. Even though it was good money at times, we had a bit of a hard time with the first couple of Google updates. I kept on going into understanding of why this was happening. That eventually drove me to patents. And to be honest, this was the most exciting thing ever because I thought “If we can understand that, we can deliver way more to the business owners than through this kind of ridiculous nonsense, the black-hat thing.”

When I first started researching this, and start talking and publishing about it, especially in Poland, people would say, “Okay, there are white-hat SEOs, and there are SEOs with traffic,” and all those funny jokes like, “This is never going to work out.” I wanted to find a niche, and I wanted to understand something that nobody knew about. We started researching some websites that couldn’t recover. That’s how we found one website with this preloader wheel that was blocking the indexing of the whole website. That was shocking, and this led us to JavaScript research.

Cool to see how you have evolved in your SEO journey, and also, thanks for fucking up the internet! So, from link building to patents, which led to JavaScript and rendering. What do you think is coming our way as SEOs? Are we still going to have a job (Jono Alderson and Rand Fishkin certainly think we don’t)?

Fucking up the internet, guilty haha! I remember those conversations. They changed a little bit, but they never disappeared. So, after the first Panda and first Penguin, I remember a lot of people thought that everyone was going to move to a different search engine, either Bing, DuckDuckGo or something like those.” But they didn’t and they won’t. Google is going to take more and more because they’re a company.

But if you think about that, this is going to affect very low-hanging fruit that people are used to getting. So, if you search for something that can be answered in Google and this is your daily bread and butter, there’s something significantly wrong with that. Imagine you’re a weather forecast company, it’s either going to be Google showing the weather in searches, or there’s going to be an app on your phone and so on. So, I think this is just progress of technology. Still, this is a slow progress. If you told me five years ago what SEO would be right now, I would not believe you. SEO is now different per vertical, per way of working. In some verticals the success rate is for 90%  about links. But it is really shifting slowly.

It’s the same with content. Even how you create the content is shifting. What we’re seeing is not really about the length of the content. It’s about who can tackle that content first. There are a lot of websites with 300 words, that completely outrank websites with 3000 words. The one with 3000 is just not written very well. And Google is really focusing on killing those queries quickly. The focus is slowly shifting from “I wrote the most on the topic,” to, “I wrote it in a way that’s the most digestible.” So next to not needing many links, you also don’t even need a ton of content. Maybe you just need to make your content better or just somehow unique. In conclusion, I believe that we will still have jobs but that they will be more and more technical and focused on content.

So in short, your advice for future SEOs would be: get technical and know your end-user?

90% of the problems coming to us are about website owners not fully understanding who’s coming to their website and why, and what their challenges are? Are they using the same device, are they from different cities, how to look at intent? In the end, it’s about connecting the technological layer with the UX and psychological layers. And if these two are in line, there’s no stopping you.

Unfortunately, we have decided not to organize a Belgian edition of Friends of Search in 2025. The 12th edition of Friends of Search we are organising in the Netherlands. Belgian visitors are of course more than welcome. Join us on March 23, 2025, in de Kromhouthal in Amsterdam.

Go to Dutch website Stay here