JavaScript SEO with @bart_goralewicz #vcbuzz

JavaScript SEO with @bart_goralewicz #vcbuzz

Search engine optimization is an ever-evolving industry.

When we started out, the best practice was to avoid JavaScript at any cost

These days Google is much more advanced, so JavaScript can be SEO-friendly.

Let’s discuss!

***Add #VCBuzz chats to your calendar here.

***Please sign in here to follow the chat -> twchat.com/hashtag/vcbuzz

About @bart_goralewicz

@bart_goralewicz is founder and CEO at @onelycom, the advanced technical SEO agency.

His research into JavaScript and how Google deals with it paved the way for JavaScript SEO – a narrow discipline within technical SEO that Onely is particularly famous for. He’s also a trailblazer when it comes to rendering SEO. 

Questions we discussed

Q1 How did you become a digital marketer? Please share your career story!

I started around ~2010 with affiliation, and I was trying to make some extra $ by ranking websites (loans, casinos, etc.).

I was fairly good at that at the time, but I quickly realized that this is not what I want to do in my life. We started Elephate agency around 2013 and I found my mojo recovering large websites from penalties.

I was already quite technical as to succeed in highly competitive spaces with affiliation wasn’t easy and what I learned from the black hat was to constantly search for new paths.

Questioning Google’s statements was my default mode & I remember Google’s statement that said that they crawl the web just like modern browsers, and I was like… “this cannot be right”.

This is how jsseo.expert experiment came to my mind. I spent a few weeks talking to devs, Googlers, and some of the brightest minds I could find online. I would private emails to people like Ilya Grigorik and ask them questions. This was one of the best moments in my career. I understood that even though I wasn’t a developer, I could provide value to the community. I can change its course. 

Seeing Google’s reaction to my research and how they addressed it was like a drug. I couldn’t believe it. I was in SEO for 5-6 years and had tremendous impostor syndrome. The truth was that back then, a Polish surname didn’t help in landing clients. 

I got totally immersed in technical SEO. I felt that I had found my path. This was (still is) my obsession. I’ve built a huge R&D team led by @TomekRudzki. If you know Tomek, you know what happens next.

Q2 Please tell us more about your JS research and what you were able to confirm.

In 2020 we released research going way beyond JS SEO. Research that was (again) confirmed and appreciated by Googlers.

Last month we launched ziptie.dev and we have more cool stuff coming this year.

What I found out is: – cached content doesn’t guarantee its indexing, and – Google differently approaches inline vs. external vs. bundled JavaScript.

Different JS frameworks would have different indexing issues – Google doesn’t do very well with JS-dependent content, which is still the case. Search for product description here

The results showed that: – Google tends to partially skip indexing of the content, even for HTML/CSS pages. – Google prioritizes the indexing of the most prominent elements of a site.

Q3 What are the typical SEO myths when it comes to JavaScript?

JavaScript isn’t evil (a popular myth). Just like anything, you need to learn to use it properly (JS SEO). JS may harm your website (screenshot below), but fixing it isn’t as complicated as some think.

All crawled pages get indexed. It may not always be about JS content, but Google doesn’t guarantee to index everything it finds and crawls. Google must prioritize what is added to the index, choosing the most valuable URLs.

Rendering depends only on JavaScript usage. Rendering JS content on one page does not mean Google will be able to execute it at scale. Having a large site, perform a thorough analysis to ensure that Google can process thousands of pages across your site.

Treating prerendering as a silver bullet. Prerendering adds an extra level of complexity as it requires you to maintain two separate structures of your site. Also, it may slow down your server and make the renderer fail.

Q4 How can one tell if their site is crawlable by Google?

Ensure your critical assets are not blocked in robots.txt. Blocking important JavaScript files may lead to severe rendering and indexing issues. Use the URL Inspection tool or the Mobile-Friendly test to ensure you’re not blocking any resources by mistake.

Crawl the site with the Googlebot user agent. Are there any important links missing? ➡️ Compare the links you see and the links visible after the analysis. Beware of using links relying on JavaScript, as Google may not follow them.

Check how different templates are rendered in the URL Inspection Tool, Mobile-Friendly or Rich Results Test. – Check if all important links and content are in the DOM. – Ensure any JS files don’t negatively affect how Google sees your content.

Q5 What are your favorite technical SEO tools?

@ziptiedev for indexing checks. I spend hours analyzing the indexing stats of large websites. This is SO much fun. Fun fact – Technical SEO agencies – very poorly indexed 😉

For crawls & GSC data I am a @Ryte_EN die-hard fan. 

I love Chrome Dev Tools for anything around Core Web Vitals, critical rendering paths, etc. I learn something new all the time, and I feel like there is still so much more to learn.

I have a ton of nostalgia for @screamingfrog, as it was my first crawler. I use it for “on-the-go” quick checks. We do have a ton of other tools @ @OnelyCom, but these 4 are my go-to daily tools to help me get the data I need.

It is a helpful tool. It lets you see a page through Google’s Web Rendering Service’s eyes. However, it is not as reliable as the Inspection Tool. It’s optimized for performance (in the sense of delivering a response in a reasonable amount of time)… and since Google uses cached resources, the MFI can show failures for files that are otherwise loaded OK in GSC. My biggest issue with it, though, is that it shows a reasonably theoretical scenario. Google can render most JavaScript, and that’s great! BUT… Having talked to hundreds of website owners and devs struggling with JS SEO-related drops, it often leads to the assumption that Google can AND WILL index JS-dependent content… my experience shows that such assumptions cost millions of $ in revenue to some of the world’s biggest websites. Similar to Google recommending (until recently) prerendering/dynamic rendering as a solution to JS-related issues… please don’t quote me on this, but I wish Google used “it depends” more when publishing documentation around JS SEO issues 🙂

Our previous technical SEO chats:

VCBee

Viral Content Bee is the free platform for social media sharing helping you get more shares for your high-quality content

More Posts - Website

Follow Me:
TwitterFacebook

Comments Are Closed