React is a JS library that is helpful in making responsive websites having pleasant UI. However, SEO (Search Engine Optimization) with React initially didn’t work properly. This was mainly because the search engines were having trouble rendering JavaScript. To address the issue React and Google worked on simplifying crawling and rendering React webpages. Though some React SEO problems still remain. As a React developer, you should be able to manage the issues with the SEO in your React website.
Before we get into the details of the challenges in React SEO and how to solve the issues, we need to understand how Google Search engine actually processes different pages.
How Google is processing webpages?
♦ Crawling
Every Search Engine has its crawler which scans for web pages and sites. Google has GoogleBot as its crawler. It sends GET requests to a server for the URLs in the crawl queue and keeps the response contents. Googlebot does this for HTML, JS, CSS, image files, and so on. It also crawls sitemaps and webpages offered by managed web hosts.
♦ Processing
URLs found within <a href> links within the HTML are added to the crawl queue. It also contains queuing resource URLs (CSS/JS) that is found within <link> tags or images within <img src> tags. If Googlebot finds a noindex tag at this point then the process stops. The bot won’t render the content, and Caffeine i.e. Google’s indexer won’t index it.
♦ Indexing
When Googlebot finds new webpages, it needs to understand the content within to be able to recognize the SEOworthiness of the webpage. Further, the text is the most straightforward to understand for Googlebot, the sequence is followed by images and videos. This signifies the importance of suitable titles, headings, meta descriptions, and topical content to make your website more SEO-friendly.
♦ Rendering
Googlebot runs JS code with a headless Chromium browser to find more content within the DOM, but not the HTML source. It does this for every HTML URLs.
♦ Ranking
Once Googlebot gives its report to Google, the last step is to rank your website established on how appropriate your content is to users’ queries. Google utilizes this ranking database to then display its search results whenever a user searches for a specific keyword. You can also see the demand to optimize the site content for not only humans but crawling bots as well. And this is where React and SEO begin clashing.
Why React SEO is challenging?
Optimizing React websites is challenging because of the fundamental nature of React’s JavaScript code and the way it renders webpages, several issues emerge while optimizing it.
♦ Loading time
Rendering JavaScript is difficult on the CPU. It takes more time to display on the client side compared to simple HTML. JavaScript also needs to create network calls to get content from the server which additionally increases the load time for users. GoogleBot discovers this delay in fetching the requested information and this influences the ranking of your website.
♦ Crawl Delay
Because React heavily relies on JavaScript, GoogleBot easily encounters empty content in its first crawl. React uses an app shell model and thus, the initial HTML visible to the bot doesn’t display any useful information. GoogleBot needs to run JS to make the content of the page visible. This delays the indexing stage and can become significant when dealing with numerous web pages on a single website.
♦ Metatags
Metatags lets Google and other websites generate detailed headings and descriptions for the webpage. However, these websites don’t run JS for each webpage. They just refer to the <head> tag of the webpage to fetch the metadata. React renders the whole content, meta tags, on the client side, and it’s hard to acquire accurate metadata for separate pages.
♦ Sitemap
This is a file that contains every detail regarding the webpage, images, videos, and different content on your website. It also shows the association between them. Google uses the Sitemap file to crawl your website. React requires additional tools to generate the sitemap as it is not feasible with the in-built features.
General issues in React SEO
Here are some of the general issues in React SEO.
♦ Indexing
As mentioned earlier, GoogleBot first fetches the HTML information from the queue. The related JavaScript execution for afterward. This adds to the time taken to index each web page, impacting the website’s SEO score.
♦ Managing errors while rendering
When it comes to managing errors while rendering HTML and JavaScript have different approaches. The JavaScript parser doesn’t work well with the slightest error and indexing fails instantly. This happens because the execution of the script stops as soon as the parser encounters the error and the page is indexed as a blank page by the GoogleBot.
♦ GoogleBot’s Crawl Budget and Crawling Limit
The crawling budget refers to the maximum number of pages that search engine bots can crawl in a specific period of time. Google reveals that the crawling limit is when the maximum fetch is limited for a specific website. A lot of websites built using JavaScript are not indexed because of the response time. The response time is more in React websites and so the GoogleBot moves to the next website for crawling.
♦ Indexing of Single Page Applications
React helps in building Single Page Applications (SPAs). The single page is loaded once and other necessary details are dynamically loaded when required. Standard multi-page apps, SPAs are fast, responsive, and deliver users with a smooth linear experience. However, there are some limitations in terms of SEO. Single Page Applications provide content when the page is already loaded. If a bot is crawling the page when the content is not loaded, it will treat it as an empty page. This suggests that a significant part of the site won’t get indexed and your website’s SEO ranking will suffer.
Challenges faced in React SEO
There are a few more challenges in React SEO- Rendering Lists and Wasted Renders.
♦ Rendering Lists
React has some issues with rendering long and complex lists. You can think of this by imagining hundreds of images on an e-commerce website that requires to adapt multiple devices. Such lists take extremely long to load and the delay is quite evident, specifically on low-end systems.
♦ Wasted Renders
React developers deal with complex data structures that are being rendered in React, like group chats that contain vast collections of text, images, and excerpts to messages that must be re-rendered. However, these re-renders can also render an evident drop in app performance for end users. Due to numerous items of a list being rendered, again and again, it drives an unnecessary drain on device resources like the processor and battery. The application can actually slow down and is referred to as lagging.
So, these were some challenges in SEO for React. Here are some solutions for improving SEO in React.
Improving React SEO
♦ Pre-Rendering
You can improve React SEO by Pre-Rendering. The concept refers to intercepting crawling requests from bots and sending a cached HTML version of your site as a pre-rendered static page. While using Pre-Renders the request comes from the user and not from the bot.
Benefits of using pre-render for your React website:
- Compatibility with the newest web features
- Handle different types of modern JavaScript and transform them into static HTML.
- Minimal or no change in the codebase.
- Implementation is simple
♦ Server-Side Rendering (SSR)
If you utilize server-side rendering, Googlebot can easily crawl your website from its HTML document. This happens as Server Side Rendering passes all the webpage content in the HTML file from the server to the client. SSR is a great way to boost React SEO.
♦ Make Static or Dynamic web app
Static and dynamic web apps use server-side rendering which assists crawlers like Googlebot to index webpages with ease. Although SPAs are richer in content and can deliver a more exquisite user experience, you can still achieve a lot of things with static and dynamic web applications.
Static apps are suitable for building landing pages, and dynamic apps help in building sites like marketplaces. Also, if you want to utilize SSR for your SPAs, you can do it by using the Next.js JavaScript framework. It helps create static web apps and works well with rich SPAs as well.
SPAs commonly encounter problems such as no 3xx redirects, with JavaScript redirects being utilized instead, and 4xx status codes not noted for “not found” URLs. These errors can lead Google to incorrectly index faulty pages or make it difficult for internal SEO audits to catch pages producing ‘404’ errors.
Next.js can help in solving these issues. It can allow you to set the status code you wish, including 3xx redirects and 4xx status codes.
♦ Avoid using Hash
It might not be a significant issue but avoiding hashed URLs is always suggested.
https://sample.com/#/example
The problem with using hashed URLs like these is that Google doesn’t see anything beyond the hash. SPAs that have client-side routing can execute the History API to modify such pages. React Router and Next.js can support you in this smoothly.
♦ Use href links
URLs in SPAs is utilizing <div> or <button> elements rather than <href>. This practice can hurt your React SEO score. When Googlebot crawls a URL, it looks for more URLs to crawl inside <href> elements. If it doesn’t find the <href> element, it will not crawl the URLs at all. So, whichever URLs you think Google should uncover, include <href> links to those URLs on your site.
You may find this informative: Top 10 Open-Source React Chart libraries for Data visualization in 2022
Wrapping up the article
These were some crucial React SEO recommendations from Webbybutter. Of course, there are many other aspects to consider while working on SEO for the website. You should practice best SEO practices such as XML sitemaps, mobile-first development, semantic HTML, and others. You can get it done by skilled professionals at our company. Contact us for getting more details regarding SEO strategies and how you can improve React SEO.