Dear NextJs Aficionado,

So you want to build a fast and well-optimized website that makes you feel proud, and that shows to the world your high level of NextJs development skills?

Well, you are at the right place!

Optimizing your nextjs project for higher performance, better user experience, and high conversion rates can be challenging. It won’t be easy and will demand hard work from you, but the result is worth it.

Why?

Because in the end, you will be able to put on the web something better than 95% of everything else. Something that loads fast, looks sleek and attracts free traffic from search engines like a magnet.

It will be based on the most trendy web technologies… And it will demonstrate your superb skills in creating web assets capable of generating profits easier.

That’s the fundament of every successful online business…

Who wouldn’t get impressed by such things?

If there is anybody like that, obviously they don’t understand anything about the modern web, and we shouldn’t care about their opinion.

So are you ready to start optimizing?

Table Of Contents

Are you interested in the full-stack story of JavaScript (And Beyond)?

Discover a newsletter about stuff you cannot find in your JavaScript books & courses without reading between the lines. Deep research. No fluff. Original content.

Click here to learn more…

General SEO: The Fundaments of Well-Optimized NextJs Site

Some of the stuff we’ll discuss later will be NextJs specific. Still, Search Engine Optimization is not dependent on the programing language, the framework in use, or any other implementation details.

Most SEO strategies, tactics, and recommendations are just as relevant to a Nextjs website as they are to a Laravel website or a Django website.

In other words, they are universal…

So I congratulate you on your wise decision to improve your web development skills regarding SEO. They will serve you well your whole life.

One of the first things you need to do is place the fundaments of a well-optimized site, and here’s precisely what that means:

1. Install SSL Certificate

Once upon a time, the web was considered to be a safer place. Plus, people didn’t need encryption or other security measures because they chatted anonymously with friends and wrote their blogs.

Now we use our credit cards on the web, and you hardly can find a website or a web app that doesn’t collect personal information. Also, some apps collect sensitive data like health status, financial information, etc.

So we need to take security measures to protect that information right away. The most basic way to protect it is by installing an SSL Certificate.

It used to be expensive and challenging to do it in the past when only a handful of trusty companies were offering those certificates, but nowadays, it’s a piece of cake. And what’s more – it can be free in a case where you’re just starting and don’t process sensitive data.

When the communication between the browser and the server happens over a secure HTTPS connection, it’s less likely somewhere along the way that communication to be intercepted and tampered with ill intentions.

2. Sitemap XML

Search engine bots discover the pages of your website in 3 general ways:

  • By crawling it page by page and visiting each link, they find
  • By visiting your pages from links they found somewhere on the web
  • By processing a particular file you sent them, named “Sitemap.”

The sitemap is an XML (Extensible Markup Language) document that lists all website pages.

When you add new pages, the XML sitemap gets updated automatically, prompting the search engine bot to come back again. There’s no guarantee it will return right away, and there’s no guarantee it will process all URLs in a sitemap.

Still, it’s a good idea always to provide one.

3. Install Analytics Tool

One of the main benefits of SEO is that it’s performance-based, which means site owners can measure (most of the time) the results of their actions.

To do so, they need to collect data about their visitors – their number, country, pages visited, forms filled, etc. That data is collected by tools like Google Analytics or Matomo Analytics. There are many web analytics solutions.

Your goal is to find out what solution must be implemented and make it work correctly so the site owners and SEO specialists can access the precious usage stats.

4. Set up a Google Search Console Property

One of the main targets of SEO is to make the website attractive for organic traffic from Google. Google’s the biggest search engine in the world, and according to Statista, it has a market share of 85.55%.

That’s huge…

Nobody can afford to ignore The Big G if they want cheaper traffic.

The good news is Alphabet provides the so-called Google Search Console, a tool that tells us stats about the organic visits from the search engine and much other helpful information that helps us keep our website in good shape.

So there is no surprise you need to set up a Google Search Console for your project.

5. Provide a robots.txt

robots.txt is a file that must be accessible from the root of your domain (https://your-domain.com/robots.txt). It holds special instructions for search engine bots like where they can find a sitemap, which pages they can visit, and more.

It’s a good idea to provide such a file, especially when your website offers private areas and many admin page URLs that shouldn’t be crawled or indexed.

That way, you save hosting resources and help the bot be more efficient.

6. Proper 404 pages

When a page works as planned, the HTTP code must be exactly 200. And respectively, if the user requests a page that doesn’t exist, the code to be 404 while the URL remains unchanged.

If there is a server error and the page can’t be served temporarily, then the code must be 500. And if the page is moved away to a new address, there must be a 301 redirect.

Later we will talk more about the proper HTTP Codes.

7. WWW or Non-WWW

Usually, you can access your website in two ways – entering the www subdomain or without it. I believe it’s a matter of preference which one to be the primary one, but you must choose between them.

So if you choose the primary one to be the shorter version of your domain (non-www), then you must redirect (301) all requests to that primary domain.

Otherwise, search engine bots will find the same content on two separate URL addresses and “think” of it as “duplicate content.”

It’s terrible for your SEO.

Are you interested in the full-stack story of JavaScript (And Beyond)?

Discover a newsletter about stuff you cannot find in your JavaScript books & courses without reading between the lines. Deep research. No fluff. Original content.

Click here to learn more…

Web Performance: Business Impact and Recommendations For Achieving Higher Speed

A good website performance means several important things about you or the site owners in case you don’t optimize it for yourself:

  • Fast websites sell better. For example, when retailer AutoAnything reduced page load time by half, they saw an increase of 12% to 13% in sales.
  • Fast websites attract more traffic. For example, when Pinterest reduced perceived wait times by 40%, they witnessed an increased search engine traffic and sign-ups by 15%;
  • Performant websites generate lower hosting expenses. There aren’t popular case studies on the topic, but it’s conventional wisdom. If your website needs less RAM and utilizes less CPU time, then its operational costs will be lower.

So it’s that important for you to create something fast. It will shine in your CV if you do it for somebody else and will definitely bring you measurable business results otherwise.

Let’s see what you can do…

8. Optimize for better Core Web Vitals

You can first analyze selected pages from your website with tools like Page Speed Insights or BuhalBu’s Kit and find your current CWVs.

If you need more information on what they are, you can find some explanation in my previous article, “Technical SEO Intricacies: What Are The All-Important “Page Speed Metrics”?”

So basically, you run the tool, get suggestions and then implement them to achieve better numbers. I promise to provide more detailed tutorials on this topic soon.

9. Good Lighthouse/PageSpeed Insights/BuhalBu’s Kit Scores

The Core Web Vitals are not the only helpful information you can get from tools like the three mentioned in the subtitle. They will give you many more suggestions on making your website faster and with a better user experience.

They use the same API, so the suggestions they give you will be very similar or, in other words, interchangeable.

Here are the main differences:

  • Google Lighthouse analyzes your web pages on the go and gives you the so-called “Lab Data.” It’s data that is collected during the analysis.
  • Page Speed Insights is one more tool from Google that will show your Core Web Vitals if available. If they’re not available, it will show you Lab Data.
  • BuhalBu’s Kit is my SEO tool that gives you a handy user interface on top of the widely available Page Speed Insights API. It gives you almost the same data, but you can analyze URLs in bulk (which saves time) and allows you to save your reports for referencing in the future (you can even rerun them to get fresh data).

So you put your tool into action and then implement most of the suggestions until all scores get colored in green. (Again, I will provide more detailed tutorials soon.)

10. Setup a CDN service

The Wikipedia definition for CDN is the following:

“A content delivery network, or content distribution network (CDN), is a geographically distributed network of proxy servers and their data centers. The goal is to provide high availability and performance by distributing the service spatially relative to end-users. “

Source: Wikipedia

It says that if you are using such a network, there are many servers around the world that cache your web pages. And when your users are geographically closer to one of these “proxy servers,” they get the cached content and get it way faster.

So this is one handy trick to improve the website’s speed quickly.

11. Implement gzip/brotli Compression

One way to make your site load faster is by reducing the page weight, so there is less information that travels back as a response to an HTTP request.

The first way to achieve less page weight is by reducing the HTML code and its textual content, resizing your images and leaving some things to be loaded dynamically.

The second way to achieve the same result is by minifying and compressing your assets – HTML documents, CSS files, js files, etc.

Next.js provides gzip compression by default, but there are some additional considerations.

If you use a” reverse proxy” like Nginx or you have put in place a CDN service, you may be able to leave the compressing for these services and offload them from the Node.js process.

12. Optimize your NextJs code

Next.Js is a superb React Framework that makes our life way easier.
And precisely because we can easily do so much stuff, sometimes we create heavy pages.

In a case like that, we can use solutions like Dynamic Imports; We can run a bundle analyzer in a search for unused code; We can replace external libraries with custom solutions; And more.

You need to understand that Next.Js is SEO-friendly, but it’s not SEO-ready. I hope this long article proves to you my last statement.

Titles & Metadata: How to Help Search Engine Bots and Browsers Better Understand Your Web Pages, Part I

SEO Mindmap

As intelligent human beings, we can understand what we look at. We use our past experience and knowledge to make sense of the world. Also, we use the surrounding context to get a clearer understanding; And we are capable of sophisticated means of communication like talking, writing, and reading to pass information to one another.

The machines are still way behind us…

In some areas, they really excel, but most of the time, they are not capable of the same “level of understanding” as us, so they need a little help.

For example, we can clearly understand the topic of an article by reading just its summary. We can even judge the content’s quality from those couple of sentences.

That’s not so easy for a bot…

It can do more work in magnitudes and eventually be an indispensable helper in processing significant amounts of documents, but it can’t “feel” the subtle nuances. It can’t judge what’s true and what’s false; What’s good and what’s wrong. It just can’t make sense of the world. Yet.

We help it by providing it with additional information in the form of titles, meta tags, links, and semantic HTML. The first three live in the HEAD tag of HTML Documents, and the last one resides inside the BODY tag.

So if we want all search engine bots to understand well our web pages, we need to do the following:

13. Provide a Unique Title Tag for Each Page

The Title Tag is one of the strongest SEO signals, so it always must be in place. It must be unique, so there is no other page titled like that on our website and, preferably, on the whole web.

It must be not too short nor too long (15-70 chars).

Also, it’s a good idea to implement a way to edit the page titles separately from the document’s headline because the latter may be way longer than 70 chars, and the two shouldn’t be identical.

14. Provide a Unique Meta Description

The meta description tag is not a major SEO signal, but it helps humans and bots understand better what your web page is about so they can make an informed decision to visit or not the page.

That’s why every page must have a unique meta description between 60 and 320 characters. The content of the tag should be provided by the page’s author and not generated automatically, although there are cases where some automation is unavoidable.

15. Proper Char Encoding

Nowadays, UTF8 encoding is the web standard for character encoding. It encompasses most alphabets and many other characters like emoticons, etc.

16. No Meta Refresh Tags

Using a meta refresh tag is an outdated way to update stale content on a web page or force the page to redirect to another URL. It’s not SEO-friendly, and it’s firmly advisable never to use it.

There are better ways to get the same results.

17. Viewport Tag

As described in the docs:

“The browser’s viewport is the area of the window in which web content can be seen. This is often not the same size as the rendered page, in which case the browser provides scrollbars for the user to scroll around and access all the content.

Some mobile devices and other narrow screens render pages in a virtual window or viewport, which is usually wider than the screen, and then shrink the rendered result down so it can all be seen at once. Users can then pan and zoom to see different areas of the page. For example, if a mobile screen has a width of 640px, pages might be rendered with a virtual viewport of 980px, and then it will be shrunk down to fit into the 640px space.”

source: developers.mozilla.org

So every web page must have a Viewport Tag if we want it to be properly served on mobile devices.

Mobile devices generate 54% of the web traffic in 2022, which is a big reason to make our website mobile-friendly.

18. Provide a lang attribute

It’s a way to tell the machines in what language is the content on the page. Not that they can’t figure it out alone, but it’s better to help them out.

19. Doctype Declaration

There was a time when we needed to choose between several versions of HTML. There was HTML 4.2, HTML 5 and XHTML (which I really liked).

Thank god we now have the “living standard,” and life is much easier.

But still…

Let’s put that Doctype declaration in place.

20. Provide a Canonical Link

“If you have a single page that’s accessible by multiple URLs, or different pages with similar content (for example, a page with both a mobile and a desktop version), Google sees these as duplicate versions of the same page. Google will choose one URL as the canonical version and crawl that, and all other URLs will be considered duplicate URLs and crawled less often.

If you don’t explicitly tell Google which URL is canonical, Google will make the choice for you, or might consider them both of equal weight, which might lead to unwanted behavior, as explained in Reasons to choose a canonical URL.”

Source: Google

Are you interested in the full-stack story of JavaScript (And Beyond)?

Discover a newsletter about stuff you cannot find in your JavaScript books & courses without reading between the lines. Deep research. No fluff. Original content.

Click here to learn more…

Content Structure: How to Help Search Engine Bots and Browsers Better Understand Your Web Pages, Part II

As I mentioned, bots need help to understand the content better. That’s why we provide metadata, style sheets, scripts, etc.

But all this is not enough…

The content must be formatted and tagged correctly.

That includes…

21. A Unique H1 headline

Each website page must have precisely one H1 headline that is unique sitewide. It must be not too short, nor too long. There are no specific recommendations for its length, but it must be long enough to communicate the page’s topic.

22. Proper Headline Hierarchy

It’s also essential for the rest of the headline tags (H2-H6) to be appropriately used for creating a hierarchical structure of topics and subtopics. (Screen readers really depend on that too).

In the past, those tags were often used for styling purposes, which is no more advisable. If it’s not denoting a subtopic as part of the main content, it must not be used just to make the title bolder or bigger.

23. HTML Semantic Elements

Elements like <div> or <span> don’t imply any additional information about their content so they are general, non-semantic elements. On the other hand, elements like <nav>, <footer>, <main>, etc. have a specific role in the overall document structure so they are semantic elements.

Semantic elements help search engine bots understand the content.

24. No Mixed Content

“Mixed content occurs when initial HTML is loaded over a secure HTTPS connection, but other resources (such as images, videos, stylesheets, scripts) are loaded over an insecure HTTP connection.”

Source: web.dev

This means there are no links, no images, no videos, or whatever else that loads over a non-secure HTTP connection.

25. Well-coded Anchors

There’s no upper limit of anchors that a webpage can have in the body, but those anchors should be carefully curated – all of them must point to secure and safe URL addresses.

All of them must have a meaningful anchor text describing the target page’s content. And if an anchor points to an external webpage, it must have the value “noopener” for its rel attribute.

Proper URLs: How to Help Search Engine Bots and Browsers Better Understand Your Web Pages, Part III

Every resource on the web has a unique identifier that helps us find and access it – its URL. So those URLs must be structured in an SEO-friendly way.

Moreover, search engine bots analyze the page’s URL to get more of the content’s context.

That’s why…

26. Allow URL Management by Users

There are various requirements like these:

  • The target keyword must be included in the URL
  • The URL mustn’t include underscores
  • The URL mustn’t have too many URL parameters
  • The URL must be as short as possible

Many of the rules for SEO-friendly URLs can’t be applied automatically, so the users need a way to do it manually.

27. No Too Long URLs

Generally, all of them must be as short as possible, but sometimes the underlying structure, placed by the web developer, adds bloat.

Bad Example:

https://buhalbu.com/knowledgebase/category/nextjs-seo/article/some-really-long-automatic-article-slug-thats-not-seo-friendly

Good Example:

https://buhalbu.com/kb/nextjs-seo/some-article-slug

28. No Duplicates

Every URL must point to a unique page on the website. If two URLs load the same content, it’s considered “duplicated,” which can lead to negative consequences.

This point can be seen as an extension of point 7 (General SEO), where we redirected the www subdomain.

Image Optimization: The Big Impact Elements On The Page

According to HTTP Archive, images make up more than 60% of the data loaded on web pages, so their file size significantly impacts the page loading speed. If you want to have a faster website – use smaller in size images.

Next.Js offers a convenient react component for images that applies automatically many of the best SEO practices regarding image resources.

Still, bear in mind the following:

29. Optimize The Largest Contentful Paint

“You should add the priority property to the image that will be the Largest Contentful Paint (LCP) element for each page. Doing so allows Next.js to specially prioritize the image for loading (e.g. through preload tags or priority hints), leading to a meaningful boost in LCP.

The LCP element is typically the largest image or text block visible within the viewport of the page. When you run next dev, you’ll see a console warning if the LCP element is an without the priority property.”

source: nextjs.org

30. Don’t Overdepend On The Image Component

This component is so handy, and we often surrender to the idea that it can handle whatever images we pass to it.

That’s far from the truth…

If you don’t process your images beforehand… If you don’t normalize them and don’t persist the optimized copies, Next.Js will optimize them after every deployment on a serverless infrastructure, which will cost you lots of CPU time.

31. Allow Users To Manage the Image’s Alt Attribute

One of the age-old SEO rules says that every IMG tag must have defined a proper alt attribute. Several more rules related to this attribute guide the user in what information they should provide as a valid alternative to the actual image.

So there must be a mechanism that allows manual input of the alt attribute for every image.

32. Don’t Play With The Filenames

Once upon a time, replacing the filename with something that’s automatically generated, like a combination of the image’s MD5 hash and the upload date, was trendy.

Nowadays, the filenames must include the target keywords as an additional hint for search engine bots about the page’s topic, so it’s considered a bad SEO practice to rename the image file.

33. Drink a Beer… You made it!

It’s time to celebrate! Call your friends and go get some fun.

Outro: Checklist Completed, A Website Optimized

For the last several minutes, we talked about 33 things you must bear in mind when optimizing a nextjs project for search engines. I put all of my focus on the technical side of optimization (NextJs Technical SEO).

Besides these 33 optimization techniques, there is at least equal optimization related to building backlinks, PR activities, local SEO, etc.

We’ll talk about all this in future articles.

I consider the technical part of having the biggest priority because if your website doesn’t work correctly and isn’t structured well, it would be a huge SEO drawback that hinders its success.

Something that now has no chance of happening if you complete this checklist.

Perhaps I proved my point that NextJs SEO isn’t easy and requires work, but once you complete the list, you will have a fast, well-optimized website that really shines in your CV or portfolio.

Cheers,
Sashe Vuchkov