Skip to content
Discussion options

You must be logged in to vote

Here is what I end up doing:
Conditionally Load Blazor for Bots. Detect crawlers and skip loading the Blazor script entirely for them.
I don't believe this is the best way to use Blazor with SEO indexed pages, but it works.

@page "/exercises/left/{id:int?}"
@using Keyboardy.Backend.Client.Services
@model Keyboardy.Backend.Pages.Exercises.LeftModel
@{
    var userAgent = Request.Headers["User-Agent"].ToString().ToLower();
    var from = Request.Headers["From"].ToString().ToLower();
    var isGoogleBot =  
    userAgent.Contains("googlebot") || userAgent.Contains("crawler") || from.Contains("googlebot");

}

<section class="section">
    <h1> This is importent information </h1>
    <p> This…

Replies: 2 comments 2 replies

Comment options

You must be logged in to vote
1 reply
@Abdulrhman5
Comment options

Comment options

You must be logged in to vote
1 reply
@Abdulrhman5
Comment options

Answer selected by Abdulrhman5
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants