Hello Blazor Team
A few days ago I wrote this answer https://github.com/aspnet/Blazor/issues/1299#issuecomment-412340860 to one of the questions. Now I'm thinking about already available Blazor features and current limitations. We all know that Blazor is not perfect yet, but in my opinion it is good enough to create first experimental projects and publish them to show to other people around the world how wonderful it is. There is one big missing feature - server side rendering aspnet/AspNetCore#5464. Web site which is not available for indexing by Google/Bing/Yahoo robots simply does not exists in the internet.
Some features mentioned in https://github.com/aspnet/Blazor/issues/1299#issuecomment-412340860 are not implemented at all or have bugs/limitations, but in basic, experimental projects we can try to find a workaround or simply don't use something. Unfortunately there is no workaround for SSR and SEO. If possible please prioritize this task.
@Andrzej-W I don't think this is till the case as the big search engine bot now crawls SPA without problem
https://webmasters.googleblog.com/2015/10/deprecating-our-ajax-crawling-scheme.html
An article about the case of Bing/Yahoo : https://www.screamingfrog.co.uk/bing-javascript/
My opinion about SSR : it is a nice feature, but I don't think it should be a priority. And I would add that a SPA framework is mostly for application with high user interaction (an app that 15 years ago would have been a desktop app). For content centric application that needs SEO there is better choice like Aspnet core or razor page.
I have to disagree, blazor will be the next thing. It will replace asp.net mvc and core mvc.
I agree with @GoranHalvarsson - I also want to replace my ASP.NET Core applications with Blazor.
@RemiBou there are a lot of interactivity in "standard" web applications:
All those sites have to have good SEO. Search engines still have problems with dynamic sites and WebAssembly is not the same as pure JavaScript. I have searched for "How is Blazor working for you" in Google and found these 3 sites:
http://www.arvidnyden.com/
https://www.flis.io/
https://www.gabrielrasdale.com/
Now try to search for:
site:https://www.gabrielrasdale.com/ forecast
and Google returns nothing. Here
site:https://www.flis.io/ forecast
also nothing.
This one works:
site:https://www.flis.io/ Interop
Other search engines tested (Bing, Ask.com, Yandex) do not work at all.
Please remember that good SEO also means correct elements in <head>
like:
<title>My Blazor site</title>
<link rel="alternate" hreflang="pl" href="https://example.com/pl">
<link rel="alternate" hreflang="en" href="https://example.com/en">
<meta name="description" content="Check my beautiful site immediately">
to name a few. Do you want to have good integration with social media sites? You have to have these elements in <head>
:
<meta property="og:description" content="...">
<meta property="og:locale" content="...">
<meta property="og:site_name" content="...">
<meta property="og:title" content="...">
<meta property="og:image" content="...">
<meta property="og:url" content="...">
to name a few. Don't expect that Facebook or Twitter will run your app locally and wait until you insert those tags dynamically. We really need server side rendering and we must be able to generate correct metatags for each page (strictly speaking for each route, because we need different metatags for example.com/product/1
and example.com/product/2
).
@Andrzej-W Thanks for your interest in Blazor and for all you've been doing to support the Blazor community. Server-side prerendering is absolutely something we want to do. It's just not a priority right now because we have more basic issues to tackle with the component model and the .NET runtime (download size, performance, handling basic user interactions, etc.). I think we understand the need for this feature and we are already tracking it with aspnet/AspNetCore#5464, so I'm going to go ahead and close this issue. I hope that's ok!
For anyone that comes across the lack of serverside rendering, this is a class I just wrote to resolve the issue, which was preventing me from being crawled by AdSense.
I have confirmed that it works (using the Google Search Console) and does add some, but not a significant amount overhead to crawling response times. It might be better to do something similar but prerender all of the content and store it in a folder and serve it upon request by a crawler - I dunno.
It is also faster if you change HostURI to the localhost.
It supports all known crawlers inside the JSON data file from https://github.com/monperrus/crawler-user-agents.
It depends on the NuGet package Selenium.WebDriver v3.141.0 and you also must have Chrome installed on the target machine.
Usage goes something like this where you set the MagicWord
string to something you expect to exist in the HTML after rendering is complete (such as a header or footer ID found in the rendered Blazor components).
public void Configure(IApplicationBuilder app, IHostingEnvironment env)
{
SeleniumRenderMiddleware.MagicWord = "language_bar";
app.UseMiddleware<SeleniumRenderMiddleware>();
app.UseServerSideBlazor<App.Startup>();
}
```C#
using System.IO;
using System.Net;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Http;
using System.IO.Compression;
using System.Runtime.InteropServices;
using System;
using System.Collections.Generic;
using Microsoft.JSInterop;
using OpenQA.Selenium.Chrome; //Nuget Package: Selenium.WebDriver v3.141.0
using System.Reflection;
namespace SeleniumRender
{
public class UserAgents
{
public string pattern { get; set; }
public string url { get; set; }
public string[] instances { get; set; }
public string addition_date { get; set; }
public string description { get; set; }
}
public class SeleniumRenderMiddleware
{
// GOOGLE CHROME MUST BE PRE-INSTALLED ON THE TARGET MACHINE FOR THIS TO WORK.
private static Dictionary<string, string> s_userAgents = new Dictionary<string, string>();
private static ChromeOptions s_chromeOptions = new ChromeOptions() { AcceptInsecureCertificates = true };
private static ChromeDriver s_chromeDriver;
private static string s_chromeDriverFilename
{
get
{
if (System.Runtime.InteropServices.RuntimeInformation.IsOSPlatform(OSPlatform.Windows)) return "chromedriver_win32.zip ";
else if (System.Runtime.InteropServices.RuntimeInformation.IsOSPlatform(OSPlatform.OSX)) return "chromedriver_linux64.zip";
else if (System.Runtime.InteropServices.RuntimeInformation.IsOSPlatform(OSPlatform.Linux)) return "chromedriver_mac64.zip";
throw new Exception("Are you running OS/2 Warp?!");
}
}
private static string s_chromeVersion = "73.0.3683.20";
public static string HostURI;
public static string MagicWord = "0xDEADBEEF";
private RequestDelegate _next;
static SeleniumRenderMiddleware()
{
string ChromeDriverLocation = Path.GetDirectoryName(Assembly.GetEntryAssembly().Location);
using (var client = new WebClient())
{
if (File.Exists($"{ChromeDriverLocation}\\agents.json") == false)
client.DownloadFile("https://raw.githubusercontent.com/monperrus/crawler-user-agents/master/crawler-user-agents.json", $"{ChromeDriverLocation}\\agents.json");
if (File.Exists($"{ChromeDriverLocation}\\chromedriver.exe") == false)
{
byte[] zipFile = client.DownloadData($"https://chromedriver.storage.googleapis.com/{s_chromeVersion}/{s_chromeDriverFilename}");
using (var ms = new MemoryStream(zipFile))
{
ms.Seek(0, SeekOrigin.Begin);
ZipArchive archive = new ZipArchive(ms);
archive.ExtractToDirectory(ChromeDriverLocation);
}
}
}
foreach (var userAgent in Json.Deserialize<UserAgents[]>(File.ReadAllText($"{ChromeDriverLocation}\\agents.json")))
foreach (var instance in userAgent.instances)
s_userAgents.Add(instance, userAgent.pattern);
s_chromeOptions.AddArgument("headless");
s_chromeDriver = new ChromeDriver(ChromeDriverLocation, s_chromeOptions);
}
public SeleniumRenderMiddleware(RequestDelegate next) => _next = next;
public async Task Invoke(HttpContext context)
{
if (s_userAgents.ContainsKey(context.Request.Headers["User-Agent"]) == false)
await _next.Invoke(context);
else
{
if (HostURI == null)
HostURI = (context.Request.IsHttps ? "https://" : "http://") + context.Request.Host.Value;
s_chromeDriver.Url = HostURI + context.Request.Path;
s_chromeDriver.Navigate();
while (s_chromeDriver.PageSource.Contains(MagicWord) == false)
await Task.Delay(100);
context.Response.ContentType = "text/html";
await context.Response.WriteAsync(s_chromeDriver.PageSource);
}
}
}
}`
Interesting solution, thanks @andras-ferencz !
One security note. If someone will be able to cheat your server and change the IP address for the googleapis.com or githubusercontent.com domain, then you will potentially download malware to the server.
SEO Workaround for 0.8.0
@SteveSandersonMS -- Thank you. This is getting SOOOO close to just magical.
In the .razor components:
@functions {
public static readonly Dictionary<string, string> BlazorBag = new Dictionary<string, string>()
{
{ "Title","Welcome to Blazor"},
};
}
In the primary Index.cshtml:
@{
var Bag = new BlazorBag(typeof(App), Request.Path.Value);
}
<head>
<title>@Bag["Title"]</title>
</head>
In helper class:
public class BlazorBag
{
private readonly Dictionary<string, string> _cachedBag;
public BlazorBag(Type type, string path)
{
if (_cachedBag == null)
{
path = path.TrimStart('/');
if (path.Equals(string.Empty, StringComparison.InvariantCultureIgnoreCase))
{
path = "Index";
}
var page = path;
var pageType = type.Assembly.GetTypes().FirstOrDefault(t => t.FullName.EndsWith("Pages." + page, StringComparison.InvariantCultureIgnoreCase));
_cachedBag = pageType.GetField("BlazorBag")?.GetValue(null) as Dictionary<string, string>;
}
}
public IHtmlContent this[string key] => new HtmlString(_cachedBag?[key] ?? "");
}
Theory
This uses the magic of reflection to call the static dictionary in each compiled .razor component. It uses some helper functions to make it an indexer. If someone has a more elegant suggestion, I would love to hear it. Remember, this is my hack to make it work, ymmv.
@ElectricHavoc With 0.9 having enabled pre-rendering, there is a terrible, sinful, unforgivable hack you can do by splitting up the razor page Index.cshtml, _on the Server_, putting any interop scripts there, and opening and closing tags by using @Html.Raw
to prevent erroring.
e.g in Index.cshtml
@page "{*clientPath}"
<!DOCTYPE html>
@Html.Raw("<html>")
@Html.Raw("<head>")
<base href="/" />
<meta charset="utf-8" />
<meta name="viewport" content="width=device-width">
<script>alert("some script");</script>
<app>@(await Html.RenderComponentAsync<Client.App>())</app>
<script src="_framework/components.server.js"></script>
@Html.Raw("</body>")
@Html.Raw("</html>")
Then likewise inside of MainLayout.cshtml, _in the App_, inject the SEO metatags, close the head tag, open a body element and continue as abnormal. (For some reason, the building process doesn't scream when you have unclosed body tags in the App, probably an oversight. )
<title>@Title</title>
<meta name="description" content="@Desc">
<link rel="alternate" hreflang="en" href="@Hreflang_En" />
</head>
<body>
@functions {
public string Title { get; set; } = "Some Title";
public string Desc { get; set; } = "Some Description"
public string Hreflang_En { get; set; } = "~/"
}
Because of how prerendering works, it does the initial rendering of the whole page, preventing any breakage, and the SEO just works and the updates to the Title and Description are visible by the user as soon as they're changed and AdSense/Google/Bing/Media.net et al. are happy.
Now the question is: Do we file this as a bug or a feature? 馃
@honkmother - That's kinda exactly what I'm doing but without the need to to hack apart the
tag. In my hack, the original Index.cshtml is intact. Since both are only effective for prerendering... will your solution actually change the title on navigate without refresh?@ElectricHavoc Yes. As soon as you change Title, or whatever other tags, the user will see it without a refresh.
The problem with having the tag inside of Index.cshtml is that it is a Razor page, and not a Component, so you can't really change the contents after it is loaded, which is OK for SEO, but not ideal for the user.
I tried your solution (which was really clever by the way!) but the title didn't change on navigation so I came up with this blasphemous solution until we have some way to do tag management.
@honkmother Ah. Yeah, that's interesting. I'll have to give it a try then. Title would be the only thing I think that really matters... and was going to try to solve that with a JS interop call, but knowing this hack could work, does make me wonder if there is a use case besides title the user would care about.
You are also correct about it being purely Razor page in my example, was intentional since I was only concerned about that first load / bot navigation.
I was actually previously doing a JS interop call, but it doesn't work for SEO purposes as far as I can tell since not all crawlers will execute the code or recognize the change being made post-load, namely AdSense. There absolutely are some (limited number of) use cases - I haven't tested it yet, but swapping between alternative stylesheets comes to mind for a light/dark mode.
any update on SSR?
SSR is critically needed not only for SEO (people who say bots can render SPAs know nuffin) but also for the initial page load to improve user experience.
@danroth27 if this is a duplicate where is the original?
@d668 here is original
Also known as Server-Side Rendering (SSR)
@d668 Prerendering in Blazor server side is now available in default template - it works out of the box. If you want to use it in Blazor Client Side here is an example repository: https://github.com/danroth27/ClientSideBlazorWithPrerendering
@d668 Prerendering in Blazor server side is now available in default template - it works out of the box. If you want to use it in Blazor Client Side here is an example repository: https://github.com/danroth27/ClientSideBlazorWithPrerendering
Awesome news I will test it, thanks!
Most helpful comment
I agree with @GoranHalvarsson - I also want to replace my ASP.NET Core applications with Blazor.
@RemiBou there are a lot of interactivity in "standard" web applications:
All those sites have to have good SEO. Search engines still have problems with dynamic sites and WebAssembly is not the same as pure JavaScript. I have searched for "How is Blazor working for you" in Google and found these 3 sites:
http://www.arvidnyden.com/
https://www.flis.io/
https://www.gabrielrasdale.com/
Now try to search for:
site:https://www.gabrielrasdale.com/ forecast
and Google returns nothing. Here
site:https://www.flis.io/ forecast
also nothing.
This one works:
site:https://www.flis.io/ Interop
Other search engines tested (Bing, Ask.com, Yandex) do not work at all.
Please remember that good SEO also means correct elements in
<head>
like:<title>My Blazor site</title>
<link rel="alternate" hreflang="pl" href="https://example.com/pl">
<link rel="alternate" hreflang="en" href="https://example.com/en">
<meta name="description" content="Check my beautiful site immediately">
to name a few. Do you want to have good integration with social media sites? You have to have these elements in
<head>
:<meta property="og:description" content="...">
<meta property="og:locale" content="...">
<meta property="og:site_name" content="...">
<meta property="og:title" content="...">
<meta property="og:image" content="...">
<meta property="og:url" content="...">
to name a few. Don't expect that Facebook or Twitter will run your app locally and wait until you insert those tags dynamically. We really need server side rendering and we must be able to generate correct metatags for each page (strictly speaking for each route, because we need different metatags for
example.com/product/1
andexample.com/product/2
).