Initial static HTML app 'shell'

We use Elm for our eCommerce stores and have been for about 5 years now. The store is an SPA written in Elm, i.e. using Browser.application, hence everything is ultimately rendered by Elm. We’ve traditionally server rendered the header area of the page as part of the app shell (the downloaded HTML page), believing it would speed up time to first paint (or some similar metric).

We’re making some changes now to the header and are reconsidering complexity of keeping the server and client rendering code in sync (the server-rendering code is not written in Elm). We’ve been doing some research around SEO best practices in this area and struggling to find anything meaningful. What are peoples’ general thoughts on this? Is it best to just keep it simple and render a completely blank placeholder? Or maybe something generic but visual that represents the rough site layout graphically? Or something else entirely?

You should check out elm-pages. It may or may not fit your use case depending on how your SPA is set up, but server-side rendering in pure Elm is supported explicitly. Check out server-rendered routes.

Thanks. I use elm-pages for a couple of other projects. It’s unfortunately not appropriate for our main eCommerce stores, for various reasons, including just because we have a bunch of working code that we do not wish to throw away and start again.

This has reminded me that I made an SPA in Elm… and I have been meaning to look into how the pages are (or not) being index by Google and Bing. I thought I heard/or read… somewhere that they’re crawlers executed JavaScript now.

My initial investigations have made it looks like the answer is no, the crawlers don’t seem to have gotten an executed any of the (Elm generated) JavaScript.

However… there are some things for me to fix, including adding a sitemap. I’m going to try some of their suggestions and see if it helps.

At this point in this side project I don’t think good SEO is going to make or break this project. So it’s not the worst thing in the world if it doesn’t work, but yeah, if I ever decided I needed better SEO I was considering migrating the project to elm-pages and doing a lot more static site generation.

1 Like

Google definitely crawls our pages, even those the static shell does not render. So it is executing the Elm compiled Javascript.

Does it have good “snippets” (the excerpts from the page in the search result) for those pages? In order to crawl a page, it just needs a link to it, not to be able to see its contents

Ok, I read your initial post a little more carefully. My understanding is… with all that, is Google and probably Bing and stuff want you to use their tools to analyze your website… things like Lighthouse and "Google Search Console and if those tools say you’re website is good, then it’s good. I’m sure there are ways to go above and beyond… but when you get into those waters Google is a lot more likely to “change the rules” and it will be hard for you to find out.

As I’ve been going down this rabbit hole… this article just showed up in my news feed about the Semantic Web:

I don’t think I knew about JSON-LD.

Google even has a “Rich Results Test” so you can see if you have it set up right:

https://search.google.com/test/rich-results

Seems pretty cool, I might try to get some of this stuff to work. It looks like it should just all work in Elm.

All that said… I do have a lingering suspicion that if all (or most) of my site was statically generated (just HTML files) that Google would “appreciate” that on some level.

1 Like

This is quite interesting. I was sure we had this all worked out, and was sure we had concluded that google was indeed traversing our pages. All are indexed, but we do have a sitemap. However, if you search for particular terms on each of the product pages (e.g. the SKU of the product) then you are returned the specific product page. This suggests that google is indeed capable of reading the page rendered via Javascript.

However, there are no rich-results for our pages. So this is definitely something that we need to look into.

Thanks for the input!

Here’s an interesting link on the topic:

During the crawl, Google renders the page and runs any JavaScript it finds using a recent version of Chrome, similar to how your browser renders pages you visit. Rendering is important because websites often rely on JavaScript to bring content to the page, and without rendering Google might not see that content.

1 Like

This topic was automatically closed 10 days after the last reply. New replies are no longer allowed.