Realworld Api Backend with concurrent-task

Hello friends,

I’ve been playing around with elm-concurrent-task and what a backend http setup could look like with it. I ended up implementing the realworld backend api spec with it; I had a lot of fun in the process!

(Please correct me if I’m wrong) I think this is the first backend spec compliant Elm implementation.

Links

How it works

  • The api is an Elm worker running on NodeJS, and uses elm-concurrent-task to interop with a few JS libraries and a Postgres database.
  • All of the business logic can be found in Routes.elm and in Domain modules.
  • Each endpoint handler is a ConcurrentTask which handles a thin layer of interop with some JS libraries (bcryptjs, jsonwebtoken and node-pg).
  • All of the code in Lib is pretty generic and could more or less be extracted out into separate packages as is (For JWT, Bcrypt, Postgres and an Http Server).
  • It’s deployed on a single vercel serverless function, with vercel postgres for storage.

I’m quite pleased with how clean it’s turned out (particularly the http layer). The database layer is pretty minimal and not very typesafe, but could definitely iterate on it.

Curious to hear any thoughts or suggestions.

8 Likes

This is replacing my naive implementation and extends it with so much more functionality, type safety and better code.

Are you using this architecture in production?

Am I reading the code correctly: you are starting up one elm worker and then accepting http requests concurrently? How do you not mix up requests and their corresponding responses?

An elm worker has a state (model), so it should be possible to implement a rate limiter inside elm, right?

Upon further reading the code, it seams that elm-concurrent-task is the piece that makes all this possible (as you stated in the first line :wink: ).

Big thanks to you for sharing this!

Thanks! Glad it’s been useful.

I’m using something close to this in production, it’s low traffic but it behaves well!

Yep you’re right about the rate limiter, the server endpoints are all stateless but underneath there’s the stateful program:

You could add logic + state for rate limiting at this level if you wanted.

Making sure requests + responses aren’t mixed up is a clever trick (credit to @eberfreitas) where you create a new Promise on request and pass the resolve function through Elm (as an Json.Encode.value). When the response is ready, JS calls the passed resolve function.

It’s pretty handy as it makes it very easy to use with any server framework.

Here’s the code:

Here’s the article:

3 Likes

This topic was automatically closed 10 days after the last reply. New replies are no longer allowed.