Conditional JSON decoder with optional fields

Hi all,

I have some JSON from a 3rd party API that can take a few different forms. For each response, there are

  • Shared fields which are present in every response (id, timestamp, etc)
  • There is a field type (present in every response) whose value is another JSON object with id and name keys
  • There is an additional key whose name is is the name of the type object. Some types don’t actually have any additional metadata (and thus have no extra key in the response JSON).

For example:

{
  "id": 1,
  "timestamp": "etc",
  "type": {"id": 1, "name": "chair"},
  "chair": {
    // (Chair metadata goes here)
  }
}

Or...

{
  "id": 2,
  "timestamp": "etc",
  "type": {"id": 4, "name": "bed"},
  "bed": {
    // (Bed metadata goes here)
  }
}

Or...

{
  "id": 2,
  "timestamp": "etc",
  "type": {"id": 4, "name": "stool"},
  // (no metadata)
}

Where the metadata for each type has a different shape.

This is complicated by the fact that
I could naively decode this into an Elm record with lots of optional fields:

type alias NaiveRecord =
    { id : Int
    , timestamp : Timestamp
    , recordType : NaiveRecordType
    , chair : Maybe ChairMetadata
    , bed : Maybe ChairMetadata
    , table : Maybe TableMetadata
    -- and so on...
    }

type alias NaiveRecordType =
    { id : Int,
    , name : String 
    }

However, this seems bad for a couple of reasons:

  • It makes (theoretically) impossible states possible. The relevant metadata field for each record’s type should be Just a if and only if the recordType is the corresponding one.
  • Any production code using this type will result in unnecessarily getting any metadata in/out of Maybes when we know that the value is present (or rather, if the value isn’t present, something has gone very wrong and we would like to handle that case differently!)

So, I would rather use a type like:

type Record =
    { id : Int
    , timestamp : Timestamp
    , recordType : RecordType
    }

type RecordType
    = Chair ChairMetadata
    | Bed BedMetadata
    | Table TableMetadata
    | Stool      -- Has no additional metadata
    | Bookcase   -- Has no additional metadata
    -- And so on

However, I’m not sure quite how to decode this from the JSON, as it requires checking the type's name field, and then (possibly) nesting the top level metadata field within the RecordType.

My current best guess is to use the “naive” type as an intermediate type. For example:

decodeNaiveRecord : Decoder NaiveRecord
-- i.e. relatively cookiecutter decoder with Decode.mapN or similar

fromNaiveRecord : NaiveRecord -> Decoder Record
-- i.e. succeed if "type" matches the metadata field, fail otherwise

decodeRecord : Decoder Record
decodeRecord =
    Decode.andThen fromNaiveRecord decodeNaiveRecord

I have 2 main questions that I would like help with:

  • Is my “target” type (Record) a good design for this problem or is there a better solution/pattern I should be aware of
  • If it is an acceptable design, then what is the best approach to decode the JSON into it? Is there any clear and elegant way to do so without going via an intermediate data type?

Thanks!

1 Like

This is a classic case for Json.Decode.andthen

e.g. (untested)

import Json.Decode exposing (...)

decodeRecordType : Decoder RecordType
decodeRecordType =
  (at ["type", "name"] string)
    |> andThen decodeRecordTypeDetails

decodeRecordTypeDetails : String -> Decoder RecordType
decodeRecordTypeDetails name =
  case name of
    "bed" -> decodeBed
    "chair" -> decodeChair
    "stool" -> succeed Stool
    _ -> fail ("unknown record type: " ++ name)
2 Likes

Thanks @wondible

I might be misunderstanding the behaviour of at. With this solution, wouldn’t the decoder returned by decodeRecordTypeDetails operate on the actual name of the type?

So although the types line up, I don’t think this would work because we need to apply that decoder to a field in the original outer JSON.

I tried something which I think was similar and got the following result:

Got bad body (Problem with the value at json[0].type:

    {
        "id": 35,
        "name": "Chair"
    }

Expecting an OBJECT with a field named `type`) when attempting to load JSON!

Have I misunderstood?

This is a really challenging data structure you have here. You’re right that andThen won’t work. My current thought is to have multiple decoders that attempt to decode each of the meta data’s and then Json.Decode.andThen both the type information and the collection of potential metadata fields and see if the metadata for the type was found. Something like

{
  "id": 2,
  "timestamp": "etc",
  "type": {"id": 4, "name": "bed"},
  "bed": {
    // (Bed metadata goes here)
  }
}
decodeData =
    map5 (\id timestamp type_ maybeBed maybeChair ->
                   ( Data I'd timestamp, type_, { bed = maybeBed, chair = maybeChair } )
               )
        (field "id" string)
        (field "timestamp" string)
        (at [ "type", "name" ] string)
        (maybe (field "bed" bedMetaDecoder))
        (maybe (field "chair" chairMetaDecoder))
        |> andThen
            (\( dataFn, type_, rec ) ->
                case type_ of
                    "chair" ->
                        case rec.chair of
                            Just chairMeta -> success (dataFn chairMeta)
                            Nothing -> fail "Expected 'chair' metadata"
                    "bed" ->
                         case rec.bed of
                             ...
                     _ -> fail "unknown type"
            )
1 Like

Thanks, @wolfadex

Am I right in thinking this is essentially the solution going via an intermediate data structure?

map5 ... to get to the intermediate data structure, andThen \(lambda...) to get from the intermediate to final data structure?

at ["type", "name"] string
    |> andThen decodeRecordTypeDetails

No, it will not. It would operate on the root object. Here is an ellie that demonstrates that it works.

4 Likes

Thank you very much, @albertdahlin! I updated the to add the common fields for my own understanding here.

In practice, the JSON I’m decoding has more than 8 fields so I’ve been using Json.Decode.Pipeline for the decoding, which seems to be causing some issues still. But with the help I’ve received from this thread, I think I should be able to get through it now.

I also found Json.Decode.Extra.when which seems like it would be another way to solve this problem. (Edit: I just needed to use custom ... instead of required "type" ... at the appropriate point in the pipeline).

Thanks again for all the help!

This topic was automatically closed 10 days after the last reply. New replies are no longer allowed.