I’ve been thinking about the Avro file format and how nice a library for it could be.
It’s written (but not quite ready for prime time as Named Types aren’t supported yet).
Here’s a feel for the API
type alias Degree =
{ name : String }
type alias Student =
{ name : String, age : Int, sex : Maybe String, degrees : List Degree }
type alias Academic =
{ name : String, school : String, title : String }
type People
= Student_ Student
| Academic_ Academic
degreeCodec : Codec Degree
degreeCodec =
success Degree
|> requiring "name" string .name
|> record { baseName = "degree", nameSpace = [] }
studentCodec : Codec Student
studentCodec =
success Student
|> requiring "name" string .name
|> requiring "age" int .age
|> optional "sex" string .sex
|> withFallback "degrees" (array degreeCodec) [] .degrees
|> record { baseName = "student", nameSpace = [] }
academicCodec : Codec Academic
academicCodec =
success Academic
|> requiring "name" string .name
|> requiring "school" string .school
|> requiring "title" string .title
|> record { baseName = "academic", nameSpace = [] }
peopleCodec : Codec People
peopleCodec =
imap (foldPeople Err Ok) (foldResult Student_ Academic_) <|
union studentCodec academicCodec
Then one calls
makeDecoder : Codec a -> Schema -> Maybe (Bytes.Decode.Decoder a)
makeEncoder : Codec a -> a -> Bytes.Encode.Encoder
To read with your Codec data written with a Schema or write with the schema developed from your Codec.
This uses the profunctor applicative technique to make the API nice for building Schemas, decoders and encoders in one pass.
Anyone think this would be useful? I’m writing it currently as I feel that a elm might actually be excellent for writing Kafka consumers and transformers. Still working on that half though.