Elm Silent Teacher - An Interactive Way to Learn Elm

I published a post on Elm Silent Teacher, an educational game designed to help you learn the Elm programming language through interactive exercises.

https://michaelrätzel.com/blog/elm-silent-teacher-an-interactive-way-to-learn-elm

As I assist beginners in getting started with Elm, I often wonder how we can make learning this programming language easier. This way, I also discovered the ‘Silent Teacher’ shared by @Janiczek at https://discourse.elm-lang.org/t/silent-teacher-a-game-to-teach-basics-of-programming/1490

These compact, interactive learning experiences are an excellent resource for students, so I expanded on the approach and developed it further.

The core loop - learning through bite-sized exercises

In the core interaction loop, the app presents an Elm expression and prompts us to evaluate it.

Of course, evaluating an expression in our brain requires some knowledge of the programming language. So we enter our best guess and initially often get it wrong.

After submitting an answer, the system checks it for correctness. If the answer is incorrect, the Silent Teacher points out the correct answer. Once we are ready for the next challenge, we can continue with a new, similar exercise on the same topic.

And as we get more exercises right, the system shows more advanced challenges. The user gradually learns about programming language elements and functions from the core libraries through many small repetitions.

Elm Silent Teacher - evolved

Compared to the earlier implementation, I adopted a different approach for encoding exercises, simplifying the authoring process. Course authors no longer need to provide a function that computes the correct answer. An interpreter now takes care of this part automatically, running the evaluation on the user’s device in the web browser.

Since the interpreter is readily available, users can also experiment with additional expressions. After submitting their answer, they can access a REPL-like sandbox initialized with the exercise’s expression. In this exploration mode, users can modify the expression and observe how the evaluation changes accordingly.

If you took the trip, let me know how it went. I’d love to hear your thoughts.

17 Likes

The approach reminds me a bit of of “The Little Schemer”, which is one of my favorite computer books.

This section didn’t work for me:

1 Like

Haha I made the same mistake at first. You need the quotes "lal" to make it valid elm

2 Likes

Thank you for the link. I have difficulty focusing on/with a book, but “find out what computing is really about” sounds compelling.

Thank you for the feedback!
As @lue-bird pointed out, the quotes were missing in that submission.

Since this kind of confusion happens frequently, I wondered what we could do to avoid that in the future. To fix this, I added additional visual highlights of the subsequences of the solution text, which differ from the submitted answer.
This screenshot shows how it looks now, highlighting the added quotes with a wavy underline:

Credits to @jinjor for publishing the elm-diff library, making it easy to compute the difference between the expected and submitted string.

2 Likes

Awesome!

I’m trying it out now. Since there is no part where it teaches the operators you’re quizzed on, it would be great if the engine would check incorrect answers for correctness by common beginner mistaken assumptions, e.g. if I am prompted with

I probably assumed the /= operator had something to do with division. So if the engine checks 3 // 3 and it evaluates to the answer I gave, we could give a nice hint:

Wrong answers are a great opportunity to teach the rules of the language.

2 Likes

Welcome back, and thank you for sharing your ideas for improvements!
People will likely receive these hints very well. At the moment, I am not sure how I would implement the function that guesses the students’ alternate intent.

Now that I read the hint in your mockup, we could show an explanatory text when the user hovers over a syntax element, like this example’s ‘strange’ operator symbol.

This is cool! Here are my thoughts while doing it:

  • If felt a bit too long for the amount of topics it covered. I felt like reducing the repetition by about 20% might feel better. Alternatively, I think interleaving some of the topics, or having more variations of problems within each topic, or occasional problems that combine multiple previous skills would help improve that too.
  • There were five or six times where I got the exact same question twice in a row. Maybe that can be filtered out somehow.
  • Annoying muiltiplication came up several times (specifically 7 * 9 and 8 * 6). Since the content was meant to be about learning Elm and not about practicing multiplication, I found that distracting.
  • On a large screen, the problem is at the very top, and the scoring is at the very bottom. Moving them both closer to the center would make it smoother to stay focused on the flow.
  • I found myself hesitant to enter a guess when I wasn’t sure – I’m not sure why exactly I had that “harsh judgement” fear, but it was there for me, maybe just because it is structured very much like a test? I’m not sure what can be done about that, but if there were some way to get the players comfortable with “failure” and experimentation, I think that would make it better fulfill its goal. FWIW, Elden Ring does that by having a scripted death as the first encounter – maybe there’s some inspiration there? :joy:
1 Like

Coming to the Silent Teacher after quite some time,

  • I do agree that explaining // etc. might help. Perhaps the explanation could happen on failure instead of before the whole section starts.
  • But then there’s still the fear of failure - before starting, the tool could say that an error will allow you to retry on a similar example (you won’t miss anything / skip to the following section if you fail one exercise). Or lessen the FOMO in some other way.
  • I absolutely agree that there’s too much of arithmetic :100: Perhaps each exercise could have a tweaked number of repetitions. let..in and function application could be there for longer while 3*8-like exercises could only have 2 or 3.
1 Like

Hey Aaron, thank you for sharing your experience in such detail!

I have made some revisions to the app to address the issues you brought up.

Interleaving different kinds of exercises also improves retention, so I opted for interleaving to solve the ‘too monotonous’ feel.
When designing the new way of mixing, I thought of how we use increasing intervals with spaced repetition for more effective learning. Accordingly, the implemented mixing does not distribute the instances evenly but starts each new type of exercise with a higher frequency.
Below is an example of how the arrangement of exercises differs between the new version and the previous version:

List.concat (previous):
🍅🍅🍅🍅🍅🍄🍄🍄🍄🍕🍕🍕🍕🍒🍒🍒
interleaveExercisesRecursive (new):
🍅🍅🍅🍄🍄🍅🍄🍅🍕🍕🍄🍕🍒🍒🍕🍒

Absolutely! You won’t see this happen anymore in the new version. I changed it to remember past challenges and filter through candidates for new instances accordingly.

I removed these annoying instances of multiplication exercises.

I changed the layout to keep them closer together, so you will no longer see such a weird arrangement. I also experimented with making it automatically zoom in, depending on the available viewport size. Ultimately, I did not include that because I am unsure if there are scenarios where an automatic zoom would be a problem.

Interesting, I did not know that game and looked up a gameplay video of that encounter. Now I wonder what packaging the existing learning/repetition mechanic into a video game would do :thinking:
At the moment, I don’t know if such a different mode of presentation would help to get comfortable with failure and experimentation. I am sure there are lots of ways we can encourage people more here, but at the moment, I don’t have a concrete idea.

1 Like

Thank you for the insights, Martin, and thanks again for the discovery and inspiration for starting this in the first place!

I also assume that offering that explanation on failure will be better received than upfront. Suppose someone has extensive experience with Python already. In that case, they might immediately understand that particular operator // so they might perceive an unconditional insertion of an explanatory element as a distraction.

Great point! I did not realize people might assume they might miss anything after a failed attempt. Not sure yet how to best integrate explaining this. Perhaps changing the button label from ‘continue’ to ‘retry’ when the attempt was failed :thinking:

Thank you, I tweaked the repetition counts for less arithmetic and more let..in blocks and more function application examples :+1:

This topic was automatically closed 10 days after the last reply. New replies are no longer allowed.