In the latest update to learntla, the author focuses on optimizing TLA+ Model Checking, showcasing 15 different optimizations that significantly improve runtime, some even achieving a 10x improvement. They also mention their recent slump in writing and their decision to learn Raku, a language recommended by users after a rant about dynamic languages. They describe Raku as a language designed by intelligent gremlins, with its unique use of Unicode operators and alphanumeric infix operators. The author explores various features of Raku, including defining operators, multiple dispatch, and miscellaneous features such as CLI flags, preloaded methods in objects, and junctions for multiple comparisons. They admit that the documentation is poor and the reliance on symbols makes it difficult to search, but overall, they express an interest in the language’s potential and hope for improvements in compile times and documentation.
https://buttondown.email/hillelwayne/archive/raku-a-language-for-gremlins/