
News about Dyalog
Sep 23, 2025
APL Forge: 2025 Winner Announced

2025 was the second year of the APL Forge, an initiative that is intended as a catalyst to grow the next generation of APL-based applications and tools by inspiring people with good ideas to use APL to turn those ideas into reality.
This annual competition asks individuals, groups, or even companies, to submit their open-source libraries or potential commercial applications/tools for assessment. Each submission can be proprietary or permissively licensed, and open- or closed-sourced, but the core must be written in a currently-supported version of Dyalog (see full eligibility criteria).
The 2025 round of the APL Forge closed in June; the judging process has now completed and Borna Ahmadzadeh has been named as the winner with his submission APLearn.
"I'm a computer science student in Toronto, Canada working towards my bachelor's degree at York University. Although my experience primarily lies in machine and deep learning, I'm also very curious about theoretical computer science and language design, leading me to explore different programming paradigms, including array programming – more specifically, APL – just over a year ago. APL held a double interest for me: It was very different from the object-oriented and functional languages I was used to, and it demanded a completely new perspective on programming (data-parallel operations, no branches, and so on). At the same time, it struck me as a more elegant tool for conveying concepts and ideas that scientific computing frameworks in languages like Python could only awkwardly express. I was particularly keen on this point because machine learning tasks regularly deal with complex multi-dimensional arrays, and APL seemed like an appealing alternative to the currently Python-dominated landscape.
"My first experiment in this area was trap, an implementation of transformers (for example, GPT) in APL. I was surprised by how concise the code was compared to the Python reference, even though the latter wasn't written from scratch, instead utilising the deep learning package PyTorch. To take trap a step further, I started to implement more traditional machine learning models such as linear regression and support vector machines in APL, forming the basis of APLearn. Transformers are conceptually straightforward since they are, in terms of concrete implementation, little more than a sequence of matrix multiplications. On the other hand, machine learning algorithms are often trickier, relying on more than just basic linear algebra. For example, random forests depend heavily on trees, and Lasso regression is solved using an iterative, non-parallel approach called co-ordinate descent. This created difficulties at times because it wasn't always obvious how to translate, say, ifs or for loops into APL, and I had to actively force myself to avoid falling back on a more standard, imperative mindset. During this time, I maintained a correspondence with Aaron Hsu, the developer behind the Co-dfns APL compiler, concerning performance improvements, and he encouraged me to submit my projects to APL Forge.
"In the future, I plan to study array programming in greater depth, with a focus on machine/deep learning. I believe APL is a viable tool for quickly and succinctly solving many computational problems, especially in research, and I hope I'll have the opportunity to take advantage of the many benefits it offers in my work."
Borna will present his winning work at Dyalog '26.
Congratulations Borna!