projects
software related
I've used languages such as Python, Mathematica, and Macaulay2 for academic work for many years. I've used Java
in coursework to learn OOP. I know data structures and algorithms decently well, or at least to the extent you'd
see in a few graduate algorithms courses. This is to say, I have experience implementing
code for theoretic work. I would like to have more experience with
development-like projects. Here are projects I'm either currently working
on or would like to work on in order to do so.
-
Writing An Interpreter In Go
This is a book written by Thorsten Ball that
walks you through creating a fully working interpreter for a C-like language called Monkey. Why am I doing this? Because I want
to understand how programming languages work. Also, the interpreter is written in Go which I would also like to learn.
In progress.
-
Writing A Compiler In Go
This is the sequel to the above book in which we write a compiler for the Monkey language.
-
Advent of Code: in Go and Python
I'm starting with the 2022 set before this year's calendar.
Advent of Code is fun and forces you to understand
the language you're working with more than just solving
algorithm and data structure problems. Doing this concurrently
in first Python then Go is a great way to learn Go more quickly.
-
Learning how to use NeoVim
Why? Because I want to have some experience not relying on an IDE.
It's also painful. I'm currently editing the HTML for this
website with NeoVim and it was painful to even get NeoVim setup
properly and to learn basic commands. But I'm seeing progress and
that's the point.
-
Learn C or C++
I don't know which. As stated before, I mostly know Python which abstracts away many concepts. I want to understand the
ideas that Python handles for you.
-
Learning GitHub and version control
-
Some TBD project either in Go or C/C++ or whatever languages become necessary
machine learning and data science related
In an effort to understand the current literature on large language models
and natural language processing, I have begun reading the foundational papers such as
Attention Is All You Need, which introduced the transformer
as a replacement to the then state-of-the-art recurrent and convolutional
neural networks and
BERT: Pre-training of Deep Bidirectional Transformers for
Language Understanding, which introduced means to make
transformer-based models general purpose for numerous natural language tasks.
-
Implementing Transformer Models in Python
More than just understanding the theory that underpins transformer models,
I am working on a Python implementation of the model described in
Attention Is All You Need.