Print("Hello, Mojoš„") ⤠The Brand New Programming Language for AI Development
An overview of Modular's new programming language, with code examples & how to get started in the Mojo Playground
Whereās your Mojo?
Modular is a platform that aims to bring together the ML/AI infrastructure. They recently created a new programming language, called Mojo. They saw the challenges of programming across the entire stack & the limitations of existing languages when it comes to targeting accelerators and heterogeneous systems in the AI field. So, they designed Mojo to be innovative, scalable, & compatible with the Python ecosystem!
Mojo's goal is to provide a powerful programming model with features like compile-time metaprogramming, adaptive compilation techniques, caching, & support for current and emerging accelerators. It recognizes the importance of the host CPU as a fallback for operations that specialized accelerators can't handle.
āMojo aims to address the challenges faced by applied AI systems and offers a single language solution.ā
Python was chosen as the foundation for Mojo because of its widespread use, popularity in the AI ecosystem, and its simplicity and expressiveness. Mojo is designed to be part of the Python family, providing full compatibility with the Python ecosystem while also offering predictable low-level performance and control.
The goal is to avoid fragmentation within the software ecosystem & make it easy for Python users to adopt Mojo without having to go through a painful migration process. While Mojo strives for compatibility with Python, there are some intentional differences.
āApplied AI systems need to address all these issues, and we decided there was no reason it couldnāt be done with just one language. Thus, Mojo was born.ā ā Modularās Mojo Docs
Mojo aims to be a first-class language that isn't limited by maintaining compatibility while still using the CPython runtime for full compatibility with the existing ecosystem. Modular plans to provide a mechanical migration tool to help users migrate their code from Python to Mojo. Personally, I canāt wait for this!
Python has certain limitations that Mojo aims to overcome. For example, Python's poor low-level performance, the global interpreter lock (GIL), & the complexity of building hybrid libraries using low-level bindings to C and C++ are all challenges that Mojo aims to solve.
By providing a unified language for accelerators, simplifying development, improving debugging capabilities, and addressing deployment challenges, Mojo hopes to make programming in the AI field easier and more efficient.
Code Examplesā
Before we get into the fun stuff, you might be wondering. . .
How can I try Mojo?
When you receive access, log in to the Mojo Playground using the email address you provided in the above form.
Share feedback, ideas, issues, or just chat with other users in our Mojo community. Also feel free to share your Mojo code with others.
So if you havenāt gotten access to the playground yet, Iāve compiled some code examples directly from the Mojo playground with simplified explanations. š
Let & Var
Since Mojo hasnāt launched to the public yet, you can only access it through the Mojo playground. Luckily, I received access and have taken some notes on the differences/additions that Mojo builds upon the Python ecosystem.
Many times, system programmers want to declare a value that is immutable, but python does not allow this functionality and so Mojo offers a solution with the
let
&var
declarations, which both support lexical scoping & support name shadowing:
# Code example from the Mojo Playground
def you_function(a, b):
let c = a
# Uncomment to see an error:
# c = b # error: c is immutable
if c != b:
let d = b
print(d)
your_function(2, 3)
alias
: named parameter expressions
Itās fairly common to want to give a name to values that are determined at compile time. While
var
is used to define a value thatās determined at runtime andlet
is used to define a constant thatās determined at runtime, we need a way to define a temporary value thatās determined at compile time. Mojo does this using analias
declaration.For example, the
DType
struct usesaliases
for its enumerators to implement a simple enum (although the actual internal implementation details may vary).
# Code Example from the Mojo Playground
struct dtype:
alias invalid = 0
alias bool = 1
alias si8 = 2
alias ui8 = 3
alias si16 = 4
alias ui16 = 5
alias f32 = 15
So, if you want to use DType.f32
as a parameter expression, you can do so using alias
. Aliases also obey scope, like var
and let
does, & naturally, you can also use local aliases within functions.
If youāre finding my blog post useful or interesting, consider subscribing & giving it a like below!š It really helps my channel grow & thrive.
Adaptive Compilation & Autotuning
Mojoās parameter expressions let you write portable algorithms that use parameters, just like you can in other languages. But when youāre writing code that needs to be high performance, you still have to choose specific values for the parameters.
For example, if youāre writing a high performance numeric algorithm, you might want to use memory tiling to speed it up. But the dimensions you use for the tiling depend on a lot of factors, like the hardware features that are available, the size of the cache, what gets fused into the kernel, & other details like this.
Even something as simple as vector length can be pretty tricky to manage. The vector length that a machine can handle depends on the datatype being used, and not all datatypes (like bfloat16) are fully supported on all implementations. Mojo makes this easier by providing an autotune function in its standard library. So if you want to write an algorithm that works with any vector length and applies it to a buffer of data, you could write it like this:
# Code Example from Mojo Playground
from Autotune import autotune
from Pointer import DTypePointer
from Functional import vectorize
fn buffer_elementwise_add[
dt: DType
](lhs: DTypePointer[dt], rhs: DTypePointer[dt], result: DTypePointer[dt], N: Int):
"""Perform elementwise addition of N elements in RHS and LHS and store
the result in RESULT.
"""
@parameter
fn add_simd[size: Int](idx: Int):
let lhs_simd = lhs.simd_load[size](idx)
let rhs_simd = rhs.simd_load[size](idx)
result.simd_store[size](idx, lhs_simd + rhs_simd)
# Pick vector length for this dtype and hardware
alias vector_len = autotune(1, 4, 8, 16, 32)
# Use it as the vectorization length
vectorize[vector_len, add_simd](N)
# And then call the function like this:
let N = 32
let a = DTypePointer[DType.f32].alloc(N)
let b = DTypePointer[DType.f32].alloc(N)
let res = DTypePointer[DType.f32].alloc(N)
# Initialize arrays with some values
for i in range(N):
a.store(i, 2.0)
b.store(i, 40.0)
res.store(i, -1)
buffer_elementwise_add[DType.f32](a, b, res, N)
print(a.load(10), b.load(10), res.load(10))
TL;DRš
Mojoās deployment narrative is very similar to that of the C language. Mojo is not just a language for AI/ML applications, but it is already much more than that! Itās essentially a unique version of Python that lets developers to write ready-to-deploy applications that are quick and easy, & take advantage of accelerators and available cores.
To wrap things up,
Mojo sets itself apart from other solutions in the Python ecosystem. Keep in mind that no programming language will likely suite all of your needs, but from what Iāve seen so far, Mojo is onto something here & personally, Iām excited to see where it will go in the future.
Mojo seeks to simplify & improve programming in the AI field by offering a new language that combines the strengths of Python with specific enhancements for accelerators & systems programming.
& thatās the Mojo.
In Summary
Modular introduces Mojo, a new programming language designed to address the challenges of programming across the ML/AI infrastructure stack.
Mojo aims to be innovative, scalable, and compatible with the Python ecosystem, providing a powerful programming model with features like compile-time metaprogramming & adaptive compilation techniques.
Mojo recognizes the importance of the host CPU as a fallback for operations that specialized accelerators can't handle, offering support for current and emerging accelerators.
Python was chosen as the foundation for Mojo due to its widespread use, popularity in the AI ecosystem, simplicity, & expressiveness.
Mojo aims to avoid fragmentation in the software ecosystem, making it easier for Python users to adopt without a semi-painful migration process.
Mojo overcomes Python's limitations in low-level performance, the global interpreter lock (GIL), & building hybrid libraries using low-level bindings to C & C++.
Mojo provides a unified language for accelerators, simplifies development, improves debugging capabilities, & addresses deployment challenges in the AI/ML field.
Mojo's deployment narrative is similar to that of the C language, enabling developers to write ready-to-deploy applications that leverage accelerators & available cores.
Here are some latest news articles about Mojo for your reading pleasure:
Mojo may be the biggest programming language advance in decades
Modular finds its Mojo, a Python superset with C-level speed
Introducing Mojo: The Programming Language Revolutionizing AI Development
Thatās all for this week! Give me a shout if you have any feedback, stories, or insights to share with me. Other than that, Iāll see you next week!
Happy learning,
šAshley
i had no idea there was a new programming language out for ai. thx for the great overview. the code example was nice and simple for beginners
Great intro about Mojo!
It's amazing to see the industry moving towards faster languages!