๐Ÿ”ฎ

A language tour

The Speed of Julia

Fast as C. Expressive as Python. Built for science. The two-language problem, solved.

scroll

01 โ€” The Two-Language Problem

Prototype and production in one

Scientists used to prototype in Python or R, then rewrite performance-critical code in C or Fortran. Julia refuses this compromise. It compiles to efficient native code via LLVM while remaining as interactive as a scripting language.

"We want a language that's open source, with a liberal license. We want the speed of C with the dynamism of Ruby. We want a language that is homoiconic, with true macros like Lisp. We want something as usable for general programming as Python. Something as easy for statistics as R."

โ€” Why We Created Julia, Jeff Bezanson et al., 2012
speed.jl
# Julia is JIT-compiled โ€” first call compiles, subsequent calls are fast
function sum_squares(n::Int)::Float64
    total = 0.0
    for i in 1:n
        total += i^2
    end
    return total
end

# @time macro measures runtime
@time sum_squares(1_000_000)
# first call:  ~0.5s (compilation + execution)
# second call: ~0.001s (just execution โ€” C speed)

# Type annotations are optional but unlock full optimisation
function fast_dot(a::Vector{Float64}, b::Vector{Float64})
    sum(@.(  # @. broadcasts elementwise
end

The @. macro broadcasts every operation in the expression โ€” a * b becomes element-wise multiplication with no temporary allocation, fusing the operations in a single LLVM pass.


Functions that specialise on all argument types

Julia's most distinctive feature: a function can have many methods, and the most specific one is called based on the types of all arguments. This isn't just method overloading โ€” it's a foundation for extensible, composable libraries.

dispatch.jl
abstract type Animal end
struct Dog <: Animal; name::String; end
struct Cat <: Animal; name::String; end

# Different behaviour for different combinations of types
interact(a::Dog, b::Dog) = "$(a.name) and $(b.name) play fetch!"
interact(a::Cat, b::Cat) = "$(a.name) ignores $(b.name)."
interact(a::Dog, b::Cat) = "$(a.name) chases $(b.name)!"
interact(a::Cat, b::Dog) = "$(b.name) hisses at $(a.name)."

interact(Dog("Rex"), Cat("Whiskers"))
# "Rex chases Whiskers!"

This isn't OOP method dispatch โ€” both argument types determine which method runs. This makes operator overloading, unit systems, and automatic differentiation composable across independent packages.


Vectorise anything with a dot

Julia's broadcasting system lets you apply any function element-wise to arrays by adding a dot. You don't need a special vectorised version โ€” just dot-apply the scalar function. The compiler fuses loops for you.

broadcasting.jl
# A scalar function
clamp01(x) = max(0.0, min(1.0, x))

pixels = [-0.3, 0.5, 1.2, 0.8, -0.1]

# Dot-broadcasting applies it element-wise โ€” no loop, no special version
clamped = clamp01.(pixels)  # [0.0, 0.5, 1.0, 0.8, 0.0]

# Works with any operator too
a = [1, 2, 3]; b = [4, 5, 6]
a .* b   # [4, 10, 18]  โ€” elementwise multiply
a .^ 2   # [1, 4, 9]   โ€” elementwise square

# @. broadcasts every operation in an expression
@. sin(a) + cos(b)  # fused into a single loop, no temps

Dot-broadcasting is syntactic sugar that compiles to a single fused loop. There's no runtime overhead from the abstraction โ€” the code the compiler generates is identical to a hand-written loop.


Code that writes code, the right way

Julia macros operate on the abstract syntax tree, transforming code before it compiles. Unlike C preprocessor macros, they're hygienic and operate on structured syntax. They're how Julia implements things like @time, @test, and automatic differentiation.

macros.jl
# @assert โ€” readable test that shows the failing expression
@assert 2 + 2 == 4

# @show โ€” prints variable name and value
x = 42
@show x  # prints: x = 42

# @benchmark from BenchmarkTools โ€” statistical timing
@benchmark sort(rand(1000))

# Custom macro โ€” transform code at parse time
macro swap(a, b)
    :(local tmp = $a; $a = $b; $b = tmp)
end

x, y = 1, 2
@swap x y
# x is now 2, y is now 1

The :(expression) syntax creates an AST node โ€” a quoted expression. Macros receive and return these nodes, letting you rewrite syntax patterns before compilation.


Code that looks like the maths

Julia supports Unicode identifiers natively โ€” you can write ฮฑ, โˆ‘, ฯ€, โˆˆ directly in your code. Mathematical algorithms read like their textbook descriptions, closing the gap between notation and implementation.

math.jl
# Type directly from LaTeX: \alpha + Tab โ†’ ฮฑ
function gaussian(x, ฮผ, ฯƒ)
    (1 / (ฯƒ * โˆš(2ฯ€))) * exp(-(x - ฮผ)^2 / (2ฯƒ^2))
end

# Implicit multiplication: 2x means 2 * x
function circle_area(r)
    ฯ€ * r^2  # ฯ€ is a built-in constant
end

# โˆˆ from set theory works in loops
primes = [2, 3, 5, 7, 11]
for p โˆˆ primes
    println(p^2)
end

Implicit multiplication (2x) and Unicode identifiers mean that the formula from a research paper can be typed almost verbatim into Julia. The code is the maths.


Why Julia is rising

๐Ÿ”ฌ

Scientific Computing

Differential equations, optimisation, machine learning โ€” Julia's ecosystem is purpose-built for scientific work.

๐Ÿค

Interoperability

Call Python, R, C, and Fortran from Julia without FFI boilerplate. Use the existing scientific Python ecosystem seamlessly.

๐Ÿงฎ

Built-in Parallelism

Threads, distributed computing, and GPU acceleration โ€” all first-class, all composable with the same language.

๐Ÿ“Š

Automatic Differentiation

Zygote.jl differentiates Julia code automatically. Write normal functions; get gradients for free via multiple dispatch.

โšก

LLVM Backend

Julia compiles to native code for every platform LLVM supports, including GPUs โ€” with no separate compilation step.

๐Ÿ“ฆ

Reproducible Environments

Built-in package manager with exact version pinning and project-local environments. Reproducibility by default.