A language tour
Fast as C. Expressive as Python. Built for science. The two-language problem, solved.
01 โ The Two-Language Problem
Scientists used to prototype in Python or R, then rewrite performance-critical code in C or Fortran. Julia refuses this compromise. It compiles to efficient native code via LLVM while remaining as interactive as a scripting language.
"We want a language that's open source, with a liberal license. We want the speed of C with the dynamism of Ruby. We want a language that is homoiconic, with true macros like Lisp. We want something as usable for general programming as Python. Something as easy for statistics as R."
โ Why We Created Julia, Jeff Bezanson et al., 2012# Julia is JIT-compiled โ first call compiles, subsequent calls are fast function sum_squares(n::Int)::Float64 total = 0.0 for i in 1:n total += i^2 end return total end # @time macro measures runtime @time sum_squares(1_000_000) # first call: ~0.5s (compilation + execution) # second call: ~0.001s (just execution โ C speed) # Type annotations are optional but unlock full optimisation function fast_dot(a::Vector{Float64}, b::Vector{Float64}) sum(@.( # @. broadcasts elementwise end
The @. macro broadcasts every operation in the expression โ a * b becomes element-wise multiplication with no temporary allocation, fusing the operations in a single LLVM pass.
02 โ Multiple Dispatch
Julia's most distinctive feature: a function can have many methods, and the most specific one is called based on the types of all arguments. This isn't just method overloading โ it's a foundation for extensible, composable libraries.
abstract type Animal end struct Dog <: Animal; name::String; end struct Cat <: Animal; name::String; end # Different behaviour for different combinations of types interact(a::Dog, b::Dog) = "$(a.name) and $(b.name) play fetch!" interact(a::Cat, b::Cat) = "$(a.name) ignores $(b.name)." interact(a::Dog, b::Cat) = "$(a.name) chases $(b.name)!" interact(a::Cat, b::Dog) = "$(b.name) hisses at $(a.name)." interact(Dog("Rex"), Cat("Whiskers")) # "Rex chases Whiskers!"
This isn't OOP method dispatch โ both argument types determine which method runs. This makes operator overloading, unit systems, and automatic differentiation composable across independent packages.
03 โ Broadcasting
Julia's broadcasting system lets you apply any function element-wise to arrays by adding a dot. You don't need a special vectorised version โ just dot-apply the scalar function. The compiler fuses loops for you.
# A scalar function clamp01(x) = max(0.0, min(1.0, x)) pixels = [-0.3, 0.5, 1.2, 0.8, -0.1] # Dot-broadcasting applies it element-wise โ no loop, no special version clamped = clamp01.(pixels) # [0.0, 0.5, 1.0, 0.8, 0.0] # Works with any operator too a = [1, 2, 3]; b = [4, 5, 6] a .* b # [4, 10, 18] โ elementwise multiply a .^ 2 # [1, 4, 9] โ elementwise square # @. broadcasts every operation in an expression @. sin(a) + cos(b) # fused into a single loop, no temps
Dot-broadcasting is syntactic sugar that compiles to a single fused loop. There's no runtime overhead from the abstraction โ the code the compiler generates is identical to a hand-written loop.
04 โ Macros
Julia macros operate on the abstract syntax tree, transforming code before it compiles. Unlike C preprocessor macros, they're hygienic and operate on structured syntax. They're how Julia implements things like @time, @test, and automatic differentiation.
# @assert โ readable test that shows the failing expression @assert 2 + 2 == 4 # @show โ prints variable name and value x = 42 @show x # prints: x = 42 # @benchmark from BenchmarkTools โ statistical timing @benchmark sort(rand(1000)) # Custom macro โ transform code at parse time macro swap(a, b) :(local tmp = $a; $a = $b; $b = tmp) end x, y = 1, 2 @swap x y # x is now 2, y is now 1
The :(expression) syntax creates an AST node โ a quoted expression. Macros receive and return these nodes, letting you rewrite syntax patterns before compilation.
05 โ Unicode & Mathematics
Julia supports Unicode identifiers natively โ you can write ฮฑ, โ, ฯ, โ directly in your code. Mathematical algorithms read like their textbook descriptions, closing the gap between notation and implementation.
# Type directly from LaTeX: \alpha + Tab โ ฮฑ function gaussian(x, ฮผ, ฯ) (1 / (ฯ * โ(2ฯ))) * exp(-(x - ฮผ)^2 / (2ฯ^2)) end # Implicit multiplication: 2x means 2 * x function circle_area(r) ฯ * r^2 # ฯ is a built-in constant end # โ from set theory works in loops primes = [2, 3, 5, 7, 11] for p โ primes println(p^2) end
Implicit multiplication (2x) and Unicode identifiers mean that the formula from a research paper can be typed almost verbatim into Julia. The code is the maths.
06 โ The Whole Picture
Differential equations, optimisation, machine learning โ Julia's ecosystem is purpose-built for scientific work.
Call Python, R, C, and Fortran from Julia without FFI boilerplate. Use the existing scientific Python ecosystem seamlessly.
Threads, distributed computing, and GPU acceleration โ all first-class, all composable with the same language.
Zygote.jl differentiates Julia code automatically. Write normal functions; get gradients for free via multiple dispatch.
Julia compiles to native code for every platform LLVM supports, including GPUs โ with no separate compilation step.
Built-in package manager with exact version pinning and project-local environments. Reproducibility by default.