Behavioral Advanced 7 min read

Interpreter Pattern in C#: When and How to Build a DSL

Interpreter pattern in C# / .NET 10: build a tiny DSL for discount rules, when Expression trees beat hand-written interpreters, and when to use neither.

Table of contents
  1. What problem does the Interpreter pattern solve in C#?
  2. How does the textbook Interpreter look?
  3. How do Expression trees give you the same pattern for free?
  4. How do you build a tiny DSL parser in .NET 10?
  5. When does the Interpreter pattern misfire?
  6. How does Interpreter compare to Strategy and Visitor?
  7. What does a real .NET 10 example look like?
  8. Where should you read next in this series?

The marketing team wants to author discount rules without filing a ticket. "20% off if cart contains BOOK and total over 50". "Free shipping for orders weighing under 2 kg in the US". By next quarter they expect twenty rules, by next year a hundred. Hard-coding each rule in C# means every rule needs a deployment.

The Interpreter pattern is one answer. Define a small grammar for the rule language, parse rule strings into an abstract syntax tree, and have each node implement Evaluate(context). The Interpreter is the rarest GoF pattern in real .NET applications, because Expression trees, Roslyn scripting, and existing rule engines cover most cases. But understanding it tells you what those tools do under the hood — and when to actually reach for it.

What problem does the Interpreter pattern solve in C#?

The pattern earns its place when non-developers must author behaviour at runtime, in a language smaller than C#. Three concrete shapes:

  1. Business rule DSL. Discount rules, validation rules, eligibility rules. Edited by analysts, deployed without a build.
  2. Query languages. A search box that accepts tag:book price<50 author:rowling. Each token becomes an AST node; the evaluator filters the data store.
  3. Configuration with logic. A YAML file that says enable: when env=prod and rollout>50%. Avoids "config that is really code" anti-patterns by giving the config a declared, bounded grammar.

What is not an Interpreter problem: "I want to dispatch one of a few methods at runtime" — that is Strategy or Command. "I want users to write C#" — use Roslyn scripting; the pattern is overkill.

How does the textbook Interpreter look?

A class per grammar construct, each with an Evaluate(context) that returns a value:

public sealed record CartContext(string[] Skus, decimal Total, double WeightKg, string Country);

public interface IRule { bool Eval(CartContext ctx); }

public sealed class ContainsSku(string Sku) : IRule
{
    public bool Eval(CartContext ctx) => ctx.Skus.Contains(Sku);
}

public sealed class TotalGreaterThan(decimal Threshold) : IRule
{
    public bool Eval(CartContext ctx) => ctx.Total > Threshold;
}

public sealed class And(IRule Left, IRule Right) : IRule
{
    public bool Eval(CartContext ctx) => Left.Eval(ctx) && Right.Eval(ctx);
}

public sealed class Or(IRule Left, IRule Right) : IRule
{
    public bool Eval(CartContext ctx) => Left.Eval(ctx) || Right.Eval(ctx);
}

The user-typed rule "cart contains BOOK and total over 50" parses into an And whose left child is a ContainsSku("BOOK") and right child is a TotalGreaterThan(50m). Calling .Eval(ctx) walks the tree.

The structural picture (the tree, not a class hierarchy):

flowchart TB
    And[And]
    And --> Contains[ContainsSku 'BOOK']
    And --> Total[TotalGreaterThan 50]

Add a new operator (Not, WeightUnder) by writing one new class. The interpreter does not know about specific rules; the rules do not know about each other.

How do Expression trees give you the same pattern for free?

System.Linq.Expressions is the BCL implementation of Interpreter. You build the AST by composing typed expression objects, then either walk it (LINQ-to-Entities translates to SQL) or compile it to a delegate:

using System.Linq.Expressions;

ParameterExpression ctx = Expression.Parameter(typeof(CartContext), "c");

Expression contains = Expression.Call(
    Expression.Property(ctx, nameof(CartContext.Skus)),
    typeof(Enumerable).GetMethod(nameof(Enumerable.Contains), new[] { typeof(string[]), typeof(string) })!,
    Expression.Constant("BOOK"));

Expression total = Expression.GreaterThan(
    Expression.Property(ctx, nameof(CartContext.Total)),
    Expression.Constant(50m));

Expression body = Expression.AndAlso(contains, total);

var lambda = Expression.Lambda<Func<CartContext, bool>>(body, ctx);
Func<CartContext, bool> rule = lambda.Compile();

rule(myCart) returns true or false. EF Core takes a similar tree and turns it into a SQL WHERE clause. The Interpreter pattern is hidden inside the BCL classes; you never write class And : IRule yourself.

For most application-internal rule logic, this is the right answer. Hand-written interpreters earn their keep when the user writes the rules — and writes them in a string the application has to parse.

How do you build a tiny DSL parser in .NET 10?

The minimum-viable parser for the rule language above:

public static class RuleParser
{
    public static IRule Parse(string source)
    {
        var tokens = source.Split(' ', StringSplitOptions.RemoveEmptyEntries);
        return ParseAnd(tokens, 0, out _);
    }

    private static IRule ParseAnd(string[] tokens, int i, out int next)
    {
        var left = ParseAtom(tokens, i, out i);
        while (i < tokens.Length && tokens[i].Equals("and", StringComparison.OrdinalIgnoreCase))
        {
            var right = ParseAtom(tokens, i + 1, out i);
            left = new And(left, right);
        }
        next = i;
        return left;
    }

    private static IRule ParseAtom(string[] tokens, int i, out int next)
    {
        if (tokens[i].Equals("contains", StringComparison.OrdinalIgnoreCase))
        {
            next = i + 2;
            return new ContainsSku(tokens[i + 1]);
        }
        if (tokens[i].Equals("total>", StringComparison.OrdinalIgnoreCase))
        {
            next = i + 2;
            return new TotalGreaterThan(decimal.Parse(tokens[i + 1]));
        }
        throw new InvalidOperationException($"Unknown token: {tokens[i]}");
    }
}

// Usage
IRule rule = RuleParser.Parse("contains BOOK and total> 50");
bool applies = rule.Eval(new CartContext(new[] {"BOOK", "MUG"}, 75m, 1.0, "US"));

This is the whole Interpreter pattern compressed into 30 lines: a grammar, a parser, an AST, a walker. For a real rule engine you would replace the recursive-descent parser with a generated one (Sprache, Pidgin, Superpower) and add operator precedence — but the shape stays the same.

When does the Interpreter pattern misfire?

Three traps:

How does Interpreter compare to Strategy and Visitor?

Pattern What is the abstraction? When does it apply? Modern .NET
Interpreter Each AST node has Eval Runtime-authored rules / queries Expression<Func<>>
Strategy One pluggable algorithm Choose at runtime which fixed implementation to call Func<> injected via DI
Visitor Add an operation to a fixed AST New operations on existing nodes Pattern matching switch

The cleanest split: Interpreter is for parsing and evaluating text; Visitor is for adding operations to an existing tree; Strategy is for swapping a single algorithm. Visitor and Interpreter often pair: build the AST with Interpreter, walk it with Visitor.

What does a real .NET 10 example look like?

For ninety per cent of cases, the real .NET 10 example uses Expression trees rather than a hand-written interpreter. Here is the discount-rule scenario solved that way:

public sealed record CartContext(string[] Skus, decimal Total, double WeightKg, string Country);

public sealed class CompiledRule
{
    public CompiledRule(string source, Expression<Func<CartContext, bool>> expr)
    {
        Source   = source;
        Compiled = expr.Compile();
    }
    public string Source { get; }
    public Func<CartContext, bool> Compiled { get; }
    public bool Applies(CartContext ctx) => Compiled(ctx);
}

public sealed class DiscountRuleEngine
{
    private readonly List<CompiledRule> _rules = new();

    public void Add(string source, Expression<Func<CartContext, bool>> expr)
        => _rules.Add(new CompiledRule(source, expr));

    public IEnumerable<string> Matching(CartContext ctx)
        => _rules.Where(r => r.Applies(ctx)).Select(r => r.Source);
}

// Caller (rules expressed in C# at startup)
var engine = new DiscountRuleEngine();
engine.Add("books over 50",
    c => c.Skus.Contains("BOOK") && c.Total > 50m);
engine.Add("light orders to US",
    c => c.WeightKg < 2.0 && c.Country == "US");

var ctx = new CartContext(new[] { "BOOK", "MUG" }, 75m, 1.0, "US");
foreach (var match in engine.Matching(ctx))
    Console.WriteLine($"matched: {match}");

If marketing must author rules in their own string format, layer the RuleParser from the previous section on top to translate text into either an IRule AST or an Expression<Func<>>. The evaluation uses the BCL's interpreter; only the parser is yours.

A pragmatic ending: most C# developers will never write a hand-rolled Interpreter, and that is fine. What matters is recognising the pattern when you see it in EF Core, in Expression<>, in your favourite rule engine — and recognising when not to build your own. The framework usually wins.

Frequently asked questions

Are Expression trees in C# the Interpreter pattern?
Yes. Expression<Func<TIn, TOut>> is a tree of nodes (BinaryExpression, MethodCallExpression, ConstantExpression) where each node knows how to evaluate or compile itself. EF Core walks the tree to translate it to SQL — that is interpretation. The .NET BCL gives you the pattern; you almost never need to write your own AST classes.
When should I build a tiny DSL instead of using Expression trees?
When the users of the rules are not C# programmers. Marketing managers want to write if cart_contains BOOK and total > 50 then 20% off, not cart => cart.Items.Any(i => i.Sku == "BOOK") && cart.Total > 50. A tiny custom DSL with a parser and Interpreter classes can be safer than letting non-engineers write Roslyn scripts.
Is Roslyn scripting the same as the Interpreter pattern?
Roslyn scripting compiles C# at runtime — it is a compiler with an interpreter on top. The classical Interpreter pattern is one rung lower: you define your own abstract syntax tree and walk it. Use Roslyn when the language is a subset of C# and the users can write C#; use the pattern when you need a smaller, sandboxed language.
When does building any DSL feel like overkill?
When you have one to three rules and they change once a quarter. A switch expression and a deployment is simpler than a parser, an AST, an interpreter, a config UI, and a rule storage layer. The DSL becomes worth it around the time the rule list crosses 30 entries or stakeholders need to change rules without a deploy.