31 Jan 2017

On mocks and stubs in Python (free monad or interpreter pattern)

A few weeks ago I watched a video where Ken Scambler talks about mocks and stubs. In particular he talks about how to get rid of them.

What he's talking about is free monads, but I feel he's glossing over a lot of details. Based on some of the questions asked during the talk I think I share that feeling with some people in the audience. Specifically I feel he skipped over the following:

Every time I've used mocks and stubs for unit testing I've had a feeling that "this can't be how it's supposed to be done!" So to me, Ken's talk offered some hope, and I really want to know how applicable the ideas are in mainstream OO/imperative languages.

The example

To play around with this I picked the following function (in Python):

def count_chars_of_file(fn):
    fd = os.open(fn, os.O_RDONLY)
    text = os.read(fd, 10000)
    n = len(text)
    os.close(fd)
    return n

It's small and simple, but I think it suffices to highlight a few important points. So the goal is to rewrite this function such that calls to IO operations (actions) (e.g. os.read) are replaced by data (an instance of some data type) conveying the intent of the operation. This data can later be passed to an interpreter of actions.

Thoughts on the execution of actions and the interpreter pattern

When reading the examples in the description of the interpreter pattern what stands out to me is that they are either

  1. a list of expressions, or
  2. a tree of expressions

that is passed to an interpreter. Will this do for us when trying to rewrite count_chars_of_file?

No, it won't! Here's why:

  • A tree of actions doesn't really make sense. Our actions are small and simple, they encode the intent of a single IO operation.
  • A list of actions can't deal with interspersed non-actions, in this case it's the line n = len(text) that causes a problem.

The interpreter pattern misses something that is crucial in this case: the running of the interpreter must be intermingled with running non-interpreted code. The way I think of it is that not only the action needs to be present and dealt with, but also the rest of the program, that latter thing is commonly called a continuation.

So, can we introduce actions and rewrite count_chars_of_file such that we pause the program when interpretation of an action is required, interpret it, and then resume where we left off?

Sure, but it's not really idiomatic Python code!

Actions and continuations

The IO operations (actions) are represented as a named tuple:

Op = collections.namedtuple('Op', ['op', 'args', 'k'])

and the functions returning actions can then be written as

def cps_open(fn, k):
    return Op('open', [fn], k)

def cps_read(fd, k):
    return Op('read', [fd], k)

def cps_close(fd, k):
    return Op('close', [fd], k)

The interpreter is then an if statement checking the value of op.op with each branch executing the IO operation and passing the result to the rest of the program. I decided to wrap it directly in the program runner:

def runProgram(prog):
    def runOp(op):
        if op.op == 'open':
            fd = os.open(*op.args, os.O_RDONLY)
            return op.k(fd)
        elif op.op == 'read':
            text = os.read(*op.args, 10000)
            return op.k(text)
        elif op.op == 'close':
            os.close(*op.args)
            return op.k()

    while isinstance(prog, Op):
        prog = runOp(prog)

    return prog

So far so good, but what will count_char_of_file all of this do to count_chars_of_file?

Well, it's not quite as easy to read any more (basically it's rewritten in CPS):

def count_chars_of_file(fn):

    def cont_1(text, fd):
        n = len(text)
        return cps_close(fd, lambda n=n: n)

    def cont_0(fd):
        return cps_read(fd, lambda text, fd=fd: cont_1(text, fd))

    return cps_open(fn, cont_0)

Generators to the rescue

Python does have a notion of continuations in the form of generators.1 By making count_char_of_file into a generator it's possible to remove the explicit continuations and the program actually resembles the original one again.

The type for the actions loses one member, and the functions creating them lose an argument:

Op = collections.namedtuple('Op', ['op', 'args'])

def gen_open(fn):
    return Op('open', [fn])

def gen_read(fd):
    return Op('read', [fd])

def gen_close(fd):
    return Op('close', [fd])

The interpreter and program runner must be modified to step the generator until its end:

def runProgram(prog):
    def runOp(op):
        if op.op == 'open':
            fd = os.open(op.args[0], os.O_RDONLY)
            return fd
        elif op.op == 'read':
            text = os.read(op.args[0], 10000)
            return text
        elif op.op == 'close':
            os.close(op.args[0])
            return None

    try:
        op = prog.send(None)
        while True:
            r = runOp(op)
            op = prog.send(r)
    except StopIteration as e:
        return e.value

Finally, the generator-version of count_chars_of_file goes back to being a bit more readable:

def count_chars_of_file(fn):
    fd = yield gen_open(fn)
    text = yield gen_read(fd)
    n = len(text)
    yield gen_close(fd)
    return n

Generators all the way

Limitations of Python generators mean that we have either have to push the interpreter (runProgram) down to where count_char_of_file is used, or make all intermediate layers into generators and rewrite the interpreter to deal with this. It could look something like this then:

def runProgram(prog):
    def runOp(op):
        if op.op == 'open':
            fd = os.open(op.args[0], os.O_RDONLY)
            return fd
        elif op.op == 'read':
            text = os.read(op.args[0], 10000)
            return text
        elif op.op == 'close':
            os.close(op.args[0])
            return None

    try:
        op = prog.send(None)
        while True:
            if isinstance(op, Op):
                r = runOp(op)
                op = prog.send(r)
            elif isinstance(op, types.GeneratorType):
                r = runProgram(op)
                op = prog.send(r)

    except StopIteration as e:
        return e.value

Final thoughts

I think I've shown one way to achieve, at least parts of, what Ken talks about. The resulting code looks almost like "normal Python". There are some things to note:

  1. Exception handling is missing. I know of no way to inject an exception into a generator in Python so I'm guessing that exceptions from running the IO operations would have to be passed in via generator.send as a normal value, which means that exception handling code would have to look decidedly non-Pythonic.
  2. Using this approach means the language must have support for generators (or some other way to represent the rest of the program). I think this rules out Java, but probably it can be done in C#.
  3. I've only used a single interpreter here, but I see no issues with combining interpreters (to combine domains of operations like file operations and * network operations*). I also think it'd be possible to use it to realize what De Goes calls Onion Architecture.

Would I ever advocate this approach for a larger Python project, or even any project in an OO/imperative language?

I'm not sure! I think that testing using mocks and stubs has lead to a smelly code base each an every time I've used it, but this approach feels a little too far from how OO/imperative code usually is written. I would love to try it out and see what the implications really are.

Footnotes:

1

I know, I know, coroutines! I'm simplifying and brushing over details here, but I don't think I'm brushing over any details that are important for this example.

Tags: python free_monad
Comment here.