4 min read

DepthGuard. a.k.a "It seemed like a good idea at the time"

Reviewing my biggest blunder as a software developer.
DepthGuard. a.k.a "It seemed like a good idea at the time"

Way back in 2012 I was employed as a Senior Developer at a small company in my home town. One of my first tasks for a new client was to build a new CMS that could be used as the foundation for all of their future projects.

So, I got to work building a fresh new project, and the in things at the time were Entity Framework and ASP.NET MVC. I had taken notes of various methodologies and design patterns and the prevailing ideas at the time were:

  • Inversion of Control via Dependency Injection
  • Separation of concerns
  • Single Responsibility (and other SOLID principles)
  • Loosely-coupled components

Now, in taking a lot of these things with great granularity, what I ended up with were:

  • Entities - The actual record objects managed by Entity Framework
  • Models - Our problem-domain specific representations of objects/ideas/concepts
  • Type Converters - Mapping from type one to the other

Naturally, the best fit for this solution was AutoMapper, and given I wanted to ensure that each class I created had a single responsibility, that wasn't tightly coupled with anything else, I leant heavily into the ITypeConverter interface provided by AutoMapper:

interface ITypeConverter<TSource, TDestination>
{
    TSource Convert(ResolutionContext);
}

Now, this class itself is innocent enough, and should you use AutoMapper correctly, it poses no problems. But I didn't use AutoMapper correctly, clearly I wanted to ensure there was no dependency on AutoMapper! So what did I do? I created a service that hid the implementation of AutoMapper. Remember, no tight-coupling!

interface IMappingService
{
    TDestination Map<TSource, TDestination>(
        TSource source, 
        TDestination = default(TDestination));
}

public class MappingService : IMappingService
{
    TDestination Map<TSource, TDestination>(
        TSource source, 
        TDestination = default(TDestination))
        where TDestination: class, new()
    {
        return Mapper.Map(source, destination);
    }
}

Let's pivot slightly and talk about Entity Framework. Entity Framework, like other OR/Ms, support the notion of Navigation Properties. These are properties on your entities which are automatically hydrated by Entity Framework either through eager loading, lazy loading, or automatic references of loaded entities. That last one is quite important. Let's imagine a simple scenario:

public class Cart
{
    public int Id { get; set; }
    public virtual ICollection<CartItem> Items { get; set; }
}

public class CartItem
{
    public int Id { get; set; }
    public int CartId { get; set; }
    public virtual Cart Cart { get; set; }
}

When Entity Framework is loading my Cart object, where I have told it to also load the Items collection too (side note - best not do this, it generates some bloated SQL results), it will automatically patch up the Cart property of each CartItem. So already we can see that we can navigate from Cart -> CartItem -> Cart -> CartItem, etc.

I found this out the hard way, because when I ran my code with it's nicely separated classes, I appeared to hit infinite recursion. I thought to myself, surely AutoMapper should handle this automatically?

And it does, when you're using the Mapper type directly. I was not, I was indirectly calling from outside AutoMapper each time, because I was going via my shiny MappingService, so AutoMapper couldn't track what had already been mapped and patch in the new object.

So it was at this stage I made the biggest technical mistake I have ever made. I implemented a dirty kludge, instead of addressing the design flaw in the application.

Enter DepthGuard.

DepthGuard was a check within my mapping service, which restricted how many levels of nesting should be supported for mapping complex object graphs.

public class MappingService
{
    [ThreadStatic]
    private static int _depthGuard;
    private static readonly int _depathGuardLimit 
        = int.Parse(From.AppSetting("DepthGuardLimit", "5"));
        
    public TDestination Map<TSource, TDestination>(
        TSource source,
        TDestination destination = default(TDestination))
        where TDestination: class, new()
    {
        if (_depthGuard > _depathGuardLimit)
            return default(TDestination)
            
        _depthGuard++;
        var result = Mapper.Map(source, destination);
        _depathGuard--;
        
        return result;
    }
}

Can you take  a look and spot what it going to happen? If not, here's what happens:

  • When we call into Map<TSource, TDestination> we increment the thread static _depthGuard value.
  • When we have finished mapping, we decrement that same value.
  • If the currently DepthGuard value is greater than the limit, we return default(TDestination), which  as TDestination is defined as class, new(), this means we'll end up returning a null value.
  • The typo _depathGuardLimit is still there, I never bothered to correct it.

The idea was in those complex object graphs managed by Entity Framework, we'll map most of the graph, depending on the nesting level of the graph itself. I know, you're already weeping for me, but hey, at least it was configurable!

Fast forward 11 years to today, and I have spent the better part of 3 solid days attempting to debug an Azure AppService app, which is powered by that same 11 year old technology. I was getting what appeared to be random NullReferenceException throw all over the place and I couldn't explain it. I thought for a moment was Entity Framework's fault. Perhaps it was not loading those navigational properties at all?

It is lucky really, that Visual Studio provides the facility to remotely debug an AppService instance. The only pain in setting this up, is I needed to manually download the build artifacts from Azure DevOps, and load the symbols from there, as they are not published to a feed anywhere.

It was pure luck, on a random breakpoint where I noticed it just not mapping correctly, and that led me to find that the default DepthGuard setting of 5 is not enough for this particular problem domain.

DepthGuard to this date, is my least proud moment in software development. I might get that put on a t-shirt.