Consider this pseudocode:

[code language=”text” gutter=”false”]
Initialize COM;
Initialise Direct3D device;
Open Direct3D Window;

while (Program Window Open)
VertexBuffer := D3DDevice.CreateVertexBuffer(size=100);

Close Direct3D window;
Release Direct3D;
UnInitialize COM;
Is there a memory leak in the above code? This is just pseudocode, so I’m not trying to be tricky with syntax and I haven’t omitted any key steps in init or shutdown. The answer I should get is “no”. And yet, in reality, the answer is “maybe”.

All the code does is allocate a vertex buffer (which is the first step of any drawing operation in Direct3D), and then immediately release the buffer. Simple.┬áThere’s no memory leak here in the application. There’s no memory leak in Direct3D.

In the final stages of preparing my Five Hundred game for release, I started running test builds on all of the PCs I could get my hands on: everything from a Win7 Core i7 12GB ram GeForce 770, Surface Pro Win8 tablet, down to an old Core 2 Duo with Intel Q35 Express integrated graphics (i.e. an old low end office PC).

I noticed something very weird on the Core 2 Duo system. In Task Manager, over time, the Working Set (memory usage) continually increased. FPC’s heap trace tools showed no leaks. DXSpy showed all of the Direct3D objects were being constructed and released correctly. Working Set usage stayed static on all the other systems, only increasing on this one. What was going on?

Shortcutting past a few hours of debugging: I boiled down the problem to the minimal steps to reproduce, which was essentially the code above. It turns out that this particular Direct3D driver leaks memory: from the application’s perpective, and DirectX’s perspective, everything is all good, but there’s a reference remaining in the driver that will never be cleaned up until the application ends. The conclusive proof was that when using the “reference rasteriser” (where D3D just runs a software emulation instead of using the hardware drivers) on the very same system, the problem just disappears.

After a lot of time spent searching the web, it turns out I wasn’t the first one to come across this issue, but every suggestion I could find just basically told the questioner “you’re obviously not releasing your resources correctly. check again”, and that was the end of the discussion. While that’s always something you should check first, and check very thoroughly, it’s not *always* the case. I wanted this on the net and hopefully to show up in search results in case anyone in future comes across this, they can prevent the hours spent to figure out the problem wasn’t in their code.

Fortunately, the memory leak is avoidable. In an optimal program, you shouldn’t really be allocating and releasing vertexbuffers all the time: the goal is to allocate as many as you need and then just re-use them through the application. I had planned to do this anyway, so finding this just raised the priority on that for me. And it’s not the end of the world: at least it’s not D3D hardware resources being wasted, just working set, which will eventually be swapped out because it’s never touched again. It’s irritating though.

Those Intel chipsets are out of support, so the last driver update was 2009 and this will never be fixed. There are a large number of PCs out there however, because those chipsets were basically used in almost every XP-era system that didn’t ship with a dedicated graphics card (including millions of PCs for offices). Support for those systems is essential for me for a Five Hundred card game, because I imagine a lot of the potential players might well be people that haven’t invested in the latest gaming-spec systems.

Next time (and I now plan to write articles every Sunday for this blog from now on!), I will talk about the effect of different code optimizations for low-end (these old XP, Intel GMA system) and high-end (GeForce 770) systems for Direct3D. I was quite surprised by my findings!

No Comment.

Add Your Comment