D3D12 in Multi-GPU

SDK supports Prepar3D’s philosophy of an open development architecture and encourages third parties to bring new innovations with improved add-ons and training content.
Post Reply
A_Jerbi
Posts: 1
Joined: Tue Dec 01, 2020 2:10 pm

D3D12 in Multi-GPU

Post by A_Jerbi »

Hello everyone!

We developed a Warp and Blend plugin for setting up multiple projectors with Prepar3d. It uses the d3d12 texture, and while this runs seamlessly on a single GPU, D3D12 fails in multi-gpu setups with this log in the ContentErrors.txt " DirectX device removed. Reason: dxgi_error_device_hung "

Is this a limitation of the current SDK regarding multi-GPU and directX12? or are we missing some configuration that enables it?

* Please note that we used the latest version of the SDK (5.1.12) as well as V5 of the software, and constantly updating and getting hot fixes, and GPU drivers.

We would highly appreciate any help and assistance,

Best,
Jerbi
User avatar
Beau Hollis
Lockheed Martin
Posts: 2452
Joined: Wed Oct 06, 2010 3:25 pm

Re: D3D12 in Multi-GPU

Post by Beau Hollis »

Sorry for the late reply on this. Prepar3D does support multiple GPUs and this extends to add-ons, however, as with dx11 add-ons in v4, there is a bit of additional work required in the plugin code to ensure that things work properly. Internally Prepar3D stores one instance of your registered rendering/texture plugin. In the render callbacks, we provide the device, command queue, and resources needed for rendering. If the texture is used on more than one GPU, your render callbacks will be hit once per GPU and a different set of device and resource pointers are passed. For one plugin to work correctly on multiple GPUs simultaneously, it needs to create a set of DX resources for each GPU. We typically create struct/class to hold per-device resources and create an array of 4. Then use the adapter index to grab the correct set. The OpenGLTexture provides a good example of this:

Code: Select all

void OpenGLTexture::Render(IRenderDataV500* pRenderData)
{
    if (pRenderData == nullptr || pRenderData->GetDevice() == nullptr)
    {
        return;
    }

    if (pRenderData->GetOutputColor())
    {
        CComPtr<IRenderDataResourceV500> spColorResource = pRenderData->GetOutputColor();
        m_GLDX[pRenderData->GetAdapterID()].Render(pRenderData->GetDevice(), 
            pRenderData->GetCommandQueue(D3D12_COMMAND_LIST_TYPE_DIRECT),
            spColorResource,
            (int) pRenderData->GetTextureWidth(),
            (int) pRenderData->GetTextureHeight());
    }
}
For a display specific add-on you might also be able to simply register an instance per GPU with a different name.
Beau Hollis
Prepar3D Software Architect
Post Reply