FPS drop when Video Output is enabled

Hi all,

I am experiencing a sharp drop in FPS when I enable Video Outputs. My file setup:

  • 7x mp4 video files, 60 minutes each, 24 fps, 2160x3840 portrait format, ~4 GB each file
  • 7x Video Reader --> Render to Canvas nodes in one layer
  • 7x canvases (2160 x 3840)
  • 7x Video Outputs, one canvas mapped to each
  • 7x physical screens

As soon as I enable one of the video outputs, my FPS drops from 60 to about 47, even when the timeline is not playing. If I enable a second output, it drops to 36, then 26 with a third output, and on from there.

The GPU and other system resources seem fine:

Questions / solutions

  • file size is too big for each video file?
  • should I consolidate my videos into a 2D layer / single texture, rather than using 7x Render to Canvas nodes?
  • this is just too many pixels for LightAct to push at a good frame rate?

Video here of this test.

Thanks!
Scott

Hi,

I would recommend to use a structured approach in trying to pinpoint what’s the cause of this issue. For example, you mentioned that the FPS drops when you enable one of the video outputs. If I’m reading this correctly, this means that all 7 video files are being read, played and rendered to their respective canvases perfectly fine with a high framerate as long as you don’t enable a video output? If this is true, then I’m guessing that your issue is not related to the video files at all but to the GPUs and the hardware side of things, right?

Also, since you’ve been testing this setup for quite a while, did you experience this on your other computer too?

Last but not least, does the issue happen even if you are trying to render, say, 7 generative content nodes (say Color palette) instead of 7 x video file?

Thanks,
Mitja

Hi Mitja,

Appreciate these ideas! Yes, I am able to read, play, render the video files to their canvases, and map the video outputs with a consistent 60 fps frame rate, as long as the video outputs are disabled. But when I enable the Video Output, the frame rate drops.

You’re right – I have been testing this for a while, and yes, I have seen the same FPS issue on my testing PC. However, I have not had access to the production PC, with 7x outputs, until this week. My testing PC has only 2x video outs, and did not have nearly the same GPU / CPU specs as the production PC – I’m running a Quadro M2200, believe it or not, and our production PC has 2x A6000 – so I had been hoping the dramatic increase in GPU speed / VRAM would solve the FPS issue.

I tested with the Color Sine node passing the texture to each video screen, and the result was the same. 60 FPS until I enabled the video outputs. This tells me it is not an issue with our mp4s being too heavy or something – does that sound right to you?

Video here.of the test.
Video here showing the hardware performance / GPU usage.

Are there any other LightAct parameters I should be considering, or a different render pipeline / workflow I should test? I am going to look into possible issues on the GPU side.

Thanks,
Scott

Hi Scott,

We can definitely rule out video files as a cause because if it was, then we wouldn’t see a FPS drop when using a Color Sine node.

I would bet that the issue is somehow connected to the fact that you are using 2 GPUs. Judging from the fact that FPS drops when you enable the outputs, I would say that the texture has to stream from one GPU to the other and this slows down the performance. Are you using NVLink at all?

What I would also be interested in is this:

  • try to find out on which GPU LightAct is running and see if FPS drops when you enable only outputs on the same gpu.
  • just for testing purposes, remove one GPU and see how FPS is behaving.

I’m sorry I can’t be of more help here, but we never use multiple GPUs so my experience is a bit limited. @John might be able to help you more though as he has set up a multiple GPU setup successfully it seems.

All best,
Mitja

1 Like

Hi smsmith,

I can’t see your video. Can you reshare it enabling “all users getting the link” ?
We use a computer with 3 GPU
1 x RTX4000 for monitoring
2 x RTX5000 with NVlink (but nvlink juste increase performance in our case) with 2 x 4 output.
In nvidia mozaic, I set a desktop of the 8 output and set it in “primary screen” (very important, I think it’s your problem).
In nvidia driver you have to set LightAct to run on the right GPU where your output are connected (so RTX5000 in our case).
When you launch LightAct, the app is opening in the primary screen. I use Shift + windows + arrow left/right combo button to display the lightact windows on the monitoring screen. Sometimes, I can still have FPS drop depend of what I run. Then I need to click one time every where in the output display in order to focus the output windows of lightact.

1 Like

Hi Jonathan, Mitja,

I’ve reshared the links – apologies for the access issue.

Video here of the original test, with 7x 4K videos rendered to 7x canvases.

Video here of the Color Sine test.
Video here showing the hardware performance / GPU usage during the Color Sine test.

Thanks for taking the time. @John I’m going to take some time to think through your setup and will get back with the results!

@John,

Thanks again for the reply and for sharing your experience.

I am not using NVIDIA Mosaic currently. I was trying instead to have 8 individual displays set up in the Windows “Display” options. The setup is like this:

Display 1 is the GUI (1080p), and 2-8 are the exhibition displays (4K UHD).

I’m not against Mosaic, I just am inexperienced with this kind of system. Should I be using NVIDIA Mosaic instead? If I understand correctly, with Mosaic I would only have 2x displays visible, and I would be able to use Mosaic to spread Display 2 over outputs 2-8. Is that correct? Something like this:

(I am making these diagrams myself, since I don’t have access to the production PC at the moment, sorry if they look strange!)

Thanks,
Scott

yep, that’s it. And don’t forget to focus the output windows. With the 2 desktop system. You will have 1 windows of lightact and the second one is the big windows in full screen which cover the entire second desktop (which need to be set as primary). It’s this lightact windows which need to be focus. I think, with teamviewer, you need to switch to the second desktop and just click everywhere.

1 Like

@John great – thanks so much. I will have access to the PC again in a couple of days to test Mosaic.

@meetya I did some testing yesterday and learned a couple things –

  • The FPS drop still happens even when one of the GPUs is disabled. I did a test with 3x outputs from LightAct (the 4th GPU output was used for the LightAct GUI). The FPS dropped from 60 to 30 when I enabled the video outputs.
  • I changed the Power Management Mode for LightAct in the NVIDIA Control Panel to “Prefer Maximum Performance”, and this did not change the FPS issue.
  • The GPUs are linked by an SLI bridge, not NVLink. I realize LightAct might not be optimized for multiple GPUs anyway, but is there any chance the SLI bridge is a problem? Should we get an NVLink?

Will get back to you guys once I’ve tested Mosaic. Fingers crossed.

Thanks,
Scott

the bridge between car is the same for SLI or NVlink. You have to select in nvidia link which mode you want:
SLI - GPU share -> all the output of one of the both card are disabled.
NvLink - VRAM share -> all the output works.

You should try the SLI Mode to force Lightact to manage only 1 card to see what happens.
But you definately need the NVlink mode. From our part, we just see a slight better performance.

1 Like

Ah – Thanks @John. I just learned from our AV integrator that there actually is no bridge between the two graphics cards. So I guess I can’t do the VRAM share.

Still, appreciate your XP sharing!

Hello there. We’re trying something similar with a Notch block (100k particles) which runs fine by itself but as soon as we enable the 7 outputs, we’re getting about the same issue, the FPS drops around 30.

We have 2x RTX 3090 not bridged. Should we try to NvLink them?

Thanks

Hi,

I believe the experience from most of our users using multiple GPUs is that yes Nvlink is required. Otherwise all the textures need to use PCIe to transfer between the GPUs and that’s very slow.

Cheers,
Mitja

Hi @julienrobert – I’ve been waiting to post until I have my solution finalized, but maybe an update now would be helpful to you. We have not been able to test our full system yet due to logistical delays, but I will be on site next week and should be able to finalize it and update everyone here. I am actually not sure if our GPUs are bridged–I am not on site, so my information is a bit limited–but we had some success with this method. It might be worth testing before buying an NV Link.

I was able to test a 1x4 4K Mosaic setup, following the conversation with Mitja and Jonathan above. In LightAct, I set up 1x video screen with a resolution of 8640x3840 which is read by 1x video output. This single video screen displays the content for 4x physical screens, lined up next to each other – I created a single texture in a 2D Scene with all 4 screens content, then render that texture to a canvas in a separate layer, then set that layer as the Source for the 1x4 video screen.

You have to set up NVidia Mosaic for this 8640x3840 output, and then, in the Windows Display settings, this appears as a single large external display. You have to be sure to check “make this my main display” to ensure that the LightAct playback takes full advantage of your GPUs.

It successfully ran at the full 60 fps frame rate, and the video playback looked great on the screens.

image.png

Again, we have not yet been able to run a test with all 7 screens and Mosaic, but I’m optimistic.

I haven’t verified this, but the technical staff I’m working with have told me that if you want to have a combined video output using Mosaic and ALSO have a separate control monitor, you need that control monitor to be on a separate GPU from the Mosaic outputs. So, in our case, we want to have:

  • 1x7 4K screens, portrait mode, all using Mosaic, playing back 1x video output from LightAct (15120 x 3840 resolution), all on display in the exhibition space
  • 1x 1080p screen, landscape mode, showing the LightAct GUI for monitoring and control (only visible to technical staff)

For this setup, we have 2x A6000 GPUs dedicated to the 7x 4K screens, but we need an additional GPU just for the 1080p screen. This seems consistent with how @John uses LightAct and Mosaic in his setup.

I’ll report back once we’ve tested all this next week, but wanted to chime in.

2 Likes

Hello all,

Getting back to this thread, finally, after a lot of testing and various other delays. Our installation is up and running, using the method @John described. The setup is:

  • 1x video screen, 15120 x 3840, in LightAct, with all 7 videos in portrait mode lined up next to each other.
  • 1x7 setup in Nvidia Mosaic, with each screen in portrait mode = 15120 x 3840*
  • Map single LightAct video output to 15120 x 3840 output

We tried to add an additional control monitor output, but we were not able to. We tried setting up an additional Mosaic grid with the control screen on the 8th GPU output, and we tried adding a 3rd graphics card solely for the control screen. Neither were successful. The AVIT team I was working with told me that moving 7x 4K is pushing the limit of data that can move across the PCIe bus on our PC, which they’d never seen before. Adding the additional screen pushed it over the edge, and would crash the Mosaic setup or cause the computer to fail to boot.

We are using 2x NVidia A6000 cards with no SLI bridge. Apparently the bridge in this application actually reduces performance, since the GPU must dedicate resources to moving data across it (I think – not totally sure).

@John is right that it’s critical that the video output from Windows be the “main display” or primary output. We tried a couple of configurations with 2 Mosaic grids (1x4 + 1x3), and we were unable to get the performance above 11-12 fps. When running with a single output, it is the primary display by default, and LightAct runs great. (I’m also outputting almost 90 universes of DMX.)

Anyway, wanted to report back that we’ve got it running, and it’s very exciting! Thanks to the folks here for your help.

Scott

1 Like

Thanks for the detailed report, Scott! Can’t wait to see some pictures :slight_smile: