CMOS image sensor with macroscopic lens - Zemax needed?

Hi,

In our project, we coat a diffracting layer on a CMOS sensor. The transmission of the diffracting layer is angle dependent and we would like to see the transmission of the layer in real imaging applications. To simulate the transmission when imaging an object with lenses, should I use Zemax to first simulate light field after the lenses and use it as a light source for FDTD simulation? Or I should use the Point Spread Function simulation described here? What is the difference between these two methods?

I have watched the videos introducing the interoperability between Zemax and FDTD but there was not an example about CMOS image sensor. It would be great if you can give more details.

Thanks!