I am trying to simulate a simple Gires-Tournois interferometer setup and am getting some weird results from the magnitude and phase of the reflectivity of the device.
The simulation setup is a 0.5 um thick dielectric with constant refractive index of 2.5 placed on top of a PEC. A plane wave source is injected along the positive Z axis and phase/magnitude information is extracted via a DFT monitor placed behind the source. The bandwidth is 1 um - 10 um.
The reflectivity magnitude looks like this:
Which is close to |R|=1 (which is what is expected) but obviously not quite. I would expect to have |R|=1 to within numerical accuracy. I don’t get different results by allowing the simulation to run longer or by improving meshing accuracy.
The reflectivity group delay looks like this, compared with the analytical result I would expect:
Here there is something more obviously wrong… the resonances seem to be occurring at the wrong frequencies, and there is also seems to be an increase in the group delay in the simulation, as we look at higher frequencies, which does not occur analytically.
I am also including a copy of my simulation file:
gires_tourn.fsp (302.9 KB)
I would appreciate any input as to where I may be setting up or analyzing the simulation incorrectly.