Hi, I am trying to run a simulation where I am sending a focused beam onto my device and I want to calculate at a certain point what we call the ‘field factor’, or how much the power is enhanced or reduced due to interference from the reflected wave. So It should be 1 with no reflected wave, or 0 to 4 for a fully reflected coherent wave.
Here is the project file for my simulation:
pnf simulation.fsp (247.7 KB)
And here is a screenshot:
The device consists of two electrodes on both sides of a trench dug into SiO2. At the bottom of the trench there are two thin pieces of Pt metal. I am interested in the field factor between the two electrodes, where there is a suspended carbon nanotube. I am focusing a beam onto the plane of the carbon nanotube.
I have the simluation set up and working but I’m having a hard time understanding the units of the monitors. I am currently using a frequency domain field and power monitor(lebeled Power NT) where I am interested, and have CW normalization on. When I look at the E field between the electrodes as a function of wavelength I get a plot that looks like what I want:
However, when I run the same simulation, but remove all of the strucutres, I think I should get roughly 1 independent of lambda, but I don’t:
However when I make it a plane wave I do get roughly 1 like I would expect.
How can I normalize the data correctly for the focused beam? Also am I correct in what I think I’m calculating with this monitor: the electric field amplitude of both the incoming beam and reflected beam coherently added?
Thanks for any help.