Tsunami
- 88
- 0
...that would be welcome.
I'm a third year Engineering student, majoring in Physics. This year I have an introductory course in Photonics. Unfortunately, introductory says it: as soon as I try to figure something out in depth, I have to look it up on the net.
I'm having trouble finding info about the following:
when light is transmitted through a wave conductor (in real life situations this wave conductor will usually be an optical fiber), the light always stays inside the conductor because there is total optical reflection at the walls. This is because the refractive index is higher in the inner medium, than it is in the outer medium (that's still pretty straightforward).
An important quality of wave conductors is that it provides a good way to make light take a curve : it's perfectly possible to make a wave conductor in a shape of a quarter-circle, and thus make the light take a smooth curve of 90°. However, my course notes say, there will always be a certain loss of light in this curve , and you can minimize this loss by making the radius of the circle greater (I mean 'the curving ray', not the radius; the property that describes the distance from the geometrical middle of the curving surface to the edge - I don't know the correct English term). Also, it says that when the difference in refractive index between outer and inner medium (n_inner-n_outer) is greater, the radius can also be choosen a little smaller.
I wouldn't know what that is based on. So my question is: how does the radius (or curving ray) and the difference in refractive index relate to the amount of light that is lost (read: gets transmitted) through the walls of the wave conductor?
I hope I'm posting in the right category, this is my first visit here. Hopefully I'll be able to answer some questions for other people too instead of merely asking them.
Whatever the case, I thank you for at least reading my question in full, and for whatever help you wish to give me in addition to that.
I'm a third year Engineering student, majoring in Physics. This year I have an introductory course in Photonics. Unfortunately, introductory says it: as soon as I try to figure something out in depth, I have to look it up on the net.
I'm having trouble finding info about the following:
when light is transmitted through a wave conductor (in real life situations this wave conductor will usually be an optical fiber), the light always stays inside the conductor because there is total optical reflection at the walls. This is because the refractive index is higher in the inner medium, than it is in the outer medium (that's still pretty straightforward).
An important quality of wave conductors is that it provides a good way to make light take a curve : it's perfectly possible to make a wave conductor in a shape of a quarter-circle, and thus make the light take a smooth curve of 90°. However, my course notes say, there will always be a certain loss of light in this curve , and you can minimize this loss by making the radius of the circle greater (I mean 'the curving ray', not the radius; the property that describes the distance from the geometrical middle of the curving surface to the edge - I don't know the correct English term). Also, it says that when the difference in refractive index between outer and inner medium (n_inner-n_outer) is greater, the radius can also be choosen a little smaller.
I wouldn't know what that is based on. So my question is: how does the radius (or curving ray) and the difference in refractive index relate to the amount of light that is lost (read: gets transmitted) through the walls of the wave conductor?
I hope I'm posting in the right category, this is my first visit here. Hopefully I'll be able to answer some questions for other people too instead of merely asking them.
Whatever the case, I thank you for at least reading my question in full, and for whatever help you wish to give me in addition to that.