So I know resolving power is given by : Resolvance = λ/Δλ = nN Where n = order of fringe being observed and N is the no. of slits. So I was thinking there are a few ways of increasing resolvance : 1.) Decrease the slit width 2.) Increase the order of the fringe we observe. 3.) Increase the number of slits. However, wouldn't the first option cause a decrease in transmitted light and hence wouldnt our spectra have decreasing intensity ? Also, for the second option, it is known that intensity of maximas decreases quite quickly from one order to the next... Hence, would our peak have poor intensity as well ?
Right. You'll have to find the best slit width, taking both effects into account. Not if the slits are narrow ;).
I'm not sure what you are asking about- technologies used for high resolution spectroscopy, grating design, something else....?
I'm in interested in grating designs that could increase resolving power and the subsequent disadvtages.
Are you sure you want to get additional disadvantages? - reduce the number of slits - use a different width for different slits - make the slits too small, or too wide - use the screen too close to the grating, or too far away If you want to have advantages, avoid all those points.
Grating design is a complex subject that must take into account not just the density of 'lines' but also the detailed shape (the 'blaze angle', for example). I recommend starting by reading this: http://gratings.newport.com/library/handbook/handbook.asp It's the gold-standard reference. Also, I recommend looking at what equipment NIST or other standards labs use to perform precision (say, 1 part in 10^15) spectroscopy.