A double-slit diffraction pattern is formed on a distant screen. If the separation between the slits decreases, what happens to the distance between interference fringes? Assume the angles involved remain small.The distance between interference fringes remains the same.The effect cannot be determined unless the distance between the slits and the screen is known.The distance between interference fringes also decreases.The distance between interference fringes increases.

Respuesta :

Answer:

The distance between interference fringes increases.

Explanation:

In a double-slit diffraction pattern, the angular position of the nth-maximum in the diffraction patter (measured with respect to the central maximum) is given by

[tex]sin \theta = \frac{n \lambda}{d}[/tex]

where

[tex]\theta[/tex] is the angular position

[tex]\lambda[/tex] is the wavelength

d is the separation between the slits

In this problem, the separation between the slits decreases: this means that d in the formula decreases. As we see, the value of [tex]sin \theta[/tex] (and so, also [tex]\theta[/tex]) is inversely proportional to d: so, if the d decreases, then the angular separation between the fringes increases.

So, the correct answer is

The distance between interference fringes increases.