tink200 Posted April 25, 2008 Posted April 25, 2008 hi guys, i have been trying to figure out this question for days now i think its a case of wrong formulas but im not entirly sure, so any help would be gratly appreciated! i have been told that monochromatic light from a laser is directed towards a diffraction grating with 300 lines per millimetre. A diffraction pattern on a screen situated beyond the grating consists of a series of spots of light. The first order spot (d) is situated 2.00m (200mm) from the centre of the grating, at a diffraction angle of 11.8 degrees, assume that the speed of light is 3.0x10^8 m s-1. the questions are......... 1) calculate the distance (s) (in mm) from the centre of the diffraction pattern to the first order spot (i have been using the equation s= din11.8 x d, please correct me if i am wrong) 2)calculate the wavelength of the light of the laser? now this is where i am running into problems i cannot use wavelength=frequency x c because i do not know the frequency, so instead i am using wavelength= sin11.8 x d/s is this correct?? 3) calculate the frequency ...no probs assume it would be f= c/wavelength the prob with the way i am doing it is the colour of the light is supposed to be in the visible part of the spectum, but the best i have come up with so far is 2 values either side of the visible range!!?? all i need is clarification on the formulas i am using, and if i am wrong then a point in the right direction thanx
timo Posted April 25, 2008 Posted April 25, 2008 If I understand the setup correctly, then 1) is correct but your approach to 2) is wrong. Draw that standard picture of what happens at the grating to see why (I mean this picture: http://en.wikipedia.org/wiki/Image:TwoSlitInterference.svg).
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now