Hello,
You seem to be making a generally confused impression, e.g.
I'm guessing that I can treat "s" as a constant since it's "given" for the conditional PDF
You shouldn't guess, of course
I also know that adding L + E will result in a normally distributed random variable.
Why? Is anything at all given about L and E? What if they were both 0, then the sum is 0 and that's hardly a normal distribution.
So D is also a random variable, right?
I don't know how this follows from the previous, but it is true, but it should be more obvious
Of course, I don't mean to come across as belittling: my point is to point out why I consider you to be generally confused and I want to help. When I'm generally confused, I find it always helps to go back to the definitions. For example, go back to the definition of a random variable:
verify and
understand why D indeed
is a random variable.
Then look up the definition of something like f(d|s). Most probably your book will have defined it as
f(d|s) = \frac{f(d,s)}{f(s)}
so by definition we need f(d,s) and f(s) [or you could try to use something like Bayes' rule, as you did, but that
also uses things of the form f(x|y) so you can guess that that won't simplify matters]
Now the question has come to: how do I get f(d,s) and f(s). Your integration radar is indeed correct: it is the clue to understanding that we only need to get f(d,s). Is it obvious how f(s) follows from f(d,s)?
Now the problem has been reduced to "what is the joint probability distribution of D and S?".
You most probably saw a method to get a probability distribution of functions of random variables. If this doesn't ring a bell, reread your course and give it some thought :)
I'll be here if you have more questions.