There are quite a lot of topics that cover introductory probability and random variables, but some of the topics I have mentioned are probably going to be found when doing a google search or reading journal papers (if you don't get the information directly from say internal university lecture notes).
Something like this should be OK for an overview of the core basic topics:
https://www.amazon.com/dp/0321795431/?tag=pfamazon01-20
Stuff like simulation and MCMC is quite new (in terms of the established theory) and a lot of research is still going into these areas for various applications (finance, bio-statistics, and general statistical theory).
Basically the thing you want to look at is the joint distribution, establishing independence between two variables and use these concepts (in conjunction with the appropriate results) to establish what the minimum number of random variables are and what their joint distribution is (given sufficient information to figure it out).
In the case that you have the definitions of the random variables, you can just first count the number of independent variables (say u = xy, v = x^2y, w = xy^2 then x and y are the independent variables), and then use definitions of independence to see if they are independent.
If variables are independent then the joint distribution is a product of the individual distributions P(A = a, B = b) = P(A = a)*P(B = b). If it isn't, then the joint distribution can't be simplified or separated out.
If you want to look at a function of random variables, then depending on what you want you will need to look at the things I mentioned above (transformation theorem, characteristic function, ratio and product results, convolution theorem, MCMC simulation, normal simulation, etc).
If you only care about the mean and variance (or other moments), then you don't need the above and you can use the results that are talked about in your introductory statistics book.