I have been studying a processing time for an industrial process. The present analysis just consists of finding the mean value as if the time was distributed normally. I took a sample of data and made a histogram of the data and realized it is not normally distributed at all. The normal distribution isn't a bad fit though. Also I realized that there are two peaks to the distribution. I spent some time watching the process and I realized that an error occurs in the process which can be corrected but it take almost twice the time to complete the process if the error occurs. So I broke the data into two sets one without the error and one without the error. The error free and error histograms fit a gamma distribution well (I don't know if this is the best choice), but the error free process has a mode of about 4 minutes and the process with an error has a mode of about 9 minutes. I also looked at how probable an error is to occur and it was around 30%. My question is: Is there a way to recombine these two distributions and the probability of error into a single distribution so I can define a mode for the whole process. I want to change the process to reduce the likely hood of error, but with everything it is all about $$$. So I have to be able to justify what I want to do.