This isn't really a complex question more of a problem of definition. The questions I'm getting in my book are asking me to compute probabilities of various circumstances and then it asks after each one if the occurrence of the event is usual or unusual. The book has a graphic which is showing at a value of probability 1 the event will definitely occur and at 0.5 it is a 50/50 chance and then right near zero is says unlikely to occur, and of course at 0 the event is impossible. Elsewhere in the paragraph it mentions a probability of 1/1000 as being "very unlikely" but it doesn't define an "unusual event" with any sort of numerical parameter. What should I conclude then as an unusual event? Less than 0.05? Thanks.