This isn't really a complex question more of a problem of definition. The questions I'm getting in my book are asking me to compute probabilities of various circumstances and then it asks after each one if the occurrence of the event is usual or unusual. The book has a graphic which is showing at a value of probability 1 the event will definitely occur and at 0.5 it is a 50/50 chance and then right near zero is says unlikely to occur, and of course at 0 the event is impossible. Elsewhere in the paragraph it mentions a probability of 1/1000 as being "very unlikely" but it doesn't define an "unusual event" with any sort of numerical parameter. What should I conclude then as an unusual event? Less than 0.05? Thanks.
Context is everything - seems you are being asked for a value judgement. What is usual or unusual will depend on the circumstances. Over here we give students language like: 99% = "almost certain" 95% = "likely" 68% = "probable" 34% = "possible" 5% = "unlikely" 1% = "almost never" ... for normally distributed stats. But it's all hogwash ... if a surgical procedure has a 1% chance of killing the patient but it gets performed 1000s of times a day then deaths from this procedure are very common and saying it "almost never" happens would be silly.