High School Probability of arranging identical objects

  • Thread starter Thread starter MeesaWorldWide
  • Start date Start date
Click For Summary
The discussion focuses on calculating the probability of arranging the letters in a word with identical objects, specifically the letters N, U, A, V, and T. The total arrangements of the 7 letters are calculated as 1260, while the arrangements with N at both ends yield two scenarios based on whether the N's are distinguished or not. If the N's are treated as identical, the probability of having N at each end is 60/1260, equating to approximately 0.048. Conversely, if they are distinguished, the probability increases to 120/1260, or about 0.095. The conversation emphasizes the importance of consistent counting rules when dealing with identical letters in probability calculations.
MeesaWorldWide
Messages
9
Reaction score
1
TL;DR
If all the letters of the word NUNAVUT are randomly rearranged, what is the probability that there will be an N at each end?
Total ways to arrange the 7 letters: 7! / (2! x 2!) = 1260

Ways to have an N at each end: N _ _ _ _ _ N
There are 5 other letters in the middle, and two of them repeat (U), so the middle 5 are found by 5! / 2! = 60

Now, here is where I am unsure what to do. Since the N's are identical, do we distinguish between them? If we switched both the N's, there is still an N at either end, but would that be considered a separate case? If so, we do 60 x 2 = 120.
Then the probability of an N being at each end is 120 / 1260 = 0.095

If not, then the probability is simply 60 / 1260 = 0.048

Any insight here would be appreciated. 0.095 seems a bit high to me, but not treating N _ _ _ _ _ N and N _ _ _ _ _ N as different cases also makes sense...
 
  • Like
Likes Agent Smith
Physics news on Phys.org
The word has letters
N N
U U
A
V
T
It is not clear two N and two U are distinguished or not in our count. If we distinguish two N and two U, for an exaple by capital and lower case
N n U u A V T

I think the rule to count the cases should be shared for the investigation.

As for N n U u A V T rule, number of all the cases : 7!
N is at one edge and n is at another edge : 5! x 2

So the probality is 1/21
 
Last edited:
There's no need for counting in this case. The probability that the first letter is N is ##\frac 2 7##. And, the probability that the last letter is N, given the first letter is N, is ##\frac 1 6##. The probability of both is the product of these, hence ##\frac 1 {21}##.
 
  • Like
Likes Agent Smith, Klystron and anuttarasammyak
PS note that calculation applies to any two chosen positions for the N's. The probability that the first two letters are both N is the same, by the same calculation. Likewise the first and third letters; or the third and sixth etc.
 
The standard _A " operator" maps a Null Hypothesis Ho into a decision set { Do not reject:=1 and reject :=0}. In this sense ( HA)_A , makes no sense. Since H0, HA aren't exhaustive, can we find an alternative operator, _A' , so that ( H_A)_A' makes sense? Isn't Pearson Neyman related to this? Hope I'm making sense. Edit: I was motivated by a superficial similarity of the idea with double transposition of matrices M, with ## (M^{T})^{T}=M##, and just wanted to see if it made sense to talk...

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 29 ·
Replies
29
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 57 ·
2
Replies
57
Views
6K