Tensor contraction reduces a tensor's rank, transforming a rank-2 tensor in three dimensions, which has nine elements, into a scalar, thereby raising questions about the loss of information. The discussion highlights that while this process seems to discard data, it actually generates new information, similar to how a dot product operates. The legitimacy of tensor contraction is affirmed through its utility in producing invariant quantities, essential in fields like general relativity. The Ricci scalar, resulting from such contractions, provides insights into curvature, although the complete loss of information during this process remains a topic of intrigue. Ultimately, tensor contraction is a fundamental operation that balances information loss with the generation of meaningful scalar values.