Main Question or Discussion Point
I've been eating organic food for the past year, and just recently I heard it's not any healthier for you. Is that true? I spend about 1/3 of my paycheck buying organic, it is really expensive and now I find out, that it's not any better for you. I just want to know if that's true?