I just saw some news article about Mitt Romney calling for Obama to "take responsibility for his failures." I hate when some one says "you need to take responsibility". No one ever "takes responsibility" for good things that happen. That's called taking credit. In fact, no body ever "takes responsibility". Taking responsibility in something that people in a position of power say to those under them as a back handed way of blaming them.