Schrodinger's Dog
- 835
- 7
whatta said:Not really, you will have short moments of increased self-respect.
There will be a lot of benefit in those short moments when you are dead won't there
whatta said:Not really, you will have short moments of increased self-respect.
Schrodinger's Dog said:OK so is it possible to have a truly selfless act given what you just typed, given the whole of humanity or x group?
Schrodinger's Dog said:If for example your conscious decision is to save 4000000 people and to die yourself and you are not religious and believe you will get no reward for your action nor will anyone else benefit ever except obviously the 400000 people but all of them will be totally unaware your action saved them and you will die in such a way that no one even knew you were there, and thus you will be reported missing and no one will tie in your act with you, etc, etc is this truly selfless?![]()
out of whack said:I don't believe so for the reason I gave in my previous post.
I think you can formulate any dilemma in any manner you wish and it would still not matter. In your example, if you are motivated to make this decision then clearly your expect that the outcome will be to save these people and clearly you desire this outcome. The exact reason why you desire this outcome are personal. It could be that you would not want to keep on living with the knowledge that you sacrificed four million people. It could be that failure to act would violate your self-respect. Or anything along these lines. A motivator can be avoidance of a negative as well as desire of a positive. Regardless of the specific motive, the motive exists and it is what you personally want. The fact that different people would make different decisions simply reflects different personal motivators.
Schrodinger's Dog said:In other words you know to not act means life and to act means death but the saving of 400000 people, but you have no time to ponder the implications either way, in fact the decision must be made practically instantaneously and instinctively, and then of course what I said above also follows.
In essence is this a selfless act?
Ie it is not motivational exactly, it is just reflexive action.
Schrodinger's Dog said:pure instinct [...]
Ie it is not motivational exactly, it is just reflexive action.
baywax said:You still have the notion of consequence of action here. The consequence seems to be driving the action and thus appears to be motivated by the perceived consequence of saving400,000 people. So, it is still a matter of motive ie: how good it feels to save 400,000 over how unknown and scary death is going to be.
The motivation of a fear of the unknown (death) cannot overrule the immediate motivation of knowing (known, [primarily examplified in Bruce Willis movies]) how good it will feel to save 400,000 lives.
How can a person be so selfish as to save 400,000 lives then die as a result of their actions?![]()
out of whack said:If you don't have any conscious input in your action then does the concept of selflessness (or selfishness) even apply? I think this would go outside the intent of a discussion on value theory. You may as well be talking about plants turning towards the sun: no value judgement, just a reaction.
Schrodinger's Dog said:What if someone had lived in a cave for twenty years and upon coming out into the world had no idea of that such an idea was acceptable or not? They then had a decision to make instantly whether to save ten men like them and die or whether to live, they have no example of what is good and bad moral conduct or indeed any understanding of whether either decision would make them feel good or bad, they only know that if they let ten men live they die and vice a versa, no choice has any gratification prospects. The choice is a virgin choice, without preconceived morality or ideas, in fact said person would only know how he would feel after the decision was made either way.
Schrodinger's Dog said:It might at a subconscious level.
Schrodinger's Dog said:Actually I'm of the position that there is no such thing as a selfless act myself, I'm just seeing if anyone else can think of one. I tend to agree that selflessness by it's definition denotes an act of morality. An morality requires a framework for a decision without it you might as well be a robot.
Gelsamel Epsilon said:Blarrrrgh, just read the other thread.
baywax said:A truly selfless act.
Lets try to calm our extremism when it comes to seeking out this truly selfless act.
Lets try to see that there is an act that has a dual purpose that has both selfless and selfish motives simultanieously. Considering this possibility, there truly are acts of selflessness but they are acts with a dual purpose where the same act "in the same breath" satisfys both purposes.
For instance, the doctor who has just finished 23 hours on duty and stays another 8 hours because of an emergency surgery may be satisfying his ego or sense of duty but, there is an overwhelming percentage of selflessness to her/his actions as well.
He may derive some selfish pleasure from attempting to save a person's life during that 8 hours but, when you weigh how much he'd rather be sleeping or at home in a purely selfish manner against his actual actions, there is a huge element of selflessness keeping him at his station.
baywax said:I'm joining the position. But, what makes you think we're not organic robots with moral and/or empathic programing?
Schrodinger's Dog said:What turned me off reading that is that it sounded like it was one of those Wigtenstinian games of how do we define x. And not a discussion, so I didn't bother; I can't stand playing define the word games, what is the ontology of ontology?If there are no words do we exist, blah,blah,blah zzzzzzzzzzzz, it's perhaps the most boring and unfruitful area of philosophy since Plato stood up and said "I'm more pissed than you! Prove I'm not!"
However I am willing to admit that since I haven't read it and since I wouldn't touch it given a ten foot barge pole as it is described, I may well be wrong.
It's a good point but your assuming there isn't some sort of mathematical duality here;in other words that increasing x cancels out decreasing y or they are somehow totally dependant, when in reality they are both x and y and increasing or decreasing independently with some interplay.
I realize a mathematical model isn't really apt but it will simplify what I mean:-
A truly selfish act would be say 100 on a scale 1 to 100 with y at 0 ie no redeeming features.
And conversely a truly selfless act would be 100 with x at 0 or no selfish motivational issues.
I don't think we are robots either, but then we'd have to establish that free will exists and we are not just a part of our materialist programming to really prove that.
Gelsamel Epsilon said:No there is more then "I save my life so I can care for my children".
It is the parents duty to do so, and they don't want to see their children being brought up wrong = the selfish motivation towards saving your own life for your kids. (As well as you surviving being a motivation).
Gelsamel Epsilon said:Only if you agree with your above premise.
Gelsamel Epsilon said:Love stories romances and fantasies are not part of instinct, only sex drive is apart of that.
And I disagree with Instinct = Selfless, because Instinct =/= no-thought, if thought is required in the decision then a weighing of decisions happens and a selfish direction is chosen.
You have got me thinking though, I guess you could call arc-reflexes "selfess".
Gelsamel Epsilon said:What can I say other then I simply disagree with your model.
-Job- said:I think unconscious or unintentional acts can be considered selfless, but i realize we're probably talking about intentional, conscious selfless acts.
I think the base question is whether or not it's possible for an act to be unable to benefit the actor in any way.
Clearly we're not willing to take the actor's word that his/her act was selfless, so to prove that selfless acts are possible we try to find an act that will not benefit the actor in any way at all and have someone perform it. But i think that this may be impossible.
Is it the case that, for any given act, you can construct a scenario/interpretation in which the actor benefits from the act? If this is true then we can never know that the actor did not perform the act to reap the benefits of that one specific scenario.
I think selfless acts are possible, what is impossible is to verify or know beyond any possible doubt that the act was indeed selfless.
If we qualify this to say "Is it the case that, for any given conscious act, you can construct a scenario/interpretation in which the actor benefits from the act?" then I think the answer is yes. All conscious acts provide feedback to the person doing the act - by definition, we carry out a conscious act because we have consciously chosen to carry out that act, and that conscious decision has repercussions on our perceptions of the world and of ourselves in light of that act. Our decision is made for a reason or reasons (otherwise it would simply be a random selection rather than a conscious decision), and we can never be certain that all of those reasons are completely selfless.-Job- said:Is it the case that, for any given act, you can construct a scenario/interpretation in which the actor benefits from the act? If this is true then we can never know that the actor did not perform the act to reap the benefits of that one specific scenario.
moving finger said:If we qualify this to say "Is it the case that, for any given conscious act, you can construct a scenario/interpretation in which the actor benefits from the act?" then I think the answer is yes. All conscious acts provide feedback to the person doing the act - by definition, we carry out a conscious act because we have consciously chosen to carry out that act, and that conscious decision has repercussions on our perceptions of the world and of ourselves in light of that act. Our decision is made for a reason or reasons (otherwise it would simply be a random selection rather than a conscious decision), and we can never be certain that all of those reasons are completely selfless.
Only in the case of perfectly random acts (which we can be sure are genuinely random) could we be sure that there are no selfish "reasons" for the act (simply because there are NO reasons for a genuinely random act!).
MF